eric e. dolecki
 
Work System Design SoundTouch controller VideoWave Purchase Journey Listening Modes

SoundTouch Controller

My Role
Product Design Visual Design Software Developer
Duration
12 weeks
Tools
Illustrator, Photoshop, Xcode
Team
Corporate Concepts Group, UX Research
 

What is Bose SoundTouch?

Overview

Bose SoundTouch is a series of wireless music systems developed by Bose Corporation. These systems are designed to provide high-quality audio which allows users to stream music from various sources, including:

Wi-Fi.
SoundTouch systems can connect to your home Wi-Fi network, allowing you to stream music directly from the internet.
Bluetooth.
You can also stream music from any Bluetooth-enabled device, such as a smartphone or tablet.
Music Services.
The systems are compatible with popular music streaming services like Spotify, Amazon Music, Pandora, and more.
Internet Radio.
Access thousands of internet radio stations.
Local Music Libraries.
Play music stored on your computer or network-attached storage (NAS) devices.

With the ability to set up to six personalized presets, users can access their favorite stations or content effortlessly. Initially units shipped with a simple remote control in the box.

The standard remote control and a Bose SoundTouch 10 speaker.

Enhancements

We needed to create a premium SoundTouch remote

Word came to us as an invitation to create a brand new premium SoundTouch remote accessory.

"We understood the need for a remote that matched the sophistication of your Bose SoundTouch system — seamlessly intuitive, beautifully designed, and packed with functionality. This remote goes beyond mere control; it's a natural extension of your listening environment, offering you easy access to your favorite presets with just a touch. Whether you're switching between playlists, adjusting the volume, or exploring new music, the Bose SoundTouch Controller is designed to make every interaction effortless, ensuring your audio experience is as exceptional as the sound."

Six presets were not enough for some users

Our user research revealed that six presets were optimal for most users, aligning with the classic car radio design that offered six AM and FM presets. After launch we received numerous requests to expand the number of presets to offer more options. Our current hardware shipped with six physical preset buttons, as did the remote control. A new optional remote accessory could handle the task.

Many users wanted to have quick physical access to more than just six presets.

Easier access to music

The remote that shipped with each speaker system allowed for some convenience, but users wanted a more premium experience. It was easy to misplace a small remote control. We could offer something more substantial with a more permanent location if desired. Such as on a wall, a refrigerator, or a shelf.

User-centered design (UCD) framework

The investigation and subsequent tasks were a product of a user-centered design framework approach. We sought to create an experience that met the needs of a majority of users by way of surveys and interviews, iterative design, as well as generative brainstorming.

My specific tasks

  • Conceive and DesignConceptualize, sketch, and develop intuitive and engaging product UI/UX designs.
  • Collaborate and InnovatePartner with the Industrial Design (ID) team to brainstorm and refine the product's industrial design.
  • Communicate and InfluencePresent design directions to key stakeholders with clarity and impact.
  • Prototype with PrecisionDesign visually stunning and seamless user experiences, and develop high-fidelity experiential prototypes on an iPad, offering a near-final representation in the absence of custom hardware.
  • Refine and PerfectImplement all necessary adjustments to ensure the design meets rigorous evaluation standards.

Kick-off meeting

I was invited and joined a collaborative meeting with colleagues from User Research, Marketing, and the Bose Design Center, where we began brainstorming potential solutions to the feedback we were receiving.

One foundational idea we quickly embraced was the creation of an easy-to-use, centerpiece remote control designed to be prominently placed on a coffee table, side table, near a light switch, or even magnetically attached to a refrigerator door—always within easy reach.

In addition to its accessible form factor, this remote would display metadata to inform users about what they were listening to. We also envisioned a more tactile and precise volume control, featuring a rotary encoder dial for fine-grained adjustments.

Example of in-meeting notes taken with some initial ideas recorded.

Preparation

Target deliverables

After our initial brainstorming and discussions, we made a list of should-have and shall-have aspects of the remote control. We had a window of only 6 months to prepare a design for manufacturing - so we needed to plan with precision.

Six preset buttons on a volume dial around a rectangular screen but hidden with a smoked lens to hide the screen boundaries. This would give the new industrial design visual heft for easier location as well as the ability to peek into preset content with a touch before a press.
A center area to display text and icons in a monochrome color to keep unit pricing down.
The ability to peek into other station content with a touch before a press.
A volume ring around the outside edge of the remote to allow for elegant volume adjustment.
The center area should function as play/pause control - the entire top functioning as a button.
Since stations were mostly streaming, fast-forward and rewind would need to be selectively backlit to show access status.

Prototyping

No existing hardware...

Without any existing hardware or software, I took on the task of creating the UI once we finalized the design and direction. We sketched various screens to ensure we displayed the right amount of information. It was decided that since the controller would be small enough to fit on an iPad screen at a 1:1 scale, I could use one to prototype. Perfect. I could leverage both my visual design skills as well as my software engineering and iOS experience.

Considered driving a SoundTouch system or a laptop by sending Bluetooth key presses and release - but opted to render and control local audio in the simulation application itself (which was good enough).Tapping on the keyboard (on desktop) above to give it focus, you can type some characters (while it auto-types) and also see press and hold activated. Part of my debugging for systems accepting keyboard inputs.

I then collaborated with the assigned industrial designer to craft a realistic top-down image representing the hardware on the iPad screen. With that in place, I coded a simulation with presets of local audio content to mimic a real user's collection. We tested a single preset, a full set, and all possible gestures to drive the system. Additionally, we explored error states, experimental features like station peeking, and extended presets with six banks of six presets each.

While the actual hardware was being prototyped and iterated upon, we refined the entire interaction model independently from the hardware. This approach exponentially accelerated the development process.

Development

I quickly placed the mock-up of the industrial design (top-down) into the project and then began to implement the screen views and the volume controls for hard-coded preset content (on device). The most difficult part was replicating the rotational user touches to convert them into volume ring approximations that felt nearly as satisfying as the operation of a true rotary encoder.

Since the screens were wire-framed earlier, implementing was easier. Anticipated states were taken into account, and I could run the iPad over to desks nearby to collect quick feedback during the prototyping phase. I could also supply TestFlight builds to team members with their own iPads so that they could drive the experience from the comfort of their own homes if they so chose to.

Objective-C

At the time I used UIKit and Objective-C to develop the prototype. I don't believe Swift was yet released but I would have preferred that language certainly.

Some sample touch tracking Objective-C code. This doesn't take into account the distance from center in order to approximate an outside edge volume ring.

            typedef NS_ENUM(NSInteger, PanDirection) {
                PanDirectionNone,
                PanDirectionLeft,
                PanDirectionRight,
                PanDirectionUp,
                PanDirectionDown
            };

            // View Controller Code 
            
            #import "ViewController.h"

            @interface ViewController ()
            @property (nonatomic, strong) UIView *knobView;
            @property (nonatomic) CGPoint lastTouchPoint;
            @property (nonatomic) CGFloat volumeLevel;
            @end

            @implementation ViewController

            - (void)viewDidLoad {
                [super viewDidLoad];
                
                // Set up the knob view
                self.knobView = [[UIView alloc] initWithFrame:CGRectMake(150, 300, 60, 60)];
                self.knobView.backgroundColor = [UIColor blueColor];
                self.knobView.layer.cornerRadius = 30;
                [self.view addSubview:self.knobView];
                
                // Add pan gesture recognizer
                UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(handlePan:)];
                [self.knobView addGestureRecognizer:panGesture];
                
                // Initialize volume level
                self.volumeLevel = 0.5; // Starting volume level
            }

            - (void)handlePan:(UIPanGestureRecognizer *)gesture {
                CGPoint touchPoint = [gesture locationInView:self.view];
                
                switch (gesture.state) {
                    case UIGestureRecognizerStateBegan:
                        self.lastTouchPoint = touchPoint;
                        break;
                    case UIGestureRecognizerStateChanged: {
                        CGFloat deltaX = touchPoint.x - self.lastTouchPoint.x;
                        CGFloat deltaY = touchPoint.y - self.lastTouchPoint.y;
                        CGFloat distance = sqrt(deltaX * deltaX + deltaY * deltaY);
                        
                        // Calculate direction
                        PanDirection direction = [self panDirectionForDeltaX:deltaX deltaY:deltaY];
                        switch (direction) {
                            case PanDirectionLeft:
                                NSLog(@"Moving Left");
                                break;
                            case PanDirectionRight:
                                NSLog(@"Moving Right");
                                break;
                            case PanDirectionUp:
                                NSLog(@"Moving Up");
                                break;
                            case PanDirectionDown:
                                NSLog(@"Moving Down");
                                break;
                            default:
                                break;
                        }
                        
                        // Calculate speed (simple approximation)
                        CGFloat speed = distance / gesture.minimumPressDuration;
                        NSLog(@"Speed: %f", speed);
                        
                        // Update last touch point
                        self.lastTouchPoint = touchPoint;
                        
                        // Update volume level based on touch movement
                        [self updateVolumeWithDeltaX:deltaX deltaY:deltaY direction:direction];
                        break;
                    }
                    case UIGestureRecognizerStateEnded:
                    case UIGestureRecognizerStateCancelled:
                        NSLog(@"Touch Ended");
                        break;
                    default:
                        break;
                }
            }

            - (PanDirection)panDirectionForDeltaX:(CGFloat)deltaX deltaY:(CGFloat)deltaY {
                if (fabs(deltaX) > fabs(deltaY)) {
                    return (deltaX > 0) ? PanDirectionRight : PanDirectionLeft;
                } else {
                    return (deltaY > 0) ? PanDirectionDown : PanDirectionUp;
                }
            }

            - (void)updateVolumeWithDeltaX:(CGFloat)deltaX deltaY:(CGFloat)deltaY direction:(PanDirection)direction {
                CGFloat adjustment = 0.0;
                if (direction == PanDirectionLeft || direction == PanDirectionRight) {
                    adjustment = deltaX / 1000.0;
                } else if (direction == PanDirectionUp || direction == PanDirectionDown) {
                    adjustment = deltaY / 1000.0;
                }
                self.volumeLevel = MIN(MAX(self.volumeLevel + adjustment, 0.0), 1.0);
                NSLog(@"Volume Level: %f", self.volumeLevel);
            }

            @end
        

Impact

Prototype

  1. As a team we could iterate the design and try new ideas out on a separate development thread from the hardware
  2. Partnered with an Industrial Designer and Mechanical Designer in developing the look and feel of the hardware prototype - kept the design in sync
  3. Provided additional feature options to stakeholders that were in addition to what was expected
  4. Further spread the effect of Research's development capabilities
  5. Led user research initiatives that influenced the direction of the product

Operational

  1. Expanded the Research Department's in-house development capabilities
  2. Shared the completed Xcode project with the Bose Github community
  3. Through timely iteration and design sprints, we were able to deliver an experience and document it for the release team
  4. This product concept/simulation directly led to the release of the retail product

English promotional video

Japanese exploration video

すごく良い製品コンセプトも販売数が少ないのか、情報が少なすぎる

Service Manual

I found a service manual online for a version of the controller.