Bose SoundTouch is a series of wireless music systems developed by Bose Corporation. These systems are designed to provide high-quality audio which allows users to stream music from various sources, including:
With the ability to set up to six personalized presets, users can access their favorite stations or content effortlessly. Initially units shipped with a simple remote control in the box.
Word came to us as an invitation to create a brand new premium SoundTouch remote accessory.
"We understood the need for a remote that matched the sophistication of your Bose SoundTouch system — seamlessly intuitive, beautifully designed, and packed with functionality. This remote goes beyond mere control; it's a natural extension of your listening environment, offering you easy access to your favorite presets with just a touch. Whether you're switching between playlists, adjusting the volume, or exploring new music, the Bose SoundTouch Controller is designed to make every interaction effortless, ensuring your audio experience is as exceptional as the sound."
Our user research revealed that six presets were optimal for most users, aligning with the classic car radio design that offered six AM and FM presets. After launch we received numerous requests to expand the number of presets to offer more options. Our current hardware shipped with six physical preset buttons, as did the remote control. A new optional remote accessory could handle the task.
The remote that shipped with each speaker system allowed for some convenience, but users wanted a more premium experience. It was easy to misplace a small remote control. We could offer something more substantial with a more permanent location if desired. Such as on a wall, a refrigerator, or a shelf.
The investigation and subsequent tasks were a product of a user-centered design framework approach. We sought to create an experience that met the needs of a majority of users by way of surveys and interviews, iterative design, as well as generative brainstorming.
I was invited and joined a collaborative meeting with colleagues from User Research, Marketing, and the Bose Design Center, where we began brainstorming potential solutions to the feedback we were receiving.
One foundational idea we quickly embraced was the creation of an easy-to-use, centerpiece remote control designed to be prominently placed on a coffee table, side table, near a light switch, or even magnetically attached to a refrigerator door—always within easy reach.
In addition to its accessible form factor, this remote would display metadata to inform users about what they were listening to. We also envisioned a more tactile and precise volume control, featuring a rotary encoder dial for fine-grained adjustments.
After our initial brainstorming and discussions, we made a list of should-have and shall-have aspects of the remote control. We had a window of only 6 months to prepare a design for manufacturing - so we needed to plan with precision.
Without any existing hardware or software, I took on the task of creating the UI once we finalized the design and direction. We sketched various screens to ensure we displayed the right amount of information. It was decided that since the controller would be small enough to fit on an iPad screen at a 1:1 scale, I could use one to prototype. Perfect. I could leverage both my visual design skills as well as my software engineering and iOS experience.
Considered driving a SoundTouch system or a laptop by sending Bluetooth key presses and release - but opted to render and control local audio in the simulation application itself (which was good enough).Tapping on the keyboard (on desktop) above to give it focus, you can type some characters (while it auto-types) and also see press and hold activated. Part of my debugging for systems accepting keyboard inputs.
I then collaborated with the assigned industrial designer to craft a realistic top-down image representing the hardware on the iPad screen. With that in place, I coded a simulation with presets of local audio content to mimic a real user's collection. We tested a single preset, a full set, and all possible gestures to drive the system. Additionally, we explored error states, experimental features like station peeking, and extended presets with six banks of six presets each.
While the actual hardware was being prototyped and iterated upon, we refined the entire interaction model independently from the hardware. This approach exponentially accelerated the development process.
I quickly placed the mock-up of the industrial design (top-down) into the project and then began to implement the screen views and the volume controls for hard-coded preset content (on device). The most difficult part was replicating the rotational user touches to convert them into volume ring approximations that felt nearly as satisfying as the operation of a true rotary encoder.
Since the screens were wire-framed earlier, implementing was easier. Anticipated states were taken into account, and I could run the iPad over to desks nearby to collect quick feedback during the prototyping phase. I could also supply TestFlight builds to team members with their own iPads so that they could drive the experience from the comfort of their own homes if they so chose to.
At the time I used UIKit and Objective-C to develop the prototype. I don't believe Swift was yet released but I would have preferred that language certainly.
typedef NS_ENUM(NSInteger, PanDirection) {
PanDirectionNone,
PanDirectionLeft,
PanDirectionRight,
PanDirectionUp,
PanDirectionDown
};
// View Controller Code
#import "ViewController.h"
@interface ViewController ()
@property (nonatomic, strong) UIView *knobView;
@property (nonatomic) CGPoint lastTouchPoint;
@property (nonatomic) CGFloat volumeLevel;
@end
@implementation ViewController
- (void)viewDidLoad {
[super viewDidLoad];
// Set up the knob view
self.knobView = [[UIView alloc] initWithFrame:CGRectMake(150, 300, 60, 60)];
self.knobView.backgroundColor = [UIColor blueColor];
self.knobView.layer.cornerRadius = 30;
[self.view addSubview:self.knobView];
// Add pan gesture recognizer
UIPanGestureRecognizer *panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(handlePan:)];
[self.knobView addGestureRecognizer:panGesture];
// Initialize volume level
self.volumeLevel = 0.5; // Starting volume level
}
- (void)handlePan:(UIPanGestureRecognizer *)gesture {
CGPoint touchPoint = [gesture locationInView:self.view];
switch (gesture.state) {
case UIGestureRecognizerStateBegan:
self.lastTouchPoint = touchPoint;
break;
case UIGestureRecognizerStateChanged: {
CGFloat deltaX = touchPoint.x - self.lastTouchPoint.x;
CGFloat deltaY = touchPoint.y - self.lastTouchPoint.y;
CGFloat distance = sqrt(deltaX * deltaX + deltaY * deltaY);
// Calculate direction
PanDirection direction = [self panDirectionForDeltaX:deltaX deltaY:deltaY];
switch (direction) {
case PanDirectionLeft:
NSLog(@"Moving Left");
break;
case PanDirectionRight:
NSLog(@"Moving Right");
break;
case PanDirectionUp:
NSLog(@"Moving Up");
break;
case PanDirectionDown:
NSLog(@"Moving Down");
break;
default:
break;
}
// Calculate speed (simple approximation)
CGFloat speed = distance / gesture.minimumPressDuration;
NSLog(@"Speed: %f", speed);
// Update last touch point
self.lastTouchPoint = touchPoint;
// Update volume level based on touch movement
[self updateVolumeWithDeltaX:deltaX deltaY:deltaY direction:direction];
break;
}
case UIGestureRecognizerStateEnded:
case UIGestureRecognizerStateCancelled:
NSLog(@"Touch Ended");
break;
default:
break;
}
}
- (PanDirection)panDirectionForDeltaX:(CGFloat)deltaX deltaY:(CGFloat)deltaY {
if (fabs(deltaX) > fabs(deltaY)) {
return (deltaX > 0) ? PanDirectionRight : PanDirectionLeft;
} else {
return (deltaY > 0) ? PanDirectionDown : PanDirectionUp;
}
}
- (void)updateVolumeWithDeltaX:(CGFloat)deltaX deltaY:(CGFloat)deltaY direction:(PanDirection)direction {
CGFloat adjustment = 0.0;
if (direction == PanDirectionLeft || direction == PanDirectionRight) {
adjustment = deltaX / 1000.0;
} else if (direction == PanDirectionUp || direction == PanDirectionDown) {
adjustment = deltaY / 1000.0;
}
self.volumeLevel = MIN(MAX(self.volumeLevel + adjustment, 0.0), 1.0);
NSLog(@"Volume Level: %f", self.volumeLevel);
}
@end
すごく良い製品コンセプトも販売数が少ないのか、情報が少なすぎる
I found a service manual online for a version of the controller.