eric e. dolecki
 
Work System Design SoundTouch controller VideoWave Purchase Journey Listening Modes

VideoWave

My Role
Concept Design Visual Design Software Developer
Duration
4 months
Tools
Illustrator, Flash, SEPY
Team
Bose Design Center, ID Team, Marketing, R&D Engineering
 

What is a Bose VideoWave?

Overview

The Bose VideoWave was an innovative entertainment system introduced by Bose that combined a high-definition television with an integrated, advanced audio system. Here are some key features and aspects of the Bose VideoWave:

Integrated Sound System.
One of the most notable features of the VideoWave was its built-in sound system. It used an array of speakers and Bose's proprietary audio processing technology to deliver immersive surround sound without the need for external speakers or a separate home theater system.
Simplicity and Elegance.
The VideoWave aimed to simplify home entertainment setups by reducing the need for multiple components and wires. This made it appealing to users who wanted high-quality audio and video without the complexity of traditional home theater systems.
Unify™ Technology.
Bose's Unify™ technology was incorporated into the VideoWave, providing an easy-to-use interface for connecting and controlling various external devices such as Blu-ray players, gaming consoles, and cable boxes.
Size and Design.
The VideoWave was available in different sizes to fit various room settings, and it featured a sleek, minimalist design that complemented modern home decor.
Image Quality.
In addition to its audio capabilities, the VideoWave offered high-definition video quality, ensuring that users could enjoy both superior sound and sharp, clear images.

Typical television remote controls common at the time.

The Project

The ask

Dr. Bose wanted to unveil a television that was extremely easy to use that sounded incredible out-of-the-box. Remote controls had been covered with a multitude of buttons - leaving users in a confused state most of the time. Dr. Bose posed a challenge to the company.

I'd like a television remote control that only has five buttons. - Dr. Amar Bose

This simple request kicked off a series of many related projects, each with it's own unique codename. There was the remote control itself. There was the user interface that had to go along with the remote. Then there was the connected Lifestyle box. Then there was the panel itself. There was remote to systems communication. There were all kinds of explorations that began.

After attending design meetings in regard to the user interface and the remote that would control it, I alone was tasked with designing and building a functional prototype that was good enough to convince key stakeholders that whatever we came up with would be a great experience and would be included.

My assigned tasks (hardware & software)

  1. Design and code the approved visual design direction of the on-screen user interface and necessary states.
  2. Help with the industrial design of the remote control itself.
  3. Inform the remote control communications layer for user control.
  4. Implement a video source for real-world prototype use.
  5. Create and document all user flows and interactions.
  6. Host weekly (or ad hoc) design meetings to keep things on track.
  7. Hand-off visual assets and design documentation to the final software team.

 Early whiteboard sketches
We gathered initial whiteboard sketches of the user interface and remote controls before progressing to unveil the content below the video layer. This foundational step helped establish the framework for our next design phase.

 Whiteboard sketch photos:

Prototyping

The Bose VideoWave's remote control and on-screen user interface were designed for simplicity and ease of use. The remote control had an ergonomic design, fitting comfortably in the user's hand, and featured a minimalist layout with fewer buttons than traditional remotes. This reduction in complexity made it user-friendly. It also functioned as a universal remote, capable of controlling the VideoWave system as well as other connected devices, thus eliminating the need for multiple remotes.

The remote included a touchpad for intuitive navigation, allowing users to swipe and select options on the screen. The remote also had programmable buttons, enabling users to customize certain buttons for specific functions. The buttons were backlit, making the remote easy to use in low-light conditions. The remote provided one-touch access to frequently used features and settings, such as volume control, source selection, and power on/off.

A source-based user interface

In order to reduce the number of physical buttons, we decided to place the majority of them on screen, below the video source. When the user touched a square trackpad on the remote, the video would scale down, exposing the button elements. A positional finger position interface would show which button currently had input focus. A click on the trackpad (clickpad) would select the currently highlighted button. We called the cursor "Tinkerbell".

We came close to using only five buttons, but executed a final design that tested very well with 87% of our test users.

  1. Power
  2. Source
  3. Clickpad
  4. D-Pad (for cable guide navigation)
  5. Volume +/-
  6. Channel +/-
  7. Mute
  8. Last

I prefer to sketch ideas before committing to digital designs.

Details

Remote control communication

I worked with an engineer named Lazlo who provided the input data to my application, sendng me X and Y locations on the track pad. He also sent me press and release states for all of the buttons, including the track pad itself. We worked hard on getting the communication quickly and accurately.

Interface audio experimentation

I created several audio clips for the on-screen user interface using a diverse range of MIDI instruments. While many were not utilized, this experimentation led us to develop a remarkable spatial audio track: a recording of an orchestra warming up. The result was a stunning, immersive sound that truly filled the space. This played during each cold system startup process.

Below are a few samples I was able to locate.

1
2
3
4
5
6
7
Audio SFX
 

Cable television as a source

We could watch and control live cable television! I installed a video capture card that supported HDMI input. My application could see that card as a video source and stream it into my interface. We get a Comcast drop in the design center and we hooked up the box in my office. I worked with Lazlo again who used the Comcast remote and retrieved the IR codes it sent to the cable box. Knowing this, he gave me an IR blaster that I could use to send those commands to the cable box from my application. I just needed to make sure the blaster was pointed at the cable box.

I could send IR codes to a Comcast Cable box from my application. In this way I could control a live television experience. Volume, channels, numeric inputs, everything.
A tethered remote control with a click pad and dedicated volume and channel buttons allowed me to control the experience in the exact same way that we would implement the control for the production experience.
The HDMI signal was sent to a video capture card which then in turn provided my application with a streaming video signal I displayed as an object that I could scale up and down as well as provide drop shadow for.
Dynamic button hit detection for track elements (around the edge of the interface) were applied so that they exhibited a certain type of magnetism while swiping to improve the experience.
Selection could modify the currently highlighted button if the user's finger was near the edge of one of them. This was taken into account with some algorithms to improve the user experience.
We dubbed the touch position cursor "Tinkerbell" because it was magical and enhanced the experience so much. Its design was a key in button navigation because the user could anticipate upcoming transitions easily. The cursor moved with a pixel-smooth and eased response.

Application

A debugging console

To save time making algorithmic changes, I created a debugging console that appeared on the television display which allowed users to see the state of the application and make edits to parameters. In this way they could tune interactions until it was just the way they wanted it.

I collected this data later to see what the most effective settings were from all of those who used the prototype. The console also displayed the x, y coordinate data from touches, press states, and more that was saved to a local file that I could parse later to help debug all state conditions. This saved many hours of development time and allowed everyone to customize the settings to their liking (which would later be evaluated).

A rolling cart

I put the whole prototype on a rolling cart which allowed me to wheel it around the Research and Development building - allowing people to experience the prototype in their own office. I had a local video file that would loop and serve as the cable feed - and in the "mobile" state, IR commands were not sent from the blaster. It was used to evaluate the source navigation and the remote control.

Over 20 iterations

There were easily over 20 iterations of the prototype - all in the effort of making the experience as smooth and effortless as possible while also delivering on production-quality assets. We wanted there to be no guesswork and no surprises at all.

There was a week where the prototype was close but not close enough. I re-wrote part of the input engine which yielded even snappier interaction behaviors which also smoothed out all the animations and reduced mistaken input almost completely. The project was a resounding success!

Everyone, this is truly remarkable. We believe that this is a game-changing feature, and we are really excited about it. We're going to include it! - Senior Marketing Mgr.

A very early (and ugly)state of the prototype I found on a backup drive. First test of cable box control.

Operating guide

I found a system operating guide online which goes into great detail about the entire system and all of its components. View the PDF.

Initial AS3 code I found where I began simply mapping a 1080p rectangle to simulated touch position with a mouse. This must have been day one or two throw away code. {x,y} not set.

    package
    {
        import flash.display.Sprite;
        import flash.events.Event;

        public class MapCoordinates extends Sprite
        {
            // 16:9 (1080p)
            private const RECT_WIDTH:Number = 1920; 
            private const RECT_HEIGHT:Number = 1080;
            private var items:Array;
            private var previousMouseX:Number;
            private var previousMouseY:Number;
            private var velocityX:Number = 0;
            private var velocityY:Number = 0;
            private const ACCELERATION:Number = 0.1;

            public function MapCoordinates()
            {
                items = [];
                for (var i:int = 0; i < 10; i++) {
                    var item:Sprite = createItem();
                    items.push(item);
                    addChild(item);
                }
                
                this.addEventListener(Event.ENTER_FRAME, onEnterFrame);
            }

            private function createItem():Sprite
            {
                var item:Sprite = new Sprite();
                item.graphics.beginFill(0xFF0000); // Red color
                item.graphics.drawCircle(0, 0, 10); // Draw a circle with radius 10
                item.graphics.endFill();
                return item;
            }

            private function onEnterFrame(event:Event):void
            {
                var mouseX:Number = this.mouseX;
                var mouseY:Number = this.mouseY;

                // Calculate velocity
                if (!isNaN(previousMouseX) && !isNaN(previousMouseY)) {
                    velocityX = mouseX - previousMouseX;
                    velocityY = mouseY - previousMouseY;
                }

                previousMouseX = mouseX;
                previousMouseY = mouseY;

                // Calculate target position
                var target:Object = mapToEdge(mouseX, mouseY);
                var targetX:Number = target.x;
                var targetY:Number = target.y;

                // Move items with acceleration
                for each (var item:Sprite in items) {
                    var dx:Number = targetX - item.x;
                    var dy:Number = targetY - item.y;
                    item.x += dx * ACCELERATION;
                    item.y += dy * ACCELERATION;
                    item.alpha = 0.5 + 0.5 * Math.sqrt(dx*dx + dy*dy) / Math.sqrt(RECT_WIDTH*RECT_WIDTH + RECT_HEIGHT*RECT_HEIGHT);
                }
            }

            private function mapToEdge(x:Number, y:Number):Object
            {
                var targetX:Number;
                var targetY:Number;

                var aspectRatio:Number = RECT_WIDTH / RECT_HEIGHT;
                var mouseAspectRatio:Number = x / y;

                if (mouseAspectRatio > aspectRatio) {
                    if (x < RECT_WIDTH / 2) {
                        targetX = 0;
                        targetY = (y / RECT_HEIGHT) * RECT_HEIGHT;
                    } else {
                        targetX = RECT_WIDTH;
                        targetY = (y / RECT_HEIGHT) * RECT_HEIGHT;
                    }
                } else {
                    if (y < RECT_HEIGHT / 2) {
                        targetX = (x / RECT_WIDTH) * RECT_WIDTH;
                        targetY = 0;
                    } else {
                        targetX = (x / RECT_WIDTH) * RECT_WIDTH;
                        targetY = RECT_HEIGHT;
                    }
                }

                return {x: targetX, y: targetY};
            }
        }
    }                
        

Impact

Prototype

  1. The prototype allowed cross-functional team members a voice based upon operational experience to be heard during development.
  2. As a team we could iterate the design and try new ideas out on a separate development thread from different hardware components.
  3. I was able to form tight bonds with other team members and thought partners.
  4. I was able to experience a true nuts to bolts experience of multiple components that were synchronized together.
  5. I experienced true scrum while also self-managing a lot of my time to make sure tasks were completed while allowing enough time for creative solutions.
  6. I was awarded a few design patents while working on this project.
  7. One of the few projects I worked on that shipped in its entirety.

Operational

  1. Expanded the Design Department's in-house prototyping capabilities.
  2. Dr. Bose took a keen interest in the Design Department's in-house development capabilities.
  3. Prototypes provided tangible feedback, allowing the team to iterate rapidly and refine concepts based on real-world data.
  4. Development time and communication were vastly improved.
  5. On-screen real time settings and visible logging impacted later product development as those who saw it were impressed by its utility.
  6. I expanded my technical expertise in multiple areas - learning on the job. I was able to flex my creative muscles and the approaches and assets I created ultimately shipped.

Daily Giz Wiz 1199: The Bose VideoWave TV

Hosts:Dick DeBartolo with Leo Laporte. Combine a vibrant 46" HD screen with an invisible surround sound system and you have the Bose VideoWave entertainment system.