LeakyPhones

Gaze is an important social signal in human interactions. While its interpretation may vary across cultures it is generally agreed that eye contact indicates the point of attention in a conversation, and in western countries, shows a positive and desirable indication of attention and interest in a conversation.

Despite this, many common personal computing technologies require significant visual attention and thus harm their user's ability to provide feedback to others with regards to their interest and engagement.

LeakyPhones is a public/private headset that was designed to promote conversation, face-to-face interaction, sharing of interests, and healthy social skills by letting users "peak" into each other just by looking.

The aim of this ongoing project (kindly supported by BOSE through the MIT Media Lab and the Leakyphones SIG) is to explore gaze and other subtle yet natural forms of signaling to foster engagement in conversations and using headphones as a non-digital mediator for interaction. We are trying to test, whether the combination of eye contact and an ice breaker such as a piece of common music, could help people notice and interact with their surroundings

Leaky_Amos_Timna.jpg
 

HOW DOES IT WORK?

Each person who owns a headset can tune their privacy setting as follows:

Leaky_privacy_with_colors.jpg
  1. Bidirectional mode-One could be receiving and content from others and transmitting their own content.

  2. Transmit only mode-One could be sharing content yet not interested in receiving content from others (like a person with a boombox).

  3. Receive only mode-One could be receiving content without sharing- for this mode, you would not need a music source at all, and could be scanning around to hear what others listen to.

  4. Traditional headphones - you could be just listening to music the old fashioned way.

 

The standard Scenario

A common interaction with Leakyphones, would look like this:

1. User A is looking in another user's direction, lets call it user B

2. The longer user A looks at B, the more dominant B's music becomes

3. User A switches to B’s music for as long as he looks at person B

4. When user A looks away from B, his music switches back to his own music.

Leaky_figure3.jpg
 
Leaky_figure1.jpg

The FIRST System Design:

The first implementation of the system was designed to provide the user with varying degrees of privacy, from a standard private headset to a fully public listening experience where users can share their content and receive other people's content by looking at them.

The First Implementation of LeakyPhones was achieved by hacking a simple headset and using IR transmitters and receivers for gaze identification and ID, and a pair of radio transmitter and receiver for broadcasting and receiving audio from other users.

 A microcontroller (Arduino) and a simple digital potentiometer mixer were used to mix incoming audio with the users audio when an appropriate IR signal was received.

 

The Second Design iteration:

The first Design of leakyphones was problematic. The first audio mixer that I designed had some major flaws and the location of the IR transmitter on the top of the headset proved to be inefficient even with an omnidirectional lens, for users of different hight. In addition, the sound quality of these cheap headsets was pretty bad.

To overcome these difficulties I designed a new mixer PCB and created a new design for the earcaps of a BOSE Soundlink headset. Close attention was paid for the radio transmitter and receiver antenna locations to make sure reception and audio quality are good enough

A slightly better mixer design, incorporating 2 dual potentiometers, 4 unity gain buffers and 3 summing amplifiers.

A slightly better mixer design, incorporating 2 dual potentiometers, 4 unity gain buffers and 3 summing amplifiers.

The IR LEDs were embedded in a new 3d printed headset earcap. I paid careful attention to the effective light cone of each individual LED so that the whole area around each user will be covered without any dead spots.

The IR LEDs were embedded in a new 3d printed headset earcap. I paid careful attention to the effective light cone of each individual LED so that the whole area around each user will be covered without any dead spots.

 

Third Design iteration:

The system was much more robust now, but still, the audio quality was not good enough, and in order to test the different interaction scenarios that I had in mind I really needed the headset to deliver the same listening experience that people are used to getting from standard headsets.

I decided to build a more robust platform for testing and experimentation with a human subject. This a platform also enabled me to gather additional data such as duration of interactions, the identity of the participants in each interaction, their musical choice,  and many more aspects aspect of the human interaction and relevant data to testing the concept. 

To do so, we have started to design the A.SAP system