walle.png

Worked with other teams in Bell Labs to define research-based humanistic principles for how to design human-robot interaction. Defined three key pillars - Intent, Agency, and Realtionships, for 10 year plan dealing with robots in industrial and consumer contexts. Supplemented by ROS development for language functionality, sound design, and more.

Human-Robot Interaction

Bell Labs

Get Life/Love's Work

Ed Atkins

ed.jpg

Led collaboration between Ed Atkins and 3D graphics team at the Antwerp location of Bell Labs. Ed's experimental use of CGI has created an incredible body of work and we wanted to expose him to the best of our research to inspire future pieces. The resulting solo show at the New Museum runs from June 30th to October 3rd 2021.

Ultrasonic Sensing

Bell Labs

C-Bat-by-Simon-Crowhurst.jpg

Performed research on ultrasonic sensing capabilities for Robots. Using a deep Convolutional Neural Network, the location and identity of objects around a microphone was able to be determined just from the recordings of a room with ultrasonic chirps.

Secret Garden

Stephanie Dinkins

secret-garden-process-9_orig.png

Secret Garden was the culmination of Stephanie Dinkin's residency at Bell Labs, ideas developed through many conversations with leading AI researchers. The resulting exhibition and 3D online experience presented narratives of generaitons of Black women. Shown at the Sundance Film Festival 2021.

Building Visualization

Bell Labs

Bell Labs project to build the future of software for location understanding. Built functioning Unity prototype used for final designs, tracing different scenarios related to tracking of personnel and deployment of equipment in Covid-19 usecase scenarios.

uRTcmix

Columbia Computer Music Center

rtcmix-logo.jpg

Project with Brad Garton to port the realtime synthesis language RTcmix into the Unity game engine. Resulting open source package was the first true integration of a synthesis language with a commercial game engine, giving unprecedented creative control to musicians and game designers who want to create music algorithmically in 3D environments.

Package can be downloaded here: http://rtcmix.org/urtcmix/

Empathic Communication

Bell Labs

Oliver-Creole-Jazz-Band-Chicago-1923.jpg

Longterm Bell Labs project to experiment with the next medium for deep empathic communication. Current communications technology are vastly inferior to in-person meetings for information exchange and even worse for any conversation of emotional importance. Many experiments and investigations were done on the potential for musical and haptic communication. For more information on my own contribution, see the Research page.

Passing

Fei Liu

sawyer.jpg

Project with Fei Liu to create a video and performance using the Rethink Robotics Sawyer Arm. Sawyer was programmed using the ROS ecosystem to learn and perform tasks as part of an artist-written script, and the resulting video was shown at the 2021 Bell Labs and New Inc showcase: https://furtherexperiments.rhizome.org/

C4C

Bell Labs

c4c.png

System created in collaboration with Lainie Fefferman to bridge the gap between concert audiences and performers by turning the cell phones of guests into loudspeakers. System reengineered to use Bell Labs IoT technology and adapted for many public concerts over several years, allowing for flexible deployment over the web and audience interaction.

Networked Ecosystem

Mark Ramos & Ziyang Wu

Screen Shot 2021-06-14 at 12.01.28 PM.pn

Mark Ramos and Ziyang Wu's innovative take on data visualization. Sensor information from Bell Labs robots and IoT infrastructure was sent over databases to a 3D program growing an organic ecosystem based on the inputs. A completely fresh take on how to understand complex industrial data. The piece was shown at the 2021 Bell Labs and New Inc showcase and can be accessed there:  https://furtherexperiments.rhizome.org/

Haptic Music

Reeps One

reeps.png

Collaboration with Reeps One to enhance his We Speak Music performance with a prototype haptic device from Bell Labs, the wearable Sleeve. Worked with the artist to create customized and synced haptic performances with his music, including a novel triggering program which allowed him greater artistic control. An important advance for the potential of haptics in performance.

Future X Tours

Bell Labs

futurex.jpg

Various Hololens projects which were used to enhance the Bell Labs FutureX Lab tour experiences. Customers from major carriers and government organizations were shown E.A.T. designed AR experiences to better show Bell Lab's approach to 5G and Edge Computing.

Common Domain

Kalle Rasinkangas

Screen Shot 2021-03-01 at 8.52.26 AM.png

Led collaboration between Kalle Rasinkangas, the Finnish National Opera and Ballet, and various Bell Labs teams in Finland to help create this innovative VR piece. Common Domain is a mixed reality installation set in worlds of non-existent, but possible operas. The piece incorporates unique moveable displays that allows the audience to experience the virtual environment outside of the VR headset. Premiere in Helsinki in September 2020.

Recalibrating

Andrew Demirjian

Screen Shot 2021-06-17 at 4.11.10 PM.png

Created computer vision overlays for Andrew Demirjians drone video project Recalibrating. Recalibrating is a piece of experimental science fiction created with a Bell Labs drone and trying to understand the nature of artificial curiosity. The piece was shown at the 2021 Bell Labs and New Inc showcase and can be accessed there:  https://furtherexperiments.rhizome.org/

Coriolis

Seth Cluett

coriolis.png

Unique visuals for a video and music piece created by Seth Cluett and the International Contemporary Ensemble (ICE). A special composition and performance showcasing the emotional journey undertaken by an around the world sailor, Alex Thomson, who raced across the ocean with the help of Bell Labs technology.

Being in Real-Time

Sarah Rothberg

Screen Shot 2021-06-17 at 4.11.30 PM.png

Collaboration with Sarah Rothberg using Bell Labs prototype hardware to reflect on the interaction between computer systems and the personal environment. Created custom multi-server pipeline to handle image and speech data, as well as custom neural nets to recognize and understand objects from the artist's space using small dataset training.

LazyArms

daisy*

arms-08.jpg

Interactive installation by daisy*. Using Arduino, Kinect, and various IoT connections we made an experience which allows the user to control the movement and colors of robotic arms. The piece has been shown at many events throughout Japan. More details and images can be seen here.