Worked with other teams in Bell Labs to define research-based humanistic principles for how to design human-robot interaction. Defined three key pillars - Intent, Agency, and Realtionships, for 10 year plan dealing with robots in industrial and consumer contexts. Supplemented by ROS development for language functionality, sound design, and more.
Get Life/Love's Work
Led collaboration between Ed Atkins and 3D graphics team at the Antwerp location of Bell Labs. Ed's experimental use of CGI has created an incredible body of work and we wanted to expose him to the best of our research to inspire future pieces. The resulting solo show at the New Museum runs from June 30th to October 3rd 2021.
Performed research on ultrasonic sensing capabilities for Robots. Using a deep Convolutional Neural Network, the location and identity of objects around a microphone was able to be determined just from the recordings of a room with ultrasonic chirps.
Secret Garden was the culmination of Stephanie Dinkin's residency at Bell Labs, ideas developed through many conversations with leading AI researchers. The resulting exhibition and 3D online experience presented narratives of generaitons of Black women. Shown at the Sundance Film Festival 2021.
Bell Labs project to build the future of software for location understanding. Built functioning Unity prototype used for final designs, tracing different scenarios related to tracking of personnel and deployment of equipment in Covid-19 usecase scenarios.
Columbia Computer Music Center
Project with Brad Garton to port the realtime synthesis language RTcmix into the Unity game engine. Resulting open source package was the first true integration of a synthesis language with a commercial game engine, giving unprecedented creative control to musicians and game designers who want to create music algorithmically in 3D environments.
Package can be downloaded here: http://rtcmix.org/urtcmix/
Longterm Bell Labs project to experiment with the next medium for deep empathic communication. Current communications technology are vastly inferior to in-person meetings for information exchange and even worse for any conversation of emotional importance. Many experiments and investigations were done on the potential for musical and haptic communication. For more information on my own contribution, see the Research page.
Project with Fei Liu to create a video and performance using the Rethink Robotics Sawyer Arm. Sawyer was programmed using the ROS ecosystem to learn and perform tasks as part of an artist-written script, and the resulting video was shown at the 2021 Bell Labs and New Inc showcase: https://furtherexperiments.rhizome.org/
System created in collaboration with Lainie Fefferman to bridge the gap between concert audiences and performers by turning the cell phones of guests into loudspeakers. System reengineered to use Bell Labs IoT technology and adapted for many public concerts over several years, allowing for flexible deployment over the web and audience interaction.
Mark Ramos & Ziyang Wu
Mark Ramos and Ziyang Wu's innovative take on data visualization. Sensor information from Bell Labs robots and IoT infrastructure was sent over databases to a 3D program growing an organic ecosystem based on the inputs. A completely fresh take on how to understand complex industrial data. The piece was shown at the 2021 Bell Labs and New Inc showcase and can be accessed there: https://furtherexperiments.rhizome.org/
Collaboration with Reeps One to enhance his We Speak Music performance with a prototype haptic device from Bell Labs, the wearable Sleeve. Worked with the artist to create customized and synced haptic performances with his music, including a novel triggering program which allowed him greater artistic control. An important advance for the potential of haptics in performance.
Future X Tours
Various Hololens projects which were used to enhance the Bell Labs FutureX Lab tour experiences. Customers from major carriers and government organizations were shown E.A.T. designed AR experiences to better show Bell Lab's approach to 5G and Edge Computing.
Led collaboration between Kalle Rasinkangas, the Finnish National Opera and Ballet, and various Bell Labs teams in Finland to help create this innovative VR piece. Common Domain is a mixed reality installation set in worlds of non-existent, but possible operas. The piece incorporates unique moveable displays that allows the audience to experience the virtual environment outside of the VR headset. Premiere in Helsinki in September 2020.
Created computer vision overlays for Andrew Demirjians drone video project Recalibrating. Recalibrating is a piece of experimental science fiction created with a Bell Labs drone and trying to understand the nature of artificial curiosity. The piece was shown at the 2021 Bell Labs and New Inc showcase and can be accessed there: https://furtherexperiments.rhizome.org/
Unique visuals for a video and music piece created by Seth Cluett and the International Contemporary Ensemble (ICE). A special composition and performance showcasing the emotional journey undertaken by an around the world sailor, Alex Thomson, who raced across the ocean with the help of Bell Labs technology.
Being in Real-Time
Collaboration with Sarah Rothberg using Bell Labs prototype hardware to reflect on the interaction between computer systems and the personal environment. Created custom multi-server pipeline to handle image and speech data, as well as custom neural nets to recognize and understand objects from the artist's space using small dataset training.
Interactive installation by daisy*. Using Arduino, Kinect, and various IoT connections we made an experience which allows the user to control the movement and colors of robotic arms. The piece has been shown at many events throughout Japan. More details and images can be seen here.