H(n)MI
Day 1: Introduction to HNMI and soft sensors
The first day introduced the concept of the human body as both a source and medium for interaction in the digital age. We explored affective computing, embodiment, biometrics, and prototyping, with a focus on using our bodies to collect small data sets. We built soft pressure sensors using velostat and conductive fabric, connected to Arduino, and visualized real-time data in Processing. This hands-on approach highlighted how bodily actions can be translated into digital feedback.



Example code for Arduino:
Data visualization with Processing:
Day 2: Exploring p5.js and sound interaction
The second day focused on p5.js, where we created visual sketches that responded to sound inputs from the microphone. We made circles that changed size with sound levels and generated colorful, dynamic visuals. This session showed how auditory inputs could drive digital visuals, adding another dimension to interactive design.

Example code in p5.js:
Day 3: Serial communication and time-based interaction
The third session introduced serial communication between Arduino and p5.js using the Web Serial library. We controlled the size and color of a circle based on pressure sensor data and used millis() to create time-based animations. This integration of real-time data, dynamic visuals, and user interaction demonstrated how technology can reflect bodily states interactively.

We also learned that interaction is more than just reaction; it involves reciprocity and mutual influence. The Extended Mind Theory by Andy Clark, which suggests that cognitive systems extend beyond the mind to tools like pencils or maps.
Final thoughts: This workshop provided essential skills in soft sensor development, data visualization, and interactive design, encouraging new ways to integrate physiological data into my design space.
Workshop and final result
For our final project in the H(n)MI workshop, our group decided to explore music interaction using P5.js. We started with a simple theremin-like digital instrument, where moving the mouse controlled the frequency and volume of an oscillator. Our first goal was to replace the mouse input with hand tracking, making the interaction more intuitive and performative.
Phase 1: Hand tracking implementation
We integrated ML5.js HandPose, focusing on tracking the index fingers of both hands. Initially, one index finger replaced the mouse movement, allowing us to control pitch and volume. Then, we introduced a second index finger as a modifier, acting as an LFO (low-frequency oscillator) to alter the sound dynamically.
Phase 2: Expanding the concept
As we progressed, we realized the potential of modifying both the interaction method and the sound output. To deepen our exploration, each team member developed their own unique hand/(non)hand-controlled instrument, leading to a final collaborative orchestra performance, which we named "Conductive Noise Orchestra"
Andrea focused on percussive sounds and effects, using mouth opening and closing to trigger drum beats.
Ziming designed a system where finger movements controlled multiple sounds, creating complex textures.
I developed a mixing table where two index fingers controlled track transitions, adjusting volume (Y-axis) and playback speed (X-axis).
Here you can play with my instrument code online
Final performance: visual & sound integration
To enhance the live experience, we implemented real-time reactive visuals synchronized with the sounds. This turned our performance into a fully immersive audio-visual piece, blending generative art with interactive music.
Our Conductive Noise Orchestra demonstrated how hand gestures could transform digital instruments into expressive tools, blurring the lines between body movement, technology, and sound composition.


Last updated