Javier Serra - MDEF Portfolio
HomeTermsAboutSweat Matters
  • Term 2
  • ‎
    • Design Space
    • Microchallenges
    • Reflections
    • Interventions
    • Design Dialogues
    • Pictorial
  • Courses
    • Living Materials
      • Crafted material: Bio-fan
      • Living material: Hairy mycelium
    • Situated Design Practices
      • Personal Reflection
    • Communicating Ideas
    • Design Studio 02
      • Design Space Evolution
      • Ethics reflection
      • Research framework
      • The sweat conflict
    • Collective Intelligences
    • H(n)MI
    • Extended Intelligences II
Powered by GitBook
On this page
  • Day 1: Introduction to HNMI and soft sensors
  • Day 2: Exploring p5.js and sound interaction
  • Day 3: Serial communication and time-based interaction
  • Workshop and final result
  1. Courses

H(n)MI

PreviousCollective IntelligencesNextExtended Intelligences II

Last updated 3 months ago

Day 1: Introduction to HNMI and soft sensors

The first day introduced the concept of the human body as both a source and medium for interaction in the digital age. We explored affective computing, embodiment, biometrics, and prototyping, with a focus on using our bodies to collect small data sets. We built soft pressure sensors using velostat and conductive fabric, connected to Arduino, and visualized real-time data in Processing. This hands-on approach highlighted how bodily actions can be translated into digital feedback.

Example code for Arduino:

int sensorPin = A0;
int sensorValue;
void setup() { Serial.begin(9600); pinMode(sensorPin, INPUT); }
void loop() { sensorValue = analogRead(sensorPin); Serial.println(sensorValue); delay(1000); }

Data visualization with Processing:

import processing.serial.*;

Serial mySerial;
String myString;
int nl = 10;
float myVal;

void setup() {
  size(800, 600); // Canvas size
  printArray(Serial.list()); // List available ports
  delay(1000);
  String myPort = Serial.list()[0]; // Select the first available port
  mySerial = new Serial(this, myPort, 9600);
}

void draw() {
  while (mySerial.available() > 0) {
    myString = mySerial.readStringUntil(nl);
    background(125, 0, 125);
    if (myString != null) {
      myVal = float(myString);
      println(myVal);
      circle(width / 2, height / 2, myVal);
      smooth();
    }
  }
}

Day 2: Exploring p5.js and sound interaction

The second day focused on p5.js, where we created visual sketches that responded to sound inputs from the microphone. We made circles that changed size with sound levels and generated colorful, dynamic visuals. This session showed how auditory inputs could drive digital visuals, adding another dimension to interactive design.

Example code in p5.js:

let mic;
function setup() { createCanvas(400, 400); mic = new p5.AudioIn(); mic.start(); }
function draw() { background(0); let vol = mic.getLevel(); ellipse(width/2, height/2, vol*500, vol*500); }

Day 3: Serial communication and time-based interaction

The third session introduced serial communication between Arduino and p5.js using the Web Serial library. We controlled the size and color of a circle based on pressure sensor data and used millis() to create time-based animations. This integration of real-time data, dynamic visuals, and user interaction demonstrated how technology can reflect bodily states interactively.

let prevTime;
let actualTime;
let x;
let y;
let counter;

function setup() {
  createCanvas(640, 400);
  prevTime=millis();
  x=20;
  y=20;
  counter=0;
}

function draw() {
  //background(220);
  
  actualTime=millis();
  
  if( (actualTime-prevTime) >= 1000) {
    print("1 second");
    prevTime=actualTime;
    x=x+10;
    counter=counter+1;
     }
  
  if (counter==60) {
   counter=0;
   x=20;
   y=y+10; 
  }
  
  fill (random(0,255),random(0,255),random(0,255));
  circle(x,y,10)
}

function keyPressed(){
  if(key=="s"){
    save("myData"+frameCount+".png");
  }
}

We also learned that interaction is more than just reaction; it involves reciprocity and mutual influence. The Extended Mind Theory by Andy Clark, which suggests that cognitive systems extend beyond the mind to tools like pencils or maps.


Final thoughts: This workshop provided essential skills in soft sensor development, data visualization, and interactive design, encouraging new ways to integrate physiological data into my design space.


Workshop and final result

For our final project in the H(n)MI workshop, our group decided to explore music interaction using P5.js. We started with a simple theremin-like digital instrument, where moving the mouse controlled the frequency and volume of an oscillator. Our first goal was to replace the mouse input with hand tracking, making the interaction more intuitive and performative.

Phase 1: Hand tracking implementation

let handPose;
let video;
let indexFinger = null; // Primer dedo índice (track 1)
let indexFinger2 = null; // Segundo dedo índice (track 2)
let audioPlayer1, audioPlayer2; // Reproductores de audio
let trail1 = []; // Rastro del track 1 (verde)
let trail2 = []; // Rastro del track 2 (rojo)

function preload() {
  handPose = ml5.handPose();
}

function setup() {
  createCanvas(windowWidth, windowHeight);

  video = createCapture(VIDEO);
  video.size(640, 480);
  video.hide();
  handPose.detectStart(video, gotHands);

  // Cargar los archivos de audio
  audioPlayer1 = createAudio('track1.mp3'); // Ruta relativa al archivo
  audioPlayer2 = createAudio('track2.mp3'); // Ruta relativa al archivo

  // Configurar los reproductores de audio
  [audioPlayer1, audioPlayer2].forEach((player) => {
    player.loop(); // Reproducir en bucle
    player.volume(0); // Volumen inicial en 0 (silenciado)
    player.showControls(); // Mostrar controles del reproductor
  });
}

function draw() {
  // Fondo negro sólido
  background(0);

  // Dibujar el rastro del track 1 (verde)
  if (indexFinger) {
    let volume1 = map(indexFinger.y, 0, height, 1, 0); // Volumen basado en la posición Y
    let strokeWeight1 = map(volume1, 0, 1, 2, 10); // Grosor del trazo basado en el volumen
    noFill();
    stroke(0, 255, 0); // Color verde
    strokeWeight(strokeWeight1);
    beginShape();
    for (let i = 0; i < trail1.length; i++) {
      vertex(trail1[i].x, trail1[i].y);
    }
    endShape();
  }

  // Dibujar el rastro del track 2 (rojo)
  if (indexFinger2) {
    let volume2 = map(indexFinger2.y, 0, height, 1, 0); // Volumen basado en la posición Y
    let strokeWeight2 = map(volume2, 0, 1, 2, 10); // Grosor del trazo basado en el volumen
    noFill();
    stroke(255, 0, 0); // Color rojo
    strokeWeight(strokeWeight2);
    beginShape();
    for (let i = 0; i < trail2.length; i++) {
      vertex(trail2[i].x, trail2[i].y);
    }
    endShape();
  }

  // Controlar el track 1 con el primer dedo índice
  if (indexFinger) {
    // Controlar la velocidad (x)
    let speed1 = map(indexFinger.x, 0, width, 0.5, 2); // Mapear X a velocidad (0.5x - 2x)
    audioPlayer1.speed(speed1);

    // Controlar el volumen (y)
    let volume1 = map(indexFinger.y, 0, height, 1, 0); // Mapear Y a volumen (1 a 0, invertido)
    volume1 = constrain(volume1, 0, 1); // Asegurar que el volumen esté en el rango [0, 1]
    audioPlayer1.volume(volume1);

    // Guardar la posición actual del dedo índice en el rastro
    trail1.push({ x: indexFinger.x, y: indexFinger.y });

    // Ajustar la longitud máxima del rastro en función de la velocidad
    let maxTrailLength1 = map(speed1, 0.5, 2, 1000, 100); // Más velocidad = menos puntos
    if (trail1.length > maxTrailLength1) {
      trail1.shift(); // Limitar el tamaño del rastro
    }
  } else {
    // Si no se detecta el dedo, detener el track 1
    audioPlayer1.volume(0);
  }

  // Controlar el track 2 con el segundo dedo índice
  if (indexFinger2) {
    // Controlar la velocidad (x)
    let speed2 = map(indexFinger2.x, 0, width, 0.5, 2); // Mapear X a velocidad (0.5x - 2x)
    audioPlayer2.speed(speed2);

    // Controlar el volumen (y)
    let volume2 = map(indexFinger2.y, 0, height, 1, 0); // Mapear Y a volumen (1 a 0, invertido)
    volume2 = constrain(volume2, 0, 1); // Asegurar que el volumen esté en el rango [0, 1]
    audioPlayer2.volume(volume2);

    // Guardar la posición actual del dedo índice en el rastro
    trail2.push({ x: indexFinger2.x, y: indexFinger2.y });

    // Ajustar la longitud máxima del rastro en función de la velocidad
    let maxTrailLength2 = map(speed2, 0.5, 2, 1000, 100); // Más velocidad = menos puntos
    if (trail2.length > maxTrailLength2) {
      trail2.shift(); // Limitar el tamaño del rastro
    }
  } else {
    // Si no se detecta el dedo, detener el track 2
    audioPlayer2.volume(0);
  }

  // Dibujar la línea de tiempo de los tracks
  drawTimeline();
}

// Función para dibujar la línea de tiempo de los tracks
function drawTimeline() {
  let timelineHeight = 20;
  let timelineY = height - timelineHeight - 10;

  // Línea de tiempo del track 1 (izquierda)
  let progress1 = audioPlayer1.time() / audioPlayer1.duration();
  fill(0, 255, 0); // Color verde
  rect(10, timelineY, width / 2 - 20, timelineHeight);
  fill(255);
  rect(10, timelineY, (width / 2 - 20) * progress1, timelineHeight);

  // Línea de tiempo del track 2 (derecha)
  let progress2 = audioPlayer2.time() / audioPlayer2.duration();
  fill(255, 0, 0); // Color rojo
  rect(width / 2 + 10, timelineY, width / 2 - 20, timelineHeight);
  fill(255);
  rect(width / 2 + 10, timelineY, (width / 2 - 20) * progress2, timelineHeight);
}

// Función que recibe los datos de la detección de manos
function gotHands(results) {
  if (results.length > 0) {
    // Primer dedo índice (track 1)
    let hand1 = results[0];
    let keypoints1 = hand1.keypoints;
    indexFinger = keypoints1.find(point => point.name === 'index_finger_tip');

    if (indexFinger) {
      // Invertir la coordenada X para corregir la inversión de la cámara
      indexFinger.x = width - map(indexFinger.x, 0, video.width, 0, width);
      indexFinger.y = map(indexFinger.y, 0, video.height, 0, height);
    }

    // Segundo dedo índice (track 2)
    if (results.length > 1) {
      let hand2 = results[1];
      let keypoints2 = hand2.keypoints;
      indexFinger2 = keypoints2.find(point => point.name === 'index_finger_tip');

      if (indexFinger2) {
        // Invertir la coordenada X para corregir la inversión de la cámara
        indexFinger2.x = width - map(indexFinger2.x, 0, video.width, 0, width);
        indexFinger2.y = map(indexFinger2.y, 0, video.height, 0, height);
      }
    } else {
      indexFinger2 = null; // Si no hay segunda mano, reiniciar el segundo dedo
    }
  } else {
    indexFinger = null;
    indexFinger2 = null;
  }
}
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta http-equiv="X-UA-Compatible" content="IE=edge" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>Hand-Controlled Oscillator</title>
    
    <!-- Librerías de p5.js y p5.sound -->
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.1/p5.js"></script>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.11.1/addons/p5.sound.min.js"></script>
    
    <!-- Librería de ml5.js para detección de manos -->
    <script src="https://unpkg.com/ml5@1/dist/ml5.min.js"></script>
  </head>
  <body>
    <script src="sketch.js"></script>
  </body>
</html>

Phase 2: Expanding the concept

As we progressed, we realized the potential of modifying both the interaction method and the sound output. To deepen our exploration, each team member developed their own unique hand/(non)hand-controlled instrument, leading to a final collaborative orchestra performance, which we named "Conductive Noise Orchestra"

  • Andrea focused on percussive sounds and effects, using mouth opening and closing to trigger drum beats.

  • Ziming designed a system where finger movements controlled multiple sounds, creating complex textures.

  • I developed a mixing table where two index fingers controlled track transitions, adjusting volume (Y-axis) and playback speed (X-axis).

Final performance: visual & sound integration

To enhance the live experience, we implemented real-time reactive visuals synchronized with the sounds. This turned our performance into a fully immersive audio-visual piece, blending generative art with interactive music.

Our Conductive Noise Orchestra demonstrated how hand gestures could transform digital instruments into expressive tools, blurring the lines between body movement, technology, and sound composition.

Here is a code to use the mouse as a

Here the to the p5 connected to serial

Here is the link to our with all the files and info

We integrated , focusing on tracking the index fingers of both hands. Initially, one index finger replaced the mouse movement, allowing us to control pitch and volume. Then, we introduced a second index finger as a modifier, acting as an LFO (low-frequency oscillator) to alter the sound dynamically.

you can play with my instrument code online

Theremin
link
group repository
ML5.js HandPose
Here
Velostat
Circuit
Tryout at p5