Javier Serra - MDEF Portfolio
HomeTermsAboutSweat Matters
  • Term 2
  • ‎
    • Design Space
    • Microchallenges
    • Reflections
    • Interventions
    • Design Dialogues
    • Pictorial
  • Courses
    • Living Materials
      • Crafted material: Bio-fan
      • Living material: Hairy mycelium
    • Situated Design Practices
      • Personal Reflection
    • Communicating Ideas
    • Design Studio 02
      • Design Space Evolution
      • Ethics reflection
      • Research framework
      • The sweat conflict
    • Collective Intelligences
    • H(n)MI
    • Extended Intelligences II
Powered by GitBook
On this page
  • Microchallenge I: Bio-electronic interface for body response
  • Microchallenge II: Form Follows Life
  1. ‎

Microchallenges

PreviousDesign SpaceNextReflections

Last updated 2 months ago

Microchallenge I: Bio-electronic interface for body response

For this microchallenge, , , and I collaborated, merging our diverse interests to create an innovative bio-electronic interface. Carlos brought a strong focus on healthcare applications, exploring how technology could enhance well-being and monitor physiological signals. Ziming and I leaned more towards artistic expression and body sensing, aiming to capture and communicate emotional responses through wearable technology. This fusion of perspectives inspired the concept of a device that not only monitors the body but also serves as a form of self-expression.

The initial idea began as a bio-reactive tattoo designed to reflect bodily changes through color and movement. However, as our exploration deepened, the concept evolved into a more complex wearable interface that could integrate both biological sensing and dynamic visual feedback. This shift allowed us to experiment with multiple materials, sensors, and design approaches, leading to a versatile prototype.

The device features two key layers: a pH-reactive hydrogel with Bromothymol Blue that changes color based on sweat acidity, visualizing the body's pH in real time, and a ferrofluid microfluidic circuit that moves in response to electromagnetic fields controlled by a GSR sensor, reflecting stress and emotional fluctuations

The project was divided into several tasks:

  1. Hydrogel Development: We experimented with BTB as a pH indicator, testing both agar-agar and sodium alginate to create a reactive hydrogel. While agar-agar showed good reactivity, it was unstable over time, prone to fracturing, and retained moisture. Sodium alginate, though slower to dry, proved more stable and transparent, making it a better option for future iterations.

  1. Ferrofluid Creation: Producing stable ferrofluid was challenging. We initially used iron oxide but later switched to metal shavings, which were more magnetic. The best suspension medium was hypersaturated saltwater, which stabilized the ferrofluid and prevented separation. We learned that the ferrofluid must be fully submerged to maintain its integrity.

  1. Circuit and GSR Integration: We built a circuit using a GSR sensor to detect skin conductivity changes, intending to control electromagnets for ferrofluid movement. However, the GSR sensor provided overly stable signals, insufficient to drive the electromagnets effectively. We adapted by creating a variable intensity circuit suitable for different sensors or considering a simpler on/off control mechanism.

  1. Wearable Design and Molding: The device was designed to be worn on the neck, with 3D-printed molds used for silicone casting. Some materials were too fragile, and we found that thinner silicone walls improve the electromagnetic response. Adjustments are needed to optimize the mold design, possibly replacing the hydrogel with liquid-filled capsules for clearer sweat visualization.

Key Takeaways:

  • BTB is highly reactive and offers reversible color changes, but its compatibility with agar-agar is limited.

  • Ferrofluid requires precise stabilization, with saltwater providing the best results.

  • GSR sensors need further calibration or alternative sensors for effective ferrofluid control.

  • Design iterations are crucial for improving material durability and system responsiveness.

Although the project remains incomplete, the extensive experimentation provided valuable insights into the technical and material challenges, laying a strong foundation for future development.




Microchallenge II: Form Follows Life

Introduction

Concept and system diagram

The project aimed to capture and interpret the paths ants naturally follow, ensuring that their transit zones remained open while other regions were defined through negative space. The workflow involved:

  1. Tracking ant movement – Recording video footage of ants in motion.

  2. Image mapping in Python – Using OpenCV to overlay their movement into a single image.

  3. 3D surface generation – Converting the mapped movement into a surface using Firefly in Grasshopper.

  4. G-code generation – Processing the 3D surface in Grasshopper for clay printing.

  5. Clay printing & returning to ants – Fabricating the piece and placing it back in the ants’ environment.

Step-by-step process

1. Early experimentation

We initially tested P5.js to map ant movement, which resulted in a point cloud. While promising, handling these points in Grasshopper proved complex. We then explored direct image processing in Grasshopper before transitioning to Python and OpenCV for better control.

2. Final mapping with Python & OpenCV

Using OpenCV in Python, we processed a video of moving ants to generate a movement-based overlay. The process involved:

  • Detecting ants as dark regions and tracking their centroids.

  • Compiling movement frames into a final image.

  • Experimenting with heatmaps to highlight frequent dwell areas.

  • Applying Gaussian filtering to unify scattered points into continuous paths.

import cv2
import numpy as np

# Ruta del video (ajústala según tu archivo)
video_path = "C:/Users/javis/OneDrive/Escritorio/ant/video.mp4"

# Cargar el video
cap = cv2.VideoCapture(video_path)

# Verificar si el video se abrió correctamente
if not cap.isOpened():
    print("❌ ERROR: No se pudo abrir el video.")
    exit()

# Leer el primer frame para obtener dimensiones de la imagen
ret, first_frame = cap.read()
if not ret:
    print("❌ ERROR: No se pudo leer el primer frame del video.")
    cap.release()
    exit()

height, width, _ = first_frame.shape

# Crear imagen base (negra) para acumular los trails de las hormigas
overlay = np.zeros((height, width), dtype=np.uint8)

while cap.isOpened():
    ret, frame = cap.read()
    if not ret:  # Si no hay más frames, salimos
        break

    gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY)

    # Detección de hormigas (ajustar umbral si es necesario)
    _, thresh = cv2.threshold(gray, 60, 255, cv2.THRESH_BINARY_INV)

    # Detectar contornos (hormigas)
    contours, _ = cv2.findContours(thresh, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)

    for cnt in contours:
        x, y, w, h = cv2.boundingRect(cnt)
        cx, cy = x + w // 2, y + h // 2  # Centroide de la hormiga

        # Dibujar un área más grande (10 px) para simular la región de paso
        cv2.circle(overlay, (cx, cy), 10, 255, -1)

cap.release()

# Aplicar un filtro Gaussiano para suavizar y expandir las regiones
smoothed_overlay = cv2.GaussianBlur(overlay, (101, 101), 0)  # Aumenta el kernel para expandir más

# Binarizar la imagen para dejar solo blanco y negro
_, binary_overlay = cv2.threshold(smoothed_overlay, 50, 255, cv2.THRESH_BINARY)

# Guardar la imagen resultante en blanco y negro
output_path = "C:/Users/javis/OneDrive/Escritorio/ant/trayectoria_hormigas_bn.png"
cv2.imwrite(output_path, binary_overlay)

print(f"✅ Imagen guardada en {output_path}")

3. Integrating Python into Grasshopper

We aimed for an automated pipeline but encountered challenges:

  • Rhino 7: Could not run Python scripts with external libraries like OpenCV.

  • Remote Python Plugin: Attempted integration with Anaconda but lacked full functionality.

  • Rhino 8: Successfully executed Python scripts but required additional steps to transfer processed data.

4. From bitmap to contours

Using Firefly in Grasshopper, we transformed the Python-generated bitmap into a 3D mesh:

  • Inverted regions to highlight pathways.

  • Created a height-based mesh using brightness values.

  • Applied Gaussian blur to smooth transitions.

  • Extracted clean contour lines by slicing the mesh with an XY plane.

5. Generating the final surface for G-code

After extracting contours, we prepared the model for 3D printing:

  • Simplified curves using the Rebuild Curve command.

  • Addressed surface recognition issues in Grasshopper.

  • Organized printing sequences for optimized layer-by-layer deposition.

  • Used 2mm layer height, considering the 4mm nozzle diameter.

6. Clay preparation & printing process

We tested different clay mixtures:

Mix

Clay (g)

Water (g)

1st

500

40

2nd

500

43

3rd (a day passed)

500

42

4th

500

50

5th (final)

500

47

*Final mix; additional water required based on air exposure over time.

  • The clay was pressed into the container to eliminate air bubbles, ensuring a stable flow.

  • A pneumatic air pump applied pressure (~3 bars) to push the clay through the extruder.

  • The extruder screw rotation direction was adjusted to control the material flow.

The live printing process was exciting to observe, and the final piece dried over several days. Small fractures appeared, likely due to shrinkage, indicating that controlled drying would be necessary for future iterations.

Key takeaways

  • Python is not as intimidating as it seemed – We successfully implemented OpenCV for computer vision.

  • Firefly in Grasshopper – An interesting tool for integrating images into computational design workflows.

  • Understanding G-code – We examined the code behind 3D printing, which is typically auto-generated.

  • Clay preparation was crucial – From mix ratios to extrusion, every step impacted the final print quality.

This project was an engaging intersection of nature, technology, and digital fabrication, showcasing how biological intelligence can inform computational design.

For this Microchallenge, developed alongside and , we explored the concept of intelligence by integrating artificial intelligence (AI) with biological intelligence. Our project analyzed the movement patterns of ants using machine learning and translated them into G-code to 3D print an object with clay. By doing so, we allowed the ants’ natural behaviors to shape the final design, ultimately returning the printed structure to their environment.

Paula Rydel
Maithili Sathe
Carlos
Ziming
Our common design space