For this microchallenge, , , and I collaborated, merging our diverse interests to create an innovative bio-electronic interface. Carlos brought a strong focus on healthcare applications, exploring how technology could enhance well-being and monitor physiological signals. Ziming and I leaned more towards artistic expression and body sensing, aiming to capture and communicate emotional responses through wearable technology. This fusion of perspectives inspired the concept of a device that not only monitors the body but also serves as a form of self-expression.
The initial idea began as a bio-reactive tattoo designed to reflect bodily changes through color and movement. However, as our exploration deepened, the concept evolved into a more complex wearable interface that could integrate both biological sensing and dynamic visual feedback. This shift allowed us to experiment with multiple materials, sensors, and design approaches, leading to a versatile prototype.
The project was divided into several tasks:
Hydrogel Development: We experimented with BTB as a pH indicator, testing both agar-agar and sodium alginate to create a reactive hydrogel. While agar-agar showed good reactivity, it was unstable over time, prone to fracturing, and retained moisture. Sodium alginate, though slower to dry, proved more stable and transparent, making it a better option for future iterations.
Ferrofluid Creation: Producing stable ferrofluid was challenging. We initially used iron oxide but later switched to metal shavings, which were more magnetic. The best suspension medium was hypersaturated saltwater, which stabilized the ferrofluid and prevented separation. We learned that the ferrofluid must be fully submerged to maintain its integrity.
Circuit and GSR Integration: We built a circuit using a GSR sensor to detect skin conductivity changes, intending to control electromagnets for ferrofluid movement. However, the GSR sensor provided overly stable signals, insufficient to drive the electromagnets effectively. We adapted by creating a variable intensity circuit suitable for different sensors or considering a simpler on/off control mechanism.
Wearable Design and Molding: The device was designed to be worn on the neck, with 3D-printed molds used for silicone casting. Some materials were too fragile, and we found that thinner silicone walls improve the electromagnetic response. Adjustments are needed to optimize the mold design, possibly replacing the hydrogel with liquid-filled capsules for clearer sweat visualization.
Key Takeaways:
BTB is highly reactive and offers reversible color changes, but its compatibility with agar-agar is limited.
Ferrofluid requires precise stabilization, with saltwater providing the best results.
GSR sensors need further calibration or alternative sensors for effective ferrofluid control.
Design iterations are crucial for improving material durability and system responsiveness.
Although the project remains incomplete, the extensive experimentation provided valuable insights into the technical and material challenges, laying a strong foundation for future development.
The project aimed to capture and interpret the paths ants naturally follow, ensuring that their transit zones remained open while other regions were defined through negative space. The workflow involved:
Tracking ant movement – Recording video footage of ants in motion.
Image mapping in Python – Using OpenCV to overlay their movement into a single image.
3D surface generation – Converting the mapped movement into a surface using Firefly in Grasshopper.
G-code generation – Processing the 3D surface in Grasshopper for clay printing.
Clay printing & returning to ants – Fabricating the piece and placing it back in the ants’ environment.
1. Early experimentation
We initially tested P5.js to map ant movement, which resulted in a point cloud. While promising, handling these points in Grasshopper proved complex. We then explored direct image processing in Grasshopper before transitioning to Python and OpenCV for better control.
2. Final mapping with Python & OpenCV
Using OpenCV in Python, we processed a video of moving ants to generate a movement-based overlay. The process involved:
Detecting ants as dark regions and tracking their centroids.
Compiling movement frames into a final image.
Experimenting with heatmaps to highlight frequent dwell areas.
Applying Gaussian filtering to unify scattered points into continuous paths.
3. Integrating Python into Grasshopper
We aimed for an automated pipeline but encountered challenges:
Rhino 7: Could not run Python scripts with external libraries like OpenCV.
Remote Python Plugin: Attempted integration with Anaconda but lacked full functionality.
Rhino 8: Successfully executed Python scripts but required additional steps to transfer processed data.
4. From bitmap to contours
Using Firefly in Grasshopper, we transformed the Python-generated bitmap into a 3D mesh:
Inverted regions to highlight pathways.
Created a height-based mesh using brightness values.
Applied Gaussian blur to smooth transitions.
Extracted clean contour lines by slicing the mesh with an XY plane.
5. Generating the final surface for G-code
After extracting contours, we prepared the model for 3D printing:
Simplified curves using the Rebuild Curve command.
Addressed surface recognition issues in Grasshopper.
Organized printing sequences for optimized layer-by-layer deposition.
Used 2mm layer height, considering the 4mm nozzle diameter.
6. Clay preparation & printing process
We tested different clay mixtures:
Mix
Clay (g)
Water (g)
1st
500
40
2nd
500
43
3rd (a day passed)
500
42
4th
500
50
5th (final)
500
47
*Final mix; additional water required based on air exposure over time.
The clay was pressed into the container to eliminate air bubbles, ensuring a stable flow.
A pneumatic air pump applied pressure (~3 bars) to push the clay through the extruder.
The extruder screw rotation direction was adjusted to control the material flow.
The live printing process was exciting to observe, and the final piece dried over several days. Small fractures appeared, likely due to shrinkage, indicating that controlled drying would be necessary for future iterations.
Python is not as intimidating as it seemed – We successfully implemented OpenCV for computer vision.
Firefly in Grasshopper – An interesting tool for integrating images into computational design workflows.
Understanding G-code – We examined the code behind 3D printing, which is typically auto-generated.
Clay preparation was crucial – From mix ratios to extrusion, every step impacted the final print quality.
This project was an engaging intersection of nature, technology, and digital fabrication, showcasing how biological intelligence can inform computational design.
For this Microchallenge, developed alongside and , we explored the concept of intelligence by integrating artificial intelligence (AI) with biological intelligence. Our project analyzed the movement patterns of ants using machine learning and translated them into G-code to 3D print an object with clay. By doing so, we allowed the ants’ natural behaviors to shape the final design, ultimately returning the printed structure to their environment.