2025 Body as Textile
p5.js ml5.js
fullstack installation 1 month New York
Interactive installation exploring movement over time and archiving.
Try the web version here.
Body as Textile is an experience that looks at textile as hypertext that operates beyond thread and fabric, weaving information through bodies, spaces, and histories. It is more than material—it's a living archive. It records movement, encodes organic behaviors, and transmits knowledge across time.
By expanding ideas on human kinesthetics and exploring the potential for unconventional archival mediums, we can rethink how bodies inhabit space and how textiles both enclose and provide comfort within it—ultimately revealing how even the clothing we wear may serve as carriers of information. What does textile look like in its digital form? What can the digital textile achieve that a physical one cannot?
This builds on textiles' historical role as information carriers—traditional clothing has long encoded stories, social status, marital status, cultural identity, and regional histories through patterns, materials, and construction techniques. The digital textile accelerates this archival function exponentially. While traditional textiles slowly absorb traces of wear and encode social information over time, the digital textile instantly captures and records movement data in real-time. Physical textiles cannot document actual movement—they can only hold static cultural meanings and gradual wear patterns.
Each person's movement textile becomes part of a evolving collective archive. Individual heatmaps blend and layer with previous participants' data, creating composite textiles that hold traces of multiple bodies and movement histories. This temporal layering creates new forms of collective memory through encoded gesture; expanding textiles' traditional archival capacity into real-time kinesthetic documentation that physical fabrics could not achieve.
Technical Architecture
This p5.js application integrates ml5's bodyPose library to capture real-time skeletal tracking data from users' webcam input. The pose estimation identifies key body landmarks and records their coordinates over time as users move or dance.
Movement data is processed into heatmap visualizations where position frequency determines pixel intensity—areas with more movement activity render darker, creating unique visual "textiles" based on individual kinesthetic patterns. Each completed session generates a dataset that gets stored in a MongoDB database, preserving both the raw coordinate data and rendered textile visualizations.
The application features a gallery interface where visitors can browse the collection of stored textiles. Individual textiles can be customized through color manipulation controls, allowing users to reinterpret existing movement data with different visual treatments. The collective textile view blends multiple datasets, layering movement histories from different users into composite visualizations.
Users can save their customized textiles directly to their mobile devices through the browser's download functionality. The MongoDB backend enables persistent storage and retrieval of the growing archive, supporting the conceptual framework of textiles as evolving repositories of embodied information across time and participants.