Loading site...

Back to Portfolio
Three.jsTensorFlow.jsReactBlenderGSAP
University of the Arts London (UAL)
Installation

Testing Live Interaction

A Preview of the interactive 3d model particles

Client: University of the Arts London (UAL)

Deliverable: Interactive Digital Installation

The Challenge

As part of the Work in Progress Show for the MA in Design for Data Visualisation, I was tasked with creating an installation that captured my current thinking and direction ahead of the final major project. The theme of the show was Responsible Design, and whatever we displayed had to represent a meaningful step toward our final outcome.

Value Delivered

My project explores data as performance, and this installation served as a live test of that idea specifically, audience interaction within a digital space. Visitors were drawn in by animation, intrigued by the presence of a live camera, and engaged by the playful unpredictability of the installation. It attracted consistent crowds and proved to be a successful experiment in interactive engagement.

Background

My broader practice investigates how data can be performed rather than simply displayed. For this installation, I asked: What does performance look like in an exhibition context? My answer: audience participation. The audience becomes part of the piece not just observers, but co-performers.

This raised another question: How do you encourage engagement with data in a gallery space? My answer was interactivity using motion, gesture, and spectacle to draw people into a conversation with data.

Info cards on the background behind the experience

Tools Used

  • Three.js (3D rendering)
  • TensorFlow.js (gesture recognition)
  • React (UI framework)
  • Blender (3D modeling)
  • GSAP (animation)

The Solution

The installation took the form of a digital interface exploring three types of audience interaction:

Visual Spectacle: A particle system animated across a series of 3D models. Audiences were drawn in by the fluid visuals and ambient motion.

Camera Motion Tracking: A camera detected visitors’ hand movements, which then disrupted or shaped the particle forms in real time.

Gesture Recognition: Using machine learning, the system recognized specific hand gestures and responded by cycling through corresponding emoticons.

These layers created a playful space where visitors experimented naturally some interacted deliberately, others just observed. Both approaches became part of the performance.

Concept Design

Inspired by interactive digital events I’ve attended, I noticed that people love particles the movement, the flow, the sensation of interacting with something alive. I built the experience using Three.js to keep it browser-based, which aligns with the direction of my final project.

A live test of the digital experience

Gesture tracking added another dimension: data as a mirror of the body. The camera captured hand joint data in real time, interpreting it into predictions and visual responses. This “human data visualization” approach allowed for playful experimentation Many visitors waved, posed, or simply moved their hands just to see what would happen. Curiosity became the gateway to interaction.

Results

The installation successfully attracted and held attention. A steady crowd gathered around the display, intrigued by its behaviour and keen to know more about the final piece it was building towards. Even with the occasional glitch or limitation in gesture detection, people continued to experiment, enjoying the unpredictability.

Image of an audience around my piece

Importantly, I learned how people want to interact which gestures they instinctively make, what surprises them, and where they lose interest. These insights are directly shaping the development of my final major project.

See It in Action

🔗 Live Installation Preview