top of page

Supervision - Locus Web Version

DUC00335 copy 2_edited.jpg

Project Goal

Our main concept was to support locus dance framework play in an online version so that anyone can try it at home, specifically supporting blind community. More details on my locus framework exploration can be found here

​​

​

​

​My Role

My role is to support them in exploring different ways to achieve a design goal. For instance, I gave them time to experiment with different technologies hardware and software both. Then I introduced them to new methods of empathising with problems and ideating. When we had a concrete concept, I guided them to try bodystorming

 

Something I learned in my PhD as well as in this supervision is that our mind shift of balancing between future outcomes and what we can achieve now. Sometimes we stop ourselves from experimenting things just because there is no carved path at the current instance but that could be a future research. Also, to accept that learning is a curve, we cannot rush things no matter how urgent it looks like.  

​​

​

​

​

​

​

​

​

 

 

 

 

 

 

 

​Student Team

 

  • Chanel (with the peace hand gesture) is responsible for coordinating the team, representing at conferences such as Sxcw and also creating the front end combining all the work done by other team members.

  • Angel (with shades on the sweater) explored how sound can be incorporated to provide feedback during the movement play. She experimented with tone.js and pose training models.

  • Divyana (glasses over the head) also explored pose training models and integrating to providing randomised locus locations. She explored how the dancers hand position can be mapped tp body tracking models.

  • Ben explored how to do pose detection for Contemporary body poses. He started with working on a data training model.

​​​



 

​​
 

Read more about this project in the official website.

IMG_5892.JPG

Technologies

I started supervising two undergraduates (2023) where we explored designing representations of body movement using different technologies such as 3D printing and using ardruino. Chanel created 3D printed moveable body representation. Angel worked on a mini thermin that responds to body movement.Then two more students joined the team (2024). Ben is keen on gesture recognition model training and Divyana is keen on exploring different programming languages for body tracking. The project later in current stage is geared towards innovative technologies like body tracking and gesture recognition through applications like MediaPipe and TensorFlow to translate dance movements into non-visual modalities such as sonification and haptics.

Prototype

Following video demonstrates how it works. Chanel and Ben are presenting at an open event. I myself trying out the system.

Work in progress prototype

bottom of page