Project Idea: First presentation
October 7, 2023: Please post your first project proposal by responding to this with a title, brief description, what technology, references and schedule (what you think will be realized) and any other relevant information
Project Idea: First presentation
Project Idea: First presentation
George Legrady
legrady@mat.ucsb.edu
legrady@mat.ucsb.edu
Re: Project Idea: First presentation
Hyun Cho
- Attachments
-
- OOBprojectMAT265.pdf
- (1.98 MiB) Downloaded 325 times
MidTerm presentation
MidTerm presentation
Please post your midterm project presentation here
Please post your midterm project presentation here
George Legrady
legrady@mat.ucsb.edu
legrady@mat.ucsb.edu
Re: Project Idea: First presentation
I have been developing a VR project that uses breathing as an input signal to control visual effects in Unity. At first, I tried to connect the biosignal sensor using OSC, but the connection was unstable. So I built a custom network server to transmit the data to Unity instead.
After successfully receiving the breathing data, I created a real-time graph showing the inhale and exhale cycles, which helped visualize the sensor signal.
To connect this data to the Unity VFX system, I studied how to generate particle effects on the surface of a 3D mesh. I then mapped the breathing values to the particle force, so that when the user exhales, the particles expand outward. Finally, I implemented a color transition—when the user holds their breath, the particles gradually change from a warm red tone to a cooler blue tone.
After successfully receiving the breathing data, I created a real-time graph showing the inhale and exhale cycles, which helped visualize the sensor signal.
To connect this data to the Unity VFX system, I studied how to generate particle effects on the surface of a 3D mesh. I then mapped the breathing values to the particle force, so that when the user exhales, the particles expand outward. Finally, I implemented a color transition—when the user holds their breath, the particles gradually change from a warm red tone to a cooler blue tone.
- Attachments
-
- siggraphprocess.pdf
- (3.23 MiB) Downloaded 72 times
-
jintongyang
- Posts: 7
- Joined: Wed Oct 01, 2025 2:38 pm
Re: Project Idea: First presentation
In the group research project, I was responsible for creating data visualizations of the cross-attention maps in Stable Diffusion.
Re: Project Idea: First presentation
https://www.canva.com/design/DAG27rWBzo ... AbS8g/edit
This is the link for our PowerPoint.
This project is a collaborative study between Gao Xue and me, inspired by and based on our reconstruction of Rodger Luo’s project created seven years ago. Luo’s original work explored how multiple machines could move, sense, and coordinate with one another—raising questions about how robots perceive and interpret the world around them.
Building on this foundation, we are conducting a systematic technical investigation using the Zumo 32U4 robot. Our research focuses on understanding its sensing capabilities, motion behavior, and potential for inter-robot interaction. So far, we have successfully replicated key functions, including proximity detection, directional tracking, and object avoidance. We have also successfully accessed the Raspberry Pi, repaired the onboard camera, and enabled the robot to capture images—giving the system a visual input channel.
Looking ahead, we aim to expand beyond technical replication and move toward a more human-centered conceptual direction.
This is the link for our PowerPoint.
This project is a collaborative study between Gao Xue and me, inspired by and based on our reconstruction of Rodger Luo’s project created seven years ago. Luo’s original work explored how multiple machines could move, sense, and coordinate with one another—raising questions about how robots perceive and interpret the world around them.
Building on this foundation, we are conducting a systematic technical investigation using the Zumo 32U4 robot. Our research focuses on understanding its sensing capabilities, motion behavior, and potential for inter-robot interaction. So far, we have successfully replicated key functions, including proximity detection, directional tracking, and object avoidance. We have also successfully accessed the Raspberry Pi, repaired the onboard camera, and enabled the robot to capture images—giving the system a visual input channel.
Looking ahead, we aim to expand beyond technical replication and move toward a more human-centered conceptual direction.