Proj 4: Final Project I: Concept Definition

glegrady
Posts: 203
Joined: Wed Sep 22, 2010 12:26 pm

Proj 4: Final Project I: Concept Definition

Post by glegrady » Fri Apr 01, 2016 6:22 pm

Final Project I: Concept Definition

Topic: Students will create a project using the Kinect and in Processing.

Criteria: The project will have conceptual, aesthetic, and computational components. It will be based on ideas, technical solutions, and aesthetic methods learned from the research presentations and from other sources as found through individual research.

Schedule:
May 3: A general concept to be posted at the student forum, with a sketch.
May 10: Classroom presentation - continuation of research presentations followed by introduction of project.
May 17: Completion of project, with final version details (screen shots, code, references, etc. posted at the forum
George Legrady
legrady@mat.ucsb.edu

qiu0717
Posts: 9
Joined: Wed Jan 06, 2016 1:44 pm

Re: Proj 4: Student Project I

Post by qiu0717 » Tue May 17, 2016 9:28 am

"Bubbles"
Weihao Qiu


"Bubbles" is a Kinect project that represent users movements in form of a bubbles. When a person moves, a bubble will be enlarged as the outline of his body be the bubble wand. The start and end of making bubbles is controlled by the person's moving speed. As time goes by, the shape of bubble should become more organic. Moreover, people can interact with the bubble they make, by pushing it to float away or piercing it to break it apart.
img_2880.jpg
Inspiration of making bubbles
Last edited by qiu0717 on Fri May 27, 2016 1:52 pm, edited 2 times in total.

jing_yan
Posts: 5
Joined: Fri Apr 01, 2016 2:33 pm

Re: Proj 4: Student Project I

Post by jing_yan » Tue May 17, 2016 6:55 pm

/

the BLUE GUITAR

Jing Yan

They said, "You have a blue guitar, You do not play things as they are." --- Wallace Stevens
Inspired by David Hockney, who is inspired by Wallace Stevens who is inspired by Pablo Picasso.

The “BLUE GUITAR” is a motion based spatial sound construction project. The intention is to create a potential audio environment that allows user to interact with and gradually reveals and evolves itself through interaction. The blue guitar is used as a metaphor for not only an instrument that you can play with but also a distorted and unrealistic acoustic world that you might encounter.

The basic concept of the structure of sound and sound materials.
sketch-01.jpg
Interaction process:
The sound is triggered by the entrance of the audience. the amplitude, frequency and panning are related to the amount of users, distance to the Kinect device, position of the body part, and the position in space.

Visualization:
In contrast to the complexity of the sound, the visualization will be simplified as a net like thing consisting of dots that is waving up and down according to the movement of users.
000101.png
000675.png
Technic:
Sound materials are built in the supercollider.
Motion detection are achieved through Kinect.
All parts are connected by Processing.

ihwang
Posts: 5
Joined: Fri Apr 01, 2016 2:35 pm

Re: Proj 4: Student Project I

Post by ihwang » Wed May 18, 2016 12:25 am

Title : Visualizing Energy Flow in 3D with Kinect

In this project, I will visualize energy flow of human body using Kinect V2. Asian martial art, especially in China, it is believed that human is able to increase their internal energy with training. Tai Chi is one of the examples of Chinese martial art. It focuses on state of mind and controlling energy in body with slow movement to optimize internal energy with meditation. Also, calligraphy performance is another good example of this. Black ink and brush stroke are the reflection of body movement. Controlling huge brush is a result of demanding training.

Using Kinect V2 and Unity 5, and several libraries to support both devices and program, I will generate a flow of movement path, this will be converted to the 3D form that represents the speed of movement. In addition, Oculus will increase the aesthetic affect, which offers virtual experience as oppose to watching screen.

This video shows progress.
https://www.youtube.com/watch?v=7E_GzYp ... e=youtu.be

zhenyuyang
Posts: 9
Joined: Fri Apr 01, 2016 2:34 pm

Re: Proj 4: Student Project I

Post by zhenyuyang » Wed May 18, 2016 3:08 pm

Zhenyu Yang
/ / P A R A L L E L W O R L D / /

/ Concept
The idea of this project is based on the two concepts from my previous project (sculpture project). In my previous project, I mentioned there are two types of depth: General depth and detailed depth. General depth means the distance between the general objects detected by the Kinect camera. This kind of depth describes the spatial relationship among object. For example, based on general depth, we can tell if an object is near or far from us. Detailed depth describes the detailed geometry of an object. For example, based on the detailed depth, we can tell how a car looks like by looking at it from different angles.

In this project, I am thinking about creating a space by removing the detailed depth and keeping the general depth. In this space, the audience can actually observe himself. However, the perception can be unusual since the all detail depth are removed in this space and all the audience can perceive are the general distance between them and the abstract world.

Another feather brought by removing the detailed depth is parallelity: Everything is compressed into 2-dimensional so they will be always parallel to each other in the space.

1.png
Since this project involves creating a 2D world in a 3D space, a better 3D display technology can definitely enhance the experience (the ambiguity caused by the fusion of 2D and 3D perception). So I am considering making this project compatible with VR devices like oculus rift or HTC Vive so that the audience can see they are walking in the space.


/ Features
- Kinect Camera implementation, depth map filter
band.png
- Virtual camera movement controls
camera_movements.jpg
- Artificial stereo sound effects in planar space
stereo.png
/ Inspirational Resources
Some inspirational resources that are related to the concept of this project (keeping depth/distance among objects and removing the geometrical depth of each object).

Two-dimensional Egyptian art: http://www.shira.net/culture/kemetic-2d-art.htm
funerary-anubis-detail.gif
Camera movement techniques:https://www.videomaker.com/article/c10/ ... -and-truck

Stereophonic sound:https://en.wikipedia.org/wiki/Stereophonic_sound
Last edited by zhenyuyang on Sat May 28, 2016 10:48 pm, edited 7 times in total.

changhe
Posts: 6
Joined: Wed Jan 06, 2016 1:39 pm

Re: Proj 4: Student Project I

Post by changhe » Mon May 23, 2016 2:36 pm

Experimental Gesture Interface in Processing

Hilda Chang HE

My idea is to make a hand gesture recognition and control interaction sub-system for existing data visualization project. It needs be appended to existing project, such as the data visualization project I made last quarter. The ultimate goal is to let user control and interact with their project by hand gesture so that there is an alternatives to the dominant mouse interface.

This sub-system will include two modes: one hand mode and two hands mode. I'll pre-define some postures as standard and basic control gestures which is the configuration process. Then the system will run detection, and recognition while user interacts with it. It will only support one person as controller for better user experience and follow HCI principle.

I have been reading papers and implemented the one hand mode. There is already a framework illustrated by academic paper. However, I haven't found any complete system implementation in processing which can be run as a individual library. I'll be doing research to see if it's possible to wrap an existing library into another so that I can possibly make it a public standard library. I don't have a positive answer yet. And It also needs to be decided after I finish implementation of the whole sub-system base on the time limitation.

I don't have img to show yet since it's still under developing and debugging right now.
However, here is a part of the code:

Code: Select all

public class HandTracker {
    public PVector position;
    public PVector velocity;

    boolean is_position_available;
    boolean is_velocity_available;

    public double last_measurement_time;


    public HandTracker() {
        position = new PVector();
        velocity = new PVector();
        last_measurement_time = millis();
    }

    public void update(PVector current_position) {
        double t = millis();
        double dt = (t - last_measurement_time) / 1000.0;
        last_measurement_time = t;

        PVector position_new = v_add(v_mul(position, 0.6), v_mul(current_position, 0.4));
        velocity = v_mul(v_sub(position_new, position), (float)(1.0 / dt));
        position = position_new;

        // This should be called at least twice for is_velocity_available to be true.
        if(is_position_available) is_velocity_available = true;
        is_position_available = true;
    }
};

lliu
Posts: 9
Joined: Wed Jan 06, 2016 1:41 pm

Re: Proj 4: Student Project I

Post by lliu » Mon May 23, 2016 4:28 pm

Intrinsic Variations
Human Body fulfilled with Voronoi 3D cells
LU LIU spring M265 2016
voronoiske1.jpg
Concept:
Cells constitute the human body. What cells look like?
Intrinsic Variations is a visualization of motion structure inside of human body. The main focus are fluidity of cells and transformation of cells themselves. It based on Voronoi algorithm and realized in 3D space.


Work-in-Progress:
IMG_2500.JPG
Last edited by lliu on Tue May 24, 2016 1:45 pm, edited 6 times in total.

junxiangyao
Posts: 10
Joined: Wed Jan 06, 2016 1:38 pm

Re: Proj 4: Final Project I: Concept Definition

Post by junxiangyao » Mon May 23, 2016 10:11 pm

Interactive Particle System
Junxiang Yao
In my final project, I want to use particle system to create visual effect. The movement of the particles will be influenced by the motion of the user. When kinect is not capturing any user, the particles will move in their own patterns. And after kinect capture one user, there will be different kinds of attractive and repulsive forces created by users gesture or motion. The kinect will capture the depth image. Using this image, the computer will calculate the contour, joints and the skeleton of the user. The forces created from inside area and outside area of the contour respectively will be different from each other. And the particles will responce to the distribution and the values of the forces.

esayyad
Posts: 2
Joined: Fri Apr 01, 2016 2:31 pm

Re: Proj 4: Final Project I: Concept Definition

Post by esayyad » Tue May 24, 2016 3:52 am

ReFLEXion

Image

“The lake was silent for some time. Finally, it said:
"I weep for Narcissus, but I never noticed that Narcissus was beautiful. I weep because, each time he knelt beside my banks, I could see, in the depths of his eyes, my own beauty reflected.”
― Paulo Coelho, The Alchemist

The Piece would be made of a screen that acts as a water surface in front of the user.
you can lay down and see yourself in the reflection. if you touch the surface, the image will be distorted and regenerated as someone or something else. these can be deep meanings about what humanity can be. and what a human being can achieve.

Image

ambikayadav
Posts: 4
Joined: Fri Apr 01, 2016 2:32 pm

Re: Proj 4: Final Project I: Concept Definition

Post by ambikayadav » Tue May 24, 2016 11:07 am

FINAL PROJECT
Ambika Yadav

CONCEPT 1
YOU SEE YOU

This project is an imaginary piece of work . I was inspired by the basic concept of the Kinect, to give the depth values of elements in the Kinect’s Field of View. The Kinect output tells us, how far (depth) every pixel on the camera output is. I wanted to take these depth values and invert them
This implies that in the output, the depth value of the far away objects will be less, they will appear closer than the objects which are close by, which will have higher depth values, and will appear to be far away .
What I imagine is that the user, when approaches the system to look at his mirrored image, will look at himself moving far away, and other far off objects move closer .
Putting the perspectives and pixels together is going to be quite tricky and I am working on this as of now.
Inversion.jpg
CONCEPT 2
MIRRORS

As the idea behind You See You is a bit tricky , I wanted to keep a backup idea incase it is not implementable.
This idea is inspired by Daniel Rozin Mirror series. It is a simple implementation of a black and white grid. The white grids turn black when a user comes around the system, mirroring the movements and motion of the user continuously .
DR_Ref.jpg
Mirror.jpg

Post Reply