Proj 2: Depth Project

Post Reply
glegrady
Posts: 203
Joined: Wed Sep 22, 2010 12:26 pm

Proj 2: Depth Project

Post by glegrady » Fri Apr 01, 2016 5:48 pm

Proj 2: Depth Project

The assignment is to create a visualization that explores 3D space using the Kinect Sensor, SimpleOpenNi functions and the Processing library Peasycam: http://mrfeinberg.com/peasycam/

Using the movement of people recorded by the Kinect, create a virtual object or objects in 3D space. Examples:
. A series of lines that can be viewed in 3D
. A set of virtual rectangles
. Anything that has 3D dimensions which could then be printed using a 3D printer.

References
Antony Gormley is our artistic, sculpture reference:
http://www.antonygormley.com/sculpture/chronology
http://www.antonygormley.com/drawing/series

And Art + Com Berlin is our VR interactive reference: https://artcom.de/en/project/the-invisi ... ings-past/

Post your project here:
. Provide a good concept description
. Screen shot(s)
. Code
. References
George Legrady
legrady@mat.ucsb.edu

ihwang
Posts: 5
Joined: Fri Apr 01, 2016 2:35 pm

Re: Proj 2: Depth Project

Post by ihwang » Tue Apr 12, 2016 1:30 am

Energy strip in space
tumblr_inline_mqli4e4XiG1qz4rgp.jpg
Source: http://dohanews.co/katara-to-hold-final ... rformance/

Topic
My initial idea was started from a large scale calligraphy performance. Using a huge brush, a performer write a letter, it seems like a messy drawing, but trace of black oil ink on the paper shows a dynamic movement of the performer.

Description
First, I started with this project with a "simple" program which is scattering dirt on the ground.(Video 1) Then, I developed this idea to connect each dots to create line field on the space. After last week critic I changed my initial plan. Using KinectV2 I made a calligraphic drawing program. Instead of strong stroke brush, I decided to create virtual drawing tool to show my both hands movement. The sketch initially follows both hands moment, however, you can see that 3D object are fabricated based on the position of hands. When the sphere moves so fast, it also changes the shape of tube. This is something like what Picasso did.

I also added a line function, the first and second video show that the lines are generated based on two vectors, one is on hand, the other one is on the ground. But there exists time gap, which was the travel time between the two points. Thus, the dirt is dropping on the ground vertically, but the lines are not perpendicular to the ground.

For the next step, I made the tube as mesh with several libraries, this polygon is converted to the STL file. Then I request 3D printing job.

[img]
proof.jpg
[/img]
0002.png
0058.png
0089.png

Proof.

https://www.youtube.com/watch?v=8Cub4_T ... tml5=False
https://www.youtube.com/watch?v=UreSpGm ... e=youtu.be
https://youtu.be/uA5hJT_T9yo


Reference

1. One of examples in Thomas Sanchez Lengeling's KniectPV2 Library
2. amnon.owed's answer (https://forum.processing.org/one/topic/eraser.html)
3. peasycam v201 (http://mrfeinberg.com/peasycam/)
4. For the point cloud, I borrowed a code from Processing forum (https://processing.org/discourse/beta/n ... 92578.html)
5. Daniel Shiffman's Sine Wave tutorial (https://processing.org/examples/sinewave.html)
6. Daniel Shiffman's ArrayList of objects (https://processing.org/examples/arraylistclass.html)
7. Shape3D library ExtrudeBeam exampel (http://lagers.org.uk/index.html)
8. OBJ Export http://n-e-r-v-o-u-s.com/tools/obj/

Code

Code: Select all

/*
 MAT 265 class project No.2 20160412
 This code is the modified version from 
 1. One of examples in Thomas Sanchez Lengeling's KniectPV2 Library 
 2. amnon.owed's answer  (https://forum.processing.org/one/topic/eraser.html)
 3. peasycam v201 (http://mrfeinberg.com/peasycam/)
 4. For the point cloud, I borrowed a code from Processing forum (https://processing.org/discourse/beta/num_1241792578.html)
 5. Daniel Shiffman's Sine Wave tutorial (https://processing.org/examples/sinewave.html)
 6. Daniel Shiffman's ArrayList of objects (https://processing.org/examples/arraylistclass.html)
 7. Shape3D library ExtrudeBeam exampel (http://lagers.org.uk/index.html) 
 8. OBJ Export http://n-e-r-v-o-u-s.com/tools/obj/
 
 Copyright (C) 2014  Thomas Sanchez Lengeling.
 KinectPV2, Kinect for Windows v2 library for processing
 
 Permission is hereby granted, free of charge, to any person obtaining a copy
 of this software and associated documentation files (the "Software"), to deal
 in the Software without restriction, including without limitation the rights
 to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
 copies of the Software, and to permit persons to whom the Software is
 furnished to do so, subject to the following conditions:
 
 The above copyright notice and this permission notice shall be included in
 all copies or substantial portions of the Software.
 
 THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
 IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
 FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
 AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
 LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
 OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
 THE SOFTWARE.
 
 https://forum.processing.org/one/topic/eraser.html

*/
 
import KinectPV2.KJoint;
import KinectPV2.*;
import peasy.*;

import shapes3d.utils.*;
import shapes3d.animation.*;
import shapes3d.*;

import nervoussystem.obj.*;
import processing.dxf.*;

PeasyCam cam;

KinectPV2 kinect;
Skeleton [] skeleton;

///////////////////////////// place for store x,y,z cordinate
float storeRX;
float storeRY;
float storeRZ;

float storeLX;
float storeLY;
float storeLZ;

float storeRXGround;
float storeRYGround;
float storeRZGround;

float storeLXGround;
float storeLYGround;
float storeLZGround;

int storeMul = 300; 

float RandomAngle;

///////////////////////////////

float zVal = 400; //scale value of skeleton
float rotX = PI;

Location [] pointsRightHand = new Location[1];      //Number of points
Location [] pointsLeftHand = new Location[1];      //Number of points


int pointX, pointY, pointZ;
int radius = 100; //size of sphere for point cloud

///////////////////////////////

int ScreenWidth = 1280;
int ScreenHeight = 960;
int ground = -550;
///////////////////////////////Sing Wave Ball Part

int xspacing = 16;   // How far apart should each horizontal location be spaced
int w;              // Width of entire wave

float theta = 0.0;  // Start angle at 0
float period = 25.0;  // How many pixels before the wave repeats
float dx;  // Value for incrementing X, a function of period and xspacing
float[] yvalues;  // Using an array to store height values for the wave
float rotY = TWO_PI/25;

////////////////////////////////Particle Array

ArrayList<Drop> Rdrops;
ArrayList<Drop> Rgrounddrops;

ArrayList<Drop> Ldrops;
ArrayList<Drop> Lgrounddrops;

ArrayList<PVector> vectors = new ArrayList<PVector>();
ArrayList<PVector> vectors1 = new ArrayList<PVector>();


ArrayList<Box> BoxesR;
ArrayList<Box> BoxesL;


int dropWidth = 18;

boolean DrawLine = true;
boolean DrawBox = false;

int lineterm = 1;
int boxterm = 100;

int timer;

//////////////////////////////////////////////////// extrusion
Path path;
Extrusion e;
Contour contour;
ContourScale conScale;
  
float x,y,z,xx,yy,zz;

ArrayList<PVector> vectors2 = new ArrayList<PVector>();

PVector[] knots;
PVector[] knots1;
PVector v1;

////////////////////////////////

boolean record;

class Box{
  
  float x1;
  float y1;
  float z1;
  
  int x,y,z;
  int xx,yy,zz;
 
  float life = 1;
  
  Box(float tempX1, float tempY1, float tempZ1){       
    x1 = tempX1;
    y1 = tempY1;
    z1 = tempZ1;
  }
 
  boolean finished(){
    life--;
    if(life < 0){
      return true;
    }else{
      return false;}
  }
 
  void display(){
    //fill(0,life);
    beginShape();
  
    if(DrawBox == true){    
       
        for(int i = 0; i < vectors.size(); i ++){    
          x = abs(int(BoxesR.get(i).x1));
          y = abs(int(BoxesR.get(i).y1));
          z = abs(int(BoxesR.get(i).z1));
         }
         
        for(int j = 0; j < vectors.size(); j++){  
          xx = abs(int(BoxesL.get(j).x1));
          yy = abs(int(BoxesL.get(j).y1));
          zz = abs(int(BoxesL.get(j).z1));       
         }          
        //box(x+xx,y+yy,z+zz);     
    }
    endShape();
    //box(x,y,z);   
    line(0,0,0,x,y,z);
  }
}

void draw() {
  
  if (record) {
    beginRecord("nervoussystem.obj.OBJExport", "filename.obj"); 
  } 
  
  background(0); 
  calcWave();
  
  //image(kinect.getColorImage(), 0, 0, 320, 240);
  skeleton =  kinect.getSkeleton3d();
  
   if (record) {
      beginRaw(DXF, "output.dxf");
    }



  pushMatrix();
  scale(zVal);
  rotateY(rotX);

   

  for (int i = 0; i < skeleton.length; i++) {
    if (skeleton[i].isTracked()) {
      KJoint[] joints = skeleton[i].getJoints();
      //drawBody(joints);
      //draw different color for each hand state
      drawRightHandState(joints[KinectPV2.JointType_HandRight]);
      drawLeftHandState(joints[KinectPV2.JointType_HandLeft]); 
      //drawJoint(joints, KinectPV2.JointType_FootLeft);
    }
  }
  
  scale(2/zVal);

      for(int i = 0; i <= pointsRightHand.length-1; i++)
      {
        {
          pushMatrix();
          translate(pointsRightHand[i].x+storeRX, pointsRightHand[i].y+storeRY, pointsRightHand[i].z+storeRZ);
           //Rdrops.add(new Drop(pointsRightHand[i].x+storeRX, pointsRightHand[i].y+storeRY, pointsRightHand[i].z+storeRZ,100));
            
            if(DrawLine == true){
              if(millis()- timer >= lineterm){
                 vectors.add(new PVector(pointsRightHand[i].x+storeRX, pointsRightHand[i].y+storeRY, pointsRightHand[i].z+storeRZ));
                 vectors1.add(new PVector(pointsRightHand[i].x+storeRX, pointsRightHand[i].y+storeRY, pointsRightHand[i].z+storeRZ));
                 
                timer = millis();
              }  
            }
            
            if(DrawBox == true){
               if(millis()- timer >= boxterm){
                 BoxesR.add(new Box(pointsRightHand[i].x+storeRX, pointsRightHand[i].y+storeRY, pointsRightHand[i].z+storeRZ));
                 timer = millis();
              }  
            }  
          sphere(1);
          popMatrix();
        }
      }
      
    
      for(int i = 0; i <= pointsLeftHand.length-1; i++)
      {
        {
          pushMatrix();
          translate(pointsLeftHand[i].x+storeLX, pointsLeftHand[i].y+storeLY, pointsLeftHand[i].z+storeLZ); 
          //Ldrops.add(new Drop(pointsLeftHand[i].x+storeLX, pointsLeftHand[i].y+storeLY, pointsLeftHand[i].z+storeLZ,100));
      
              if(DrawBox == true){
                 if(millis()- timer >= boxterm){
                    BoxesL.add(new Box(pointsLeftHand[i].x+storeRX, pointsLeftHand[i].y+storeRY, pointsLeftHand[i].z+storeRZ));
                    timer = millis();
                }  
              }  
          
      
          sphere(1);
          popMatrix();
        }
      }
         
   for(int j = Rdrops.size() - 1; j >= 0; j--){
          Drop Rdrop = Rdrops.get(j);
          Rdrop.moveR();
          Rdrop.display();
            if(Rdrop.finished()){
              Rdrops.remove(j);
            }
          }
          
    for(int j = Ldrops.size() - 1; j >= 0; j--){
          Drop Ldrop = Ldrops.get(j);
          Ldrop.moveR();
          Ldrop.display();
            if(Ldrop.finished()){
              Ldrops.remove(j);
            }
    }
/////////////////////////////////////////////////////////////////////////          
 for(int h = Rgrounddrops.size() - 1; h >= 0; h--){
      Drop Rgrounddrop = Rgrounddrops.get(h);
      Rgrounddrop.grounddisplay();
        if(Rgrounddrop.finished()){
          Rgrounddrops.remove(h);
        }
      }
      
  for(int h = Lgrounddrops.size() - 1; h >= 0; h--){
      Drop Lgrounddrop = Lgrounddrops.get(h);
      Lgrounddrop.grounddisplay();
        if(Lgrounddrop.finished()){
          Lgrounddrops.remove(h);
        }
      }
 
 
 
 for(int h = BoxesR.size() - 1; h >= 0; h--){
      Box BoxR = BoxesR.get(h);
      BoxR.display();
   //      if(BoxR.finished()){
   //       BoxesR.remove(h);
   //     }
     
 }
      
 
    
if(DrawLine == true){      
  for(int i = 1; i < vectors.size(); i ++){
    noFill();
    beginShape();
    curveVertex(int(vectors.get(i).x),int(vectors.get(i).y),int(vectors.get(i).z));
    
    for(int j = 0; j < vectors.size(); j++){  
      curveVertex(int(vectors1.get(j).x),int(vectors1.get(j).y),int(vectors1.get(j).z));  
      //println(vectors1.get(j));
    }
    endShape();
  }
}


 beginShape();       
     // Use a BezierSpline to define the extrusion path
     PVector[] arr = vectors.toArray(new PVector[0]);
      path = new P_BezierSpline(arr);
      // Custom contour object
      contour = new Qcontour();
      // Custom scale object
      conScale = new Qscale();

      e = new Extrusion(this, path, 100, contour, conScale);
      e.setTexture("ground.jpg", 4, 7);
      e.drawMode(S3D.TEXTURE );
    
      // End caps
      e.stroke(color(0, 0, 0), S3D.BOTH_CAP);
      e.fill(color(0, 0, 0), S3D.BOTH_CAP);
      e.strokeWeight(0.2f, S3D.BOTH_CAP);
      e.setTexture("ground.jpg", S3D.BOTH_CAP);
      e.drawMode(S3D.SOLID | S3D.WIRE, S3D.BOTH_CAP);
 
    e.draw();
  endShape();
/////////////////////////////////////////////////////////







//////////////////////////////////////////////////////////  
      
  stroke(255);   
  popMatrix();
  
  //text(frameRate, 50, 50);
  
 
  beginShape();
    fill(27,27,27);
    vertex(-2000, ground, -2000);
    vertex(-2000, ground, 2000);
    vertex(2000,  ground, 2000);
    vertex(2000,  ground, -2000);
  endShape(CLOSE);
  
  
     if (record) {
    endRecord();
    record = false;
  }
  ///////////////////////////////////////////////////
  
  
  saveFrame("frames/####.png");
  
  
}

void drawRightHandState(KJoint joint) {
  handState(joint.getState());
  //strokeWeight(5.0f + joint.getZ()*8);
  point(joint.getX(), joint.getY(), joint.getZ());
  
  storeRX = joint.getX()*storeMul;
  storeRY = joint.getY()*storeMul;
  storeRZ = joint.getZ()*storeMul;
}

void drawLeftHandState(KJoint joint) {
  handState(joint.getState());
  //strokeWeight(5.0f + joint.getZ()*8);
  point(joint.getX(), joint.getY(), joint.getZ());
  
  storeLX = joint.getX()*storeMul;
  storeLY = joint.getY()*storeMul;
  storeLZ = joint.getZ()*storeMul;
}

class Drop {
  float x;
  float y;
  float z;
  float speed;
  float gravity;
  float w;
  float life = 600;
  
  Drop(float tempX, float tempY, float tempZ, float tempW){
    x = tempX;
    y = tempY;
    z = tempZ;
    w = tempW;
    speed = 0;
    gravity = 0.1;
  }
  
  void moveR(){
    //Add gravity
    speed = speed - gravity;
    //Add speed to y
    y = y + speed;
    
    if(y < -270){
      speed = speed * -0.08;
      y = storeRY;
      Rgrounddrops.add(new Drop(x,-369.0,z,0));   
      
      storeRXGround = x;
      storeRYGround = -369.0;
      storeRZGround = z;     
    }
  }
  
   void moveL(){
    //Add gravity
    speed = speed - gravity;
    //Add speed to y
    y = y + speed;
    
    if(y < -270){
      speed = speed * -0.08;
      y = storeLY;
      Lgrounddrops.add(new Drop(x,-369.0,z,0)); 
         
      storeLXGround = x;
      storeLYGround = -369.0;
      storeLZGround = z;      
    }
  }
  

  boolean finished(){
    life--;
    if(life < 0){
      return true;
    }else{
      return false;}
  }
  
  void display(){
    fill(0,life);
    beginShape(POINTS);
    vertex(x,y,z);
    endShape();
  }
  
  void grounddisplay(){
    fill(0,life);
    beginShape(POINTS);
    vertex(x,-260,z); 
    endShape();
  }
}
void createPoints()
{
  for(int i = 0; i <=pointsRightHand.length-1; i++)
  {
    float angleA = random(0,360);
    float angleB = random(0,360);
      
    pointX = int(radius*sin(angleA)*cos(angleB));
    pointY = int(radius*sin(angleA)*sin(angleB));
    pointZ = int(radius*cos(angleA));
    
    pointsRightHand[i] = new Location (pointX, pointY, pointZ); 
    pointsLeftHand[i] = new Location (pointX, pointY, pointZ); 
       
  }
}

void calcWave() {
  // Increment theta (try different values for 'angular velocity' here
  theta += 0.05;

  // For every x value, calculate a y value with sine function
  float x = theta;
  for (int i = 0; i < yvalues.length; i++) {
    yvalues[i] = sin(x)*180+180;
    x+=dx;
    RandomAngle = yvalues[i];
   // println(RandomAngle);
  }
} 

class Location
{
  float x,y,z;
  Location(float pointX, float pointY, float pointZ)
  {
    x = pointX;
    y = pointY;
    z = pointZ;
  }
}
class Qcontour extends Contour {
  public Qcontour() {
    contour = new PVector[] {
      new PVector(13, 6.5f), 
      new PVector(13, -6.5f), 
      new PVector(6.5f, -13), 
      new PVector(-6.5f, -13), 
      new PVector(-13, -6.5f), 
      new PVector(-13, 6.5f), 
      new PVector(-6.5f, 13), 
      new PVector(5.3f, 13), 
      new PVector(11.5f, 18.5f), 
      new PVector(10, 20), 
      new PVector(-10, 20), 
      new PVector(-20, 10), 
      new PVector(-20, -10), 
      new PVector(-10, -20), 
      new PVector(10, -20), 
      new PVector(20, -10), 
      new PVector(20, 10), 
      new PVector(17.5f, 13), 
      new PVector(20, 15.5f), 
      new PVector(15.5f, 20), 
      new PVector(5.5f, 11), 
      new PVector(10, 5.5f), 
      new PVector(12, 7.7f)
      };
      for (PVector c : contour)
        c.mult(2);
  }
}
// Simple class to generate a scaling factor to apply 
// to the contour along its length
class Qscale implements ContourScale {
  // Parametric t in range 0.0 to 1.0 inclusive
  public float scale(float t) {
    float s = t - 0.5f;
    s = s * s * 2.2f + 0.2f;
    return s;
  }
}

void setup() {
  
  size(ScreenWidth, ScreenHeight, P3D);
   createPoints();//create point cloud
///////////////////////////////////////////// Kinect setup
  kinect = new KinectPV2(this);
  kinect.enableColorImg(true);
  kinect.enableSkeleton(true);
  //enable 3d Skeleton with (x,y,z) position
  kinect.enableSkeleton3dMap(true);
  kinect.init();
  smooth();
  
///////////////////////////////////////////// peasycam setup
  cam = new PeasyCam(this,1000);
  cam.setMinimumDistance(500);
  cam.setMaximumDistance(5000);
  cam.rotateZ(PI);
  cam.rotateX(1.5/PI);
  
///////////////////////////////////////////// Sine waveball setup
  w = width+16;
  dx = (TWO_PI / period) * xspacing;
  yvalues = new float[w/xspacing];

///////////////////////////////////////////// Drop setup

  Rdrops = new ArrayList<Drop>();
  Rdrops.add(new Drop(100,220,222,dropWidth));
  
  Rgrounddrops = new ArrayList<Drop>();
  Rgrounddrops.add(new Drop(0,0,0,0));
  
  Ldrops = new ArrayList<Drop>();
  Ldrops.add(new Drop(0,0,0,0));
  
  Lgrounddrops = new ArrayList<Drop>();
  Lgrounddrops.add(new Drop(0,0,0,0));
 
  BoxesR = new ArrayList<Box>();
  BoxesR.add(new Box(0,0,0));
  BoxesL = new ArrayList<Box>();
  BoxesL.add(new Box(0,0,0));
  
  vectors.add(new PVector(0,0,0));
  vectors1.add(new PVector(0,0,0));
  

  //////////////////////////////////////
  

}

void handState(int handState) {
  switch(handState) {
  case KinectPV2.HandState_Open:
    stroke(255);
    break;
  case KinectPV2.HandState_Closed:
    stroke(255);
    break;
  case KinectPV2.HandState_Lasso:
    stroke(255);
    break;
  case KinectPV2.HandState_NotTracked:
    stroke(255);
    break;
  }
}
void keyPressed()
{
  if (key == 'r') {
    record = true;
  }
  if (key == 'e'){
    record = false;
  }
}
Last edited by ihwang on Mon Apr 18, 2016 11:09 pm, edited 3 times in total.

jing_yan
Posts: 5
Joined: Fri Apr 01, 2016 2:33 pm

Re: Proj 2: Depth Project

Post by jing_yan » Sun Apr 17, 2016 5:21 pm

the SPACE between
CONSTRUCTION and DESTRUCTION


: : Jing Yan

Concept

Inspired by - Antony Gormely's sculpture. I like his balance between abstract unity and characteristic in the sculptures.
56fa4d42b067b.jpg
553112f3482a7.jpg
I would like to create a virtual real-time abstract "sculpture" of the scene in real world with basic elements.

space element
I try to experiment with different basic elements such as cube and cross, and feel their various sense of space and depth. While the 3D cross create a feeling of graphic, the cube add more depth to the three dimensional scene. Thus, in the final version of my project I combine both of them. I display the crossed on the surface to represent interesting detailed information and construct the cubes behind as skeleton or basis to create better feeling of depth. The cube is constructed according to the complexity of the surface.

construction and destruction
While the sculpture is mostly still, I am interested in the idea of it's construction and destruction. In my view, the construction is not only for a moment, but also the freezing of motion/what happens in a certain amount of time. In contrast to that, there is destruction, that something disappear with the time and due to some reasons.
In my project, I achieve the effect of construction and destruction by recording a certain length of frames, removing the old frame and add in new one when it's out of bounds, and display them with time.

ScreenShot
frame-001837.png
frame-001802.png
frame-001521.png
frame-001359.png
Video Link
experiment with cube
https://youtu.be/37q3Zrgaqfs
experiment with both cube and cross
https://youtu.be/TxN5bz6yous

Code

Code: Select all

/* 2016-4-12 (Processing 3)
 
 M265 Optical-Computational Processes: Depth Project 
 
 ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
 :::::::: the SPACE between CONSTRUCTION and DESTRUCTION:::::::::::
 ::::::::::::::::::::::::::::::::::::::::::: code: Jing Yan :::::::
 ::::::::::::::::::: theuniqueeye@gmail.com :::::::::::::::::::::::
 ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
 ::::::: [VERSION 4] ::::::::::::::::::::::::::::::::::::::::::::*/

// version4: 
// 1. try different elements: box, cross
// 2. add more depth to the sculpture according to the complexity of the front

// reference: 
// inspired by - Antony Gormely's sculpture
// based on "point clouds example" - Greg.Borenstein <Making Things See> (2012.01) 


import peasy.*;
PeasyCam cam;
import SimpleOpenNI.*;
SimpleOpenNI kinect;
import processing.opengl.*;

// Cube - Sculpture - Frame of sculptures
ArrayList<ArrayList<Cube>> sculptures = new ArrayList<ArrayList<Cube>>(); 
Cube cube;
int step=15;
int maxRange=2000, minRange=400;
int numFrame=5, counter=0;
boolean isRecording=true;
//boolean isRecFrame;

boolean rotateY=false, rotateX=false;
boolean colorMode=false, saveFrame=false;
PShader lineShader;
PShape cross;

void setup() {
  size(1080, 800, OPENGL);
  cam = new PeasyCam(this, 500);

  kinect = new SimpleOpenNI(this);
  if (kinect.isInit() == false) {
    println("Can't init SimpleOpenNI."); 
    exit();
    return;
  }   
  kinect.setMirror(true);
  kinect.enableDepth();
  kinect.enableRGB();
  kinect.alternativeViewPointDepthToImage();

  // shader
  //lineShader = loadShader("linefrag.glsl", "linevert.glsl");
  //hint(DISABLE_DEPTH_MASK);

  // element: cross
  int a=30;
  cross = createShape();
  cross.beginShape(LINES);
  cross.vertex(-a/2, 0, 0);
  cross.vertex(a/2, 0, 0);
  cross.vertex(0, -a/2, 0);
  cross.vertex(0, a/2, 0);
  cross.vertex(0, 0, -a/2);
  cross.vertex(0, 0, a/2);

  //  cross.vertex(a/2, 0, 0);
  //  cross.vertex(a/2, a, 0);
  //  cross.vertex(a/2, a/2, 0);
  //  cross.vertex(a, a/2, 0);
  //  cross.vertex(0, a/2, 0);
  //  cross.vertex(a/2, a/2, 0);
  //  cross.vertex(a/2, a/2, -a/2);
  //  cross.vertex(a/2, a/2, a/2);
  cross.endShape();
}


void draw() {
  kinect.update();
  //shader(lineShader, LINES); // shader
  background(255); 
  frameRate(5);

  /*
  pushMatrix();
   PImage rgbImage= kinect.rgbImage();
   translate(-1080/2+220, -800/2+145);
   image(rgbImage, 0, 0);
   popMatrix();
   */

  stroke(0);
  noFill();
  strokeWeight(1);
  rotateX(radians(180)); 
  translate(0, 0, -700);

  PVector[] depthPoints = kinect.depthMapRealWorld(); 
  int[] depthMap=kinect.depthMap();

  int index;
  float len, transparency, weight;
  float posX, posY, posZ;

  // ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::[ record ]::::::::

  if (isRecording) {
    //isRecFrame = (frameCount % 100 == 0); // record every 10 frames

    ArrayList<Cube> sculpture = new ArrayList<Cube>();

    for (int y=0; y<kinect.depthHeight (); y++) {  // loop every pixel from the screen
      for (int x=0; x<kinect.depthWidth (); x++) {
        index=x+y*kinect.depthWidth();
        PVector currentPoint = depthPoints[index];

        if (currentPoint.z > minRange && currentPoint.z < maxRange) { // only save the data of points within a certain distance
          posX = currentPoint.x;
          posY = currentPoint.y;
          posZ = currentPoint.z;
          step = int(map(currentPoint.z, minRange, maxRange, 5, 15)); // density between the other
          len = map(currentPoint.z, minRange, maxRange, 25, 3); // size 
          transparency = map(currentPoint.z, minRange, maxRange, 80, 230); // transparency of stroke
          weight = map(currentPoint.z, minRange, maxRange, 2, 0.3); // strokeWeight

          cube = new Cube(step, len, transparency, weight, posX, posY, posZ); // [ create a new cube ]
        } else {
          cube = new Cube(0, 0, 0, 0, 0, 0, 0);
        }
        sculpture.add(cube); // [ add cubes from every pixel of the screen into a sculpture ]
      }
    }

    if (sculptures.size()<numFrame) // [ add sculptures from a certain duration of frames into sculptures ]
      sculptures.add(sculpture);
    else {
      sculptures.set(counter%numFrame, sculpture); // update the old frame with new data, keep the number of frame at a certain range
      //println("counter%numFrame : " + counter%numFrame);
    }
    counter++; // count frame
  }


  // ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::[ display ]::::::::

  for (int i=0; i<sculptures.size (); i++) { // display sculpture from each frame with the time range
    ArrayList<Cube> currentSculpture = sculptures.get(i);
    Cube currentCube = currentSculpture.get(0);
    int countBox=0;

    for (int y=0; y<kinect.depthHeight (); y+=10) {
      for (int x=0; x<kinect.depthWidth (); x+=10) {
        index=x+y*kinect.depthWidth();
        currentCube = currentSculpture.get(index);
        pushMatrix();
        translate(currentCube.posX, currentCube.posY, currentCube.posZ);
        stroke(0, currentCube.transparency);
        strokeWeight(currentCube.weight);
        //randomness(x,y,len); // add randomness

        if (currentCube.len!=0) { // only draw the element within certain distance range

          //box(currentCube.len, currentCube.len, currentCube.len); // [BOX-level1]
          if (x%2 ==0 && y%2 ==0) shape(cross, 0, 0); // [CROSS-level1]

          // add depth to the sculpture according to the complexity of the front 

          int lenBox1=30, lenBox2=60, gap=10; // for CROSS
          float strokeBox=0.7; // for CROSS
          //int lenBox1=60, lenBox2=120, gap=20;  // for BOX
          //float strokeBox=0.5; // for BOX

          if (currentCube.posZ <(minRange+maxRange)/3) { // [BOX-level2]
            countBox++;
            if (countBox>300 && x%20 ==0 && y%20 ==0) {
              translate(0, 0, lenBox1+gap);
              strokeWeight(strokeBox);
              box(lenBox1);
            }
            if (countBox>600 && x%30 ==0 && y%30 ==0) { // [BOX-level3]
              translate(0, 0, lenBox1+lenBox2+gap);
              strokeWeight(strokeBox);
              box(lenBox2);
            }
          }
        }
        popMatrix();
      }
    }
  }
  if (rotateY) cam.rotateY(radians(1)*0.8);
  if (rotateX) cam.rotateX(radians(1)*0.8);
  if (saveFrame) saveFrame("frame/frame-######.png");
}

Code: Select all

void keyPressed() {
  if (key == ' ' || key == ' ') { 
    isRecording = !isRecording;
  }
  if (key == '1') {
    rotateY = !rotateY;
  }
  if (key == '2') {
    rotateX = !rotateX;
  }
  if (key == 'c' || key== 'C') {
    colorMode = !colorMode;
  }
  if (key == 's' || key == 'S') {  
    saveFrame=!saveFrame;
  }
}

Code: Select all

class Cube { 
  public int step;
  public float len, transparency, weight;
  public float posX, posY, posZ;
  //boolean ifDraw;
  //public PVector position;
  
  Cube(int step, float len, float transparency, float weight, float posX, float posY, float posZ){ // constructor
    this.step = step;
    this.len = len;
    this.transparency = transparency;
    this.weight = weight;
    this.posX = posX;
    this.posY = posY;
    this.posZ = posZ;
    //this.position = position;
  }
}
References
based on "point clouds example" - Greg.Borenstein <Making Things See> (2012.01)
(http://lizarum.com/assignments/physical ... intro.html)
(http://stackoverflow.com/questions/9863 ... rld-method)
Last edited by jing_yan on Thu Apr 21, 2016 12:04 pm, edited 2 times in total.

qiu0717
Posts: 9
Joined: Wed Jan 06, 2016 1:44 pm

Re: Proj 2: Depth Project

Post by qiu0717 » Mon Apr 18, 2016 7:07 pm

FRUSTUM DIVISION
Spatial Interaction Project
Weihao Qiu



Concept:

What is happening when we are moving in the space? What effects do we make to the space as we are inside it? They are existent, but unfortunately, they are invisible. My project comes up a way to visualize what we are doing to our space and generate a form, which is the accumulation of our actions within the space.

The main idea is that a space without any user is impact and complete, in contrast, a space will be turbulent and broken into pieces by the user entering it. If we treat the space that a Kinect can detect as a frustum, mostly intuitionally, one of the effects a person inside it would make is to divide the frustum into three sub-frustums, the front, left and right ones.

Inspired by Antony Gormley's project, EXPANSION FIELD, which divides people into parts, presents part as cubes and expand each cube to make it overlap with other cubes, I present each part of the original detected space as sub-frustums, and enlarge all the sub-frustums to make them overlap with each other and hence, to create an unique form.

By recursively dividing the original frustum and its sub-frustums, which means the sub-frustum can also be divided into sub-frustums by user interaction, a intensively overlapped and complicated form would be generated.
2.jpg
Expansion Field
1.jpg
Expansion Field 2
3.jpeg
Spacial Division Illustration
Sketches:
IMG_4447副本.jpg
Sketch 1 Different ways of spatial division
IMG_4452.jpg
Sketch 2 Computation of coordinates

Results:
001000.jpg
When a user enter the space
001383.jpg
When a user enter the space 2
000563.jpg
Frustum is divided by position of the user
The sub-frustums with green stroke represent ones that user is within. Because of overlapping areas, a user can be inside several frustums.
The ones with red stroke represent sub-frustums that are newly generated through spatial division.
This division happens once serval seconds.
004285.jpg
The final form after several times of recursive division
003318.jpg
The final form after several times of recursive division 2
004918.jpg
The final form after several times of recursive division 3
003010.jpg
The final form after several times of recursive division 4
The users' movement trail is documented by a red curve.

*Interaction:
-Space: Stop/Start the automatic rotation.
-D/d: Enable/Disable the space division.
-T/t: Show/Hide the user trails.
-H/h: Show/Hide the interface.
-S/s: Sava a screenshot.

Code:

1. mat265_prj2_withKinect_2.pde

Code: Select all

import peasy.*;
import SimpleOpenNI.*;

PeasyCam cam;
SimpleOpenNI  context;

int w, l, h, maxDistance;
ArrayList<Unit> allUnits;

PVector userLocation;
ArrayList<PVector> userLocations;

int fc, T;
float trailsAlpha;
boolean shouldStop, showTrails, shouldDivide, showGUI;

Button shouldStopButton, showTrailsButton, shouldDivideButton;
PFont font;
void setup() {
  size(1920, 1200, P3D);

  maxDistance = 4000;
  cam = new PeasyCam(this, 600);
  cam.setMinimumDistance(0);
  cam.setMaximumDistance(maxDistance);
  cam.setWheelScale(0.2);


  // Unit Initialization
  //
  allUnits = new ArrayList<Unit>();
  userLocations = new ArrayList<PVector>();
  w =200;
  l = 125;
  h = 500;  
  Unit u = new Unit(new PVector(0, 0, 0), l, w, h);
  u.initItp(u);
  allUnits.add(u);


  // Kinect Setup
  //
  context = new SimpleOpenNI(this);
  context.enableDepth();
  context.enableUser();
  if (context.isInit() == false)
  {
    println("Can't init SimpleOpenNI, maybe the camera is not connected!"); 
    exit();
    return;
  }

  fc = 0;
  T = 35*40;
  trailsAlpha = 10;

  shouldStop = true;
  showTrails = true;
  shouldDivide = false;
  showGUI = true;


  float buttonSize = 100;
  float buttonX1 = 3.8*width/9;
  float buttonX2 = 4.5*width/9;
  float buttonX3 = 5.2*width/9;
  float buttonY = height-height/8;

  shouldStopButton = new Button("rectangle", buttonX1, buttonY, buttonSize, buttonSize/10, shouldStop, 75, 75, 20, 175);
  shouldDivideButton = new Button("rectangle", buttonX2, buttonY, buttonSize, buttonSize/10, shouldDivide, 75, 75, 20, 175);
  showTrailsButton = new Button("rectangle", buttonX3, buttonY, buttonSize, buttonSize/10, showTrails, 75, 75, 20, 175);

  font = loadFont("Futura-Medium-24.vlw");
  smooth();
}
void draw() {
  // ----------------  START Background and Camera settings  ----------------
  if ( frameCount % 200 > 100 ) {
    background(255 - (frameCount%200-100)/5.f);
    trailsAlpha = 200 -  (frameCount%200-100)*1.5;
  } else {
    background(235 + frameCount%200/5.f);
    trailsAlpha = 50 + (frameCount%200)*1.5;
  }

  if (!shouldStop) {
    fc++;
    fc = fc % T;
  }
  cameraMove(fc);
  // ----------------  END Background and Camera settings  ----------------



  // ----------------  START Kinect Part  ----------------
  context.update();

  int[] u = context.userMap();
  int[] d = context.depthMap();
  float sumD = 0;
  int count = 0;
  float sumX = 0, sumY = 0;

  for (int i =0; i<u.length; i++) {
    if (u[i] != 0) {
      count ++;
      sumD = sumD + d[i];

      float xx, yy;
      xx= i % 640;
      yy= i / 640;
      sumX = sumX + xx;
      sumY = sumY + yy;
    }
  }
  float userX, userY, userZ;
  userZ = map(sumD/count, 0, maxDistance, h, 0);
  userX = map(sumX/count, 0, 640, -userZ/h*w/2, userZ/h*w/2);
  userY = map(sumY/count, 0, 480, -userZ/h*l/2, userZ/h*l/2);
  userLocation = new PVector(userX, userY, userZ);
  //  println(userLocation);
  fill(0);
  pushMatrix();
  translate(userLocation.x, userLocation.y, userLocation.z);
  noStroke();
  box(4);
  popMatrix();

  for (int i =0; i<u.length; ) {
    if (u[i] != 0) {
      pushMatrix();
      translate(
      map(i % 640, 0, 640, -userZ/h*l/2, userZ/h*l/2), 
      map(i / 640, 0, 480, -userZ/h*w/2, userZ/h*w/2), 
      userLocation.z);
      stroke(1);
      point(0, 0);
      popMatrix();
    }
    i=i+20;
  }
  // ----------------  END Kinect Part  ----------------



  // ----------------  START Draw Shapes  ----------------

  // Draw every unit
  //
  for (int i = 0; i<allUnits.size (); i++) {
    allUnits.get(i).drawUnit();
    allUnits.get(i).isInside(userLocation);
  }

  // Draw user trails
  //
  if (showTrails) {
    drawUserTrails(userLocations, #BA261A, trailsAlpha, 3);
  }


  // For once a second (40 frames), a unit will be splited. 
  // And a user location will be documented.
  //
  if (frameCount%40 == 1 && shouldDivide) {

    // Document the user's location
    //
    if (!Float.isNaN(userLocation.x)) userLocations.add(userLocation);

    for (int i = 0; i<allUnits.size (); i++) {
      Unit temp = allUnits.get(i);
      if (temp.isInside(userLocation)) {
        temp.splitBy(userLocation);
        allUnits.remove(i);
        for (int j = 0; j < 3; j++) {
          temp.Children.get(j).initItp(temp);
          allUnits.add(temp.Children.get(j));
        }
        break;
      }
    }
  }
  // ----------------  END Draw Shapes  ----------------
  cam.beginHUD();
  if (showGUI) 
    gui();
  //  drawButtons();
  cam.endHUD();
}


void cameraMove(int fc) {
  if (fc < 5*40) {
    rotateX(fc*(PI/200));
  } else if (fc < 10*40 ) {
    rotateY((fc-5*40)*(1.5*PI/200));
  } else if (fc < 15*40 ) {
    rotateY(200*(1.5*PI/200)+(fc- 10*40)*0.5*PI/200);
    translate(0, 0, -(fc- 10*40)*(1000/200.f));
  } else if (fc < 20*40 ) {
    rotateY(PI/2);
    rotateX(PI/8);
    rotateZ(PI/200*(fc - 15*40));
    translate(-200, (fc- 17.5*40)*(750/200.f)*0.4-100, -(fc- 17.5*40)*(750/200.f)*0.8);
  } else if (fc < 25*40) {
    rotateX(PI/2);
    translate(0, 0, -(fc- 23.5*40)*(750/200.f)*0.8);
    //    rotateZ(PI/200*(fc-25*40));
  } else if (fc < 30*40) {
    rotateX(PI/2);
    translate(0, 0, -(25*40- 23.5*40)*(750/200.f)*0.8);
    rotateZ(PI/200*(fc-25*40));
  } else {
    rotateX(PI/2);
    translate(0, 0, -(25*40- 23.5*40)*(750/200.f)*0.8);
    rotateZ(PI);
  }
}

void drawUserTrails(ArrayList<PVector> userLocations, color clr, float alpha, float sw) {
  beginShape();
  strokeWeight(sw);
  stroke(clr, alpha);
  if (userLocations.size()>0) {
    PVector tp = userLocations.get(0);
    curveVertex(tp.x, tp.y, tp.z);
  }
  for (int i = 0; i<userLocations.size (); i++) {
    PVector tp = userLocations.get(i);
    curveVertex(tp.x, tp.y, tp.z);
    //    allUnits.get(i).isInside(userLocation);
  }
  if (userLocations.size()>1) {
    PVector tp = userLocations.get(userLocations.size()-1);
    curveVertex(tp.x, tp.y, tp.z);
  }
  strokeWeight(1);
  endShape();
}

void gui() {

  shouldStopButton.draw();
  showTrailsButton.draw();
  shouldDivideButton.draw();

  fill(20);
  textAlign(CENTER, TOP);
  textFont(font, height/100);
  text("Rotation", shouldStopButton.x, shouldStopButton.y+shouldStopButton.h);
  text("Divide", shouldDivideButton.x, shouldDivideButton.y+shouldDivideButton.h);
  text("Trails", showTrailsButton.x, showTrailsButton.y+showTrailsButton.h);
  textFont(font, height/75);
  text("FRUSTRUM  DIVISION", width/2, shouldDivideButton.y-4*shouldDivideButton.h);
  //  showTrailsButton, 
  //  shouldDivideButton, 
  //  showGUIButton;
}
void keyPressed() {
  if (key == ' ') {
    shouldStop = !shouldStop;
  }
  if (key == 'd' || key == 'D') {
    shouldDivide = !shouldDivide;
  }
  if (key == 'T' || key == 't') {
    showTrails = !showTrails;
  }
  if (key == 'H' || key == 'h') {
    showGUI = !showGUI;
  }
  if (key == 'S' || key == 's') {
    saveFrame("######.jpg");
  }
}

void mousePressed() {
  if (shouldStopButton.mouseOnButton()) {
    shouldStop = !shouldStop;
    println("shouldStopButton"+ shouldStopButton.status);
    shouldStopButton.status = shouldStop;
  }
  if (showTrailsButton.mouseOnButton()) {
    showTrails = !showTrails;
    println("showTrailsButton" + showTrails);
    showTrailsButton.status = showTrails;
  }
  if (shouldDivideButton.mouseOnButton()) {
    shouldDivide = !shouldDivide;
    println("shouldDivideButton"+shouldDivide );
    shouldDivideButton.status = shouldDivide;
  }
}

2. Button.pde

Code: Select all

class Button {
  String rectOrCircle;
  float x;
  float y;
  float w;
  float h;
  boolean status;
  color pressedColor;
  color hoverColor;
  color onColor;
  color offColor;
  //PImage activeImg, hoverImg, inactiveImg;

  Button (String _rectOrCircle, 
    float _x, 
    float _y, 
    float _w, 
    float _h, 
    boolean _status, 
    color _pressedColor, 
    color _hoverColor, 
    color _onColor, 
    color _offColor) 
  {
    rectOrCircle = _rectOrCircle;
    x = _x;
    y = _y;
    w = _w;
    h = _h;
    status = _status;
    pressedColor = _pressedColor;
    hoverColor = _hoverColor;
    onColor = _onColor;
    offColor = _offColor;
  }

  void draw() {
    noStroke();
    if (this.mouseOnButton()) {
      if (mousePressed) {
        if (status)
          fill(onColor);
        else 
        fill(offColor);
      } else fill(hoverColor);
    } else {
      if (status) fill (onColor);
      else fill(offColor);
    }
    if (rectOrCircle.equals("rectangle")) {
      rectMode(CENTER);
      rect(x, y, w, h);
    } else if (rectOrCircle.equals("circle"))
    {
      ellipse(x, y, w, h);
    }
  }

  //  Button setActiveImg (PImage aImg) {
  //    activeImg = aImg;
  //    return this;
  //  }
  //  Button setHoverImg (PImage hImg) {
  //    hoverImg = hImg;
  //    return this;
  //  }
  //  Button setInactiveImg (PImage iImg) {
  //    activeImg = iImg;
  //    return this;
  //  }

  boolean mouseOnButton() {
    return (between(mouseX, x-w/2, x+w/2) && between(mouseY, y-h/2, y+h/2));
  }

  void setStatus(boolean _status) {
    status = _status;
  }
  void changeStatus() {
    status = !status;
  }
}

boolean between(float a, float b, float c ) {
  return ((a>=b)&&(a<=c)) || ((a>=c)&&(a<=b));
}
3. Integrator.pde

Code: Select all

  /*

Original Author: Ben Fry
From book Visualizing Data

*/

class Integrator {

  final float DAMPING = 0.5f;
  final float ATTRACTION = 0.2f;

  float value;
  float vel;
  float accel;
  float force;
  float mass = 1;

  float damping = DAMPING;
  float attraction = ATTRACTION;
  boolean targeting;
  float target;


  Integrator() { }


  Integrator(float value) {
    this.value = value;
  }


  Integrator(float value, float damping, float attraction) {
    this.value = value;
    this.damping = damping;
    this.attraction = attraction;
  }


  void set(float v) {
    value = v;
  }


  void update() {
    if (targeting) {
      force += attraction * (target - value);      
    }

    accel = force / mass;
    vel = (vel + accel) * damping;
    value += vel;

    force = 0;
  }


  void target(float t) {
    targeting = true;
    target = t;
  }


  void noTarget() {
    targeting = false;
  }
}
4. unit.pde

Code: Select all

class Unit {
  float l, w, h, a;
  PVector pv;
  Unit Parent;
  ArrayList<Unit> Children;
  Integrator itpL, itpW, itpH, itpX, itpY, itpZ, alpha;
  boolean userInside;

  Unit(PVector _pv, float _l, float _w, float _h) {
    pv = _pv;
    l = _l;
    w = _w;
    h = _h;
    Parent = null;
    Children = new ArrayList<Unit>();
    a = 30;
    userInside = false;
  }

  void initItp( Unit fu) {
    float p1 = 0.6;
    float p2 = 0.05;
    itpL = new Integrator(fu.l, p1, p2);
    itpW = new Integrator(fu.w, p1, p2);
    itpH = new Integrator(fu.h, p1, p2);
    itpX = new Integrator(fu.pv.x, p1, p2);
    itpY = new Integrator(fu.pv.y, p1, p2);
    itpZ = new Integrator(fu.pv.z, p1, p2);
    alpha = new Integrator(fu.a, p1, p2);
    itpL.target(l);
    itpW.target(w);
    itpH.target(h);
    itpX.target(pv.x);
    itpY.target(pv.y);
    itpZ.target(pv.z);
    itpZ.target(pv.z);
  }
  void drawUnit() {
    itpL.update();
    itpW.update();
    itpH.update();
    itpX.update();
    itpY.update();
    itpZ.update();
    alpha.update();
    a = alpha.value;
    if (frameCount % 10 == 0 ) alpha.target(random(10, 200));
    pushMatrix();
    //translate(pv.x, pv.y, pv.z);
    translate(itpX.value, itpY.value, itpZ.value);
    //box(5);
    stroke(0, 20);
    rectMode(CENTER);
    //fill(0,5);
    strokeWeight(0.5);
    stroke(0, 100);
    noFill();
    //rect(0, 0, 2*w, 2*l);
    rect(0, 0, 2*itpW.value, 2*itpL.value);
    strokeWeight(5);   
    stroke(0, 200); 
    point(0, 0, h);
    strokeWeight(1);

    if (userInside) {
      stroke(#007615, 200);
    } else {
      if ((itpL.value - l)<0.1)
        stroke(0, a);
      else stroke(#BA261A, 200);
    }


    beginShape();

    vertex(-itpW.value, itpL.value, 0);
    vertex(-itpW.value, -itpL.value, 0);
    vertex(0, 0, itpH.value);
    vertex(-itpW.value, itpL.value, 0);
    vertex(itpW.value, itpL.value, 0);
    vertex(itpW.value, -itpL.value, 0);
    vertex(0, 0, itpH.value);
    vertex(itpW.value, itpL.value, 0);
    vertex(itpW.value, -itpL.value, 0);
    vertex(-itpW.value, -itpL.value, 0);
    endShape();
    popMatrix();
  }

  void splitBy(PVector sp) {
    float[] offset = new float[3];
    offset[0] = 1;
    offset[1] = 1;
    offset[2] = 1;

    PVector asp1 = new PVector(pv.x-1, sp.y, sp.z);
    float ratio1 = 1-(abs(pv.z-sp.z)/h);
    Unit c1 = new Unit(asp1, l*ratio1, w*ratio1, h*ratio1);

    float dX = sp.x-pv.x;
    float dZ = sp.z-pv.z;
    PVector asp2 = new PVector();
    asp2.x = (2*h*dX+dZ*w-w*h)/(4*h);
    asp2.z = pv.z;
    asp2.y = pv.y;
    float h2 = (2*h*dX+dZ*w+w*h)/(2*w);
    float ratio2 = (2*h*dX+dZ*w+w*h)/(2*h*w);
    Unit c2 = new Unit(asp2, l*ratio2, w*ratio2, h*ratio2);

    PVector asp3 = new PVector();
    asp3.x = (w*h+2*h*dX-dZ*w)/(4*h);
    asp3.z = pv.z-1;
    asp3.y = sp.y;
    float h3 = (2*h*dX+dZ*w+w*h)/(2*w);
    float ratio3 = (w*h-2*h*dX+dZ*w)/(2*h*w);
    Unit c3 = new Unit(asp3, l*ratio3, w*ratio3, h*ratio3);

    Children.add(c1);
    Children.add(c2);
    Children.add(c3);
  }
  boolean isInside(PVector sp) {
    float dX, dY, dZ, raito;
    
    dX = sp.x - itpX.value;
    dY = sp.y - itpY.value;
    dZ = sp.z - itpZ.value;    

    float ratio = dZ/h;
    if (isBetween(dZ, 0, h)) {
      if (isBetween(dX, -l/2*ratio, l/2*ratio) && isBetween(dY, -w/2*ratio, w/2*ratio)) {
        userInside = true;
        return true;
      }
    }
    userInside = false;
    return false;
  }
}

boolean isBetween(float a, float b, float c) {
  return (a>=b)&&(a<=c);
}


Reference:

Antony Gormley, EXPANSION FIELD, 2014 - 2015, http://www.antonygormley.com/projects/i ... /id/307#p0
Last edited by qiu0717 on Mon Apr 18, 2016 11:31 pm, edited 2 times in total.

zhenyuyang
Posts: 9
Joined: Fri Apr 01, 2016 2:34 pm

Re: Proj 2: Depth Project

Post by zhenyuyang » Mon Apr 18, 2016 10:07 pm

Releif sculptures - Zhenyu Yang


Concept
This project involves two types of depth: detailed depth and overall depth. Detailed depth mainly embodies on the surface of the 3D models captured in real-time, which show user's posture, motion, and even expression. Overall depth mainly reflects in the positions of sculptures, which are directly mapped from the user's real positions.

The scene starts as an empty space with a wood floor. When a user walks in this room, his postures will be recorded by a Kinect and built into a 3D relief sculpture model in the virtual space. Once a relief sculpture is created, it will stay in the same position on the wood floor. These sculptures also give a sense of time in space: Each model of the user is static, just like the time if frozen at an instance. Moreover, a user can walk back to his sculptures and look at himself, or in another word, look into the past.

Sculptures in this project are created into mesh models so they can be further manufactured by 3D printing.

Progress
- Mapping user's position to positions in the virtual space [Done]
- Constructing a filter to process depth data from Kinect [Done]
- Constructing user's sculptures by using point cloud models [Done]
- Constructing user's sculptures by using mesh models [Done]
- Clean up noise data in mesh models [Done]
- Calibrating the distance range to model user's sculptures precisely [Done]
- Adjusting lighting [Done]
- Setting up a user system to store sculpture mesh models[Done]

Interation
Press P to stop drawing new sculptures.


Screenshots
1.png
2.png
3.png
4.png
5.png

Code

Code: Select all


//  This is a project for the depth concetp
//  This would be used for MAT265 course by Prof. George Legrady in 2016 Spring.
// 
//  This program only runs in Processing 2 since some of libraries are not supported in other version of Processing. 
//  Please sucessfully connect a Kinect or Xtion before run this program.
//
//  Author: Zhenyu Yang
//  Date: 04/12/2016
//
//  Thanks for help from  Hilda Chang HE and guide by Prof. George Legrady.
//
//  Interation: Press P key to stop drawing new sculpture.


import peasy.*;
import SimpleOpenNI.*;
import java.util.Iterator;
PeasyCam cam;

SimpleOpenNI context;

boolean photo = true;
boolean ifSculpture = true;
int         steps           = 4; // to speed up the drawing, draw every third point 3

PImage img;
PImage ground;
PImage maskImage;
// width, height are given as window sizes
int screen_width = 1024; 
int screen_height = 700; 
float depth = 500;
float camDepth = 2000;

// Distance between the camera and the screen in the virtual space.
float camera_screen_distance = 4500; //500

//Boundary and initial values 
float left = -100;
float right = 100;
float top = -100;
float bottom = 100;
float back = 50;
float front = -500;

// SIZES
int canvasWidth  = 640;
int canvasHeight = 480;

int kinectWidth  = 640;
int kinectHeight = 480;

int[] depthValues;

int minDistance  = 500;  // 50cm
int maxDistance  = 1500; // 1.5m
int distanceThreshould = 250;

float groundScale = 0.8;

int a=0;
int bb =0;
int c = 0;

// Initialize OpenNI or not (for debugging).
boolean initialize_openni = true;
// Show hand positions (for debugging).
boolean show_Com_vectors = true;
boolean draw_boundary = true;
boolean draw_path = true;

// User tracking information.
HashMap<Integer, UserTrackingInfo> user_tracking_info = new HashMap<Integer, UserTrackingInfo>();

//avoid concurency modification
Object lock = new Object();

// Previous update time.
double previous_update_time;

//User location
PVector com;

//initiate time
float time;
int fadeSpeed = 100; // larger -> faster
int alphaThreshold = 80;//transparency

//distance measurement
float distance = 0;

//parameters for sculptures
float       strokeW         = 0.6;

PVector   s_rwp = new PVector(); // standarized realWorldPoint;
int       kdh;
int       kdw;
int       max_edge_len = 50000; //50
float     strokeWgt = 0;//0.4
int       i00, i01, i10, i11; // indices
PVector   p00, p10, p01, p11; // points
PVector   k_rwp; // kinect realWorldPoint;

PImage marble;
PShape square;

int red = 225;
int green = 225;
int blue = 255;

void setup() {
  ground = loadImage("woods.jpg");
  marble = loadImage("marble.jpg");
  maskImage  = createImage(kinectWidth, kinectHeight, RGB);

  if (ifSculpture) {
    frameDistanceThreshould = 200;
    cam = new PeasyCam(this, width / 2, height / 2, -depth - camera_screen_distance, back + camDepth);
  } else {
    frameDistanceThreshould = 25;
    cam = new PeasyCam(this, 1200);
  }

  // Initialize screen.
  // OpenGL rendering.
  size(screen_width, screen_height, P3D);

  // Initialize OpenNI.
  if (initialize_openni) {
    context = new SimpleOpenNI(this);
    if (context.isInit() == false) {
      println("Can't init SimpleOpenNI, maybe the camera is not connected!");
      exit();
      return;
    }
  } else {
    context = null;
  }
  if (context != null) {
    // Disable mirror.
    context.setMirror(true);
    // Seems it stops randomly crashing if we enable depth.
    context.enableDepth();
    // Enable user tracking.
    context.enableUser();
    context.enableRGB();
    context.update();
    com = new PVector(); // user location
  }
  //peasyCam
  //set frameRate
  frameRate(20);
  //set time
  time = 0.00;
  //hint(DISABLE_DEPTH_TEST);
  kdh = context.depthHeight();
  kdw = context.depthWidth();

  square = createShape(BOX, kdh, kdw, 50);
  square.scale(0.5); //0.1+1.0/(2*p.z));
  //square.setTexture(marble);
}

// Translate OpenNI space into image space.
// GL: translates the sensing distance from /real space sensing to virtual space
PVector translateOpenNIPosition(PVector position) {
  float x = -position.x;
  float y = -position.y;
  float z = position.z;// (position.z - 1000);
  return new PVector(x, y, z);
}

// Update hand positions.
// We don't actually use this, but this function could be useful.
void updateHandPoses() {
  // Calculate dt (in seconds).
  double current_time = millis();
  float dt = (float)(current_time - previous_update_time) / 1000.0;
  previous_update_time = current_time;

  // Read OpenNI data and update user hand positions and velocities.
  if (context != null) {
    // Get forces.
    int[] user_list = context.getUsers();
    for (int i = 0; i < user_list.length; i++) {
      if (context.isTrackingSkeleton(user_list[i])) {
        int user_id = user_list[i];
        PVector left_hand_position = new PVector();
        PVector right_hand_position = new PVector();
        float left_hand_confidence = context.getJointPositionSkeleton(user_id, SimpleOpenNI.SKEL_LEFT_HAND, left_hand_position);
        float right_hand_confidence = context.getJointPositionSkeleton(user_id, SimpleOpenNI.SKEL_RIGHT_HAND, right_hand_position);
        left_hand_position = translateOpenNIPosition(left_hand_position);
        right_hand_position = translateOpenNIPosition(right_hand_position);
        UserTrackingInfo info = user_tracking_info.get(user_id);
      }
    }
  }
}
//Update user position 
void updateUserPoses() {
  if (context != null) {
    int[] user_list = context.getUsers();

    for (int i = 0; i < user_list.length; i++) {
      //println("test for distance use " + i +":  "+ user_list[i]);
      if (context.getCoM(user_list[i], com)) {
        int user_id = user_list[i];
        if (com != null && user_tracking_info.get(user_id) != null&&photo) {
          if (ifSculpture) {
            PVector[] depthPoints = context.depthMapRealWorld(); 
            float thres = 200;
            //float thres = com.z+distanceThreshould;
            if (strokeWgt == 0) 
              noStroke();
            else strokeWeight(strokeWgt);
            for (int y = 0; y < kdh; y+=steps) {
              for (int x = 0; x < kdw; x+= steps) { 
                int counter = y * kdw + x;
                PVector p = depthPoints[counter];
                // if the point is on the edge or if it has no depth
                if (p.z < com.z - thres || p.z > com.z+thres || y == 0 || y == kdh - steps || x == 0 || x == kdw - steps) {
                  // replace it with a point at the depth of the backplane (i.e. maxZ)
                  PVector realWorld = new PVector();
                  //PVector projective = new PVector(x*(thres*1.0/p.z), y*(thres*1.0/p.z), thres);
                  PVector projective = new PVector(x, y, com.z+thres);
                  // to get the point in the right place, we need to translate
                  // from x/y to realworld coordinates to match our other points:
                  context.convertProjectiveToRealWorld(projective, realWorld);
                  depthPoints[counter] = realWorld;
                  //depthPoints[i] = projective;
                }
              }
            }
            PShape tempShape  = createShape();
            tempShape.beginShape(TRIANGLES);  

            for (int y=0; y < kdh-steps; y+=steps) {
              int y_steps_kdw = (y+steps)*kdw;
              int y_kdw = y * kdw;
              for (int x=0; x < kdw-steps; x+=steps)
              {
                i00 = x + y_kdw;
                i01 = x + y_steps_kdw;
                i10 = (x + steps) + y_kdw;
                i11 = (x + steps) + y_steps_kdw;
                p00 = depthPoints[i00];
                p01 = depthPoints[i01];
                p10 = depthPoints[i10];
                p11 = depthPoints[i11];
                if ((p00.z > 0) && (p01.z > 0) && (p10.z > 0) && // check for non valid values
                (abs(p00.z-p01.z) < max_edge_len) && (abs(p10.z-p01.z) < max_edge_len)) { // check for edge length
                  tempShape.vertex(p00.x, p00.y, p00.z, x, y); // x,y,x,u,v   position + texture reference
                  tempShape.vertex(p01.x, p01.y, p01.z, x, y+steps);
                  tempShape.vertex(p10.x, p10.y, p10.z, x+steps, y);
                }

                if ((p11.z > 0) && (p01.z > 0) && (p10.z > 0) &&(abs(p11.z-p01.z) < max_edge_len) && (abs(p10.z-p01.z) < max_edge_len)) {
                  tempShape.vertex(p01.x, p01.y, p01.z, x, y+steps);
                  tempShape.vertex(p11.x, p11.y, p11.z, x+steps, y+steps);
                  tempShape.vertex(p10.x, p10.y, p10.z, x+steps, y);
                }
                // endShape();
              }
            }
            tempShape.scale(0.1+1.0/(2*com.z)); 
            tempShape.endShape();
            user_tracking_info.get(user_id).update(new PVector(com.x, com.y, com.z), tempShape, (random(PI)-PI/2.0));
          } else {
            depthValues = context.depthMap();
            maskImage.loadPixels();
            for (int pic = 0; pic < depthValues.length; pic++)
              //if (depthValues[pic] > minDistance && depthValues[pic] < maxDistance)
              if (depthValues[pic] > com.z-distanceThreshould && depthValues[pic] < com.z+distanceThreshould)
                // IN RANGE: WHITE PIXEL
                maskImage.pixels[pic] = color(255);
              else
                maskImage.pixels[pic] = color(0);
            maskImage.updatePixels();
            // BLUR THE B/W IMAGE
            superFastBlur(3);
            // MASK THE RGB CAM IMAGE
            PImage rgbImage =  context.rgbImage();
            rgbImage.mask(maskImage);
            PImage temp = new PImage();
            temp = rgbImage.get(0, 0, rgbImage.width, rgbImage.height); 
            user_tracking_info.get(user_id).update(new PVector(com.x, com.y, com.z), temp);
          }
        }
      }
    }
  }
}

//Update the box boundary, which is the border of our movement. 
void updateBoxBoundary() {
  // Read OpenNI data and update boundary
  if (context != null) {
    // iteration
    int[] user_list = context.getUsers();
    for (int i = 0; i < user_list.length; i++) {
      if (context.getCoM(user_list[i], com)) {
        PVector p = translateOpenNIPosition(com);
        //println("User Location: x(" + p.x + ") y(" + p.y + ") z(" + p.z+")");
        if (p.x < left ) {
          left = p.x;
          //println("Update LEFT, x = " + left );
        } else if (p.x > right) {
          right = p.x; 
          //println("Update RIGHT, x = ", + right );
        }
        if (p.y < top ) {
          top = p.y;
          //println("Update TOP, y = " + top );
        } else if (p.y > bottom ) {
          bottom = p.y;
          //println("Update BOTTOM, y = " + bottom );
        }
        if (p.z > back) {
          back = p.z;
          //println("Update BACK, z = " + back );
          //change camera distance
          cam.setDistance(back + camDepth);
        }
      }
    }
  }
}
void draw() {
  background(0);

  pointLight(red, green, blue, 10000, -10000, -20000);
  pointLight(red, green, blue, 0, 0, 20000);
  pointLight(red/3, green/3, blue/3, -1000, 0, 0);
  pointLight(red/5, green/5, blue/5, 0, 10000, 0);
  pointLight(red, green, blue, 0, -20000, 0);

  if (ifSculpture) {
    pushMatrix();
    translate(-1000, 100, 0);
    for (int i = 0; i<50; i++) {
      for (int j = 0; j<50; j++) {
        pushMatrix();
        translate(i*100, 0, j*100);
        line(-100, 0, 0, 100, 0, 0);
        //rotateY(PI/2);
        line(0, 0, -100, 0, 0, 100);
        popMatrix();
      }
    }
    popMatrix();
  }
  if (context != null) context.update();
  synchronized (lock) {
    //updateHandPoses();
    updateBoxBoundary();
    updateUserPoses();
    //translate(width / 2, height / 2, -depth - camera_screen_distance);

    Iterator<Integer> it;


    //draw user position 
    if (show_Com_vectors) {
      stroke(255, 255, 255);
      fill(255, 255, 255);

      it = user_tracking_info.keySet().iterator();
      if (context != null) {
        // iteration
        int[] user_list = context.getUsers(); // =========================
        for (int i = 0; i < user_list.length; i++) {
          if (context.getCoM(user_list[i], com)) {
            PVector p = translateOpenNIPosition(com);
            //println("User Location: x(" + p.x + ") y(" + p.y + ") z(" + p.z+")");
            drawBox(p.x, p.y, p.z);
            float dist = sqrt((p.x/22.7)*(p.x/22.7) + (p.y/22.7)*(p.y/22.7) + (p.z/22.7)*(p.z/22.7));
            fill(255);
            textSize(24);
            text("Distance: " + (int)dist + " inch", p.x-25, p.y+25, p.z);
          }
        }
      }
    }
    if (draw_path) {
      it = user_tracking_info.keySet().iterator();
      if (context != null) {
        while (it.hasNext ()) {
          int user_id = it.next();
          UserTrackingInfo info = user_tracking_info.get(user_id);

          stroke(info.colour.x, info.colour.y, info.colour.z);
          for (int i = 1; i < info.positions.size (); i++) {
            //println("i: " + i);
            PVector p = translateOpenNIPosition(info.positions.get(i));
            if (ifSculpture) {
              PShape shape = info.shapes.get(i); 
              pushMatrix();
              translate(p.x, -70, p.z*2);
              rotateY(PI);
              rotateZ(PI);
              rotateY(info.phases.get(i));
              shape(shape);
              translate(0, 0, p.z*(0.1+1.0/(2*p.z))+25);
              shape(square);
              popMatrix();
            } else {
              PImage rgbImage = info.photos.get(i);
              pushMatrix();
              translate(p.x, p.y, p.z);
              image(rgbImage, 0, 0, 100, 100);
              popMatrix();
            }
          }
        }
      }
    }
  }

  time += 0.01;
}

//draw a small cude by given parameters
void drawBox(float x, float y, float z) {
  pushMatrix();
  translate(x, y, z);
  box(20, 20, 20);
  popMatrix();
}


// -----------------------------------------------------------------
// SimpleOpenNI user events
//event handler when found new user
void onNewUser(SimpleOpenNI current_context, int user_id) {
  synchronized (lock) {
    println("New user: id = " + user_id + ", started tracking.");
    user_tracking_info.put(user_id, new UserTrackingInfo());
    //context.startTrackingSkeleton(user_id);
  }
}

//event handler when lost track of this user
void onLostUser(SimpleOpenNI current_context, int user_id) {
  synchronized (lock) {
    println("Lost user: id = " + user_id);
    UserTrackingInfo info = user_tracking_info.get(user_id);
    //user_tracking_info.remove(user_id);
  }
}

void keyPressed() {
  if (key == 'p' || key== 'P') {
    photo =! photo;
  }
}


YouTube Link:
https://youtu.be/hdCboH-hLbA

Reference
1."HildaDemo" example.
2.Super Fast Blur Filter (Based on Mario Klingemann's work)
3.Greg.Borenstein <Making Things See> (2012.01), Page 301 ro Page 344
Attachments
sculpture.zip
Source code for downloading and testing.
(20.91 MiB) Downloaded 186 times
Last edited by zhenyuyang on Thu May 19, 2016 1:47 am, edited 6 times in total.

lliu
Posts: 9
Joined: Wed Jan 06, 2016 1:41 pm

Re: Proj 2: Depth Project

Post by lliu » Tue Apr 19, 2016 12:41 pm

-----------------------------------

WILD TRAILS
The polygon overlay by object moving

LU LIU
-------------------------------------------------------

CONCEPT:
I used Heptagons to track the movement of object (for example: people). The kinect device recorded closest depth point every second. And every heptagon was drawn by latest seven points in this points list. The filled color is random grey color.

CODE:

Code: Select all

import wblut.math.*;
import wblut.processing.*;
import wblut.core.*;
import wblut.hemesh.*;
import wblut.geom.*;
import java.util.List;
import peasy.*;
boolean beginShape=false;
PeasyCam cam;

import processing.opengl.*;
import SimpleOpenNI.*;
SimpleOpenNI kinect;

float closestValue;
float closestX;
float closestY;

HE_Mesh mesh;
WB_Render3D render;
HEC_ConvexHull creator=new HEC_ConvexHull();

int counter=1;

List<WB_Point> points;
//List<WB_Point> currentpoints;
ArrayList<List<WB_Point>> shapes= new ArrayList<List<WB_Point>>(50);

int col;
color[] colarray= new color[50];
ArrayList<Float> trail= new ArrayList<Float>();

int numpoints;

void setup() {
  size(1152, 720, OPENGL);
  cam = new PeasyCam(this, 800);
  smooth(8);
  render= new WB_Render3D(this);
  points= new ArrayList<WB_Point>();
  //  array=new float[Max][3];
  //frameRate(1);
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));
  points.add(new WB_Point(random(-200, 200), random(-200, 200), random(200, 400)));


  kinect = new SimpleOpenNI(this);
  kinect.enableDepth();
  kinect.setMirror(true);

  for (int i=0; i<50; i++) {
    shapes.add(new ArrayList<WB_Point>());
  }

  for (int i=0; i<50; i++) {
    shapes.add(new ArrayList<WB_Point>());
  }


  shapes.get(0).add(points.get(0));
  shapes.get(0).add(points.get(1));
  shapes.get(0).add(points.get(2));
  shapes.get(0).add(points.get(3));
  shapes.get(0).add(points.get(4));
  shapes.get(0).add(points.get(5));
  shapes.get(0).add(points.get(6));


  //colarray[0]=color(random(0, 255), random(0, 255), random(0, 255));
  colarray[0]=color(random(0, 255));
}

void draw() {
  closestValue=600;

  background(50);  
  kinect.update();
  translate(0, 0, -1000);  

  //rotateY(radians(map(mouseX, 0, width, -180, 180)));
  //    rotateY(frameCount*0.001);
  //    rotateX(frameCount*0.001);

  PVector[] depthPoints = kinect.depthMapRealWorld(); 
  for (int i = 0; i < depthPoints.length; i++) {
    // get the current point from the point array
    PVector currentPoint = depthPoints[i];
    // draw the current point
    //    point(currentPoint.x, currentPoint.y, currentPoint.z);
    //    if (currentPoint.z > 610 && currentPoint.z < 1525) {
    if (currentPoint.z > 410 && currentPoint.z < 1725) {
      if (currentPoint.z>closestValue) {
        closestValue= currentPoint.z;
        closestX=currentPoint.x;
        closestY=currentPoint.y;
        //        println(closestX + " " + closestY + " " + closestValue);
        //        println("YEAH");
        pushMatrix();
        translate(closestX, closestY, closestValue);
        fill(255, 0, 0);
        box(40);
        popMatrix();
      }
    }
  }


  creator.setPoints(shapes.get(0));
  mesh=new HE_Mesh(creator); 
  col=colarray[0];
  stroke(col, 30);
  render.drawEdges(mesh);
  fill(col, 30);
  render.drawFaces(mesh);

  for (int i=1; i<counter; i++) {
    creator.setPoints(shapes.get(i));
    mesh=new HE_Mesh(creator); 
    col=colarray[i];
    println(counter);
    stroke(122, 125);
    strokeWeight(1);
    render.drawEdges(mesh);
    fill(col, 128);
    render.drawFaces(mesh);
    println("Counter: " + counter);
    println("Size: " + trail.size());
    if (trail.size()>5) {
      for (int j=0; j<counter-2; j++) {
        strokeWeight(2);
        stroke(255);
        line(trail.get(0+(3*j)), trail.get(1+(3*j)), trail.get(2+(3*j)), trail.get(3+(3*j)), trail.get(4+(3*j)), trail.get(5+(3*j)));
      }
    }

    if (beginShape) {  
      if (frameCount%60 ==0) { 
        points.add(new WB_Point(closestX, closestY, closestValue));  
        trail.add(closestX);
        trail.add(closestY);
        trail.add(closestValue);
        println(trail.get(0));
        //  colarray[counter]= color(random(0, 255), random(0, 255), random(0, 255));
        colarray[counter]= color(random(0, 255));
        //  
        //  for (int i=0; i<points.size(); i++) {
        //    shapes.get(counter).add(points.get(i));
        //    if(points.size()>4){
        //      line(points.get(points.size()-2));}
        //  }
        for (int i=0; i<7; i++) {
          shapes.get(counter).add(points.get(counter+i));
        }
        counter+=1;
        println(closestX + " " + closestY + " " + closestValue);
      }
    }
  }
}

void keyPressed() {
  if (key==' ') {
    beginShape=!beginShape;
  }
}
RESULTS:
Color Version:
QQ20160419-2@2x.png
Bla&White Version:
QQ20160418-3@2x.png
QQ20160418-5@2x.png
QQ20160418-4@2x.png
QQ20160418-6@2x.png
QQ20160419-1@2x.png
REFERENCE:
1."Making Things See" by Greg Borenstein
2.WBLUT Library
Last edited by lliu on Tue Apr 19, 2016 3:18 pm, edited 3 times in total.

junxiangyao
Posts: 10
Joined: Wed Jan 06, 2016 1:38 pm

Re: Proj 2: Depth Project

Post by junxiangyao » Tue Apr 19, 2016 12:44 pm

Description
In this project, I tracked my right hand detected by kinect to create an abstract structure. In the tracking process, the trajectory of the movement of my right hand will be saved in an array list. Besides, I want the thickness of the trajectory to be decided by the speed of my movement. If I move really slow, around my hand, there will be more points drawn randomly, and the range of the randomness will grow as I move really slow. And I used line() and bezier() to connect those points. To enhance the effect, I used the dist() to calculate the distance between the current position of the newest point and several points I draw before. If the result is less than a certain threshold, these the newest point and a previously drawn point will be connect by line. This time the difference in thickness can be really obvious. Besides, I set an sphere in the screen as an switch to turn the drawing processing on and off and a reference object when I move my hand.

Screenshot
Screen Shot 2016-04-21 at 11.35.51 AM.png
屏幕快照 2016-04-19 上午1.29.23.png
屏幕快照 2016-04-19 下午2.16.25.png
Original 2D version
屏幕快照 2016-04-19 下午4.45.30.png
Code

Code: Select all

/********************************************************
 * MAT265 PROJ.2   Depth Project                       *
 *                                                      *
 * Junxiang Yao                                         * 
 *                                                      *
 *                                                      *
 *                                                      *
 * Press S to show / hide the skeleton.                 *
 *                                                      *
 * Press B to show / hide the reference sphere.         *
 *                                                      *
 * Press O to turn on / off the rotation mode.          *
 *                                                      *
 * Press R to show / hide the righthand tracking point. *
 *                                                      *
 ********************************************************/




import peasy.*;
import peasy.org.apache.commons.math.*;
import peasy.org.apache.commons.math.geometry.*;
import peasy.test.*;

PeasyCam cam;

import processing.opengl.*;
import SimpleOpenNI.*;
SimpleOpenNI kinect;



ArrayList <ArrayList<PVector>> dots = new ArrayList <ArrayList<PVector>> ();
PVector rightHand;
PVector pRightHand;
float pMouseX, pMouseY;
ArrayList <PVector> test = new ArrayList <PVector>();
int frameNum = 0;
ArrayList <PVector> position = new ArrayList <PVector>();
PVector speedChecker;
int tubeWidth = 8;
boolean connect = true;
boolean skeleton = true;
boolean start = false;
boolean isInSphere = false;
boolean ball = true;
boolean left = false;
boolean right = true;
boolean rotate = true;

int r = 150;
int spherePos = 1500;
float angle = 0;
ArrayList <PVector> spherePoints = new ArrayList <PVector>();

int boxSize = 150;
PVector boxCenter = new PVector(0, 0, 1000);
// this will be used for zooming
// start at normal

void setup() {
  size(1024, 768, OPENGL);
  kinect = new SimpleOpenNI(this);
  kinect.enableDepth();
  kinect.setMirror(true);
  cam = new PeasyCam(this, 4000);
  cam.setMinimumDistance(200);
  cam.setMaximumDistance(5000);
  kinect.enableUser();
  rightHand = new PVector(0, 0, 0);
  pRightHand = new PVector(0, 0, 0);
}



void draw() {
  background(0);
  kinect.update();
  rotateX(radians(180));
  if (rotate) {
    if (start) {
      rotateY(angle);
      angle+= 0.01;
      if (angle > TWO_PI) {
        angle = 0;
      }
    }
    if (!start&&angle != 0) {
      if (angle>PI) {
        if (angle>TWO_PI-0.05) {
          angle = 0;
          rotateY(angle);
        } else {
          angle+= 0.2;
          rotateY(angle);
        }
      }
      if (angle<=PI) {
        if (angle<0.05) {
          angle = 0;
          rotateY(angle);
        } else {
          angle-= 0.2;
          rotateY(angle);
        }
      }
    }
  }
  // bumped up the translation
  // so that scale is better centered
  translate(0, 0, 0);

  IntVector userList = new IntVector();
  kinect.getUsers(userList);
  if (ball) {
    pushMatrix();
    translate(0, 0, spherePos);
    noStroke();
    fill(255);
    if (start) {
      fill(200);
    }    
    sphere(r);
    popMatrix();
  }



  ////  float boxAlpha = map(depthPointsInBox, 0, 1000, 0, 255);  
  //  translate(boxCenter.x, boxCenter.y, boxCenter.z);
  ////  fill(255, 0, 0, boxAlpha);
  //  stroke(255, 0, 0);
  //  box(boxSize);


  //user__________________________________________


  if (userList.size() > 0) {
    // get the first user
    int userId = userList.get(0);

    // if we’re successfully calibrated
    if ( kinect.isTrackingSkeleton(userId)) {
      if (skeleton) {
        drawSkeleton(userId);
      }

      PVector rightHand = new PVector();
      kinect.getJointPositionSkeleton(userId, 
      SimpleOpenNI.SKEL_RIGHT_HAND, 
      rightHand);
      PVector leftHand = new PVector();
      kinect.getJointPositionSkeleton(userId, 
      SimpleOpenNI.SKEL_LEFT_HAND, 
      leftHand);
      stroke(255, 0, 0);
      strokeWeight(10);
      if (right) {
        point(rightHand.x, rightHand.y, rightHand.z);
      }
      stroke(0, 255, 0);
      if (left) {
        point(leftHand.x, leftHand.y, leftHand.z);
      }

      if (dist(rightHand.x, rightHand.y, rightHand.z, leftHand.x, leftHand.y, leftHand.z)<20) {
        start = !start;
      } 

      if (dist(rightHand.x, rightHand.y, rightHand.z, 0, 0, spherePos)<r*2-20&&!isInSphere) {
        start = !start;
        isInSphere = true;
        print("switch!");
      } 
      if (dist(rightHand.x, rightHand.y, rightHand.z, 0, 0, spherePos)>=r*2-20) {
        isInSphere = false;
      }

      if (start) {
        if (rightHand.x != pRightHand.x || rightHand.y != pRightHand.y || rightHand.z != pRightHand.z) {
          //    PVector p = new PVector(mouseX, mouseY);
          //position.add(p);
          float d = dist(rightHand.x, rightHand.y, rightHand.z, 
          pRightHand.x, pRightHand.y, pRightHand.z);
          if (d >= 0 && d < tubeWidth) {
            for (int i = 0; i < tubeWidth - d; i+=3) {
              position.add(new PVector(rightHand.x+pow(random(d-2*tubeWidth, 2*tubeWidth-d), 1), 
              rightHand.y+pow(random(d-2*tubeWidth, 2*tubeWidth-d), 1), 
              rightHand.z+pow(random(d-2*tubeWidth, 2*tubeWidth-d), 1)));
            }
          }    
          pRightHand.x = rightHand.x;
          pRightHand.y = rightHand.y;
          pRightHand.z = rightHand.z;      
          frameNum = frameCount;
          dots.add(position);
          position = new ArrayList <PVector>();
        }
        if (rightHand.x == pRightHand.x && rightHand.y == pRightHand.y && rightHand.z == pRightHand.z) {
          position.add(new PVector(rightHand.x+random(-(frameCount-frameNum), (frameCount-frameNum)), 
          rightHand.y+random(-(frameCount-frameNum), (frameCount-frameNum)), 
          rightHand.z+random(-(frameCount-frameNum), (frameCount-frameNum))));
        }
      }

      for (int i = 0; i < dots.size (); i++) {
        for (int j = 0; j < dots.get (i).size(); j++) {
          stroke(255);
          point(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z);
          float c = dist(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, 0, 0, spherePos);
          if (c < 240) {
            if (!connect) {
              stroke(0);
            }
            line(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, 0, 0, spherePos);
          }
          if (i<240) {
            for (int k = 0; k < i; k+=4) {
              for (int l = 0; l < dots.get (k).size(); l+=2) {
                PVector p = (PVector) dots.get(k).get(l);
                float d = dist(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, p.x, p.y, p.z);
                strokeWeight(1);
                if (d < 200) {
                  if (!connect) {
                    stroke(0);
                  }
                  //if (random(10) < 5) // Skip some lines randomly
                  line(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, p.x, p.y, p.z);
                }
              }
            }
          } else { 
            for (int k = i-240; k < i; k+=4) {
              for (int l = 0; l < dots.get (k).size(); l+=2) {
                PVector p = (PVector) dots.get(k).get(l);
                float d = dist(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, p.x, p.y, p.z);
                strokeWeight(1);
                stroke(255, 100);
                if (d < 200) {
                  if (!connect) {
                    stroke(0);
                  }
                  //if (random(10) < 5) // Skip some lines randomly
                  line(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, p.x, p.y, p.z);
                }
              }
            }
          }
        }
      }


      strokeWeight(1.5);
      stroke(255, 0);
      for (int i = 0; i < dots.size (); i++) {
        if (i>4) {
          if (dots.get(i).size()>=dots.get(i-1).size()) {
            for (int j = 0; j < dots.get (i).size(); j++) {
              stroke(255);
              noFill();
              bezier(dots.get(i).get(j).x, dots.get(i).get(j).y, dots.get(i).get(j).z, 
              dots.get(i-1).get(j%dots.get(i-1).size()).x, 
              dots.get(i-1).get(j%dots.get(i-1).size()).y, 
              dots.get(i-1).get(j%dots.get(i-1).size()).z, 
              dots.get(i-2).get(j%dots.get(i-2).size()).x, 
              dots.get(i-2).get(j%dots.get(i-2).size()).y, 
              dots.get(i-2).get(j%dots.get(i-2).size()).z, 
              dots.get(i-3).get(j%dots.get(i-3).size()).x, 
              dots.get(i-3).get(j%dots.get(i-3).size()).y, 
              dots.get(i-3).get(j%dots.get(i-3).size()).z);
            }
          } else {
            for (int j = 0; j < dots.get (i-3).size(); j++) {
              stroke(255);
              bezier(dots.get(i-3).get(j).x, dots.get(i-3).get(j).y, dots.get(i-3).get(j).z, 
              dots.get(i-2).get(j%dots.get(i-2).size()).x, 
              dots.get(i-2).get(j%dots.get(i-2).size()).y, 
              dots.get(i-2).get(j%dots.get(i-2).size()).z, 
              dots.get(i-1).get(j%dots.get(i-1).size()).x, 
              dots.get(i-1).get(j%dots.get(i-1).size()).y, 
              dots.get(i-1).get(j%dots.get(i-1).size()).z, 
              dots.get(i).get(j%dots.get(i).size()).x, 
              dots.get(i).get(j%dots.get(i).size()).y, 
              dots.get(i).get(j%dots.get(i).size()).z);
            }
          }
        }
      }
    }
  }
}
void keyPressed() {
  if (key == 'n' || key == 'N') {
    connect = ! connect;
  }
  if (key == 's' || key == 'S') {
    skeleton = ! skeleton;
  }
  if (key == 'b' || key == 'B') {
    ball = ! ball;
  }
  if (key == 'r' || key == 'R') {
    right = ! right;
  }
  if (key == 'l' || key == 'L') {
    left = ! left;
  }
  if (key == 'o' || key == 'O') {
    rotate = ! rotate;
  }
}
void drawSkeleton(int userId)
{
  strokeWeight(3);

  // to get the 3d joint data
  drawLimb(userId, SimpleOpenNI.SKEL_HEAD, SimpleOpenNI.SKEL_NECK);
  drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_LEFT_SHOULDER);
  drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_LEFT_ELBOW);
  drawLimb(userId, SimpleOpenNI.SKEL_LEFT_ELBOW, SimpleOpenNI.SKEL_LEFT_HAND);

  drawLimb(userId, SimpleOpenNI.SKEL_NECK, SimpleOpenNI.SKEL_RIGHT_SHOULDER);
  drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_RIGHT_ELBOW);
  drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_ELBOW, SimpleOpenNI.SKEL_RIGHT_HAND);

  drawLimb(userId, SimpleOpenNI.SKEL_LEFT_SHOULDER, SimpleOpenNI.SKEL_TORSO);
  drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_SHOULDER, SimpleOpenNI.SKEL_TORSO);

  drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_LEFT_HIP);
  drawLimb(userId, SimpleOpenNI.SKEL_LEFT_HIP, SimpleOpenNI.SKEL_LEFT_KNEE);
  drawLimb(userId, SimpleOpenNI.SKEL_LEFT_KNEE, SimpleOpenNI.SKEL_LEFT_FOOT);

  drawLimb(userId, SimpleOpenNI.SKEL_TORSO, SimpleOpenNI.SKEL_RIGHT_HIP);
  drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_HIP, SimpleOpenNI.SKEL_RIGHT_KNEE);
  drawLimb(userId, SimpleOpenNI.SKEL_RIGHT_KNEE, SimpleOpenNI.SKEL_RIGHT_FOOT); 

  stroke(255, 200, 200);

  strokeWeight(1);
}

void drawLimb(int userId, int jointType1, int jointType2)
{
  PVector jointPos1 = new PVector();
  PVector jointPos2 = new PVector();
  float  confidence;

  // draw the joint position
  confidence = kinect.getJointPositionSkeleton(userId, jointType1, jointPos1);
  confidence = kinect.getJointPositionSkeleton(userId, jointType2, jointPos2);

  stroke(255, 0, 0, confidence * 200 + 55);
  line(jointPos1.x, jointPos1.y, jointPos1.z, 
  jointPos2.x, jointPos2.y, jointPos2.z);

  drawJointOrientation(userId, jointType1, jointPos1, 50);
}

void drawJointOrientation(int userId, int jointType, PVector pos, float length)
{
  // draw the joint orientation  
  PMatrix3D  orientation = new PMatrix3D();
  float confidence = kinect.getJointOrientationSkeleton(userId, jointType, orientation);
  if (confidence < 0.001f) 
    // nothing to draw, orientation data is useless
    return;

  pushMatrix();
  translate(pos.x, pos.y, pos.z);

  // set the local coordsys
  applyMatrix(orientation);

  // coordsys lines are 100mm long
  // x - r
  stroke(255, 0, 0, confidence * 200 + 55);
  line(0, 0, 0, 
  length, 0, 0);
  // y - g
  stroke(0, 255, 0, confidence * 200 + 55);
  line(0, 0, 0, 
  0, length, 0);
  // z - b    
  stroke(0, 0, 255, confidence * 200 + 55);
  line(0, 0, 0, 
  0, 0, length);
  popMatrix();
}
// SimpleOpenNI user events

void onNewUser(SimpleOpenNI curContext, int userId)
{
  println("onNewUser - userId: " + userId);
  println("\tstart tracking skeleton");

  kinect.startTrackingSkeleton(userId);
}

void onLostUser(SimpleOpenNI curContext, int userId)
{
  println("onLostUser - userId: " + userId);
}

void onVisibleUser(SimpleOpenNI curContext, int userId)
{
  //println("onVisibleUser - userId: " + userId);
}
Reference
Making Things See, Borenstein, Chapter 3 & Chapter 4
http://www.openprocessing.org/user/31023

ambikayadav
Posts: 4
Joined: Fri Apr 01, 2016 2:32 pm

Re: Proj 2: Depth Project

Post by ambikayadav » Tue Jun 07, 2016 9:38 am

SYMMETRY AND REFLECTION

In this project , I aim to utilize concepts of geometrical concepts to achieve symmetry of structure which is derived by users interaction in the space.

INSPIRATION : The most important inspiration for this work was to give the user the ability to control the system and be able to produce artwork with the use of their body movements. Concepts of mathematics, geometry , physics and reflections are a very vital part of creating this piece . I have tried to explore how these subjects can used to create beautiful art.
IMG_1178.JPG
Concept Sketch
DESCRIPTION :
The user interacts with the system . The hand is tracked , and this gives us the x,y,z position of the hand in space.
This position is then mapped to the virtual space created in this system .
The space is divided into 6 portions of 60 degrees each . In each of this division we see the reflections and symmetrical sketch of what the user sketches out emerging as time passes and as the user interacts.
Screen Shot 2016-06-07 at 10.19.36 AM.png
Output 1
Screen Shot 2016-06-07 at 10.14.44 AM.png
Output 2

Post Reply