{Walk & Talk}

IMG_5162 copy

Walk & Talk

is the final project of me and Ziv for Spatial Media.

 

Ideas

For the final of Spatial Media, there’s no restriction on content and context, so because of the struggling process of brainstorming, we decided to make a project helping brainstorming! Getting the inspirations from Land and Sea, a game heard from Ziv from Israel, and the Boundary Functions of Scott Snibbe that exploring the relation between people and spaces, we built up a system that people can expand their territory by walking and shaking, and once people stop moving, their territories will shrink and eventually disappear. Based on several researches proved that “Body Movements Can Influence Problem Solving”(e.g. Science Daily, May 13, 2009), it has the potential to be installed in office space, and help employees brainstorming.

traditional_territory_small

 

Concept

Each person in the game is assigned an initial territory, which he can expand by walking. If they don’t walk, their territory gets smaller. Also, if they don’t talk (to brainstorm or just chit chat), their territories will shrink. This way, people have to walk and talk in order to keep his/her territory.

 

Context

This game can be used for multiple purposes, e.g. DECISION MAKING • BRAINSTORMING • VOTING ON IDEAS • BUDGET PLANNING. Overall the main function is provoking thoughts about sharing and space with movements.

http://www.jhclaura.com/wp-content/uploads/2014/04/walk-n-talk.pdf

 

Technique

process

Tools: OpenFrameworks, Kinect*1, projectors*2.

 

Developing

1) The first attempt to expand the territories based on the blobs movement captured by Kinect.

2) Using ofPolylines to smooth the shape of geometry.

3) Beautiful mistakes 😉

4) Final version of color-filled geometry.

Virtual Reality Tour of Met

For my internship during Spring semester 2014 in Media Lab of The Metropolitan Museum of Art, I hooked up

    1. 3D models of Met from the Architecture Department
    2. Official Audio Guide
    3. 3D models art pieces in Greek and Roman gallery, made by 3D scan with photos
    4. Unity as game engine
    5. virtual reality head-mounted display Oculus Rift as controller

and create an immersive virtual reality tour of Met!

forBlog With Oculus Rift, users can wonder around the museum, listening to the audio guide and admiring art pieces, walk upstair, watch butterflies, being blocked by huge bowl, and being inside of the surreal mash-up models(credits to Decho<horse> and Rui<uncolor triangulars>). metTour

IDEA

With a background as VFX artist of 3D animation and post production, I’m always interested in 3D and how it can be interactive in the creative way. Once I got the chance to intern in Media Lab of the Met and knew we can access the 3D models of museum, I wanted to use Oculus Rift to walk inside the fantasy version of the Met, and to enjoy the immersive experience in space.

 

PROJECT_DEVELOPMENT

Virtual Met Museum –> Fantasy Experiment –> Art piece + Audio Guide

 

BASIC_SETUP_HOW_TO

First of all, tons of basic knowledge about Unity here. And setup a project from scratch, here.

 

✓ Import BIM 3D models into Unity

Basically just put the fbx file into the Assets folder of the project you just created. Not too complicated but there’s one thing you should be aware of, the SCALE. It’s a good practice to setup scale right in the modeling application before importing the model to Unity, and associated details described as below:

  • 1 Unity unit = 1m
  • the fewer GameObjects the better. Also, use 1 material if you can
  • useful link: wiki unity3d

 

✓ Oculus Rift Plugin in Unity 3d Setup

Just follow the clear instruction on youtube!

 

✓ Add collider to meshes

In order to preventing player walking through meshes(e.g. walls, stairs), we need to add Collider attribute on models, steps as below:

  • select model
  • @inspector
  • Add Component –> Physics –> Box Collider or Mesh Collider
  • Mesh Collider is more specific than box collider but at the same time is more expensive to use.

collider copy

 

✓ Occlusion Culling

Means that things you aren’t looking at, aren’t loading into memory, so game will run faster.

  •  geometry must be broken into sensibly sized pieces.
    • if you have one object that contains all the furniture, either all or none of the entire set of furniture will be culled.
  • tag all scene objects that you want to be part of the occlusion to Occluder Static in the Inspector.
  • Back!
  • useful link: unity3d manual

 

✓ Import 3D-Scanned Models from 

  • Take about 20 photos around the object you want to 3D scan of(360 degrees!).
  • Upload the photos to 123D Catch.
  • Yeah now you’ll have both .obj model file and texture file!
  • Just download the file, and drag whole folder into the Asset folder of Unity!

 

POSSIBILITIES

  • Gain accessibility for people who can’t visit the museum in person.
  • Installation design simulation.

 

Thank_to

It’s really a good experience interning at MediaLab of Met. I know I want to keep working on 3D and also step into virtual reality world with Oculus Rift, and it’s a great match that I can have this topic as my own project, and also match to the needs of Met! From this internship, I gained valuable resources from the museum, and also knowing amazing mentors and colleagues from Labs. This project leads me to the world of virtual reality and I’m glad and also thankful that I’m a Spring 14′ intern of Media Lab of The Metropolitan Museum of Art.

{Rabbit_Hole}

{currently works with Chrome and Firefox browser}

For the composition assignment the final of Coding for Emotional Impact class, I want to create something with multiple layers and is self-explained. Inspired by the description of computer vision is a rabbit hole from Andy(because I’m learning Three.js by myself recently), I wanted to make a game about “Rabbit Hole”, and my biggest assumption is that everyone is sort of down the rabbit hole.

ps. It’s not really a fun game to play. Still confusing should it be fun to play or just an emotion-building nowhere…

– Title
Rabbit Hole
– Environment
environment     environment2
– Audience
Whoever also down the rabbit hole or wonder how it feel down there.
 
– Narrative arc
Rabbit HoleMetaphor for the conceptual path which is thought to lead to the true nature of reality. Infinitesimally deep and complex, venturing too far down is probably not that great of an idea.

Literary Nonsense- has no system of logic, although it may imply the existence of an inscrutable one, just beyond our grasp.

And below are three snapshots of what I’ve built so far. I made my own models in Maya and drew textures in Photoshop. Can’t view online because of some web-related issue I can’t solve to load the music(SOLVED_by hard coding the url of music file path). But have no ideas how to do the transition from scene to scene…

SCENE_ZERO: http://www.rabbithole.link/

open

SCENE_ONE: http://www.rabbithole.link/index_D.html

Screen Shot 2014-04-13 at 10.26.14 PM

SCENCE_TWO: http://www.rabbithole.link/index_G.html

Screen Shot 2014-04-13 at 10.27.30 PM

SCENE_THREE: http://www.rabbithole.link/index_S.html

Screen Shot 2014-04-20 at 6.44.00 PM

SCENE_FOUR: http://www.rabbithole.link/index_M.html

maze

SCENE_FIVE: http://www.rabbithole.link/index_T.html

TV

SCENE_SIX: http://www.rabbithole.link/index_F.html

jump

SCENE_SEVEN: http://www.rabbithole.link/index_V.html

voice

SCENE_EIGHT: http://www.rabbithole.link/index_E.html

Elevator

( Three.js + web stuff ) == super deep rabbit hole.

study note_{Three.js}

3D in Web journey starts! Trying to convert last week’s sketch from Processing into Three.js. But can’t finished it by Monday for the class, still in progress… Here’s what I got so far(WARNING: ROUGH), and below are the notes on translating Processing into JavaScript. Will organized it once I finish the work. Stay tuned!

Experience I learned from banging my head against bloody wall

  1. put your codes in function and then execute them by calling the function
    • tried to rotate my goddamn tetrahedron without doing this and had been failing for nearly 2 hours, just kept failing and failing and failing….
  2. OOP
    • links
      • https://developer.mozilla.org/en-US/docs/Web/JavaScript/Introduction_to_Object-Oriented_JavaScript
      • http://stackoverflow.com/questions/572897/how-does-javascript-prototype-work
      • https://github.com/shiffman/The-Nature-of-Code-Examples-p5.js/issues/8
      • http://javascriptissexy.com/javascript-prototype-in-plain-detailed-language/
      • https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/prototype
      • http://www.objectplayground.com/
  3. color
    • if using HSL, material.color.setHSL( h, s, l); –> *careful* can’t set s, l both to 1, it’d be white even though h < 1
  4. Web Audio API
    • http://srchea.com/experimenting-with-web-audio-api-three-js-webgl
  5. Clickable!!
    • http://soledadpenades.com/articles/three-js-tutorials/object-picking/
    • http://threejs.org/examples/canvas_interactive_cubes.html
    • http://stackoverflow.com/questions/11036106/three-js-projector-and-ray-objects

Awareness

(3/25_Updated_footage version)

It’s a project of material experiment and mycelium network simulation. The ultimate goal is to pull closer humans’ relationship with fungus, increase awareness, and explore the usage of mycelium by holding workshop and gathering public source.

 

 

material experiment

In 2007, Eben Bayer and Gavin McIntyre noticed mycelium’s self-assembling glue-like character. By growing mycelium with agricultural byproducts, they discovered a biological, durable, and compostable material that performs, and they found a company called ecovatice. Their products are pressed with pressure during production, and are thick, chunky, and volumetric. Inspired by artist Eric Klarenbeek‘s 3D printed case with straw, I guessed as long as I follow the principle that “mycelium digests nutrient and water and grows harder”, the process of production can be free-formed and without boundary. So I gave it a try.

diagram

For the blender part, the ratio of mycelium+straw & water is approximately 2:1.

Hang the balls in a separated area in order to avoid be contaminated. And after 3~5 days the ball will become obvious white, showing the growing of mycelium.

After 10 days, harvest the balls and pop the balloons, and voila!

Put them aside and dry their interiors for a day(because they were blocked by the balloon).

IMG_3105

IMG_3145

IMG_3130

IMG_3110

IMG_3157

 

 

mycelium network

I’m also interested in how mycelium communicates with each others. The roots of most land plants are colonised by mycorrhizal fungi that provide mineral nutrients in exchange for carbon, and based on “Underground signals carried through common mycelial networks warn neighbouring plants of aphid attack” on Ecology Letter, by Zdenka Babikova, Lucy Gilbert, Toby J. A. Bruce,3 Michael Birkett, John C. Caulfield, Christine Woodcock, John A. Pickett and David Johnson, mycorrhizal mycelia can also act as a conduit for signaling between plants, acting as an early warning system for herbivore attack.

Screen Shot 2014-03-19 at 10.39.44 AM

The experiment is based on the fact that Vicia Faba will emit plant volatiles, particularly methyl salicylate, making this bean plants repellent to aphids but attractive to aphid(bugs) enemies such as parasitoids. It sets up 5 Vicia Faba, having only one of them attacked by aphids,  and having it either connected to other plants with roots or without roots, with mycelium or without mycelium(as picture right above). And the result(as picture left above) shows that the plants, which are connected to the Donor(infested w/ aphids) by mycelium, act same as the Donor, producing volatiles to repel aphids and attract aphids’ enemy. It means This underground messaging system allows neighboring plants to invoke herbivore defenses before attack.

It interests me a lot, and I want to use it as the content to inform people about the amazing behavior of fungus by visualizing the network of mycelium. The idea is–>

  1. when there’s no one around, the mycelium bulb will breathe in its own pattern, presenting w/ LEDs, and there is a video playing footages of fungus & mycelium life.
  2. once someone comes near, the mycelium bulbs will communicate with each other, lighting up and off one by one, and the video will change to broadcast the human-related footages(e.g. garbage, oil spill, and mycoremediation).

2014-03-12 09.48.46

footage Breathing, password: fungus

footage Awarepassword: fungus

 

 

And here are my Arduino code. I wrote digitalWrite into PMW pins.

//#include <LED.h>
#include <NewPing.h>

#define TRIGGER_PIN 8
#define ECHO_PIN 7
#define MAX_DISTANCE 30

//for ultrasonic sensor
NewPing sonar(TRIGGER_PIN, ECHO_PIN, MAX_DISTANCE);
int value;
int interval;  //to trigger the change of LEDs

//for smoothing
const int numReadings = 5;
int readings[numReadings];
int oriReading;
int index = 0;
int total = 0;
int average = 0;

//pin
int ledPins[] = { 
  3,5,6,9,10,11 };

int lastFade[6] = {
  0};
int level[] = {
  10, 23, 45, 50, 100, 205};

//output
int maxV = 220;
int minV = 5;

//slope & intercept
double ain[6], bin[6], aex[6], bex[6];

//time
double inTime[] = {
  1500, 1700, 1900, 2000, 2100, 2300};
double pauseTime[] = {
  350, 400, 450, 500, 550, 600};
double outTime[] = {
  2000, 2200, 2400, 2500, 2600, 2800};
double thirdT[6], cycleT[6];
double levels[6];

boolean lightUp[6];
int awareTime[] = {
  0, 1, 2, 3, 4, 5};
int awareOriTime[] = {
  0, 1, 2, 3, 4, 5};

void setup() {
  Serial.begin(9600);

  //smoothing
  for(int i=0; i<numReadings; i++){
    readings[i] = 0;
  }

  for(int i=0; i<6; i++) {
    pinMode(ledPins[i], OUTPUT);

    thirdT[i] = inTime[i] + pauseTime[i];
    cycleT[i] = inTime[i] + pauseTime[i] + outTime[i];

    ain[i] = (maxV - minV)/inTime[i];
    bin[i] = minV;
    aex[i] = (minV - maxV)/outTime[i];
    bex[i] = maxV - aex[i]*(inTime[i]+pauseTime[i]);

    lightUp[i] = false;
  }  
}

unsigned long tstart[6];
double time;

void loop() {

  //ultrasonic sensor
  oriReading = sonar.ping();
  value = (int) oriReading/US_ROUNDTRIP_CM;

  for(int thisChannel=0; thisChannel<6; thisChannel++) {

    //if detect ppl, all light up
    if(value > 0) {

      //if time can be dividable by 60
      if ( (awareTime[thisChannel])%6 == 0 ) {
        lightUp[thisChannel] = !lightUp[thisChannel];
      }

      if(lightUp[thisChannel] == true)
        levels[thisChannel] = 255;
      else
        levels[thisChannel] = 0;

      analogWrite(ledPins[thisChannel], levels[thisChannel]);

      //determin whether to restart the cycle of time
      awareTime[thisChannel] += 1;

      if( awareTime[thisChannel] >= (180) )
        awareTime[thisChannel] = awareOriTime[0];
    } 

    //if not, do LED pattern
    else {

      if (lastFade[thisChannel] <= inTime[thisChannel]) {
        levels[thisChannel] = int( ain[thisChannel]*lastFade[thisChannel] + bin[thisChannel] );
      }
      else if (lastFade[thisChannel] <= thirdT[thisChannel]) {
        levels[thisChannel] = maxV;
      }
      else {
        levels[thisChannel] = int( aex[thisChannel]*lastFade[thisChannel] + bex[thisChannel] );
      }

      analogWrite(ledPins[thisChannel], levels[thisChannel]);
      delay(1);

      //determin whether to restart the cycle of time
      if(lastFade[thisChannel] >= cycleT[thisChannel]) {
        lastFade[thisChannel] = 0;
        tstart[thisChannel] = millis();
      }
      else {
        lastFade[thisChannel] = millis() - tstart[thisChannel];
      }
    }
  }
}

 

And my Processing code to switch footages based on the Serial signal got from Arduino.

import processing.serial.*;
import processing.video.*;
import java.awt.MouseInfo;
import java.util.Arrays;
import java.util.Collections;
import java.awt.Rectangle;

Movie aware;
Movie grow;
boolean playGrow = true;

Serial myPort;

void setup() {
  size(displayWidth, displayHeight);
  if (frame != null) {
    frame.setResizable(true);
  }
  background(0);
  // Load and play the video in a loop
  aware = new Movie(this, "aware_2.mp4");
  grow = new Movie(this, "grow_v2.mp4");
  aware.loop();
  grow.loop();

//  println(Serial.list());
  String portName = Serial.list()[5];
  myPort = new Serial(this, portName, 9600);
}

void movieEvent(Movie m) {
  m.read();
}

void draw() {
  if(playGrow)
    image(grow, 0, 0, width, height);
  else
    image(aware, 0, 0, width, height);
}

void serialEvent (Serial myPort) {
  int inByte = myPort.read();
  println(inByte);

  if (inByte > 10)
    playGrow = false;
  else
    playGrow = true;

}

void keyPressed() {
  if(key == '1')
    playGrow = true;
  if(key == '2')
    playGrow = false;
}

int mX;
int mY;

boolean sketchFullScreen() {
  return true;
}

void mouseDragged() {
  frame.setLocation(
  MouseInfo.getPointerInfo().getLocation().x-mX, 
  MouseInfo.getPointerInfo().getLocation().y-mY);
}

public void init() {
  frame.removeNotify();
  frame.setUndecorated(true);
  frame.addNotify();
  super.init();
}

 

 

photos of Fabrication2014-03-11 12.19.29

2014-03-12 00.24.24

lightUp2

lightUp1

 

IMG_9685

IMG_9701

For further development, I’m thinking about maybe cooperate with Kate‘s “mushroom craft” and have some craft workshops! After all the process of making those mycelium light bulbs, I’ve been through the fabrication works which I’ve never tried before, and it felt great! I think direct “Hand” touch is the most effective way to pull closer the relationship between people and materials.

By starting the production from searching and growing the material, we can appreciate more about the resource we take from nature and also be more aware about the environmental issues. Not just sitting there receiving the news from TV, but actually  caring and being aware of it because you feel it affecting the fabrication process directly. Maker/Crafter spirit is one of the answer to the future.

css update: Healthy Movie Night

One page web-fantasy UPDATE! GO PLAY.

  1. Type in the movie you’re going to see.
  2. Type in the food you’re going to eat.
  3. Choose which exercise you’re going to take.
  4. Type in your weight, in kg or lbs.
  5. Base on the duration of movie, the calories of the food, the amount of calories born in certain exercise, and your weights, Healthy Movie Night gives you back how much food you can eat without worrying gaining any weights!

Screen Shot 2014-03-23 at 9.03.15 AM

 

e.g. “The Grand Budapest Hotel”, pizza, Jog in Water, 51kg.
Screen Shot 2014-03-23 at 9.03.49 AM

 

e.g. “Wanted”, spaghetti,  fishing, 51kg.
Screen Shot 2014-03-23 at 9.05.09 AM

 

e.g. “Alice in Wonderland”, cake, Ballet, 51kg.
Screen Shot 2014-03-23 at 9.05.51 AM