Work in Progress Notes - Taylor Moon
Objective of my project: My project is exploring the intersections of cinema, virtual reality, architecture, and robotics. I specifically drew from lectures regarding aesthetic narrative and computational photography while also incorporating the mechanics and functionality of our Zumo robots.
Goldfeather, Jack. “Tracking in Virtual Reality.” Math Horizons, vol. 10, no. 3, 2003, pp. 27–31. JSTOR, ct .
http://www.jstor.org/stable/pdf/2567840 ... b7b544e4bc
object detection and tracking
utilizes eye-tracking systems p 27
by having information about a chair stored in the computer, it is able to project it onto your goggle screens like an actual object would be projected onto your right eye and the retina of your left eye p 27
it factors in your eye coordinates, the world coordinates and the angle of projection from your goggles p 27
optical ceiling tracking p 27
“the whole field of computer graphics, of which virtual reality is a part, has been dubbed ‘mathematical archaeology”” 27
“computation of position and orientation of an object begins by collecting data from the environment. Data collection might be done by cameras, transmitted signals, magnetic field distortions, etc. Typically, the data are used as coefficients in a system of equations with the position and orientation parameters as variables.” - 28
“problems with systems of equations used in tracking.... the system may have more than one solution. Determining which one to pick can be impossible in some cases. Such situations arise when the system is either under-constrained or ill-conditioned. Unfortunately, this is a common occurrence in tracking systems.... The system may have no solution. This often arises when noise in the data (i.e., measurement errors) produces inconsistent equations. Usually what is done in this case is to try to find a least-squares fit which minimizes error. However, there is no guarantee that the least-squares fit is a meaningful result or even that optimization methods will find it.” - p 29 // My camera would address which one to pick by presenting the viewer with the multiple options it found (i.e. the multiple photos it took while conducting object tracking. If multiple scenarios are found that match the computer objective, then it becomes up to the user’s discretion to pick which image)
Self
my camera would be use optic tracking in order to detect objects within the environment; it would recognize objects and locate their coordinates, converting the real world coordinate system into a local object coordinate system - see Goldfeather p 28
images that I want it to capture to generate the narrative sequence/series:
a tight shot/close up
motion/action shot
landscape
color detection/color picker
image processing
color segmentation
color space
model selection
skin detection
the ways in which photoshop and illustrator use a color picker
Cândido, Jorge, and Maurício Marengoni. “Combining Information in a Bayesian Network for Face Detection.” Brazilian Journal of Probability and Statistics, vol. 23, no. 2, 2009, pp. 179–195. JSTOR,
www.jstor.org/stable/43601135.
http://www.jstor.org/stable/pdf/4360113 ... 38f93b8f3e
“One of the main tasks in computer vision is object detection. Object detection is the first step in most vision tasks ” p 179
“Object detection is a challenging step because, in general, there are no constraints on how the object shows up in an image. There are differences related to illumination, type of sensor used, visualization, and color, among others. The detection of human faces is important due to its application in surveillance systems, human computer interaction (HCl), and biometrics systems” p 179
“The work developed for face detection can be divided into two main groups: knowledge-based methods and appearance-based me” 179
related projects // “The work presented here uses the knowledge-based method. The motivation for this work is related to human computer interaction and it is part of an ongoing optical mouse project for disabled computer users. The optical mouse concept designed here allows users with certain disabilities (e.g., Parkinson's dis- ease) to operate or navigate in the Internet using the eyes to move the mouse and click on certain positions” - p 179 // the way in which it tracks a viewer’s eyes and the subtle changes in motion and direction it is able to pick up on
“The image stream will b real time by the webcam. Once the face is detected, it will be tracked a geometrical face model, the eyes expected position will be determin and, finally, the gaze estimation will be computed. Once the eyes are simple calibration process should provide enough accuracy for the optical mouse” p 180 // how my project intends to have a motion capture image in the computer-generated narrative.
Golda, Gregory J. "Integrated Arts 10 - Film Terminology and Other Resources." Film Terminology. Penn State University, n.d. Web. 24 May 2017.
http://www.psu.edu/dept/inart10_110/inart10/film.html
wide angle
zoom shot
tilt shot
soft focus
fish-eye
dissolve
Vezhnevets, Vladimir, Vassili Sazonov, and Alla Andreeva. "A survey on pixel-based skin color detection techniques." Proc. Graphicon. Vol. 3. 2003.
http://academic.aua.am/Skhachat/Public/ ... niques.pdf
“The final goal of skin color detection is to build a decision rule, that will discriminate between skin and non-skin pixels. This is usually accomplished by introducing a metric, which measures distance (in general sense) of the pixel color to skin tone. The type of this metric is defined by the skin color modeling method. One method to build a skin classifier is to define explicitly (through a number of rules) the boundaries skin cluster in some colorspace. For examlple [Peer et al. 2003]:
(R,G,B) is classified as skin if :
R >95 and G >40 and B > 20 and
max{R,G,B} - min {R,G,B} > 15 and
|R-G| > 15 and R > G and R > B
"Oscar Nominees." Oscar.go.com. ABC News, n.d. Web. 31 May 2017.
“Film Synopsis: Set inside their home, a beloved hatchback, PEARL follows a girl and her dad as they crisscross the country chasing their dreams. It’s a story about the gifts we hand down, their power to carry love.. and finding grace in the unlikeliest of places.”
“This is the second Academy Award nomination for Patrick Osborne. He was previously nominated for: FEAST (2014). Winner, Short Film (Animated)”
It is a 360-degree animated film that uses virtual reality. It plays on subject position and perspective in order to cause the viewer to feel immersed and an active member of the movie.
D'Zurilla, Christie. "Watch the Oscar-nominated 360-degree Short Film 'Pearl,' Set Entirely inside a Car." Los Angeles Times. Los Angeles Times, n.d. Web. 31 May 2017.
“the viewer experiences sitting passenger side throughout the whole story.”
This 360 experience is part of a Google branch, Google Spotlight Story
Curtis, Cassidy, et al. "The making of pearl, a 360° google spotlight story." ACM SIGGRAPH 2016 Appy Hour. ACM, 2016.
“Pearl is the first Spotlight Story to include hard cuts from shot to shot, a common film technique once considered impossible for VR, made possible in this case due to the visual anchor of the car, which remains constant while the location, time of day, props and characters are always changing (38 shots, with 26 distinct environments).”
“Pearl combines spatial audio emitters and multiple Ambisonic sound fields which track viewer orientation and are mixed binaurally in real time, as well as a musical score that blends diegetic (on-screen) and non-diegetic sources from scene to scene.”
uses synchronization for the audio
“Non-photorealistic VR: Pearl’s distinctive visual style required a break from traditional CG workflows. Instead of illuminating models with lights, object colors were baked into compact swatch textures that were then customized to achieve the exact palette required for every scene. Rough edges were achieved by warping a color pass with a structured noise field (in a second buffer) that tracks objects in space and time, has correct stereo disparity, and can be animated at any frame rate. A third buffer let lighters create art-directable contours to delineate lit and shadowed regions.”
“Pearl is a single inter- active narrative experience that we adapted to a diverse range of hardware modalities, including handheld devices (both mono- and stereoscopic), non-interactive video (both rectangular and spher- ical), and full 6-degree-of-freedom VR. Our Story Development Kit and platform-agnostic realtime engine enabled the filmmakers to focus on the story, and made adapting it to multiple mediums relatively simple.”
Google Spotlight Stories. Google, n.d. Web. 31 May 2017.
Google Spotlight Stories caters to multiple platforms for VR storytelling. It accommodates, “mobile 360, mobile VR and room-scale VR headsets” while “building the innovative tech that makes it possible.”
It is a full sensory experience.
https://www.youtube.com/watch?v=WqCH4DN ... e=youtu.be
Films featured on Google Spotlight Stories are Pearl by Patrick Osborne, Rain or Shine by Felix Massie, The Simpsons: Planet of the Couches, Buggie Night by Mark Oftedal, On Ice by Shannon Tindle, Help by Justin Lin, and Special Delivery by Tim Ruffle.
"Get Colors from Image (BETA)." HTML Color Codes. N.p., n.d. Web. 31 May 2017.
color picker
9 x 9 pixels in screen