Cosm is an integrated collection of extensions to Max/MSP/Jitter to assist the construction of navigable, sonified, complex virtual worlds, and has been designed to facilitate use in CAVE-like environments. Cosm adds support for six degrees of freedom navigation for both camera and world-objects, collision detection between objects (based on spherical intersection), spatialized audio for mobile objects. It also provides rich support for 3D fields as dynamic environments, and agent-environment interactions.
features
Cosm has been designed to require only minimal changes to existing Max/MSP/Jitter patches to support a number of features valuable in the creation of virtual worlds. In addition, several basic templates and numerous detailed examples are included in the download.
Six Degrees of Freedom navigation
The positions and orientations of the scene camera and any number of mobile objects can be manipulated as absolute values or by relative increments.
Relative increments are specified in object-local coordinate frames (in Euler or quaternion formats for orientations). Absolute positions and orientations are specified in the world-global coordinate frame (in axis/angle or quaternion formats for orientations). All internal calculations use quaternions (avoiding Gimbal lock).
The Cosm toolkit download includes a set of objects for general operations using quaternions, duplicating the quaternion library previously available here.
Collision detection
Collisions between mobile objects are based on spherical intersection (query sphere), where radius can be configured per-object; a collision reports the name, distance and position of the other colliding party.
Objects can also be easily queried for current properties at any time, supporting relationships between mobile agents at larger distances than collisions (e.g. flocking, camea following, etc.).
2D/3D dynamic field interaction
Cosm provides objects to support the interactions between agents and dynamic environments, the latter modeled as matrices spanning the world dimensions. Fast interpolated lookup and accumulation of fields is supported both in individual message-based and group matrix-based manner. Cosm also provides additional objects and examples to support intrinsically dynamic fields, such as diffusion and advection.
Stereographics
Cosm has been specifically designed with stereographics in mind. With stereo enabled, cosm does the necessary work to render successive left- and right-eye with appropriate camera displacements on each frame.
Cosm was designed for active (frame sequential) quad-buffer stereographics, which requires hardware support and shutter glasses. However, example patches are included in the distribution to support passive (dual display) stereographics, for polarizing stereo display, with both two-window and stretched desktop implementations.
Spatial audio
Spatial objects can have audio signals associated with them for spatial audio rendering. The relative position of the object from the camera, and the camera orientation are taken into account to synthesize directional and distance cues. Distance cues include distance attenuation, filtering (suggesting e.g. air absorption), and delay (e.g. Doppler shift.) Directional cues are simulated using Ambisonics (2D or 3D, up to 3rd order). Many sources can be summed to shared ambisonic domain signal busses, and then decoded to location-specific speaker layouts. Particle systems can also be sonified in 3D higher order ambisonics by means of buffer granulation.
Remote rendering
Cosm has been designed to scale up to situations in which the OpenGL rendering occurs on one or more separate machines from the control logic (in those situations where more projectors are needed than a single machine can support). A distinction between world master and render objects allows networked transmission of control messages for remote scene management.
THIRD PARTY PROJECTS:
AUGMENTED REALITY (FUTURE CINEMA LAB)
Cosm also forms a crucial component of Future Cinema Lab's CAVE and Augmented Reality frameworks (including integration with Intersense tracking systems), developed by Andrew Roth at the Augmented Reality Lab at York University and the Banff New Media Institute.
For more information and downloads, point your browser to Future Cinema Lab.
HISTORY
Cosm originated in the development of inaugural content for the AlloSphere at the University of California Santa Barbara. This content was based on Marcos Novak's AlloBrain project, which was undertaken by Graham Wakefield, Lance Putnam, John Thompson, Dan Overholt and Wesley Smith. In the course of this development, Wesley Smith and Graham Wakefield authored several extensions to Max/MSP/Jitter to support the immersive, multimodal demands of the AlloSphere, which have continued to be developed as the Cosm toolkit. It has continued to be developed within the context of the AlloSphere and the transLAB the Media Arts & Technology Program at UC Santa Barbara, USA, and has also formed a valuable component of several educational and research programs:
- Marcos Novak's Transvergence seminars at MAT, UC Santa Barbara, USA, and various workshops internationally.
- Wesley Smith and Graham Wakefield's Spatial Interactive Computational Composition seminars at the Southern Californa Institute for Architecture, Los Angeles, USA.
- Professor JoAnn Kuchera-Morin's Composition for the AlloSphere course (MAT 594P) at UC Santa Barbara, USA.
- Dr. MarkDavid Hosale at TU Delft, the Netherlands.
- Andrew Roth's Augmented Reality projects at Future Cinema Lab, Dept. of Film, York University, Toronto, Canada and Banff New Media Institute.
Acknowledgements
With gratitude to professors JoAnn Kuchera-Morin and Marcos Novak. Collision detection is based upon the Neighand library by Nicolas Brodu. Parts of the audio processing derive from Lance Putnam's Gamma library.