Cosm for Max/MSP/Jitter

Construct interactive, navigable, sonified virtual worlds using Max/MSP/Jitter.

© 2008 Wesley Smith & Graham Wakefield.

Cosm concepts

A Cosm world involves several conceptual components:


A cosm world-space can choose between a bounded or unbounded space by means of the @infinite attribute of Either way, it has a cuboid size associated with it (set by the @size attribute); in the unbounded case this represents the region around the observer in which objects will be visualized and sonified. Boundary conditions for agents can be set with the @boundmode attribute per cosm.nav object, which when enabled force the objects to wrap around this specified region.



There are certain properties of worlds that can be modeled as quantities or intensities which may vary across space (and time), but always have a value at any particular point. For example, if we drop a block of sugar into a glass of water, we can ask what the concentration of sugar is at any particular position in the glass. Initially the concentration will be high near the base of the glass, where the cube landed, but over time the concentration will diffuse throughout the entire liquid until it is roughly the same value throughout.

We can model intensities that vary over space (fields) by dividing up a space into cells (voxels) and associating each cell with a value. In Jitter, this can be done using a 3D matrix. Note: the dimensions of the field matrix do not need to match those of the world; the cosm.field objects will properly scale the coordinates accordingly. Computing dynamics over 3D matrices can become rapidly CPU intensive, so tuning the dimensions of the field matrix can be important to maintain an efficient patch.

Multi-plane matrices can be use to store multiple values for each cell, such as modeling the concentration of several different chemicals through a space using the same matrix. In this case, query interactions will accept and return multiple values (lists).

Querying & Modifying

The value (or values) of the field at any arbitrary location can be returned by interpolating those in the nearest voxels. The values can be modified (addition) also by interpolation, in which case the added value is mixed between the nearest voxels. The cosm.field.query object supports reading and writing on a per-message basis, suitable for use with a cosm.nav controlled agent/object. The cosm.field object supports reading and writing en masse, using jitter matrices, suitable for particle-based seimulations.

Field dynamics

A growing set of externals, abstractions and examples are included in Cosm that are specifically oriented to spatial algorithms over 2D/3D fields. The cosm.diffuse object uses stable solutions to model the way in which concentrations tend to even out over time. Diffusion rates can be varied per-plane, making it possible to simulate reaction-diffusion equations, for example.



Some aspects of a world are more singularly located, and possibly mobile. In Cosm the cosm.nav object can represent a mobile oriented point, including sending the appropriate position and rotation messages for* objects for rendering and objects for sonification. An oriented point is one that has its own local reference coordinate frame.

Objects can be named, using the @name attribute. Note that the main Cosm scene camera is also an implicit cosm.nav with the name camera.

Placing & Movement

The positions of cosm.nav objects can be set in absolute (world-global) coorindates, and the velocity of objects can be set in relative (object-local) values. By convention in Cosm, we treat the positive Z axis as 'forward' and the positive Y axis as 'up'.

The orientation of an object can be set in absolute (world-global) quaternion or axis/angle formats, and the angular velocity can be set in relative (object-local) quaternion or Euler angle formats. Local euler angles assume a convention of azimuth (tilt), elevation (tumble), bank (roll). Velocity and angular velocity is applied to position and orientation whenever the cosm.nav receives a bang message.



The @radius attribute sets the spherical region centered on the object in which it will detect intersections (collisions) with other cosm.nav objects, if its @nhood attribute is not set to 0. Collisions are reported from the middle outlet of the cosm.nav object, and include the name of the collidee, its distance, and position.


The properites (attributes) of any named cosm.nav object, including position, orientation and coordinate frame, velocities etc., can be queried or set via the cosm.query object.


Particles can be represented using a jit.matrix which is 3-plane, float32 type and typically one-dimensional; where each matrix cell represents a point in space. Standard jitter matrix operations can be used to operate and modify these locations. There are examples showing how this can integrate with the jit.p.* particle system objects, for example. Particles are not named and cannot query each other, but they can interact with fields (both reading and writing) by means of the cosm.field object. Particles can be rendered using or, and sonified by granular synthesis using




Cosm has been specifically designed with active stereographics in mind. A state attribute @stereo in cosm.master is forwarded through any attached cosm.renders to and appropriately. With stereo enabled, cosm.render does the necessary work to render successive left- and right-eye with appropriate camera displacements to the appropriate buffers in the graphics card on each frame. The @lens_angle, @focus and @spread attributes of cosm.master may be used to adjust the stereographic geometry as desired.


Spatial objects can have audio signals associated with them by linking a cosm.nav to a The relative position of the object from the camera, and the camera orientation are taken into account to synthesize directional and distance cues.

Distance cues are calculated within the external, including distance attenuation, filtering (suggesting e.g. air absorption), and delay (e.g. Doppler shift.) These effects are parameterizable for the scene as a whole via attributes of the object.

Directional cues are simulated using ambisonics: the output of can be encoded into ambisonic domain signals (2D or 3D, up to 3rd order) using the cosm.ambi.encode~ object. Many sources can be summed to shared ambisonic domain signal busses, and then decoded to location-specific speaker layouts using the cosm.ambi.decode~ object.

Particles are sonified using, which granulates a named buffer~ using the same distance filtering settings and mixing into the same ambisonic signal busses as above.

Cosm is developed by Wesley Smith & Graham Wakefield in the AlloSphere Research Group, Media Arts & Technology, University of California Santa Barbara.

© 2008 Wesley Smith & Graham Wakefield.