Visual Communication

On Monday, we were warmly welcomed Chris Bowman, Holger Deuter and Jason Benedek and told about what to expect for the next fortnight. This included our brief which would be broken into several parts, the main goal being to explore new ways to visualise environmental, sound and movement data according to the theme 'Estuary' and the concept of the Data Arena being an 'intelligent membrane'. 

After lunch, we paid the highly anticipated visit to the UTS Data Arena in Building 11. We were provided with powered 3D glasses which were necessary to view the space in 3D and stepped into the cylindrical dark room.

In a nutshell, the UTS data Arena provides an open source interactive space for data visualisation based on 360 degree video projection as well as motion detection. Boasting a solid 6 HD cameras and secondary cameras for motion capture, this facility is truly amazing.

There were two different controls used to manipulate visuals, however, I found this particular mo-cap (motion capture) marker that can be used as a controller the most interesting; oddly crab-like in appearance with spoke pierced spheres, they provide a recognition target for the cameras to detect and respond to. Switching up the positions of the spheres on this controller can create a completely reassigned controller. Moving a cursor or clicking on a screen element simply requires a flip of the marker.


Monday 08.02.2016


Today we were introduced to the MotionBuilder interface and how to create light emissions from a moving 3D object by Holger Deuter. With a basic shape asset, we applied a texture and particle shader onto it. By adjusting the emitted particles' lifespan as well as other values, we were able to create a wispy light painting tool, to which we would soon learn how to anchor onto Dean Walsh's motion capture data.

This lesson was particularly engaging as it allowed me to delve into the world of 3D software, familiarise myself with the asset building/ camera components and experiment with the various settings.


Motion capture suit in action (Jürmann 2016)


Light painting with markers at the UTS Data Arena (Jürmann 2016)


Tuesday 09.02.2016

Today we spent time at the Data Arena exploring motion capture technology. We were introduced to the Mo-Cap suit which was studded with several highly reflective markers; the same kind we found on the crab-like markers. We were shown how a dance form or movement may be tracked and recorded through these assistive devices in the form of a .fbx file which could later be manipulated in MotionBuilder.

We also spent some time experiencing the use of the various hand-held markers to do some light painting on the 360 degree screens.



Jürmann, B. 2016, Blake William Jürmann on Vimeo, viewed 10 February 2016, <>.


Wednesday 10.02.2016

Returning to our lab studio today, we learnt more techniques on making and rendering sprites to use in Autodesk MotionBuilder's particle shader. During this lesson, I also felt the need to experiement with shapes to form a blood cell-lole structure in relation to the theme of Estuary (Estuary being the life of the land, the way a blood vessel would). I rendered a 3D torus to use as a sprite which would be emitted from a larger torus that looked like a cut out diagram of a major artery.


Sprites attached onto the torus


The resultant render, tranformed into a .gif file.


Playing around with parenting and constraining


Constraints were also added to create fixed points on the movement data, allowing the number of points to be simplified and brought into SoftImage. In this software, more techniques on creating particle effects were also demonstrated to us such as cigarette smoke, fire, bubbles and falling leaves.


Experimenting with the cigarette smoke particle generator


Friday 12.02.2016

& Monday 15.02.2016

We explored the use of a script made by Darren, the technician from the Data Arena, for Windows, to create a batch process that merges several images into a video file, with different variables such as resolution, file name, location as well as frame rate. This is an alternative way to render a collection of 3D rendered images with sequential frames that would run independently without any rendering software. Aprt from that we also briefly learnt how to frame a motion capture to be ready for rendering at 4096 x 476 pixels.