On Monday, we were warmly welcomed Chris Bowman, Holger Deuter and Jason Benedek and told about what to expect for the next fortnight. This included our brief which would be broken into several parts, the main goal being to explore new ways to visualise environmental, sound and movement data according to the theme 'Estuary' and the concept of the Data Arena being an 'intelligent membrane'.
After lunch, we paid the highly anticipated visit to the UTS Data Arena in Building 11. We were provided with powered 3D glasses which were necessary to view the space in 3D and stepped into the cylindrical dark room.
In a nutshell, the UTS data Arena provides an open source interactive space for data visualisation based on 360 degree video projection as well as motion detection. Boasting a solid 6 HD cameras and secondary cameras for motion capture, this facility is truly amazing.
There were two different controls used to manipulate visuals, however, I found this particular mo-cap (motion capture) marker that can be used as a controller the most interesting; oddly crab-like in appearance with spoke pierced spheres, they provide a recognition target for the cameras to detect and respond to. Switching up the positions of the spheres on this controller can create a completely reassigned controller. Moving a cursor or clicking on a screen element simply requires a flip of the marker.