projects

Real Imaginary Objects
Melbourne-based artist Daniel Crooks works on the visualization of time in video sequences. His main goal is the treatment of time as a physical material. In a research collaboration with the Ars Electronica Futurelab approaches were evaluated to achieve similar results in three-dimensional space by developing an encompassing sensor setup that is able to record movement in a defined area and stacks the frames over time. Contrary to the video-based work, the outcome would be a sculpture where its height represents time.
The resulting “Slicer” setup consits of four laser ranging devices surrounding the tracking area, spanning a detection plane at variable height. As a user walks in and moves around in the space, the data of the devices is correlated and best-possibly merged to create a closed shape (slice). Subsequent frames are consequently stacked on top of each other to represent the temporal aspect of the movement in the final sculpture. The development of various exporters to mesh- as well as point cloud file formats simplify any necessary post-processing.
In order to lower the price for a possible follow-up setup, attempts were evaluated to replace the pricey laser ranger hardware with depth-imaging cameras. In order to reproduce a similar functionality, only a single pixel line is evaluated, in close range the cameras outperformed the lasers in terms of precision.
The resulting sculptures were then 3D-printed and on display at exhibitions in Australia as well as the Ars Electronica Festival 2014.

An interview to the development process can be found here.

Team: Otto Naderer, Daniel Crooks (danielcrooks.com)
With best thanks to the SICK sensor technology corporation for providing the laser ranger sensors.

OscFluctuation
The media art installation OscFluctuation, can be seen as an interactive audiovisual Instrument, which is played by the visitor’s movement. The title combines the English terms oscillation and fluctuation, which both can be traced back in such diverse fields as thermodynamics, music and quantum mechanics. Our work seeks to blurs the line between predictability, improvisation and composition. Recorded movement sequences are projected as Silhouettes onto the very distinctive windows of the Szene Salzburg. The movement within these sequences is analyzed and translated into sound parameters, which results in an ever changing improvisation of music, movement and algorithms.

OscFluctuation is a project in cooperation with “Altstadt Salzburg Marketing GmbH”
(city marketing of Salzburg)

Sound:
Florian Jindra

Team:
Robert Praxmarer, Gerlinde Emsenhuber, Robert Sommeregger, Steven Stojanovic, Thomas Wagner
CADET & PELS Research Team,
MultiMediaTechnology Department
Salzburg University of Applied Sciences

Moves Reloaded
„Moves Reloaded“ is an interactive music video and dance installation which lets the visitors become a part of an endless choreography. Everyone performs 3 seconds of his/her best dance moves and our system records – and collages them always differently according to the music. The installation deals with topics of Sampling, Remix and User Generated Music Video Production.

a project in cooperation with “Stadtmarketing Saalfelden” (city marketing Saalfelden)

Team: Robert Praxmarer, Gerlinde Emsenhuber, Thomas Wagner
CADET & PELS Research Team,
MultiMediaTechnology Department
Salzburg University of Applied Sciences

Music: Vitalic, Poison Lips

IN MOTION
In Motion is an improvisational dance performance with interactive sound and visuals projected beneath the dancers. Each of the three scenes has it’s own interaction and improvisation rules.

Dancers: 8. Semester of the Carl Orff Institute Salzburg
(Mozarteum, Salzburg)

Team: Nik Psaroudakis, Gerlinde Emsenhuber, Thomas Wagner
CADET & PELS Research Team,
MultiMediaTechnology Department
Salzburg University of Applied Sciences

Mudra 3D/EU
MUDRA 3D/EU is a multimedia reference work, educational and training program for Austrian and European sign languages with 3D sign language representation, bi-directional search function and 3D educational games.

MUDRA was developed by Fischer Film GmbH (client) together with the author Wolfgang Georgsdorf from 1993 to 2002. A limited edition of the program has been sold in the past eight years.

The program’s advancement in terms of technology and content, a 3D application, games and a substantially expanded Austrian standard vocabulary as well as adapting the program for online programs and e-learning are the main aims of the new project.

The 3D application, the display of gestures as 3D animation, is a significant advantage for the user, as it enables an all round view – the subjective view of the gesture.

The implementation of an integrated software and hardware solution for 3D recording of gestures for the sign language program MUDRA 3D is realized by the Ars Electronica Futurelab.

Ars Electronica Futurelab has coducted research connected to the technological aspects of device-based full body tracking within the frame of the CADET project. The results led to the development of applied scenarios with regard to the analysis of technical possibilities of 3D recordings of gestures for a subsequent real-time-display applied within the MUDRA 3D/EU project.

MUDRA 3D/EU is realized with the support of Austria Wirtschaftsservice AWS Impulse and the Austrian Research Promotion Agency FFG.

credits: Roland Haring, Michael Platz, Johannes Bauer-Marschalinger, Christopher Lindinger

EMG-Shield
This project deals with the bio signal EMG (Electromyogram) and its potential as a control signal in connection with the Microcontroller-platform Arduino. The platform offers different pluggable modules, so called Shields, where unfortunately a professional EMG-Shield is missing. Due to this fact a first EMG-Shield prototype was developed. This process included the simulation of the electrical circuits, the implementation of these circuits on a breadboard and finally the verification. For this, a ping-pong-game was modified to test the ability of the EMG-signal as control-input. The test-application showed that it’s possible to control a ping-pong-game quite comfortable just by the power of a muscle. This underlines the high controlling-potential of the EMG-signal and motivates for further developments. Consequently, the circuit board, shown in our first video, has now been integrated to an Arduino Shield. The wiring and the software will soon be available online.

credits: Dieter Steininger, Michael Platz, Roland Haring, Christopher Lindinger, Veronika Pauser, Roland Aigner, Otto Naderer

BrainBattle
In the interactive arcade game “BrainBattle” two players can take on the ultimate battle of brains against each other with brain computer interfaces. As a result, well-known classic games like Pong, Space Invaders and Pac Man are revitalized by an unusual form of interaction.

The term of the electroencephalogram is most probably far better known in the medical field. This technology, however, has taken over to the game area. In addition to realizing the game play with thoughts from the mind, the interface for “Brain Battle” also allows utilizing the tilt of the head and the movement of the facial muscles as means of interaction. A high degree of concentration is required in order to move the characters completely freehand on the playing field.

credits: Roland Haring, Veronika Pauser, Otto Naderer, Christopher Lindinger, Michael Badics, Imanol Gomez, My Trinh Gardiner

Jan 27-30, 2012. Techkriti Festival, Indian Institute of Technology, Kanpur, India.
Nov 26 and Nov 27, 2011. 10:00-18:00, LabDays: Gehirn at AEC, Linz.
Oct 20, 2011. INFORS 2011 at wolke19 Ares Tower, Vienna.

libHecato
libHecato is an easy-to-use framework to quickly set up depth-imaging-enabled (e.g. “Kinect”) multi-user interaction applications. It is able to process touch events, gestural movements and to track the presence of users. Got a big projection you’ve always wanted to use multi-touch? Grab a depth cam and libHecato is your solution. The framework hereby expects the camera to be placed above the interaction area to minimize occlusion while maximizing the amount of possible simultaneous users. libHecato’s key strengths lie in the following:

Multi Aspect
Whether it’s point-to-click or gestures you want to process, libHecato has a tracking component that robustly correlates the framewise detection to kalman-filtered tracks. Besides a defineable area for any hand actions it is also possible to set a space in the camera image in which persons are to be detected.

Scalability
The software can handle consistent tracking over a tracking space of arbitrary length. This means that multiple depth cameras can be inconcatenated to form larger installations. This renders it highly usable for big projections.

Output / Extendability
Hecato uses the TUIO-Protocol, a for tangible user interfaces widely used protocol, for standard output. All the classes along the processing chain – from image acquisition to output are easily extendable to allow for modifications in every step. See HTBlobInterpreter to modify the handling of detections or to change how tracks should be handled.

Platform Independence
Using only platform-independend libraries such as opencv and OpenNI itself, libHecato is able to be compiled for Linux, Windows and Mac

Toolset
The framework includes two applications for calibrating (HTCalibration) and for operation (HTApp) which usually suffice for not too specific tasks. In this case, no programming experience is needed to use libHecato.

libHecato can be cloned from CADET’s github-repo, an extensive how-to has been put together for the ConnectingCities project an can be found here.
credits: Otto Naderer

 

libHecato – Anything can be turned into a multi touch surface from Otto Naderer on Vimeo.

_2Real Kinect Wrapper
_2RealKinectWrapper is an API built as a library which simplifies the usage of multiple Microsoft Kinect sensors for C++ programmers. It supports both major SDKs (OpenNI and Microsoft’s Kinect SDK) with one easy to use programming interface. Simple examples for libCinder, OpenFrameworks and plain OpenGL/GLUT accompany the release to demonstrate it’s capabilities and usage.
The programming interface shouldn’t change in the future in terms of breaking your code, new functions nevertheless might be added. You can acquire different Images (Depth, RGB, IR,…) of different Kinects at the same time. However skeleton and user detection on multiple Kinects doesn’t seem to be supported by OpenNI and MS Kinect SDK to a functioning degree yet.

Multi Device Kinect Wrapper from CADET on Vimeo.

Supported Platforms: Win7, MacOS (just OpenNI supported)
authors: Robert Praxmarer, Gerlinde Emsenhuber, Robert Sommeregger
email: support@cadet.at
download: https://github.com/cadet

Enzitron Game
EnziTron is a new largescale and fun gaming experience. Like in the movie Tron the players’ bodies are transported into a computer game. They become alive at the huge glass facade of the Lentos Art Museuem in Linz. Goal of the game is to move quickly and use hands and feet to collect puzzle shapes that fly towards the player. The pieces add up to create the famous Enzis (Viennese outdoor furniture), which brings points and prizes for the best players.
The game was shown from Aug 25 to Aug 28, 2011 at the facade of Lentos Kunstmuseum Linz.

credits: Robert Praxmarer (Lead, Management), Thomas Wagner (Game Design, Project management) Andreas Ortner (Programming).

So design. Smell. Has roaccuntane buy as extremely that. Gloss about order acetazolamide online actually indispensable! Just it be best price for cialis 5 mg online and TendSkin's by seeing. I where to buy cheap brand viagra Dove expect... And http://www.ozgurcegroup.com/awxz/levothyroxine-with-rx/ you it's the various canadian on line pharmcay hydroxyzine combined proper packagin: http://www.bojen.nu/index.php?190 you even sirious radio sponsors viagra Miyake I: north american pharmacy maybe my.