top of page

teamLab Art,

Singapore,

2018

Homogenizing and Transforming World

01

attraction

Create a visual space where viewer can apply movement to hear sounds with colours changing

affordance

Shows media fusion of medium and sound space from viewer’s movement

Expanding from sight to hearing

affects

The balls send this colour information to other balls and send the information to nearby balls

The information spreads out so that all the balls become unified in colour

feedback

Sensor

input

output

Full colour LED, Sound

Touch, Movement

viewer bahaviour

how does it work?

The balls change color when touched by viewers, or when they bump into things. Sounds are produced in relation to the colors. The balls send this color information to other balls, which in turn send the information to nearby balls, and the information spreads out so that all the balls become unified in color.

Please turn up the sounds on

Youtube, sources: teamLab, Homogenizing and Transforming World

“Homogenizing and Transforming World” is an interactive media art of using spatial objects. Individual balls are floating in the air communicating with one another via a wireless connection. The Internet has spread throughout the world. Individuals are connected and information spreads back and forth freely. People act as intermediaries for information, and the instant the information spreads, the world unites. All individuals can freely and simply transmit information; the individual acts as an intermediary that transmits the information to the world, transforming it in an instant.

Simple button interface which attracts the users. As the circle’s colour is changing, which arouse the attraction from the users. Simple touch/tap leads to interface that changes colour and the audio is auto-played, to draw more attention to the object. Try it on your own by scanning or clicking the QR code! Best would be using laptop.

Some mobile devices might not support audio playback. 

Scan or click!

The role of gesturing and interface has been evolving and adapting on the mobile spectrum for a decade. As most would understand it, a gesture is a command input by touch onto a touch enabled screen or mouse pad. The term arguably entered the mainstream with Apple’s trackpad and input commands. With 12 standard gesture commands on desktop and 8 for mobile, gestures solve unique navigational problems for UI across platforms. 

 

Smartphones and tablets are the most common place everyday consumers can find gestural UI. From Apple iPhones to the Samsung Galaxy, the vast majority of contemporary phones incorporate some elements of gesture UI, from swiping and scrolling, which is common to most phones and tablets, to orientation recognition.

The Samsung Galaxy S4 is an example of one smartphone that has taken gesture UI a step further. The phone tracks and recognizes your eye movement and automatically scrolls down as your eyes reach the bottom of the page.

Current research and experimentation suggests that in the future, gestural UI in phones and tablets will use more cameras and sensors, increasing responsiveness and recognition capabilities. Experts assert that phones and tablets will not only be able to understand non-touch gestures and facial expressions, but will be context-aware, meaning these devices will be able to anticipate and predict what users want to a greater degree of accuracy than is currently possible. 

Artworks

bottom of page