Sony is back in Austin this week for SXSW, the annual technology and entertainment meeting, with a full store of odd gadgets, games and prototypes that, in one way or another, depend on Sony's technology. The exhibition, called Wow Factory, is an opportunity for engineers and artists from the Japanese technological giant to collaborate on experimental projects.
These projects aim to emphasize how Sony's display technology, particularly its advances in image sensors and projectors, can be stretched and transformed into hardware and software that goes beyond a standard image on a flat screen. In this way, Sony can venture into areas such as augmented reality through the use of interactive holograms instead of requiring users to wear bulky glasses or helmets. He achieves this by using projectors and sensors that track movement and measure depth and pressure to allow him to interact with objects made entirely of light.
Sony achieves AR with image sensors and projectors, not goggles or helmets
An example is a three-way augmented reality air hockey game that Sony developed specifically for Wow Factory this year. The game features a physical hockey puck and three physical paddles around the custom circular table. But the table is also using two of Sony's new IMX382 visual sensors, which can detect and track objects at 1,000 frames per second. One sensor sits on the table to follow the disc, and another sits below to follow the paddles of the players. Meanwhile, an overhead projector superimposes the game interface and the virtual disks on the surface of the table.
The configuration of this sensor is similar to that contained in Sony's experimental projectors that were brought to SXSW in previous exhibitions. In those situations, Sony converted the tabletops into touch screens and created an interactive software that overlays the physical accessories. For example, Sony used a copy of Lewis Carroll's Alice in Wonderland, along with a physical deck of cards and a cup of tea to bring the events described in the text to life. The company also built an architectural demonstration that could turn a standard block of wood into a top-down household model, with light shining on the table to color and annotate objects in real time.
In the case of the AR air hockey game, the Sony software allows the real hockey puck to interact with the virtual ones because the image sensors follow both your hand and the palette while interacting with the objects on the table. So you can hit the virtual disks with your paddle as if they were real, and the virtual ones even bounce off the real disk and the sides of the table realistically.
Photo of Nick Statt / The Verge
The game itself is chaotic in which the three players simultaneously defend their own objective and go on the offensive against their opponents. Meanwhile, half a dozen hockey pucks, all but one are made of light, fly around the table and collide in a frenzy without stopping.
Although it will never be a commercial product, Sony has shown over and over again that its display and sensor technology can achieve a new form of AR. These demonstrations show to what extent the sensor data and the right combination of hardware can create immersive experiences that do not depend on the light of explosion in the eyes or the screen of the face. It is often thought that AR is something that will only come when it is packaged in a pair of standard eyeglasses. And yet, at this moment, the common conception of technology is the filters of selfies and other animations that you get in Snapchat and other applications, as well as the awkward mix of real and virtual objects through a smart phone lens like Pokémon Go of Niantic.
But Sony's showcase here at SXSW illustrates how RA can be achieved through alternative means, if you're willing to expand how you think about the term and what a realistic function requires.