Microsoft Research MakeFest 2014

Way before Satya Nadella became the new CEO of Microsoft and established a company-wide hackathon, Microsoft Research has been having its own annual hack week called MakeFest. MakeFest is held by FUSE Labs, the team I am personally a part of, so we get to see our collaboration space turn into a vibrant maker community for a week with visitors from other research labs around the world.

MakeFest is a week-long event that is open to all Microsoft full-time employees and vendors. During that week, folks stop working on their usual jobs and focus on making pet projects real or join in on other interesting projects. The week starts with a breakfast on Monday morning where people with ideas pitch their projects and form teams on the spot. These just-formed teams work together for a week and present the outcome of their work Friday afternoon.

The final MakeFest presentation

The final MakeFest presentation

This year’s projects included a bike that projects its own lane and alerts when an obstacle is within proximity, a flying quadcopter, a fiber optic tree, an inflatable bar graph, an app that turns people into pixels, a UI that you can control with your eyes and a sound installation that plays sounds according to the moods of messages on Twitter. This was my project concept, by the way. It was called MoodCloud and I’ll blog about it in detail soon.

MoodCloud: a sound installation that allows you to hear the mood of topics on Twitter.

MoodCloud: a sound installation that allows you to hear the mood of topics on Twitter.

Below is a nice video of 2014 MakeFest:

And a tumblr with more photos from the event:


The stage as a musical instrument


For Nine Inch Nails’ tour “Lights in the Sky”, fans were able to experience a fully interactive visual display that is as much a part of the show as the band’s instruments.

The band’s lead Trent Reznor admits that when he’s in the studio, working on an album, he only tries to please himself. But when it’s a tour, it feels a bit more like he has a responsibility to some degree to entertain people.

“I wanted to see how I could use video as an instrument,” he says, “and try to really make the stage feel like it’s organic — like it’s part of the overall set.”

For the track Echoplex, a huge real-time drum sequencer is operated on the screen by Josh Freese using sensors that detect his hand position. This brings back the visibility of creative performance that has been lost with electronic music and shows a very interesting way to mix the digital with the physical (the electronic sequencer vs actual human players).

The company responsible for the technology driving most of the interactive tech elements is Moment Factory: a boutique Canadian outfit that’s worked on a number of Cirque du Soleil shows and has produced other industrial visual installations. Moment Factory defines themselves as a new media and entertainment studio specialized in the conception and production of multimedia environments combining video, lighting, architecture, sound and special effects to create remarkable experiences.

For the interactive portions of the show, all the onscreen video is rendered by Moment Factory’s custom rig, a trio of Linux-based devices collectively known as “the brain.”

Here’s a video showing the making of:

Digitally augmented Beyoncé

This is along the lines of my previous post: the MIDAS project. In this performance, Beyoncé digitally augments herself by being tightly choreographed with what’s going on in the big projection screen. Alone, she is just one body, but with the help of the digital world, she can extend to that whole projection area. Interesting concept, whether fully interactive or choreographed like this one.

Interactive digitally augmented spaces


The MIDAS Project is a synesthetic exploration of traditional artistic performance and digital art. It devises an interactive projection mapped space for the creative arts. Using the latest tracking technology, the space learns and reacts to performance, allowing the artist to explore new improvised choreography live and in time. The team worked in tandem with dancer Tom O’Donnell. Given a narrative revolving around man’s ever-changing relationship with technology; the movements challenge assumptions within performance and the evolving role of the performer within art.

Here’s the promo video for the project:

An interactive projection mapped project, MIDASpaces employs a combination of light projection, sound and camera tracking to add a digital dimension to the creative arts in a real world space. The project made use of custom software written in openFrameworks (C++) running in conjunction with QuartzComposer (openGL) to create the visuals.
The programs were then controlled by Ableton Live and manipulated within VDMX and MadMapper. Readily available technologies such as Microsoft’s Kinect and Sony’s Playstation 3 Eye were customized to allow for audio and visual reactive controls.

Here you can see the full performance:


Project website:

Facebook page:

The world’s deepest bin


This is another experiment from The Fun Theory initiative by Volkswagen. Can we get more people to throw trash into the bin, rather than onto the ground, by making it fun to do? The fun theory explores whether or not fun can change behavior for the better. The results are really cool and it makes us want to inject a bit more fun into everything we design.

Industrial robots star in projection-mapping video

Industrial robots star in projection mapping video

Industrial robots are inherently bad ass. But when programmed as part of a work of art, they become magical.

“Box” is a live performance mixing robotics, computer graphics and choreography. It explores the synthesis of the real and digital spaces through projection mapping and moving surfaces.

With a nod to Arthur C. Clarke, San Francisco design and engineering firm Bot & Dolly understands this concept well. It spent two years working on “Box,” a five-minute film that explores the nexus of man, machine, and art. This film employs projection-mapping techniques, a human actor, and several large robotic arms to spectacular effect.

In the video below, you’ll see two robots wielding flat-panel displays that also serve as projection screens for high-res computer graphics. Meanwhile, an actor interacts with the screens as they go through a dizzying series of motions and tricks, both graphically and physically.

No doubt it will inspire creative minds working in different industries.

via CNET

Reemo Band: Control Home Automation Systems with Gestures

Playtabase is a company that is focusing on improving quality of life in the home, with its home automation system controlled by a bracelet — Reemo — that was designed to empower the elderly and chronically ill, but is likely to have equal appeal to a consumer market. Reemo works a little like a computer mouse for the home, based upon the idea of the Internet of Things. Through a series of gestures, it allows wearers to interact with everything from lights, to televisions, to alarm clocks. It can even be used to guide a cursor across a screen like an actual desktop mouse. Though this is clearly not its best scenario, it could enable disabled folks to interact with things they could no longer interact with.

The bracelet works with basic point gestures and has been designed with very high levels of accuracy in order to be able to distinguish between objects in close proximity to one another. It communicates with a box that talks to retrofitted controls around the home. The prototype we saw was a band that can be easily taken on and off without demanding a high level of dexterity. The final product will have a magnetic clasp to stop it jiggling around too much and keep the level of accuracy high.

Reemo has been beta tested in homes for the elderly, and is due to be launched in the US at Solidcon in May. The developers are working on producing an API, which would take the Reemo to the next level.

Official website: