I run noscript and as a policy do not enable sites unless it is very very imperative for me to do so.
So of course I did not get to see the real product but it appears to be a system that audio output to the user and hand/gesture input to the control device.
There are some devices that go a step(s) further, they are based on the recent determination that scalp (aka rest on your head) sensors can pick up enough of your brain waves to proved satisfactory input controls.
In one experiment, well lets start off with the base experiment.
A Rhesus monkey has a sensor array implanted in her brain. The device is connected to a remote arm. The Remote arm can deliver treats/foods.
Now initially and the context of the experiment was that the Monkey would move an arm, the sensors would pick that up and translate that into action of the Mechanical arm.
Everyone was happy when this worked. They went away and partied, wrote papers and were congratulating themselves.
BUT THEN they returned to the Lab and .... well their world was ROCKED.
The Monkey was no longer moving her arm to control the mechanical arm. She was doing it with thought alone.
Now the new experiment and with a human, the newer devices do not need implants. They currently have visual/audio feedback systems that are controlling things on the screen through thought alone.
Now that is only a part of what I am waiting for.