We're long past keyboard and mouse being the default for computer control, but other than the touchscreen, "natural" user interfaces haven't really taken off.
Some of the wackier ideas we saw at CES this year - like brainwave wristbands - might never become mainstream.
But if the newly-released Leap Motion and the upcoming Kinect 2 give us a taste for alternative inputs, what other interface technology is on the horizon?

Wave your hands

Kinect isn't the only motion control system around. Oblong - a company based on technology created by the designer of the infamous Minority Report mid-air interface, John Underkoffer - has a (pricey) videoconferencing room system called Mezzanine that uses sensors mounted on the ceiling to let you drag documents around on the virtual whiteboard with a wand that also gives you a mouse button.
If you don't have the cash or the space for an installation like Mezzanine, you can use a Kinect, a PC, a projector and the Ubi software that turns any surface into an interactive touch screen - the price starts at $149, depending on how large a screen you want. Want to see your recipes on the kitchen counter or Angry Birds on your bedroom wall? Ubi does that.
Next gen interfaces
The Leap Motion controller is available to buy now
If you want something smaller than Kinect that lets you wave your hands at your PC or Mac, the long-awaited Leap Motion is shipping (and looking more like an Apple accessory than the angular Kinect).
Although it's supposed to be a lot more sensitive than Kinect, it's still somewhat variable in use. Scrolling through documents and web pages works well, doing something precise like selecting an icon can be tricky. Even though you can buy it, we're classing this as a work in progress.
With the right software, you could do much the same with Kinect - check out this video of Microsoft researcher Cem Keskin using his hands to paint in FreshPaint and zoom in Bing Maps.

One finger with Kinect 2

The wider field of view means more people can play at once: we saw it detect six people at the same time at the Build conference, all dancing away.Kinect 2, shipping with Xbox One, will be a lot more sensitive than the original Kinect; three times as sensitive in fact. Instead of a blur, it sees a pretty accurate 3D representation of whoever is standing in front of it. You can see lips moving when someone is talking or the wrinkles on their shirt.
And it can see further. During the demonstration we watched, the presenter had to walk off the stage to get out of range. The improved skeleton tracking picks up hand motions accurately. It can detect more of the joints in your hand, so it picks up finger and thumb movements too.
Microsoft says the voice recognition will be even better than the current Kinect. In our tests with the new fully voice enabled Sky app for Xbox, that's pretty reliable already.
With Kinect in the box, more developers will take advantage of it and it won't just be for games. If Xbox One is a success, Kinect 2 could be the next-generation interface is everyone's home. The combination of voice and gesture might show up at work too.

Multimode air traffic control

Chris Wild of design company Altran told us about a possible system they're experimenting with for air traffic controllers.
As flight paths get scheduled further in advance instead of today's ad hoc approach, sorting out routing problems without causing delays to other flights will get more difficult, so controllers need to work together closely.
In their setup, the supervisor who needs an overview of all the air space they're responsible for stands at a large screen that shows them all the planes and routes in 3D.
They use gestures to get more information about any planes they're worried about and wear a headset so they can speak to the system to allocate those aircraft to a specific controller - who might get the information on an iPad, drag the planes around on screen and send the new routes back to the big board.
Instead of just one interface, it's a multimodal system. You could gesture, talk or touch the screen for different tasks. You could even have different gestures for your left and right hands. And the technology isn't the real problem. The difficulty is in the execution, with issues such as working out how to take turns at talking, and prefecting the ergonomics right so users don't get backache.
Mixing your methods
Gartner analyst Angela McIntyre says that being able to mix different methods of control when they make sense is important to the success of new interfaces: "Being able to do one thing with your hands, another with your voice and a third with touch. Well, it's the way we normally interact with people and things in our life. You could be typing in a document and say the name of a song you want to hear and have it start playing in the background as you're working."
Wild suggests voice recognition systems could borrow a trick from the movie 2001 and use a camera to detect when you are talking to them; "When I want to talk to someone in a crowded room, I look at them. You could put images on the wall of the room that hide the camera and you hold your gaze on an image for a couple of seconds so the computer knows you're addressing it."
At CES this year we were impressed by the way Tobii's gaze tracking system uses what you're looking at on screen to scroll the right window or zoom the right part of the map, so maybe the combination of vision and sound is what we need.

Feel the feedback

Next gen interfaces
Satisfying control systems need to give feedback to users
But in the real world, points out Chris Ullrich of Immersion (the company which designs the haptic feedback used in most phones), "There's a deep seated human desire to get some kind of physical component to interactions. If you take that away you lose your confidence in how things work. It's not conscious but it makes things feel unsatisfactory. As you increase the physical reality of an experience you increase the pleasure and satisfaction you get from it."
So painting on a tablet screen is fun but it would feel more realistic if the paint that hadn't dried yet still felt 'wet' on screen. The same haptic feedback that buzzes when you type on your phone (or makes you feel that a marble rolling across the screen has fallen down a hole in a game) could do that.
In the longer term, an array of ultrasonic projectors or even a focused pulse of air against your hand could give you that feedback when you're making gestures without a touchscreen. More practical is giving people something to wear around their wrist. It might even be the smartwatch or a fitness tracking device you're already wearing.
The Fitbit Flex uses haptics to tell you when you've taken the number of steps to reach your daily goal. Why couldn't it also give you some feedback when your hand is in the right place to grab an icon in a gesture display? In three to five years, Ullrich thinks it will.

Make everything a button

Haptic startup Redux Labs is taking a different approach using a combination of transducers that turn screens into speakers and transducers (piezoelectric or electromagnetic, depending on the size of device) that propagate microscopic "bending waves" to deliver the sensation in exactly the right place - so it feels like it's under your finger.
They will be able to make a touch button on a phone or a microwave or a car dashboard feel like a physical button you're pressing or let you feel a scrollbar or scroll wheel you're dragging your finger over. Or you could get a sensation when you slide your finger over a key so you can get your hand in the right place, but not type anything until you press down firmly - like a real button.
Next gen interfaces
Oblong is experiementing with a whole office setup
It doesn't have to be built into a screen either; you could get this kind of sensation on a wooden or metal surface, through the fabric of your car seat or even on a cardboard sign, as well as on glass or plastic. And it doesn't have to be in the same place each time, so you can rotate a tablet and still have a "physical" home or Start button in the right place.
Doing away with physical buttons that move saves money (cutting the round hole in the glass of the iPad is expensive) and it saves a tiny bit of space (you could use that to put a bigger battery in or make the device smaller).
It also means there are fewer parts to fail or break and fewer ways for water to get inside if you drop your gadget. James Lewis of Redux thinks we might see the first products incorporating this technology in as little as a year's time.

Just use your phone

Angela McIntyre thinks that several of these extra technologies will come to devices you already use. Your TV remote control might get accelerometers so you can gesture to change the channel as well as pressing a button to turn the volume down.
Hillcrest designed a remote control that looked like a giant bracelet called the Loop. It didn't take off when it was introduced a few years ago but the company is in talks with several TV makers about adding it to high-end models. Samsung is including a remote with a trackpad for controlling its interface with some TVs this Christmas.
Next gen interfaces
Samsung's trackpad remote shows how different interfaces are extending to all kinds of products
She's seen a lot of unusual ideas – from heads up displays in cars and screens that show one thing to the driver and a different image to the passenger sitting next to them, to shirts and gloves that give you the physical sensation of what's going on in the game you're playing to using galvanic skin response to turn different places on your arm into different buttons and controls. (Microsoft Research had a version of that but you had to wear a Kinect on your shoulder to make it work.)
Some washing machine manufacturers are looking at using voice control to simplify complex features, but McIntyre thinks it makes more sense to use the smarts in something you already have. "Why not us your smartphone as a controller?" It has touch, it has voice recognition, it has an accelerometer.
"It is easier for that device to figure out what you're doing. You're close to is so it's not like having the thermostat on the other side of the room trying to figure out what is a gesture and what is you playing with your kids. For at least the next five years, I think the most convenient device people want to use as their controller will be the smartphone."
In other words, the new interfaces are already here. They're just waiting for us to start using them.