June 30, 2014

Amazon Fire Phone release date, news and features

Amazon Fire Phone release date, news and features
Sorry this image isn't in 3D

It's here! Meet Amazon Fire, the etailer's very first smartphone. Anyone out there nail its name?
The Amazon phone is about what we expected on the spec front, but it's loaded with two features that Amazon claims help users "see and interact with the world through a whole new lens."
Those features are Dynamic Perspective and Firefly, which we breakdown further below, plus plenty of details on everything else the Amazon phone has to offer.
Perhaps the biggest takeaway from Amazon's phone event, besides the eye-catching 3D (which doesn't necessarily mean customer-catching), is the phone's heavy ties to buying. Amazon wants you to purchase things, and now it's come up with a way for you to do so from your pocket.
What are your thoughts on Fire? Is it everything you were hoping for and more? Or a let-down that can't hold a candle to the iPhone 5SGalaxy S5or other flagship devices? Is Amazon simply trying to sell you more stuff, or looking like it legitimately wants to succeed in the smartphone space?

Amazon Fire Phone price and release date

The Amazon phone will cost $199.99 (about £117, AU$213) for a 32GB version and $299.99 (about £176, AU$320) for 64GB. Off contract, Fire costs $649.99 (about £382, AU$691) and $749.99 (about £441, AU$798), respectively.
The Fire Phone will be an AT&T exclusive, and pre-orders start today. It ships on July 25 and should be available in stores then as well.
AT&T customers with a Next early upgrade package can get away with paying $32.50/month for 20 months on Next 12 or $27.09/month for 24 months on Next 18 for the lesser storage flavor. A 64GB will run $37.50/month for 20 months on Next 12, while a next 18 option costs $31.25 for 24 months.
As an added bonus, customers who buy the Fire phone will be treated to 12 months of Prime membership free, but the offer is only running for a limited time.

Amazon Fire Phone specs

The device features a 4.7-inch screen, a size ideal for one-handed use, said CEO Jeff Bezos. It ranks with 590 nits of brightness and other goodies like an ambient light sensor and Dynamic Image Contrast to make your screen images sing in various viewing situations. The resolution sits at 1280 x 720 with 315ppi.
Gorilla Glass 3 is slathered on the front and back, the buttons are made of aluminum and stainless steel details and a rubberized polyurethane grip make for a chic profile.
On the inside, the Fire Phone features a quad-core Qualcomm Snapdragon 800 2.2GHz processor, Adreno 330 graphics and 2GB of RAM. As expected, the Fire runs a forked version of Android, Fire OS 3.5.0.
Amazon Phone back
A juicy 13MP snapper on the back
As for cameras, we know it's fixed with a 13MP snapper on the rear, complete with OIS and a powerful f/2.0 lens. There's even a dedicated camera hardware key - press once to turn it on, twice to take a shot. Amazon is throwing in free unlimited photo storage on Amazon Cloud Drive to sweeten the deal.
The front camera - the normal one - is a 2.1MP-er. Both it and the rear camera can capture video in 1080p.
Dolby Digital Plus surround sound speakers crank out the Fire Phone's audio. The Fire phone features global LTE and connectivity on nine LTE bands, four GSM bands and five UMTS. It features 802.11ac support, Wi-Fi channel bonding, Bluetooth and NFC. Note this is regular Bluetooth and not the LE kind that makes for wearable connections.
We suspect the device is going to need a lot of juice to run its 3D features, and Amazon only managed to put a 2,400mAh battery in to fuel the Fire. The company said in release notes that the Fire has 285 hours of standby time, up to 22 hours of talk time, up to 65 hours of audio playback and up to 11 hours of video playback. But running Dynamic Perspective and extensive testing is needed to see if these numbers are attainable.
Finally, a nanoSIM is preinstalled and the phone has space for a microUSB 2.0 and 3.5mm headphone ports.

Amazon Fire Phone 3D features

The Amazon phone screen has an interface called Dynamic Perspective to adjust the a 3D image on the screen to match users' head position. Lockscreens and wallpapers have a 3D effect, though that's not all.
Bezos demonstrated on stage how the device could render a building on a map in 3D. The building - the Empire State, to be exact - looked like it was coming out of the Amazon phone's screen, and moved as the user moved.
Neatly, in maps, you can tilt the phone to see what's "tucked" information that lives on another layer, like Yelp ratings and reviews, and see under and around edges.
The fun doesn't stop there. Fire Phone also lets you one-handed tilt through a line-up of items you may be shopping for, like women's dresses, in the Amazon Shopping app. You can also auto-scroll through an article, a web browser or ebooks, and tilting in Amazon Music reveals song lyrics.
And Dynamic Perspective seems acutely tuned to games, making the images you see on screen pop out and forcing you to manoeuvre around them just by moving your head.
3D images
See the world in 3D … on your Fire Phone
Dynamic Perspective is good at recognizing what's a human head and what's not, and there will even be an SDK for the feature so app developers can 3D-ify their games and offerings.
Bezos explained onstage in Seattle that in the early days of the Fire Phone, Amazon went so far as to make its own headset to emulate 3D effects. That's not really practical for real-life, Amazon concluded, which is perhaps a little jibe at Google Glass.
To solve the 3D issue, Amazon did indeed stick four front-facing cameras on each corner of its phone. No matter what angle it's being held at, two cameras will always be facing the user, Bezos claimed. They are of the infrared variety - ultra-low power, Amazon swears - so they work in darkness.
The Dynamic Perspective system also relies on four infrared LEDs on the front to compliment the cameras.

More Amazon Fire phone features

The Amazon phone is full of little touches, like swipes, to make it easier to use. Bezos and Co. seem very keen to make the Fire Phone as user-friendly as possible, probably hoping to keep their customer satisfaction rankings cozy in their No. 1 slots.
Following in line with the Kindle tablets, the phone features a dedicated Mayday button to connect to customer support. It will work over Wi-Fi, 3G and 4G, and is free.
Because video is so tied to the Amazon experience, the company has included a number of video features with its first handset. IMDB's X-Ray is headed to the Fire Phone, and Second Screen lets uses Miracast video from their Fire phone to their Fire TV. ASAP, another Fire TV feature, is also making it to its phone-y cousin.
The Kindle Store, Audible, Kindle Newstand and the recently purchased Comixology are accessible on the phone.
Taking advantage of Amazon's digital content library, the Fire provides "instant access" to over 33 million songs, apps, games, movies, TV shows, books, audiobooks and magazines. Prime members will get unlimited streaming access to movies and TV episodes at no extra chard. The same sort of deal applies to Kindle Owners' Lending Library and Prime Musicmembers.
Apps
Amazon's app collection is ready for the picking
An enhanced carousel features "active widgets" that show you the last several messages, emails or alerts in your various communication and organization apps.
The info pops up right on the home screen and users can deal with it without ever wandering away. Third-party apps can come up with their own uses; USA Today flashed headlines that are relevant to a user while Zillow popped up property information based on location.
The Music app features a "three-panel design," with the left for navigation, the center for various controls and the right with lyrics.

Amazon Fire Phone Firefly

Amazon also unveiled something called Firefly. By pressing and holding a dedicated button, the Fire Phone can recognize printed phone numbers, email and web addresses, business cards and much more. Firefly even works at a distance, so you can capture a phone number on a sign from across the street, for example.
The idea is to be able to send an email, make a call, save a contact or go to a website without having to type it all into your phone.
Firefly
Firefly in action
It doesn't stop there though; Firefly can also recognize songs, TV episodes, art, magazines, movies, music, QR codes and bar codes. iHeart Radio and StubHub build their own apps with the Firefly SDK to make it easier for customers to start a new radio station or find concert tickets.
Users can pull up info on items like books or a painting, potentially making it a handy information tool.
By the numbers, Firefly recognizes 245,00 movies and TV episodes, 160 live TV channels and 35 million songs. It can supposedly ID 70 million items (over 100 million all told), such as books, DVDs, video games and CDs, and even work around issues like folds, glare and curves. Users can then read product details for these items, add them to their Wish List, and order them on Amazon.com.
Translation - it's easier to buy things with the Fire Phone.
Amazon is releasing an SDK for the feature, meaning third-party developers can take advantage of its item-recognition abilities in their apps, too. The SDK is available immediately.

Talk, touch, wave, buzz: meet the interfaces of the future

Talk, touch, wave, buzz: meet the interfaces of the future
Next gen interfaces will shape our future

We're long past keyboard and mouse being the default for computer control, but other than the touchscreen, "natural" user interfaces haven't really taken off.
Some of the wackier ideas we saw at CES this year - like brainwave wristbands - might never become mainstream.
But if the newly-released Leap Motion and the upcoming Kinect 2 give us a taste for alternative inputs, what other interface technology is on the horizon?

Wave your hands

Kinect isn't the only motion control system around. Oblong - a company based on technology created by the designer of the infamous Minority Report mid-air interface, John Underkoffer - has a (pricey) videoconferencing room system called Mezzanine that uses sensors mounted on the ceiling to let you drag documents around on the virtual whiteboard with a wand that also gives you a mouse button.
If you don't have the cash or the space for an installation like Mezzanine, you can use a Kinect, a PC, a projector and the Ubi software that turns any surface into an interactive touch screen - the price starts at $149, depending on how large a screen you want. Want to see your recipes on the kitchen counter or Angry Birds on your bedroom wall? Ubi does that.
Next gen interfaces
The Leap Motion controller is available to buy now
If you want something smaller than Kinect that lets you wave your hands at your PC or Mac, the long-awaited Leap Motion is shipping (and looking more like an Apple accessory than the angular Kinect).
Although it's supposed to be a lot more sensitive than Kinect, it's still somewhat variable in use. Scrolling through documents and web pages works well, doing something precise like selecting an icon can be tricky. Even though you can buy it, we're classing this as a work in progress.
With the right software, you could do much the same with Kinect - check out this video of Microsoft researcher Cem Keskin using his hands to paint in FreshPaint and zoom in Bing Maps.

One finger with Kinect 2

The wider field of view means more people can play at once: we saw it detect six people at the same time at the Build conference, all dancing away.Kinect 2, shipping with Xbox One, will be a lot more sensitive than the original Kinect; three times as sensitive in fact. Instead of a blur, it sees a pretty accurate 3D representation of whoever is standing in front of it. You can see lips moving when someone is talking or the wrinkles on their shirt.
And it can see further. During the demonstration we watched, the presenter had to walk off the stage to get out of range. The improved skeleton tracking picks up hand motions accurately. It can detect more of the joints in your hand, so it picks up finger and thumb movements too.
Microsoft says the voice recognition will be even better than the current Kinect. In our tests with the new fully voice enabled Sky app for Xbox, that's pretty reliable already.
With Kinect in the box, more developers will take advantage of it and it won't just be for games. If Xbox One is a success, Kinect 2 could be the next-generation interface is everyone's home. The combination of voice and gesture might show up at work too.

Multimode air traffic control

Chris Wild of design company Altran told us about a possible system they're experimenting with for air traffic controllers.
As flight paths get scheduled further in advance instead of today's ad hoc approach, sorting out routing problems without causing delays to other flights will get more difficult, so controllers need to work together closely.
In their setup, the supervisor who needs an overview of all the air space they're responsible for stands at a large screen that shows them all the planes and routes in 3D.
They use gestures to get more information about any planes they're worried about and wear a headset so they can speak to the system to allocate those aircraft to a specific controller - who might get the information on an iPad, drag the planes around on screen and send the new routes back to the big board.
Instead of just one interface, it's a multimodal system. You could gesture, talk or touch the screen for different tasks. You could even have different gestures for your left and right hands. And the technology isn't the real problem. The difficulty is in the execution, with issues such as working out how to take turns at talking, and prefecting the ergonomics right so users don't get backache.
Mixing your methods
Gartner analyst Angela McIntyre says that being able to mix different methods of control when they make sense is important to the success of new interfaces: "Being able to do one thing with your hands, another with your voice and a third with touch. Well, it's the way we normally interact with people and things in our life. You could be typing in a document and say the name of a song you want to hear and have it start playing in the background as you're working."
Wild suggests voice recognition systems could borrow a trick from the movie 2001 and use a camera to detect when you are talking to them; "When I want to talk to someone in a crowded room, I look at them. You could put images on the wall of the room that hide the camera and you hold your gaze on an image for a couple of seconds so the computer knows you're addressing it."
At CES this year we were impressed by the way Tobii's gaze tracking system uses what you're looking at on screen to scroll the right window or zoom the right part of the map, so maybe the combination of vision and sound is what we need.

Feel the feedback

Next gen interfaces
Satisfying control systems need to give feedback to users
But in the real world, points out Chris Ullrich of Immersion (the company which designs the haptic feedback used in most phones), "There's a deep seated human desire to get some kind of physical component to interactions. If you take that away you lose your confidence in how things work. It's not conscious but it makes things feel unsatisfactory. As you increase the physical reality of an experience you increase the pleasure and satisfaction you get from it."
So painting on a tablet screen is fun but it would feel more realistic if the paint that hadn't dried yet still felt 'wet' on screen. The same haptic feedback that buzzes when you type on your phone (or makes you feel that a marble rolling across the screen has fallen down a hole in a game) could do that.
In the longer term, an array of ultrasonic projectors or even a focused pulse of air against your hand could give you that feedback when you're making gestures without a touchscreen. More practical is giving people something to wear around their wrist. It might even be the smartwatch or a fitness tracking device you're already wearing.
The Fitbit Flex uses haptics to tell you when you've taken the number of steps to reach your daily goal. Why couldn't it also give you some feedback when your hand is in the right place to grab an icon in a gesture display? In three to five years, Ullrich thinks it will.

Make everything a button

Haptic startup Redux Labs is taking a different approach using a combination of transducers that turn screens into speakers and transducers (piezoelectric or electromagnetic, depending on the size of device) that propagate microscopic "bending waves" to deliver the sensation in exactly the right place - so it feels like it's under your finger.
They will be able to make a touch button on a phone or a microwave or a car dashboard feel like a physical button you're pressing or let you feel a scrollbar or scroll wheel you're dragging your finger over. Or you could get a sensation when you slide your finger over a key so you can get your hand in the right place, but not type anything until you press down firmly - like a real button.
Next gen interfaces
Oblong is experiementing with a whole office setup
It doesn't have to be built into a screen either; you could get this kind of sensation on a wooden or metal surface, through the fabric of your car seat or even on a cardboard sign, as well as on glass or plastic. And it doesn't have to be in the same place each time, so you can rotate a tablet and still have a "physical" home or Start button in the right place.
Doing away with physical buttons that move saves money (cutting the round hole in the glass of the iPad is expensive) and it saves a tiny bit of space (you could use that to put a bigger battery in or make the device smaller).
It also means there are fewer parts to fail or break and fewer ways for water to get inside if you drop your gadget. James Lewis of Redux thinks we might see the first products incorporating this technology in as little as a year's time.

Just use your phone

Angela McIntyre thinks that several of these extra technologies will come to devices you already use. Your TV remote control might get accelerometers so you can gesture to change the channel as well as pressing a button to turn the volume down.
Hillcrest designed a remote control that looked like a giant bracelet called the Loop. It didn't take off when it was introduced a few years ago but the company is in talks with several TV makers about adding it to high-end models. Samsung is including a remote with a trackpad for controlling its interface with some TVs this Christmas.
Next gen interfaces
Samsung's trackpad remote shows how different interfaces are extending to all kinds of products
She's seen a lot of unusual ideas – from heads up displays in cars and screens that show one thing to the driver and a different image to the passenger sitting next to them, to shirts and gloves that give you the physical sensation of what's going on in the game you're playing to using galvanic skin response to turn different places on your arm into different buttons and controls. (Microsoft Research had a version of that but you had to wear a Kinect on your shoulder to make it work.)
Some washing machine manufacturers are looking at using voice control to simplify complex features, but McIntyre thinks it makes more sense to use the smarts in something you already have. "Why not us your smartphone as a controller?" It has touch, it has voice recognition, it has an accelerometer.
"It is easier for that device to figure out what you're doing. You're close to is so it's not like having the thermostat on the other side of the room trying to figure out what is a gesture and what is you playing with your kids. For at least the next five years, I think the most convenient device people want to use as their controller will be the smartphone."
In other words, the new interfaces are already here. They're just waiting for us to start using them.