Line is building its own digital assistant called Clova

Japanese-based messaging app Line is wildly popular in other parts of the world and the company has even expanded from a simple chat app to a full-service mobile carrier. According to a new report from the Financial Times, Line is branching out again and developing its own digital assistant called Clova to compete with the likes of Alexa and Google Assistant, complete with its own line of smart speakers.

Clova isn’t scheduled for release until sometime in “early summer” of this year, but Line views the platform, which is being developed with help from Sony and LG, as the next logical step for its messaging service. The platform will include the sort of features you might expect from a digital assistance in 2017 — like easy access to news, weather, calendars and online purchases — but Line is also promising Clova will be able to handle “complicated questions” and include a facial recognition aspect similar to Apple’s rumored iPhone features.

Outside of phones, Clova is designed to work in third-party apps and hardware, but the first devices to support it will be a standalone Clova app, a smart speaker called Wave and a “smart display” called Face. Although Wave looks like a direct knockoff of Google Home, Line didn’t offer any information about what sets Face apart except from its small, cartoonish face display. According to the Financial Times, Clova could eventually find its way into headsets and other third-party hardware.

Given Line’s target markets in Asia, the company hopes to beat Google, Apple and Amazon on its home turf. The Clova app and Wave speaker will see a release in Japan and South Korea this summer, followed by a roll out to Line’s other major markets in Thailand and Indonesia next.

Via: The Verge

Source: Financial Times

Engadget RSS Feed

Sony’s Xperia Ear is not the hands-free assistant I wanted

In theory, Sony’s newest wearable sounds promising. The Xperia Ear is a single Bluetooth earbud that lets you dictate messages, get weather updates and smartphone notifications, and carry out other little tasks just by talking to it. It’s like having an Amazon Echo in your ear, except with far fewer skills and third-party integrations. Sony also promises a long-lasting battery that can endure a full workday of talk time with the included charging case, so you can have the assistant ready for your commands all day. Unfortunately, the Xperia Ear simply doesn’t do enough to justify its $ 200 asking price.

Hardware

The Xperia Ear is a single black wireless earbud. The thumb-size, round-rectangular device has a slightly protruding speaker to help it latch onto your ear. There’s also a semicircular hook-like extrusion above the speaker that doesn’t appear to serve a purpose (other than perhaps helping it maintain a firmer grip on your ear). On its gray outer surface is a physical button that you can press to trigger the assistant, as well as a blue indicator light.

Inside, the earpiece houses a host of sensors, including a gyroscope, accelerometer, Bluetooth radio, NFC transmitter and proximity sensor. It also meets the IPX2 standard for water-resistance, meaning it can survive light splashes or rain. I did not encounter wet weather during my testing period, but the Ear did survive the drops of water I splashed on it.

Importantly, the device comes in a sturdy, pager-size holder that charges the core unit when you stow the latter in there. This case was small enough to carry in even my tiniest of purses, which I appreciated.

In use

Getting started with the Ear is simple. But first, know that it’s compatible only with Android, so if you’re an iPhone user, you should probably stop reading this review. Sony says it is “currently focused on creating the Xperia Ear host app for Android as it’s powered by Sony Agent Technology, which is specifically designed and currently only available for Android.” The company declined to comment on whether iOS compatibility is on the way, so don’t hold your breath.

On your Android device, your first step is to download the Xperia Ear app and then pair the Ear with your phone over Bluetooth. You can also smush your phone together with the earbud if you have an NFC-enabled handset, which makes connecting them a cinch. I paired the Ear with the Huawei Mate 9, and the NFC handshake between both devices was indeed quick.

Once I was all set up, I put the earpiece on and went about my business. The Ear felt surprisingly secure and didn’t fall out even when I shook my head vigorously to test just how well it would stay put. Wearing the Ear was comfortable until an hour later, when I started feeling a dull ache on the side of my head. It wasn’t superpainful, but I didn’t always feel like putting up with it, either. Taking off the earbud made the discomfort go away, and I ended up having to periodically remove the device during my review.

Most of your interactions with the Ear are going to involve you pressing the device’s button, waiting for it to say it’s listening and waiting for its three-tone chime (like the beep after a voicemail greeting). Only then can you ask your question. If that sounds tedious, it’s because it is. Sony could remove two steps from this process by getting rid of the redundant chime and the button push; the resulting speed gained would make the Ear feel much more responsive.

I really want the Xperia Ear to always be listening for a trigger phrase, because pushing a button against my ear repeatedly makes the side of my head feel slightly sore over time. Plus, it’s not really a hands-free experience if you have to use your hands to get some help. But that function would come at the expense of battery life, so this is a tradeoff I’m willing to accept.

You can set up the Ear so that a long press of the button activates OK Google, allowing you to use an assistant you’re probably already familiar with. But by default, you’ll be working with Sony’s unnamed helper, which is very new compared with existing offerings. And with that youth come some quirks that, together with its one-sided, Bluetooth-headset-inspired design, make the Xperia Ear feel dated.

Talking to Sony’s assistant feels like I’m interacting with a “futuristic” machine from Demolition Man. Its voice sounds artificial, robotic and disjointed, especially compared to Siri, the Google Assistant and Alexa, which have human voices with more natural inflections. Ear pronounced my name the same way Engadget’s Southern-bred Editor-in-Chief Michael Gorman does — as in, “Churl-lynn,” with a hard “ch.” Thanks a lot, Sony.

That’s an understandable mistake, considering my name is quite uncommon, but the Ear made the same error when reading a news piece about actress Charlize Theron. It took me a few seconds to realize who the assistant was describing. It also mispronounced the word “cleanses,” saying “clean-suhs” instead of “clen-suhs.” For the most part, though, the Ear is easy enough to understand if you’re paying attention.

The reason I was talking about Charlize Theron, by the way, is because whenever you stick the device in your ear, it greets you and starts rattling off the time, your agenda for the day and news headlines since you last put it on. The actress was the subject in one of several headlines that Sony pulled together. You don’t get to pick the news sources you prefer; instead, you can decide in the app settings only whether or not you want to hear headlines at all.

You can also choose to get voice alerts from apps such as Calendar, Email, Gmail, Hangouts, SMS, Twitter and Facebook. This causes the Ear to recite your incoming notifications as they arrive on your phone, which can be distracting. I happen to be excellent at tuning out noise, though, so this didn’t bother me. You can also dismiss each alert at any time by pressing the button on the earbud. I actually appreciated having someone read out my new emails to me because it means I can multitask even more effectively.

Instead of having to go to my inbox whenever I saw a new message, I could simply listen to the Ear narrate the entire email and decide if it was worth an immediate response. It was also adorable when the Ear read Managing Editor Dana Wollman’s email that opened, “Good news, bad news (mostly good news, I think),” but slightly less funny when it read out every last detail of each sender’s email signature, down to their ZIP codes. Still, with some software tuning, this feature could become truly useful for hardcore multitaskers like myself.

There are a few other things Ear can do, including setting timers, reporting the weather, answering calls, streaming music from your phone and sending text messages. The earpiece’s dual microphone, noise suppression and echo cancellation worked well, and people I spoke with using the Ear heard me clearly despite my loud Netflix video in the background. Because it’s a one-sided earbud, the Ear isn’t a good option for listening to music, but it works in a pinch. Just don’t expect great audio quality here; songs generally lack bass, with vocals sounding the clearest against tinny background instruments.

One of the niftiest things you can do with the Ear is to use voice dictation to compose messages. In general, the device accurately relayed what I said, but it spelled my name wrong. Again, given that I have a unique name, this isn’t a big deal, especially since most other words were spelled correctly.

Now, talking out loud is a rather conspicuous way to interact with any device, especially if you’re in an open office or walking outside. For those who want to be more stealthy, Sony built in an effective way to communicate nonverbally with the Ear: You can nod or shake your head in response to yes or no questions. This is a limited application, yes, but useful nonetheless for quick, discreet reactions. The device correctly interpreted my gestures (acknowledging them with a satisfying chime) when I answered its questions about whether the message it transcribed was correct and if I wanted to send my text.

That’s impressive for a first-generation device, but the Ear has its glitches. For instance, the earpiece would start reading out its greeting and list of headlines any time it got moved or bumped, even when I wasn’t wearing it. It was also inconsistent in delivering my alerts — I randomly received alerts about two really old unread Hangouts messages on my first day wearing the Ear.

Another gripe I have with the Ear is its inability to reconnect seamlessly with the synced phone after I leave and re-enter Bluetooth range. That means, when I go to the bathroom or leave the phone in a different room, the Ear stops working, only saying, “Device not connected.” When I get back to the phone, I have to press the button on the earbud to re-sync the devices. This should happen without any action on my part.

Like any other wireless earbud, the Xperia Ear’s battery life varies wildly depending on how much you use it. On my first day testing the device, which included a lot of email alerts and nearly an hour of song streaming, the Ear conked out (from a 60 percent charge) after a full day’s work. Another time, on a full charge, the Ear dropped just 60 percent of its energy after two days of testing, which included five to 10 minutes of music playback and multiple phone calls, text-message dictation and other small tasks. You can extend that runtime by activating Sony’s Battery Care mode via the companion app.

Speaking of the sort, recharging the Ear is easy — just put it back in its carrying case. The holder has two indicator lights: The top shows you by flashing red, yellow or green how full the earbud’s battery is. Another LED on the bottom indicates the amount of power left in the case, which you can plug in via micro-USB. It took about a week for the container’s charge to go from green to red, after it recharged the earbud a handful of times.

The competition

The Xperia Ear is a unique device — nothing else on the market claims to do exactly what it does. The thing is, though, you can get a similar experience with some of today’s wireless earbuds that let you tap your phone’s digital assistant. Case in point: The $ 250 Bragi Dash lets you tap your cheek to talk to Siri. You can also activate Siri with your existing Apple earphones with a long press on your remote control. Android owners don’t have a similar wireless option, though.

Compared to other wireless earbuds, such as the $ 200 Samsung Gear IconX and the $ 250 Jabra Elite Sport, the Xperia Ear is expensive, especially since it only covers one side. Plus, the Samsung and Jabra devices are geared toward fitness users and offer more features (and two earbuds instead of one) for less than twice the price of the Xperia Ear. They also deliver better audio quality than the Xperia, although Sony’s device offers longer battery life. Still, neither of these let you control an assistant yet, and the Ear retains that advantage over the competition, at least until its rivals add that feature (which, let’s be real, is inevitable).

Wrap-up

I was excited about the Xperia Ear and what it promised until I realized that, as it stands, the device does nothing different from Siri or Google over wired earbuds. In particular, the fact that it requires you to use your hand and press a button to use it makes me question the device’s existence in the first place. What’s the point of getting a whole new gadget for an assistant in your ear if not for the convenience when your arms are full? It’s not like this is a cheap purchase, either.

Still, this is a first-generation device that has the potential to become truly useful if Sony tweaks its software. That’s an easy enough fix. The trouble is, makers of other wireless earbuds could almost as easily offer the same features, by tapping into Siri or the Google Assistant. If, or when, they do, the Xperia Ear risks becoming a completely forgettable device.

Engadget RSS Feed

Allo brings Google’s ‘Assistant’ to your phone today

If you’re going to unveil a new messaging app, it had better do something unique. At this point, finding a place amongst entrenched options like Facebook Messenger, WhatsApp and iMessage is not an easy task. Google didn’t quite pull it off with Hangouts when it launched in 2013. Sure, it’s installed on basically every Android phone out there and anyone with a Gmail account has probably tried it, but Google’s messaging strategy never quite came together in a compelling or clear way.

So Google is rebooting yet again with Allo, a mobile-only messaging app that leverages the company’s biggest strengths in an effort to stand out from the pack. That strength is the vast amount of knowledge Google has about you and the world around you. It shows up in the app via the Google Assistant, a conversational chatbot that provides you and your friends with contextual info based on your chat history. The bot will show up across multiple Google products, including Google Home, but this is our first look at it in action.

It’s an outgrowth of what Google’s been doing for a long time with the Knowledge Graph and the info it serves you in things like Google Now, and that really is something no other app can do. I’ve been playing with Allo for about a week to see just how much the app can do — and where it still falls flat.

Getting set up is a simple affair: Once the app is installed, you create a profile linked to your phone number and Google account. From there, you’ll be able to see who in your phone’s contact list is using Allo to initiate a chat; you can also invite friends who don’t have the app to give it a shot. Then you can start a one-on-one chat, a group chat, an encrypted “incognito chat” or talk directly to the Google Assistant.

The Assistant is what really sets Allo apart from other chat apps, and it can provide you with a host of info depending on whether you’re in a private chat with it or bringing it into a conversation with other human beings. Probably the best way to sum up the Assistant is that it lets you bring info from around the internet right into your conversations without having to jump back and forth between apps.

If you’re planning dinner, for example, you can ask it to show you nearby Indian restaurants, and then tap on a specific result to get more details. Results from the Google Assistant typically have “chips” below them to prompt you to continue getting more info; you can pull up a map, call the location, see pictures inside and more with one tap. And because it understands natural language, you can follow up your query about Indian restaurants by saying “What about Chinese?” and it’ll know you’re interested in food, not the language.

This can be genuinely useful — it’s easy to share things like flight status, local weather and nearby points of interest with groups of people just by asking Google. And there’s lots of silly fun to be had as well. Google built in some games like “emoji movies,” where you have to guess the name of a film based on a series of emojis. You can also have it pull up pictures and GIFs from Google images, so it’s pretty easy to drop cute cat pictures to your group on the fly.

The downside to the Google Assistant is that it doesn’t quite live up to the promise of letting you do everything in the app, through the bot. Many times, tapping on various items will bounce you out to your browser, and while I can look up a bunch of restaurants with my friends, I can’t actually book one through OpenTable right in the app, for example. The Assistant doesn’t yet work with third-party services, so I can’t say “get us a table for four at 8PM.” That’ll come down the line, though.

When it can’t complete a task itself, you get bounced out to the web. Sometimes that makes sense — seeing a restaurant’s full menu is better in a browser than in a chat app, and getting directions to a location is a lot better in the proper Google Maps app. But the experience occasionally felt a bit more disjointed than I’d like. Google says the Assistant is considered only a “preview” right now, so it should become smarter and better integrated in time.

Chatting directly with the Google Assistant (rather than interacting with it in a chat with other humans) opens up more functionality. For the sake of privacy, it can do certain things only in private chat — you can ask it to get you directions to work, show you emails from yesterday, pull up your calendar agenda and more things based on your personal Google account. You can even have it pull images from Google Photos using natural language like “show me my pictures of dogs.”

The app also lets you set reminders and alarms as well as sign up for recurring “subscriptions.” You can search for a particular news item (I tried “Red Sox news”) and it’ll pop up every day at the time you specify. This is all well and good, but I don’t think a chatbot is the best place for a lot of these interactions. In fact, in a lot of cases, it’s easier to just say “OK Google” and ask your Android phone for this sort of help or info. Siri also does a lot of this on the iPhone at this point, as does the Google iOS app. Don’t get me wrong, the Google Assistant can be quite knowledgeable and useful, but in a lot of ways it’s just replicating things you can already do in Google search.

Beyond the Assistant, Allo has the messaging basics covered, but there are few surprises here. You can tap and hold the “send” button and then scroll up and down to increase or decrease the size of text — Google calls this “yelling” or “whispering.” It’s quite similar to the “loud” and “gentle” settings Apple added to iMessage in iOS 10, if you’ve checked that out. Google has also added in the “smart reply” feature that originated in Inbox. It’ll analyze the content of your chats or photos and offer suggestions. I found it to be pretty hit-or-miss; it’s handy to have it offer up a quick yes or no reply, but deeper replies don’t usually work out terribly well.

Naturally, Allo also has stickers; there are 29 different sets you can download, for starters, some of which are animated. They’re nice, and Google notes the name of the artist who created each set, but they’re not wildly different from what’s out there already. And as of yet, there isn’t a way to add more third-party options.

You can share your location or photos in Allo, but I ran into one surprising omission during my testing: On Android, you can’t see content from Google Photos and add them to a chat — you can access only images you’ve shot directly on your phone or downloaded to storage. There are work-arounds — you can go to Google Photos directly and share a photo to Allo from there — but it still seems like a strange omission. On Android, you can add text to photos and draw on top of them (a la Snapchat), a feature that’ll be coming to iOS down the line.

Allo also offers end-to-end encryption in “incognito” chats. The Google Assistant isn’t allowed here, and the participants in the chat can decide how long they want the messages to stick around for. You can set the chat expiration time as long as a week or as short as five seconds (you can also make it so messages don’t disappear). Most users probably won’t bother with this feature, but apps like Telegram made highly secure chat a feature of note, so it makes sense to see it pop up here.

Overall, there’s not a lot to make Allo stand out from the competition beyond the Google Assistant. And unfortunately, the Assistant feels a bit like it’s under construction, still. The breadth of information that Google has access to, both about a user as well as the world around him, is stunning, and it’s great to tap into. But Google has already given us a plethora of ways to do that; Allo is just another. The difference is that Allo makes it easy to bring that data into a conversation with other humans.

That’s the killer feature. But it’s not a simple one to explain, and it’s not something that becomes immediately useful. Some co-workers and I goofed around with Allo for several days, but the Assistant never elevated itself to a must-have feature. It was fun to show off and experiment with, but it didn’t feel like enough to keep any of us conversing in the app over the many other options we already have available to us. I’d like to keep giving it a shot, because it feels like it could be useful under the right circumstances. The trick is getting your friends to use it long enough for those situations to arise.

Engadget RSS Feed

Alexa support coming to BMW’s ‘Connected’ assistant app

BMW first revealed its revamped “Connected” assistant app in March, and it will finally be available this month. As a reminder, it does a lot more than sync your phone and car, acting more like the love-child of Waze and Google Now. It can scan your device’s calendar and address book, then calculate the drive time to an appointment based on your route and real-time traffic data. After factoring the vehicle’s fuel or battery level, it will send a “time to leave” notification to your iPhone or Apple Watch.

All of that information, including addresses and arrival times, is automatically synced to your car when you get in, assuming it’s a ConnectedDrive BMW, Rolls Royce or Mini. Yes, other apps including Android Auto, Waze and others let you do most of those functions. But Connected, being integrated with the vehicle, also lets you lock and unlock your vehicle, flash the headlights to help find it, and turn on the AC before you get in, among other functions. Once you arrive, it’ll give you “last mile” walking or transit directions.

Later this year, BMW will join Ford as one of the few automakers with Alexa support. That’ll let you shout commands at an Echo to remotely execute door locking and other functions, or get info like your vehicle’s fuel or battery levels. BMW says that the app will arrive on iOS sometime in August, with the Alexa update coming later in the year. There’s no word yet on Android support.

Source: BMW

Engadget RSS Feed