Apple’s 2019 iPhone could have a rear-facing 3D sensor

Apple has made no secret of its interest in augmented reality (AR) — in interviews CEO Tim Cook gives it as much attention as sales growth. Now, it’s rumoured that the company’s 2019 iPhone release will come with a rear-facing 3D sensor, potentially turning the model into a leading AR device.

People familiar with the plan have revealed that the sensor would complement, not replace, the existing TrueDepth sensor on the front of the iPhone X, Bloomberg reports. The current technology, which supports Apple’s Face ID, works by projecting a pattern of 30,000 laser dots onto the user’s face, measuring distortion to generate a 3D image for authentication. The proposed sensor would use a “time-of-flight” method instead, calculating the time it takes for a laser to hit surrounding objects, creating a 3D image from that.

Apple released ARKit this year, a software tool that helps developers make AR-based apps for iPhone. It’s proven successful with basic AR tasks, but struggles with more complex visuals and lacks depth perception. It’s thought a rear-facing 3D sensor would help mitigate these issues. However, sources say that the tech is still in its infancy, and might not be used in the final version of the phone. But there’s certainly no shortage of companies manufacturing time-of-flight sensors, so if it doesn’t make it into the 2019 model, it’s likely that it — or some kind of incarnation of the technology — will follow soon after.

Source: Bloomberg

Engadget RSS Feed

iOS beta explains WiFi and Bluetooth controls with notifications

As we noted back in September, iOS 11’s Control Center buttons don’t actually turn off Bluetooth or WiFi, unlike previous versions. Instead, tapping on either one simply disconnects you from any devices or services your iPhone is currently connected to. Apple ostensibly made this change so that you could stay connected to other services like AirDrop and devices like your Apple Watch. Still, the behavior can be confusing to many. According to MacRumors, the latest iOS 11.2 beta gives you an explanatory notification when you tap either Control Center button.

According to screenshots, your iPhone will show you a notification like the following: “Disconnecting Nearby WiFi Until Tomorrow” with an explanation about how the current network and others nearby will be disconnected until the following day. It also states that “WiFi will continue to be available for AirDrop, Personal Hotspot, and location accuracy.” You’ll see Bluetooth in the notification if you tap that button. This third iOS 11.2 beta comes a week after the second beta, which includes Apple Pay Cash via the Messages app and a few days after Apple released iOS 11.1.1, which fixed an annoying autocorrect bug.

Source: MacRumors

Engadget RSS Feed

Some iPhone X units suffer from crackling speakers at high volume

The iPhone X appears to have multiple teething troubles, albeit ones that aren’t necessarily common. Some users on Reddit, MacRumors and Twitter report that the new handset’s top speaker is crackling at higher volume levels. The severity varies, but it happens regardless of what you’re playing and persists with replacement units. It doesn’t appear to affect most units, but it’s common enough that it’s not necessarily an isolated issue.

We’ve asked Apple for comment and will keep you updated. Apple support reps are already collecting diagnostic info, so they’re at least investigating the reports.

It’s difficult to pin down a cause at this stage. Although the differing levels of the problem suggest the crackling could be a hardware issue, this comes mere weeks after Apple fixed a software flaw that produced crackles on the iPhone 8 and 8 Plus. If it’s a related issue, the company could theoretically push out a patch that addresses the problem without replacements. Either way, this and other problems are a reminder that cutting-edge phones can have their share of early glitches — it can take time before manufacturers iron out the kinks.

Via: MacRumors

Source: Reddit

Engadget RSS Feed

Security firm claims to thwart iPhone X’s Face ID with a mask

When Apple introduced Face ID security alongside the iPhone X, it boasted that even Hollywood-quality masks couldn’t fool the system. It might not be a question of movie-like authenticity, however — security researchers at Bkav claim to have thwarted Face ID by using a specially-built mask. Rather than strive for absolute realism, the team built its mask with the aim of tricking the depth-mapping technology. The creation uses hand-crafted “skin” made specifically to exploit Face ID, while 3D printing produced the face model. Other parts, such as the eyes, are 2D images. The proof of concept appears to work, as you can see in the clip below. The question is: do iPhone X owners actually have to worry about it?

The researchers maintain that they didn’t have to ‘cheat’ to make this work. The iPhone X was trained from a real person’s face, and it only required roughly $ 150 in supplies (not including the off-the-shelf 3D printer). The demo shows Face ID working in one try, too, although it’s not clear how many false starts Bkav had before producing a mask that worked smoothly. The company says it started working on the mask on November 5th, so the completed project took about 5 days.

When asked for comment, Apple pointed us to its security white paper outlining how Face ID detects faces and authenticates users.

Is this a practical security concern for most people? Not necessarily. Bkav is quick to acknowledge that the effort involved makes it difficult to compromise “normal users.” As with fake fingers, this approach is more of a concern for politicians, celebrities and law enforcement agents whose value is so high that they’re worth days of effort. If someone is so determined to get into your phone that they build a custom mask and have the opportunity to use it, you have much larger security concerns than whether or not Face ID is working.

More than anything, the seeming achievement emphasizes that biometric sign-ins are usually about convenience, not completely foolproof security. They make reasonable security painless enough that you’re more likely to use it instead of leaving your device unprotected. If someone is really, truly determined to get into your phone, there’s a real chance they will — this is more to deter thieves and nosy acquaintances who are likely to give up if they don’t get in after a few attempts.

Source: Bkav

Engadget RSS Feed

The Pixel 2 XL has another screen issue: unresponsive edges

It looks like Google still isn’t done fielding complaints about the Pixel 2 XL’s display. While some users are experiencing premature screen burn-in and seeing a bluish tint, others are apparently having trouble with its responsiveness. Comments posted on the Pixel 2 community website have revealed that some units are having issues getting their phones to register touches near the edges of the screen. One poster even conducted a test and found that while the edges on his display can recognize swipes just fine, they can’t always recognize taps.

Here’s a video of the experiment:

According to Android Police, this happens because the device’s accidental touch protection feature is just bit too effective. The good news is that it’s a software issue, and Google is already working on a fix. Orrin, a Pixel 2XL Community manager, posted on the thread to inform people that the Pixel team is already investigating and addressing the problem in an upcoming over-the-air update.

In an effort to preempt similar complaints about bluish or greenish tinted screens and burn-ins, Apple recently updated its support page to explain that those are perfectly normal for OLED displays like the iPhone X’s and Pixel 2 XL’s. Nevertheless, iPhone X’s screen seems to come with its own set of issues. Some of them have a nasty green line going down their edges, while others stop responding to touches in cold temperatures.

Source: Android Police

Engadget RSS Feed

A dedicated AI chip is squandered on Huawei’s Mate 10 Pro

Let’s face it: The AI hype train isn’t going away, and soon all our devices will be run by artificial intelligence. While Apple’s answer to the AI takeover is to just call its new A11 processor “Bionic,” Huawei has taken a more concrete approach. The company embedded a neural processing unit (NPU) on its Kirin 970 chip, which it claims can run AI tasks faster and with less power than others. The newly launched Mate 10 Pro is the first phone to use the Kirin 970, and it’s meant to demonstrate the wonders of deeply embedded AI. So far though, it’s a capable, well-designed phone that has yet to fully explore what a dedicated NPU can do.

When Huawei asked a group of reviewers what we wanted from AI, I didn’t have a real answer, though my peers pointed out things like natural linguistics and battery management. But after a few days with the Mate 10 Pro, I’ve realized what I want.

My ideal AI would basically be able to predict what I wanted based on how and when I’m using my phone. For example, if I’m holding my phone up at eye level in my apartment at about the same time every day, I’m most likely starting one of my daily selfie sprees. It should know then to automatically activate (or at least suggest) the Portrait mode on my front camera and even take a series of photos when I push one button. It gets tiring having to keep pressing the volume down button to take dozens of pictures.

The Mate 10 Pro doesn’t live up to my unrealistic expectations, but it marks a step in the right direction. The phone can recognize things you’re pointing the camera at, like food, pets, flowers or buildings, and adjusts settings like ISO, shutter speed and saturation to make your photos look good. For now, the Mate 10 Pro identifies only 13 scenes, but Huawei says it will continue adding situations that the phone will recognize.

In other words, the Mate 10 Pro is smart enough to be both camera and photographer. That is, in theory. While the Mate 10 Pro does take lovely pictures that are bright, sharp and accurately colored, I suspect that has more to do with its camera hardware than clever AI. The two cameras on its rear both feature an aperture of f/1.6 — the widest yet on a smartphone (tied with the LG V30). That hardware not only allows for clearer pictures in low light, but also creates a pleasantly shallow depth of field.

When I compared pictures I took in manual mode to those where the AI decided what settings to use, I had a hard time seeing a difference. My photos of flowers appeared as saturated whether the AI was at work or not, and the depth of field looked the same either way. The main difference I saw was a stronger bokeh effect applied by the AI. I guess this is kind of the point — the AI is as good as I, a human, am at determining the best settings.

Although the Mate 10 Pro’s tweaks aren’t very noticeable, its scene-recognition is mostly quick and accurate. However, some situations stumped the Mate 10 Pro, like my messy dinner of chicken covered with onions and peppers in a chili paste. Then there are the many objects that the phone can’t identify yet — like a group of players on a basketball court or a pair of pretty shoes. Huawei also needs more data before the phone can learn the best settings for those situations — whether it be bumping up the shutter speed to capture fast-moving soccer balls or producing shallower depth of field around shoes. The company said it will keep analyzing pictures (not user-generated) in the cloud and push out software updates to continually improve its camera software. No, Huawei isn’t spying on your photos — these are pictures it got elsewhere (the company hasn’t told us the source).

The AI is absent on the front camera, but I still loved the selfies I took with the Mate 10 Pro. Huawei’s Portrait Mode uses face-detection instead of depth-sensing like the iPhone X, which creates a softer depth of field that’s sometimes less defined than Apple’s. But the pictures from Huawei’s phone are more flattering. The iPhone X’s Portrait Mode selfies are so sharp that every imperfection and stray hair is obvious.

The primary benefit of having a dedicated neural-processing unit on the phone’s CPU is that machine-learning tasks can be executed quicker. Things like image-recognition or language translation can be carried out in tandem with other general functions so your phone shouldn’t slow down just to find the 3,500th picture of your cat’s face. With Huawei’s Kirin 970 chip, app developers can tap into the NPU by using either the Kirin API or popular machine learning frameworks like Google’s Tensorflow or Facebook’s Caffe 2.

The problem is, not many apps have done this. So far, only Huawei’s own camera software and Microsoft Translator tap the NPU for improved performance. The latter comes preinstalled in the Mate 10 Pro, by the way, and only its image-based translating tool is optimized right now. I took a picture of the phrase “You’re so pretty” in Mandarin, and barely a second later Translator told me it meant “You’re beautiful.” Close enough. Subsequent attempts with the same printout yielded dubious results, though, with the app often translating the words to “Hello, Drift.” This is more likely an issue with Microsoft’s engine than the Mate 10 Pro.

I tried the same thing out on a Galaxy Note 8 and an iPhone 8 Plus. All three phones performed within half a second of each other — with the Huawei frequently finishing the fastest. Sometimes the iPhone took the lead, but for the most part none of them lagged far behind the rest.

Aside from its camera and the Translator app, the Mate 10 Pro also uses AI to learn your habits over time so it can pre-allocate resources to the apps it thinks you’ll launch next. From my few days using the phone, it’s hard to judge how effective this has been, but the Mate 10 Pro certainly keeps up with my incessant selfie-taking, Instagram bingeing and light emailing.

So far, the Mate 10 Pro has too few AI integrations for me to really notice the benefits of a dedicated NPU. It’s a sleekly designed handset, though, and I love showing off the attractive “Signature” stripe on its elegant, shiny rear. The epic battery life is also a bonus. It easily gets through two days on a charge, and I can go four days without plugging it in under extremely light usage. I wish its display were sharper than 1080p, but that’s a minor complaint. Since Huawei hasn’t shared the US price and availability, I can’t definitively say if the Mate 10 Pro is a better deal than its competitors. But it’s an intriguing preview of the good that can come from a phone powered by AI.

Engadget RSS Feed

Some iPhone X displays have a nasty green line

The iPhone X’s design revolves around its all-encompassing OLED display, so you can imagine the heartbreak when that display is glitchy… and unfortunately, it looks like a handful of owners are going through that pain. People on Apple’s forums, Reddit and elsewhere are reporting a glitch where a green line runs down the left or right edge of the display, regardless of what’s happening on-screen. This doesn’t appear to affect the functionality, but it’s clearly annoying.

We’ve asked Apple for comment on the issue. It doesn’t appear that restarts or other common software solutions fix it, though, and this might be strictly a hardware problem. It’s not necessarily an overscan line like you might see on a TV, either. No matter what, it’s safe to say that you can get a replacement if the usual troubleshooting proves fruitless.

It’s unclear how many people are affected by the green line, although it doesn’t appear to be a widespread issue. Between this and the (software-fixable) cold weather responsiveness issue, though, it appears that the iPhone X has some teething troubles. That’s not entirely surprising. It’s Apple’s first phone to use an OLED screen, and it’s using a custom (Samsung-manufactured) panel at that — there may be a learning curve involved as the companies master their production techniques. As it is, Samsung has had problems with its own OLED phones. Provided the iPhone X flaw is a hardware issue, it illustrates the broader issues with manufacturing cutting edge OLED screens.

Via: 9to5Mac

Source: Apple Communities, Reddit, Lejia Peng (Twitter)

Engadget RSS Feed

Apple pushes out iOS 11.1.1 to fix annoying autocorrect bug

Apple gave its mobile software a facelift when it released iOS 11 back in September, but bugs led the company to push out an 11.1 update a month later to protect user security from that WPA2 Krack vulnerability. Turns out that version introduced another set of squirrely issues, which has led Apple to release iOS 11.1.1 today. You can finally say goodbye to that stupid autocorrect bug switching out the letter ‘i’ for all manner of gibberish.

The update also addresses an issue where the ‘Hey Siri’ feature occasionally stops working. And…that’s it. Even the security content is the same as the 11.1 release, meaning Apple pushed this update out just to fix these two issues. Consider your outrage heard, given that Apple could have waited to fix until the forthcoming 11.2 update; Reportedly, some GPS issues with the iPhone 8 and iPhone X are fixed in the 11.2 beta.

Via: Ars Technica

Source: Apple

Engadget RSS Feed

Apple Clips has better controls and loads of new ‘Star Wars’ effects

Apple’s Clips video creation app is less than a year old, but it’s already getting a big update. Thanks to lots of user feedback and the proliferation of new, more powerful iOS devices, Clips is now more polished than ever, and that’s very good news for people looking to craft their next viral video masterpiece.

The philosophy behind the app hasn’t changed — it’s still all about making fun, short videos without much technical know-how — but Apple worked to make the app even easier to use. Consider the app’s interface: it was never particularly hard to wrap your head around, but Apple’s zeal for simplicity sometimes made the original layout feel a little too basic. While you’ll still use a big, bright record button to add clips to your timeline, a handful of new shortcuts beneath the viewfinder window make it easier to gussy up your work.

Apple also moved its controls for Live Titles (a feature that automatically turns what you’re saying into subtitles) and style transfer filters (which add fun, Prisma-style art effects to your photos and videos) to the left and right of that big record button. These were two of the most popular (not to mention most useful) features in Clips, so I’m glad they’re getting a little more prominence this time.

This time, Clips also packs support for iCloud Drive, so you can start a new video project on an iPhone and pick up where you left off on an iPad without issue. The same project reflects updates made on multiple devices, so there’s no need to worry about version control — something that most average Clips users would probably loathe having to think about.

Interface revamps aside, the biggest new addition is what Apple calls Selfie Scenes. It’s unfortunately exclusive to the iPhone X, and one look at the feature in action confirms why — it uses the X’s TrueDepth camera to isolate your face, paint over it with some sweet artsy filters and replace your current background with something more scenic. Right now, the current batch of scenes includes a neon-soaked city in Asia, a hand-drawn rendition of Paris, an 8-bit city that looks like something out of Rampage and, uh, the Millennium Falcon. Seriously. Apple’s cozy partnership with Disney now means that you can virtually insert yourself into a corridor on the Falcon or the bridge of Supreme Leader Snoke’s Mega-class Star Destroyer from The Last Jedi. Naturally, the view of your face takes on the hazy blue of a Star Wars-style hologram.

Yes, Clips’s use of the iPhone X’s TrueDepth camera is a gimmick, but it’s a damned cool one. More importantly, it works well almost all the time — the iPhone X did a mostly great job isolating me and my extremities from my virtual background. There’s also something just a little wild about seamlessly inserting myself into a sci-fi universe I’ve yearned to be a part of since I was 8. It’s just too bad older iPhones don’t have the hardware necessary to make this work for more people. (There’s a small consolation prize for Star War buffs with older iPhones: loads of animated stickers depicting Chewie, Princess Leia, TIE Fighters and more.)

While not every iPhone will get all these new features, Apple’s thoughtful changes to the interface and workflow mean the Clips update is well worth installing — you can find it in the App Store now.

Engadget RSS Feed

Apple offered to help FBI unlock Texas shooter’s phone

FBI special agent Christopher Combs complained how the agency couldn’t get into the Texas shooter’s phone during a press conference. Turns out all they had to do was ask Apple for help. In a statement the tech titan has released to the media, it said it “immediately reached out to the FBI after learning from their press conference on Tuesday that investigators were trying to access a mobile phone.” Cupertino offered its assistance and even promised to “expedite [its] response to any legal process.” It added that it “work[s] with law enforcement every day” and “offer[s] training to thousands of agents so they understand [its] devices and how they can quickly request information from Apple.”

The company told Business Insider that the FBI has yet to ask for help accessing the phone. That pretty much confirms Reuters’ report that officials missed the 48-hour window that would have allowed them to unlock the device simply by using the shooter’s fingerprint. If the gunman had fingerprint access enabled, Apple could’ve told authorities that they had 48 hours to use his prints to unlock the phone before the feature ceased to function.

Now that it’s past 48 hours, the agency has to find a legal means to get to the phone’s contents. Officials will now have to serve Apple with a court order to be able to get their hands on his iCloud data. It’s unclear if the FBI is already securing a court order, but it might have decided not to work with Apple after having a tough time convincing the company to unlock the San Bernardino shooter’s iPhone. Apple refused to open the device for the agency even after the FBI took the company to court. In the end, the feds paid big money for a third-party company’s tool that was able to unlock the device.

Via: CNET, Business Insider, MacRumors

Source: John Paczkowski‏ (Twitter)

Engadget RSS Feed