iPad Pro could be Apple’s next device to use Face ID

It’s safe to assume that the face recognition system in the iPhone X will eventually reach other devices, but which ones are next in line? KGI’s Ming-Chi Kuo might have an idea. The historically accurate analyst expects the next generation of the iPad Pro to adopt the TrueDepth camera and, by extension, Face ID. This would unify the experience across Apple’s mobile devices, the analyst says, and would spur developers knowing that they could use face recognition across multiple Apple devices, not just one handset. The new iPads would ship sometime in Apple’s fiscal 2018, which ends in September of next year.

There’s another question to be answered: if this happens, will the Touch ID fingerprint reader go away? It’s not so clear. Apple clearly took advantage of eliminating the home button to expand the iPhone X’s screen size, but that’s not as necessary on devices that already have large displays. Also, Apple has typically kept larger bezels on the iPad due to its size — you need at least some space for your thumbs on a device that you can’t easily hold in one hand. We’d add that it could complicate multitasking, since Apple already uses an upward swipe on the iPad’s bottom edge to bring up the app dock. How would you handle that while also using a swipe to go to the home screen?

Whatever happens, it would make sense for the iPad Pro to get face recognition. Apple has made a habit of bringing relatively new features to its higher-end iPads (such as upgraded displays and the Smart Connector), and TrueDepth might be one more reason to spring for a Pro instead of sticking to the base model. And if Apple is going to continue pushing augmented reality, it’ll want tablets that particularly well-suited to the task regardless of the camera you’re using.

Source: 9to5Mac

Engadget RSS Feed

Face ID parts could cause iPhone X shortages

It’s safe to say that people are eagerly anticipating the iPhone X; it represents a step forward in design and tech for Apple. But now, The Wall Street Journal reports that difficulties in manufacturing components crucial to Face ID could lead to significant shortages of the iPhone X.

The components are called Romeo and Juliet, and as their names suggest, they work together in Apple’s face recognition system. Romeo is the home of the projector that uses a laser beam to create a 3D map of the user’s face, while Juliet’s infrared camera reads that map. According to The Wall Street Journal‘s sources, assembly of the Romeo component, and the challenge of incorporating its various components, was taking longer than its Juliet counterpart. This means there are more Juliets than Romeos.

While one source assured The Wall Street Journal that things were back on track, this is a troubling development for the iPhone X. Initially, rumors swirled around possible shortages surrounding the phones OLED display. Coupled with the Face ID component issues, this could mean shortages beyond those we traditionally expect surrounding a new iPhone launch. The iPhone X starts at $ 999 and will be able for preorder starting October 27th.

Via: Bloomberg

Source: The Wall Street Journal

Engadget RSS Feed

iOS 11 could use the iPhone’s NFC chip for more than Apple Pay

Apple may have an awkward history of avoiding and then embracing NFC in the past, but new developments at this week’s Worldwide Developer’s Conference indicate those days are long gone. Apple already announced new NFC functions coming to the Apple Watch with watchOS 4, but according to documents for the upcoming iOS 11 release, the iPhone’s NFC chip might also be handling much more than just Apple Pay transactions and Passbook check-ins.

Although the feature didn’t get any airtime onstage Monday, iOS 11 Beta adds support for Core NFC to the iPhone 7 and 7 Plus. (And presumably future hardware as well.) In release docs, Core NFC is described as “a new framework for reading Near Field Communications (NFC) tags and data in NFC Data Exchange Format.” At the moment, the iPhone’s NFC chip is useless for anything other than Apple’s in-house payment system, but the new framework appears to let the chip in the latest iPhones read any tags — not just Apple Pay tags — and take action on them based on the phone’s location. NFC could open up more ways for iOS apps to communicate with connected devices and iPhones could also replace NFC-based keycards or transit passes like London’s Oyster card and the Bay Area’s Clipper card. In theory, Core NFC could also enable functions like tap-to-pair Bluetooth speakers — something Android users have been enjoying for awhile now — but it’s possible Apple could block such features to keep the “magic” pairing experience limited to AirPods and other devices with its proprietary W1 chip.

On the other hand, opening NFC could also invite potential privacy issues onto iOS. Like Bluetooth Beacons, NFC tags allow for seamless, location-based interactions for better or worse. While the ability to tap your phone to a movie poster and instantly bring up the trailer might seem magical, even anonymous data gathered from those sorts of interactions can paint a startling clear picture of a consumer.

Get all the latest news from WWDC 2017 here!

Via: WCCFTech

Source: Apple Developer

Engadget RSS Feed

Apple ‘Neural Engine’ chip could power AI on iPhones

Apple’s focused on increasing the speed of every new mobile processor generation, most recently pairing its quad core A10 Fusion chips with its iPhone 7 and 7 Plus models last September. But to keep its devices competitive, Apple is building a secondary mobile processor dedicated to powering AI.

Sources told Bloomberg that Apple is developing the chips to participate in two key areas of artificial intelligence: Augmented reality and self-driving cars. The tech titan’s devices currently split AI tasks between two chips — the main processor and a GPU — but this new one, allegedly known internally as the Apple Neural Engine, has its own module dedicated to AI requests. Offloading those tasks should improve battery life, too.

Unfortunately, it’s unclear if the chip will come out this year. That puts Apple further behind Qualcomm’s latest Snapdragon mobile chips, which already have a dedicated AI module, and Google’s Tensor Processing Units available in its Cloud Platform to do AI heavy lifting.

Apple announced it was deploying its own deep neural networks at last year’s WWDC, but that kind of machine learning happens on server racks, not mobile processors. Unlike the company’s differential privacy methods protecting data sent to Apple’s servers, the Neural Engine chip would let devices sift through data on their own, which would be faster and easier on the battery, just like the M7 processors did for motion back in 2013.

Source: Bloomberg

Engadget RSS Feed

Apple acquisition could help Siri make sense of your data

If it wasn’t already clear that Apple is committed to improving AI, it is now. The tech giant has confirmed that it recently bought Lattice Data, a company that uses AI to make sense of unorganized “dark” data like images and text. It’s not discussing what it plans to do with its acquisition, but a TechCrunch source claims that Apple paid $ 200 million. It’s not a gigantic deal, then, but no small potatoes when only 20 engineers are making the leap. And if that same source is correct, it could be important for Siri — Lattice had reportedly been talking to tech firms about “enhancing their AI assistants.” But what does that mean, exactly?

AI assistants frequently depend on structured data to provide meaningful answers, such as the latest scores for your favorite team or your upcoming calendar events. It’s harder for them to parse the massive amounts of data you generate outside of those neat-and-tidy containers. Lattice could make that data usable, helping Siri handle more of your commands. Need to find some obscure piece of information? You might have a better chance of finding it.

That could be important in the long run, and not just for the usual voice commands on your iPhone or Mac. If you believe rumors, Apple may be close to unveiling a Siri-based speaker. While that device would be unlikely to benefit from any of Lattice’s know-how in the short term (certainly not at WWDC 2017), any eventual upgrades to Siri would improve its ability to compete against rivals like the Amazon Echo series or Google Home. Lattice may not sound like an exciting company on the surface, but its work could be crucial to Apple’s visions for the smart home and beyond.

Source: TechCrunch

Engadget RSS Feed

Apple’s 10th anniversary iPhone could cost over $1,000

To mark 10 years of metal and glass slabs, Apple is expected to debut an ultra high-end version of the iPhone alongside its next scheduled update. According to a report from Fast Company, Tim Cook and company will likely roll out three new phones this year: the incremental iPhone 7S in the 4.7-inch and 5.5-inch sizes, as well as a slightly larger, even more expensive 5.8-inch iPhone 8 with an edgeless OLED display and a few completely new features.

To really play up the 10th anniversary bit, Apple may even call the new flagship model the “iPhone X,” and the price is expected to shoot up past the $ 1,000 mark. That’s not too far-fetched by Apple’s standards, considering a maxed-out iPhone 7 Plus already costs $ 969 unlocked. We’ve heard rumors of an OLED iPhone before, but Fast Company‘s sources seem to confirm its existence. They higher-end screen alone is expected to cost Apple twice as much as the LCD displays it currently uses and with only Samsung’s OLEDs meeting Apple’s strict tolerances, the company is reportedly hogging up manufacturing capacity as well. There’s also a chance the iPhone 8/iPhone X will eliminate physical buttons entirely by incorporating the Home button into the screen itself and replacing the side buttons with touch-sensitive inlays in a metal frame with a glass back.

Probably the most interesting rumor about the next-generation iPhone, however, is Apple’s partnership with Lumentum. According to Fast Company‘s sources, Apple plans to incorporate Lumentum’s 3D-sensing technology into the flagship phone in some way — which could mean anything from better camera performance to advanced augmented reality features or even a facial recognition system that could supplement Touch ID. Of course, these features are just rumors at this point, so take them with a big lick of salt for now.

Source: Fast Company

Engadget RSS Feed

Apple’s next custom Mac chip could do a lot more

Intel processors have powered Apple’s Mac computers for over a decade now, but Apple has also found success designing its own A-series ARM-based chips for the iPhone and iPad. While the company isn’t going to dump Intel chips in the Mac any time soon, a report from Bloomberg indicates that Apple at least intends to put its foot in the water and test out designing its own silicon for the Mac.

According to Bloomberg’s Mark Gurman and Ian King, Apple is building an ARM-based chip that’ll offload the Mac’s “Power Nap” features from the standard Intel processor as a way to save batter life. Power Nap currently lets the Mac run software updates, download email and calendar updates, sync to iCloud, back up to Time Machine drives and a number of other features while the computer is asleep. Some of these features only work when plugged in, though — perhaps with a chip that consumers less energy, Power Nap’s capabilities could be expanded.

This could also be a first step towards a move away from Intel processors entirely, although Bloomberg says such a move would not happen in the immediate future. But Apple has invested a lot of money in its own series of chips since 2010 and could have more freedom to update the Mac without having to rely on Intel’s schedule.

It’s worth noting that this rumored Power Nap chip wouldn’t be the first Apple-designed chip to make it into a Mac. That honor would go to the T1, an ARM-based chip that showed up in the new MacBook Pro last fall. That chip controls the laptop’s Touch Bar and the Touch ID sensor but otherwise doesn’t have to do any heavy lifting. Apple has been pretty quiet about the chip, but it seems that the next MacBook Pro could have another ARM chip — maybe the T2? — that takes more tasks away from the main Intel processor. If that’s the case, we probably won’t know for a while, as Apple probably won’t update the MacBook Pro lineup again until this fall.

Engadget RSS Feed

USB-C’s new audio spec could rid of your headphone jack

Like it or not, the effort to get rid of the headphone jack is well underway. The USB Implementers Forum has published its long-expected Audio Device Class 3.0 specification, giving device makers the standard they need to pipe sound through USB-C ports on everything from phones to PCs. And the organization isn’t shy about its goals, either — this is mainly about letting companies removing the ages-old 3.5mm port, according to the Forum. In theory, that means slimmer devices, better water resistance and opening the “door to innovation” through room for other features.

We’re not sure everyone will buy that last argument, but there are some advantages to the spec that are worthwhile even if the headphone jack is here to stay. Aside from offering better digital audio support (such as headphones with custom audio processing), the USB-C sound spec improves on earlier USB approaches with power-saving measures and keyword detection. In other words: a company could take advantage of USB audio without hurting your battery life as much as before, and it should be easier to implement voice recognition.

This doesn’t mean that every company will embrace 3.5mm-free hardware with the same enthusiasm as Apple or Motorola. After all, Samsung used its Galaxy Note 7 introduction to make a not-so-subtle dig at Apple’s then-rumored decision to drop the headphone jack on the iPhone 7. However, the USB-C spec may nudge vendors who were thinking about ditching the conventional audio socket and were just waiting for official support to make their move.

Via: AnandTech

Source: USB Implementers Forum (PDF)

Engadget RSS Feed

Apple logs your iMessage contacts and could share them with police

Apple’s iMessage had a few security holes in March and April that potentially leaked photos and contacts, respectively. Though quickly patched, they are a reminder that the company faces a never-ending arms race to shore up its security to keep malicious hackers and government agencies out. But that doesn’t mean they will always be able to keep it private. A report from The Intercept states that iMessage conversation metadata gets logged in Apple’s servers, which the company could be compelled to turn over to law enforcement by court order. While the content of those messages remains encrypted and out of the police’s hands, these records list time, date, frequency of contact and limited location information.

When an iOS user types in a phone number to begin a text conversation, their device pings servers to determine whether the new contact uses iMessage. If not, texts are sent over SMS and appear in green bubbles, while Apple’s proprietary data messages appear in blue ones. Allegedly, they log all of these unseen network requests.

But those also include time and date stamps along with the user’s IP address, identifying your location to some degree, according to The Intercept. Like the phone logs of yore, investigators could legally request these records and Apple would be obliged to comply. While the company insisted that iMessage was end-to-end encrypted in 2013, securing user messages even if law enforcement got access, Apple said nothing about metadata.

Apple confirmed to The Intercept that it does comply with subpoenas and other legal requests for these exact logs, but maintained that message content is still kept private. Their commitment to user security isn’t really undermined by these illuminations phone companies have been giving this information to law enforcement for decades but it does illustrate what they can and cannot protect. While they resisted FBI requests for backdoor iPhone access earlier this year and then introduced a wholly redesigned file system with a built-in unified encryption method on every device, they can’t keep authorities from knowing when and where you text people.

Source: The Intercept

Engadget RSS Feed

Apple Watch could soon track your sleep and fitness levels

The Apple Watch is billed as a fitness-focused device, but it doesn’t really make sense of fitness data — you’re supposed to interpret the numbers yourself. However, Apple might soon give its wristwear some added smarts. Bloomberg sources claim that the Apple Watch will get apps that track sleeping patterns and fitness levels. It’s not certain how the sleep tracking would work (most likely through motion), but the watch would gauge your fitness by recording the time it takes for your heart rate to drop from its peak to its resting level.

It’s not certain when you’d get the apps. Apple, for its part, hasn’t commented. However, neither of these new features would require new hardware. Sleep tracking wearables have been around for a while, and the fitness measurement would just be a matter of parsing the heart rate data you can get from any Apple Watch.

If real, the move would be part of a broader effort to transform Apple’s overall approach to health. Reportedly, it wants its HealthKit framework to help “improve diagnoses,” not just collect data. You and your doctor could watch out for telltale signs of a condition, or measure your progress on the road to recovery. This would undoubtedly help Apple’s bottom line (you’d have to use at least an iPhone to get this information), but it could also help you make important life decisions.

Via: 9to5Mac

Source: Bloomberg

Engadget RSS Feed