It’s safe to assume that the face recognition system in the iPhone X will eventually reach other devices, but which ones are next in line? KGI’s Ming-Chi Kuo might have an idea. The historically accurate analyst expects the next generation of the iPad Pro to adopt the TrueDepth camera and, by extension, Face ID. This would unify the experience across Apple’s mobile devices, the analyst says, and would spur developers knowing that they could use face recognition across multiple Apple devices, not just one handset. The new iPads would ship sometime in Apple’s fiscal 2018, which ends in September of next year.
There’s another question to be answered: if this happens, will the Touch ID fingerprint reader go away? It’s not so clear. Apple clearly took advantage of eliminating the home button to expand the iPhone X’s screen size, but that’s not as necessary on devices that already have large displays. Also, Apple has typically kept larger bezels on the iPad due to its size — you need at least some space for your thumbs on a device that you can’t easily hold in one hand. We’d add that it could complicate multitasking, since Apple already uses an upward swipe on the iPad’s bottom edge to bring up the app dock. How would you handle that while also using a swipe to go to the home screen?
Whatever happens, it would make sense for the iPad Pro to get face recognition. Apple has made a habit of bringing relatively new features to its higher-end iPads (such as upgraded displays and the Smart Connector), and TrueDepth might be one more reason to spring for a Pro instead of sticking to the base model. And if Apple is going to continue pushing augmented reality, it’ll want tablets that particularly well-suited to the task regardless of the camera you’re using.
Engadget RSS Feed
It’s safe to say that people are eagerly anticipating the iPhone X; it represents a step forward in design and tech for Apple. But now, The Wall Street Journal reports that difficulties in manufacturing components crucial to Face ID could lead to significant shortages of the iPhone X.
The components are called Romeo and Juliet, and as their names suggest, they work together in Apple’s face recognition system. Romeo is the home of the projector that uses a laser beam to create a 3D map of the user’s face, while Juliet’s infrared camera reads that map. According to The Wall Street Journal‘s sources, assembly of the Romeo component, and the challenge of incorporating its various components, was taking longer than its Juliet counterpart. This means there are more Juliets than Romeos.
While one source assured The Wall Street Journal that things were back on track, this is a troubling development for the iPhone X. Initially, rumors swirled around possible shortages surrounding the phones OLED display. Coupled with the Face ID component issues, this could mean shortages beyond those we traditionally expect surrounding a new iPhone launch. The iPhone X starts at $ 999 and will be able for preorder starting October 27th.
Source: The Wall Street Journal
Engadget RSS Feed
Apple may have an awkward history of avoiding and then embracing NFC in the past, but new developments at this week’s Worldwide Developer’s Conference indicate those days are long gone. Apple already announced new NFC functions coming to the Apple Watch with watchOS 4, but according to documents for the upcoming iOS 11 release, the iPhone’s NFC chip might also be handling much more than just Apple Pay transactions and Passbook check-ins.
Although the feature didn’t get any airtime onstage Monday, iOS 11 Beta adds support for Core NFC to the iPhone 7 and 7 Plus. (And presumably future hardware as well.) In release docs, Core NFC is described as “a new framework for reading Near Field Communications (NFC) tags and data in NFC Data Exchange Format.” At the moment, the iPhone’s NFC chip is useless for anything other than Apple’s in-house payment system, but the new framework appears to let the chip in the latest iPhones read any tags — not just Apple Pay tags — and take action on them based on the phone’s location. NFC could open up more ways for iOS apps to communicate with connected devices and iPhones could also replace NFC-based keycards or transit passes like London’s Oyster card and the Bay Area’s Clipper card. In theory, Core NFC could also enable functions like tap-to-pair Bluetooth speakers — something Android users have been enjoying for awhile now — but it’s possible Apple could block such features to keep the “magic” pairing experience limited to AirPods and other devices with its proprietary W1 chip.
On the other hand, opening NFC could also invite potential privacy issues onto iOS. Like Bluetooth Beacons, NFC tags allow for seamless, location-based interactions for better or worse. While the ability to tap your phone to a movie poster and instantly bring up the trailer might seem magical, even anonymous data gathered from those sorts of interactions can paint a startling clear picture of a consumer.
Get all the latest news from WWDC 2017 here!
Source: Apple Developer
Engadget RSS Feed
Apple’s focused on increasing the speed of every new mobile processor generation, most recently pairing its quad core A10 Fusion chips with its iPhone 7 and 7 Plus models last September. But to keep its devices competitive, Apple is building a secondary mobile processor dedicated to powering AI.
Sources told Bloomberg that Apple is developing the chips to participate in two key areas of artificial intelligence: Augmented reality and self-driving cars. The tech titan’s devices currently split AI tasks between two chips — the main processor and a GPU — but this new one, allegedly known internally as the Apple Neural Engine, has its own module dedicated to AI requests. Offloading those tasks should improve battery life, too.
Unfortunately, it’s unclear if the chip will come out this year. That puts Apple further behind Qualcomm’s latest Snapdragon mobile chips, which already have a dedicated AI module, and Google’s Tensor Processing Units available in its Cloud Platform to do AI heavy lifting.
Apple announced it was deploying its own deep neural networks at last year’s WWDC, but that kind of machine learning happens on server racks, not mobile processors. Unlike the company’s differential privacy methods protecting data sent to Apple’s servers, the Neural Engine chip would let devices sift through data on their own, which would be faster and easier on the battery, just like the M7 processors did for motion back in 2013.
Engadget RSS Feed
If it wasn’t already clear that Apple is committed to improving AI, it is now. The tech giant has confirmed that it recently bought Lattice Data, a company that uses AI to make sense of unorganized “dark” data like images and text. It’s not discussing what it plans to do with its acquisition, but a TechCrunch source claims that Apple paid $ 200 million. It’s not a gigantic deal, then, but no small potatoes when only 20 engineers are making the leap. And if that same source is correct, it could be important for Siri — Lattice had reportedly been talking to tech firms about “enhancing their AI assistants.” But what does that mean, exactly?
AI assistants frequently depend on structured data to provide meaningful answers, such as the latest scores for your favorite team or your upcoming calendar events. It’s harder for them to parse the massive amounts of data you generate outside of those neat-and-tidy containers. Lattice could make that data usable, helping Siri handle more of your commands. Need to find some obscure piece of information? You might have a better chance of finding it.
That could be important in the long run, and not just for the usual voice commands on your iPhone or Mac. If you believe rumors, Apple may be close to unveiling a Siri-based speaker. While that device would be unlikely to benefit from any of Lattice’s know-how in the short term (certainly not at WWDC 2017), any eventual upgrades to Siri would improve its ability to compete against rivals like the Amazon Echo series or Google Home. Lattice may not sound like an exciting company on the surface, but its work could be crucial to Apple’s visions for the smart home and beyond.
Engadget RSS Feed
To mark 10 years of metal and glass slabs, Apple is expected to debut an ultra high-end version of the iPhone alongside its next scheduled update. According to a report from Fast Company, Tim Cook and company will likely roll out three new phones this year: the incremental iPhone 7S in the 4.7-inch and 5.5-inch sizes, as well as a slightly larger, even more expensive 5.8-inch iPhone 8 with an edgeless OLED display and a few completely new features.
To really play up the 10th anniversary bit, Apple may even call the new flagship model the “iPhone X,” and the price is expected to shoot up past the $ 1,000 mark. That’s not too far-fetched by Apple’s standards, considering a maxed-out iPhone 7 Plus already costs $ 969 unlocked. We’ve heard rumors of an OLED iPhone before, but Fast Company‘s sources seem to confirm its existence. They higher-end screen alone is expected to cost Apple twice as much as the LCD displays it currently uses and with only Samsung’s OLEDs meeting Apple’s strict tolerances, the company is reportedly hogging up manufacturing capacity as well. There’s also a chance the iPhone 8/iPhone X will eliminate physical buttons entirely by incorporating the Home button into the screen itself and replacing the side buttons with touch-sensitive inlays in a metal frame with a glass back.
Probably the most interesting rumor about the next-generation iPhone, however, is Apple’s partnership with Lumentum. According to Fast Company‘s sources, Apple plans to incorporate Lumentum’s 3D-sensing technology into the flagship phone in some way — which could mean anything from better camera performance to advanced augmented reality features or even a facial recognition system that could supplement Touch ID. Of course, these features are just rumors at this point, so take them with a big lick of salt for now.
Source: Fast Company
Engadget RSS Feed
Intel processors have powered Apple’s Mac computers for over a decade now, but Apple has also found success designing its own A-series ARM-based chips for the iPhone and iPad. While the company isn’t going to dump Intel chips in the Mac any time soon, a report from Bloomberg indicates that Apple at least intends to put its foot in the water and test out designing its own silicon for the Mac.
According to Bloomberg’s Mark Gurman and Ian King, Apple is building an ARM-based chip that’ll offload the Mac’s “Power Nap” features from the standard Intel processor as a way to save batter life. Power Nap currently lets the Mac run software updates, download email and calendar updates, sync to iCloud, back up to Time Machine drives and a number of other features while the computer is asleep. Some of these features only work when plugged in, though — perhaps with a chip that consumers less energy, Power Nap’s capabilities could be expanded.
This could also be a first step towards a move away from Intel processors entirely, although Bloomberg says such a move would not happen in the immediate future. But Apple has invested a lot of money in its own series of chips since 2010 and could have more freedom to update the Mac without having to rely on Intel’s schedule.
It’s worth noting that this rumored Power Nap chip wouldn’t be the first Apple-designed chip to make it into a Mac. That honor would go to the T1, an ARM-based chip that showed up in the new MacBook Pro last fall. That chip controls the laptop’s Touch Bar and the Touch ID sensor but otherwise doesn’t have to do any heavy lifting. Apple has been pretty quiet about the chip, but it seems that the next MacBook Pro could have another ARM chip — maybe the T2? — that takes more tasks away from the main Intel processor. If that’s the case, we probably won’t know for a while, as Apple probably won’t update the MacBook Pro lineup again until this fall.
Engadget RSS Feed
Like it or not, the effort to get rid of the headphone jack is well underway. The USB Implementers Forum has published its long-expected Audio Device Class 3.0 specification, giving device makers the standard they need to pipe sound through USB-C ports on everything from phones to PCs. And the organization isn’t shy about its goals, either — this is mainly about letting companies removing the ages-old 3.5mm port, according to the Forum. In theory, that means slimmer devices, better water resistance and opening the “door to innovation” through room for other features.
We’re not sure everyone will buy that last argument, but there are some advantages to the spec that are worthwhile even if the headphone jack is here to stay. Aside from offering better digital audio support (such as headphones with custom audio processing), the USB-C sound spec improves on earlier USB approaches with power-saving measures and keyword detection. In other words: a company could take advantage of USB audio without hurting your battery life as much as before, and it should be easier to implement voice recognition.
This doesn’t mean that every company will embrace 3.5mm-free hardware with the same enthusiasm as Apple or Motorola. After all, Samsung used its Galaxy Note 7 introduction to make a not-so-subtle dig at Apple’s then-rumored decision to drop the headphone jack on the iPhone 7. However, the USB-C spec may nudge vendors who were thinking about ditching the conventional audio socket and were just waiting for official support to make their move.
Source: USB Implementers Forum (PDF)
Engadget RSS Feed
The Apple Watch is billed as a fitness-focused device, but it doesn’t really make sense of fitness data — you’re supposed to interpret the numbers yourself. However, Apple might soon give its wristwear some added smarts. Bloomberg sources claim that the Apple Watch will get apps that track sleeping patterns and fitness levels. It’s not certain how the sleep tracking would work (most likely through motion), but the watch would gauge your fitness by recording the time it takes for your heart rate to drop from its peak to its resting level.
It’s not certain when you’d get the apps. Apple, for its part, hasn’t commented. However, neither of these new features would require new hardware. Sleep tracking wearables have been around for a while, and the fitness measurement would just be a matter of parsing the heart rate data you can get from any Apple Watch.
If real, the move would be part of a broader effort to transform Apple’s overall approach to health. Reportedly, it wants its HealthKit framework to help “improve diagnoses,” not just collect data. You and your doctor could watch out for telltale signs of a condition, or measure your progress on the road to recovery. This would undoubtedly help Apple’s bottom line (you’d have to use at least an iPhone to get this information), but it could also help you make important life decisions.
Engadget RSS Feed