Intel faces multiple lawsuits over chip security vulnerabilities

Intel is already facing multiple lawsuits over the chip security flaws revealed earlier this week. Gizmodo reports that three have been filed so far — in California, Oregon and Indiana. All three are class action complaints and note Intel’s delay in disclosing the vulnerabilities — it knew about them for months — as well as reduced performance caused by subsequent security patches. The Register reported that PC slow downs could amount to as much as five to 30 percent, but Intel has said that its solution’s impacts are “highly workload-dependent” and won’t be noticed much by the typical user.

It’s still early — the flaws were only officially revealed on Wednesday — so Intel could be facing more lawsuits going forward. In the week following Apple’s reveal that it intentionally slows older iPhone models to prevent sudden shutdowns, it was hit with a number of lawsuits in multiple countries.

Intel says 90 percent of affected chips should be patched by the end of the week while companies like Microsoft, Google and Apple are also releasing updates to mitigate the effects of the Spectre and Meltdown vulnerabilities.

Via: The Verge

Engadget RSS Feed

Google poaches a key Apple chip designer

Google is still snapping up Apple’s chip design talent as part of its ongoing quest to create custom processors. The Information has learned that the search giant has hired John Bruno, the designer who founded and ran Apple’s silicon competitive analysis group — that is, the team that helped iPhone and iPad processors stay ahead of rivals. It’s not certain what he’ll be doing at Google (his LinkedIn profile lists him only as a “System Architect”), but he started at graphics veteran ATI and rose to become a chief engineer at AMD, where he led the design of Fusion processors.

It’s reasonable to presume that the influx of new talent (which also includes veterans from Qualcomm) will be used to expand Google’s variety of custom processors. Right now, its only in-house silicon is the Pixel Visual Core imaging chip inside the Pixel 2. The question is just what Google will do with Bruno and others. It’s tempting to assume that its next step is a full-fledged CPU for its phones, especially given Bruno’s background in graphics but it could also produce other specialized chips (such as AI accelerators or display controllers).

Whatever Bruno works on, it’s evident that Google is committed to giving its phones (and possibly other devices) hardware that stands out. It’s not hard to see why it would go that route. Up until the Pixel 2, Google’s Pixel and Nexus phones only occasionally stood out hardware-wise and frequently used parts you could find in competing models. You bought them mainly for the software (such as pure Android or the Pixel’s HDR+ camera mode), and any hardware perks were just gravy. If Google can design chips that are genuinely faster or more efficient than what you find in competing products, you may have a good reason to choose a Pixel even if you only care about raw performance.

Source: The Information, LinkedIn

Engadget RSS Feed

A dedicated AI chip is squandered on Huawei’s Mate 10 Pro

Let’s face it: The AI hype train isn’t going away, and soon all our devices will be run by artificial intelligence. While Apple’s answer to the AI takeover is to just call its new A11 processor “Bionic,” Huawei has taken a more concrete approach. The company embedded a neural processing unit (NPU) on its Kirin 970 chip, which it claims can run AI tasks faster and with less power than others. The newly launched Mate 10 Pro is the first phone to use the Kirin 970, and it’s meant to demonstrate the wonders of deeply embedded AI. So far though, it’s a capable, well-designed phone that has yet to fully explore what a dedicated NPU can do.

When Huawei asked a group of reviewers what we wanted from AI, I didn’t have a real answer, though my peers pointed out things like natural linguistics and battery management. But after a few days with the Mate 10 Pro, I’ve realized what I want.

My ideal AI would basically be able to predict what I wanted based on how and when I’m using my phone. For example, if I’m holding my phone up at eye level in my apartment at about the same time every day, I’m most likely starting one of my daily selfie sprees. It should know then to automatically activate (or at least suggest) the Portrait mode on my front camera and even take a series of photos when I push one button. It gets tiring having to keep pressing the volume down button to take dozens of pictures.

The Mate 10 Pro doesn’t live up to my unrealistic expectations, but it marks a step in the right direction. The phone can recognize things you’re pointing the camera at, like food, pets, flowers or buildings, and adjusts settings like ISO, shutter speed and saturation to make your photos look good. For now, the Mate 10 Pro identifies only 13 scenes, but Huawei says it will continue adding situations that the phone will recognize.

In other words, the Mate 10 Pro is smart enough to be both camera and photographer. That is, in theory. While the Mate 10 Pro does take lovely pictures that are bright, sharp and accurately colored, I suspect that has more to do with its camera hardware than clever AI. The two cameras on its rear both feature an aperture of f/1.6 — the widest yet on a smartphone (tied with the LG V30). That hardware not only allows for clearer pictures in low light, but also creates a pleasantly shallow depth of field.

When I compared pictures I took in manual mode to those where the AI decided what settings to use, I had a hard time seeing a difference. My photos of flowers appeared as saturated whether the AI was at work or not, and the depth of field looked the same either way. The main difference I saw was a stronger bokeh effect applied by the AI. I guess this is kind of the point — the AI is as good as I, a human, am at determining the best settings.

Although the Mate 10 Pro’s tweaks aren’t very noticeable, its scene-recognition is mostly quick and accurate. However, some situations stumped the Mate 10 Pro, like my messy dinner of chicken covered with onions and peppers in a chili paste. Then there are the many objects that the phone can’t identify yet — like a group of players on a basketball court or a pair of pretty shoes. Huawei also needs more data before the phone can learn the best settings for those situations — whether it be bumping up the shutter speed to capture fast-moving soccer balls or producing shallower depth of field around shoes. The company said it will keep analyzing pictures (not user-generated) in the cloud and push out software updates to continually improve its camera software. No, Huawei isn’t spying on your photos — these are pictures it got elsewhere (the company hasn’t told us the source).

The AI is absent on the front camera, but I still loved the selfies I took with the Mate 10 Pro. Huawei’s Portrait Mode uses face-detection instead of depth-sensing like the iPhone X, which creates a softer depth of field that’s sometimes less defined than Apple’s. But the pictures from Huawei’s phone are more flattering. The iPhone X’s Portrait Mode selfies are so sharp that every imperfection and stray hair is obvious.

The primary benefit of having a dedicated neural-processing unit on the phone’s CPU is that machine-learning tasks can be executed quicker. Things like image-recognition or language translation can be carried out in tandem with other general functions so your phone shouldn’t slow down just to find the 3,500th picture of your cat’s face. With Huawei’s Kirin 970 chip, app developers can tap into the NPU by using either the Kirin API or popular machine learning frameworks like Google’s Tensorflow or Facebook’s Caffe 2.

The problem is, not many apps have done this. So far, only Huawei’s own camera software and Microsoft Translator tap the NPU for improved performance. The latter comes preinstalled in the Mate 10 Pro, by the way, and only its image-based translating tool is optimized right now. I took a picture of the phrase “You’re so pretty” in Mandarin, and barely a second later Translator told me it meant “You’re beautiful.” Close enough. Subsequent attempts with the same printout yielded dubious results, though, with the app often translating the words to “Hello, Drift.” This is more likely an issue with Microsoft’s engine than the Mate 10 Pro.

I tried the same thing out on a Galaxy Note 8 and an iPhone 8 Plus. All three phones performed within half a second of each other — with the Huawei frequently finishing the fastest. Sometimes the iPhone took the lead, but for the most part none of them lagged far behind the rest.

Aside from its camera and the Translator app, the Mate 10 Pro also uses AI to learn your habits over time so it can pre-allocate resources to the apps it thinks you’ll launch next. From my few days using the phone, it’s hard to judge how effective this has been, but the Mate 10 Pro certainly keeps up with my incessant selfie-taking, Instagram bingeing and light emailing.

So far, the Mate 10 Pro has too few AI integrations for me to really notice the benefits of a dedicated NPU. It’s a sleekly designed handset, though, and I love showing off the attractive “Signature” stripe on its elegant, shiny rear. The epic battery life is also a bonus. It easily gets through two days on a charge, and I can go four days without plugging it in under extremely light usage. I wish its display were sharper than 1080p, but that’s a minor complaint. Since Huawei hasn’t shared the US price and availability, I can’t definitively say if the Mate 10 Pro is a better deal than its competitors. But it’s an intriguing preview of the good that can come from a phone powered by AI.

Engadget RSS Feed

Intel accuses Qualcomm of abusing its mobile chip monopoly

Qualcomm’s dispute with Apple over patents on its chip tech recently took a nasty turn when it asked the US International Trade Commission (ITC) to ban iPhone sales in the US. In response to a request from the ITC, Intel has now made its own statement, accusing Qualcomm of abusing its monopoly position and not licensing “standard-essential” patents at a fair rate as required by law. Of course, if anyone knows how to spot abuse of a chip monopoly, it’s Intel.

The dispute started when Apple sued Qualcomm for “abusing its clout” in the mobile chip industry. It got more combative when Apple began withholding patent royalties via suppliers like Foxconn. It argued that Qualcomm hasn’t licensed its tech under “fair, reasonable and non-discriminatory” terms, claiming that it’s charging five times more than all of its other licensors combined. (The US Federal Trade Commission filed a separate lawsuit against Qualcomm, and both South Korea and China slapped hefty fines on Qualcomm over its trade practices.)

Qualcomm, for its part, said Apple “unilaterally declared the contract terms unacceptable; the same terms that have applied to iPhones and cellular-enabled iPads for a decade.” It then turned around and sued Apple’s suppliers that use Qualcomm patents to claw back the royalties.

Intel got involved in the dispute because its LTE modems, used in some of Apple’s latest iPhones, allegedly infringe on Qualcomm patents. Qualcomm has demanded that Apple replace those parts with chips that use its own baseband modems. (Intel’s LTE modem is reportedly used on GSM-radio iPhone 7 and 7 Plus models from T-Mobile, AT&T and the rest of the world, while Qualcomm radios are used on CDMA models by Sprint and Verizon in the US).

Apple’s iPhone 7 and 7 Plus (AOL)

Intel accused Qualcomm of further anti-competitive practices — namely, offering Apple lower licensing fees for using its chips exclusively. “These arrangements foreclosed rivals like Intel from competing for Apple’s vital business,” it said.

Intel believes that Qualcomm has a more sinister aim with its Apple patent dispute: Crushing Intel in the mobile chip market. “Qualcomm did not initiate this investigation to stop the alleged infringement of its patent rights; rather, its complaint is a transparent effort to stave off lawful competition from Qualcomm’s only remaining rival,” it states. “These arrangements foreclosed rivals like Intel from competing for Apple’s vital business,” it said.

Overall, Intel believes that the ITC needs to consider Qualcomm’s pattern of what it calls “abusive” legal practices. “This twisted use of the Commission’s process is just the latest in a long line of anticompetitive strategies that Qualcomm has used to quash incipient and potential competitors and avoid competition on the merits.”

There’s a lot of irony in this, of course. In the ITC claim, Intel is depicting itself as the poor victim of a mobile chip monopolist, even though it was fined $ 1.4 billion by the EU for abusing its own PC chip monopoly with AMD. Nevertheless, its claim to be an underdog is effectively correct: Next to Qualcomm, Intel has a pitiful share of the mobile chip market. The ITC is set to study the complaint in August, and a trial is expected sometime next year.


Source: Intel (PDF)

Engadget RSS Feed

iOS 11 could use the iPhone’s NFC chip for more than Apple Pay

Apple may have an awkward history of avoiding and then embracing NFC in the past, but new developments at this week’s Worldwide Developer’s Conference indicate those days are long gone. Apple already announced new NFC functions coming to the Apple Watch with watchOS 4, but according to documents for the upcoming iOS 11 release, the iPhone’s NFC chip might also be handling much more than just Apple Pay transactions and Passbook check-ins.

Although the feature didn’t get any airtime onstage Monday, iOS 11 Beta adds support for Core NFC to the iPhone 7 and 7 Plus. (And presumably future hardware as well.) In release docs, Core NFC is described as “a new framework for reading Near Field Communications (NFC) tags and data in NFC Data Exchange Format.” At the moment, the iPhone’s NFC chip is useless for anything other than Apple’s in-house payment system, but the new framework appears to let the chip in the latest iPhones read any tags — not just Apple Pay tags — and take action on them based on the phone’s location. NFC could open up more ways for iOS apps to communicate with connected devices and iPhones could also replace NFC-based keycards or transit passes like London’s Oyster card and the Bay Area’s Clipper card. In theory, Core NFC could also enable functions like tap-to-pair Bluetooth speakers — something Android users have been enjoying for awhile now — but it’s possible Apple could block such features to keep the “magic” pairing experience limited to AirPods and other devices with its proprietary W1 chip.

On the other hand, opening NFC could also invite potential privacy issues onto iOS. Like Bluetooth Beacons, NFC tags allow for seamless, location-based interactions for better or worse. While the ability to tap your phone to a movie poster and instantly bring up the trailer might seem magical, even anonymous data gathered from those sorts of interactions can paint a startling clear picture of a consumer.

Get all the latest news from WWDC 2017 here!

Via: WCCFTech

Source: Apple Developer

Engadget RSS Feed

Apple ‘Neural Engine’ chip could power AI on iPhones

Apple’s focused on increasing the speed of every new mobile processor generation, most recently pairing its quad core A10 Fusion chips with its iPhone 7 and 7 Plus models last September. But to keep its devices competitive, Apple is building a secondary mobile processor dedicated to powering AI.

Sources told Bloomberg that Apple is developing the chips to participate in two key areas of artificial intelligence: Augmented reality and self-driving cars. The tech titan’s devices currently split AI tasks between two chips — the main processor and a GPU — but this new one, allegedly known internally as the Apple Neural Engine, has its own module dedicated to AI requests. Offloading those tasks should improve battery life, too.

Unfortunately, it’s unclear if the chip will come out this year. That puts Apple further behind Qualcomm’s latest Snapdragon mobile chips, which already have a dedicated AI module, and Google’s Tensor Processing Units available in its Cloud Platform to do AI heavy lifting.

Apple announced it was deploying its own deep neural networks at last year’s WWDC, but that kind of machine learning happens on server racks, not mobile processors. Unlike the company’s differential privacy methods protecting data sent to Apple’s servers, the Neural Engine chip would let devices sift through data on their own, which would be faster and easier on the battery, just like the M7 processors did for motion back in 2013.

Source: Bloomberg

Engadget RSS Feed

Samsung’s chip business kept things looking up to start 2017

Samsung’s Q1 2017 earnings are in, showing the company’s highest quarterly profit since Q3 2013. That’s despite the Galaxy Note 7 recall, and a markdown in the price of its Galaxy Note 7, apparently because the company’s chip business (making memory, processors and camera sensors for phones) is booming. As a company, it brought home the $ 8.75 billion in operating profit expected, and looks forward to better results next quarter, since it will include sales of the new Galaxy S8 phones.

On a call with reporters, execs reaffirmed that reports of a reddish tint on some S8s are a “natural difference” in the OLED technology that it will let users tweak after a software update. Samsung also mentioned “the launch of a new flagship smartphone in the second half,” but didn’t tag the Galaxy Note name to whatever that presumably large-screened device will be. It also did not play into any expectations for an OLED iPhone that it could supply screens for, simply saying that “YoY revenue growth in the OLED business is forecast on the back of increased flexible panel shipments in the second half.”

Source: Samsung

Engadget RSS Feed

Apple’s next custom Mac chip could do a lot more

Intel processors have powered Apple’s Mac computers for over a decade now, but Apple has also found success designing its own A-series ARM-based chips for the iPhone and iPad. While the company isn’t going to dump Intel chips in the Mac any time soon, a report from Bloomberg indicates that Apple at least intends to put its foot in the water and test out designing its own silicon for the Mac.

According to Bloomberg’s Mark Gurman and Ian King, Apple is building an ARM-based chip that’ll offload the Mac’s “Power Nap” features from the standard Intel processor as a way to save batter life. Power Nap currently lets the Mac run software updates, download email and calendar updates, sync to iCloud, back up to Time Machine drives and a number of other features while the computer is asleep. Some of these features only work when plugged in, though — perhaps with a chip that consumers less energy, Power Nap’s capabilities could be expanded.

This could also be a first step towards a move away from Intel processors entirely, although Bloomberg says such a move would not happen in the immediate future. But Apple has invested a lot of money in its own series of chips since 2010 and could have more freedom to update the Mac without having to rely on Intel’s schedule.

It’s worth noting that this rumored Power Nap chip wouldn’t be the first Apple-designed chip to make it into a Mac. That honor would go to the T1, an ARM-based chip that showed up in the new MacBook Pro last fall. That chip controls the laptop’s Touch Bar and the Touch ID sensor but otherwise doesn’t have to do any heavy lifting. Apple has been pretty quiet about the chip, but it seems that the next MacBook Pro could have another ARM chip — maybe the T2? — that takes more tasks away from the main Intel processor. If that’s the case, we probably won’t know for a while, as Apple probably won’t update the MacBook Pro lineup again until this fall.

Engadget RSS Feed