Now Apple’s Live Photos can work on any website

Apple first introduced Live Photos in the iPhone 6S series back in 2015, but the odd photo/video-clip hybrid has taken its time coming to the internet. While Tumblr was the first to integrate Live Photos into its site last September, Apple is finally introducing an official JavaScript API to get its odd photo/video hybrid format out onto the web at large.

Developers will add their Live Photo as a DOM element like any other photo or video. They’ll be able to control how long their media should play, or if they should only play if end users hover over a “LIVE” button in the Live Photo’s top right corner.

Tumblr was the first to fully integrate the format last fall, but Google was the first to store Live Photos online over a year ago with an update to its Google Photos app for iOS. Then it released Motion Stills for Apple devices in June, an app that converted the hybrid media to looping GIFs or movie files so they could be exported and uploaded to the internet at large. After another update last month, Google’s app will likely continue be the most popular method to transfer Live Photos to the web for awhile since Apple’s API will still need developers to build it into apps and sites.

Via: 9to5Mac

Source: Apple

Engadget RSS Feed

iTunes movie rentals finally work across multiple devices

Somehow, Apple went until 2017 before adding one of the most basic features to iTunes. You see, for the past nine years, when you’d rent a movie via the app, you’d have to watch it on the device the rental originated from. So, if you rented Manchester by the Sea on your commute, watched a few minutes and then wanted to finish bumming yourself out on your big screen at home, you were out of luck. With the latest version of iTunes (12.6) and “rent once, watch anywhere,” that’s changed.

Assuming you have iOS 10.3 installed on your iPhone or iPad, and tvOS 10.2 on your Apple TV, the feature should be ready to take for a spin. A caveat, though: As 9to5Mac notes, those OS updates are only available in public beta and developer beta channels, respectively. Once those go wide, though, the feature itself should follow suit. This isn’t a massive improvement, but hey, neither is a red iPhone 7.

Via: 9to5 Mac

Source: Apple

Engadget RSS Feed

Here’s how the iPhone 7 Plus’ dual cameras could work

Apple’s 2016 iPhone launch event may be just days away, but that isn’t stemming the tide of leaks and rumors. KGI Securities analyst Ming-Chi Kuo (who is frequently, though not always, on the mark with Apple launches) has published a last-minute report claiming very detailed knowledge of Apple’s handset plans, including a few tidbits that have remained unclear. He now says he understands how the larger 5.5-inch model’s (for sake of reference, the iPhone 7 Plus) long-reported dual rear cameras would work. The two 12-megapixel sensors would reportedly be used for both zoom and “light field camera applications” — typically, that means after-shot refocusing. This would be at least somewhat similar to the dual-camera setup on the Huawei P9, where you can play with focal points and simulate different apertures. Huawei doesn’t offer an enhanced zoom, though.

On top of that, Apple would purportedly include higher-quality lenses (with more elements) and extra LED flashes to produce more natural color in low-light photos.

If the report is accurate, you also wouldn’t have to worry quite so much about Apple ditching the headphone jack. Much like Motorola, Apple is supposedly bundling a headphone adapter (in this case, Lightning to 3.5mm) in every iPhone 7 and 7 Plus box on top of native Lightning earbuds. It still wouldn’t be as elegant as a native 3.5mm port (you’d likely have to go wireless to listen to music while you charge), but you wouldn’t have to buy a dongle to keep using your pricey wired headphones.

There’s more. Kuo also hears that the A10 chip powering the new iPhones will run at a much higher 2.4GHz clock speed (the A9 in the iPhone 6s and SE tops out at 1.85GHz). And if you’re the sort who has to get a new color to prove that you have the latest iPhone, it might be your lucky day. The analyst elaborates on a previous rumor by claiming that Apple will replace its seemingly ubiquitous space gray color with “dark black,” and there would even be a glossy “piano black” if you’re feeling ostentatious. Oh, and the purported second speaker grille? That would hold a new sensor to improve Force Touch, though it’s not certain how that would work.

To top it all off, the report also supports a few existing stories. The new iPhones would indeed be water-resistant, surviving depths of 3.3 feet for 30 minutes. And Apple would not only double the base storage, but the mid-tier’s storage as well. You’d be shopping between 32GB, 128GB and 256GB models, much like you do with the iPad Pro. The display resolution won’t be going up, Kuo says (boo!), but you would get the smaller iPad Pro’s wider color range. All told, Apple would be counting on a ton of iterative improvements to get you to upgrade. Even if this isn’t the big redesign you’d hope for, it’d be more than just a modest tune-up.

Source: 9to5Mac

Engadget RSS Feed

Prisma’s arty photo filters now work offline

There’s a lot going on behind the curtain with Prisma, the app that turns your banal photos into Lichtenstein- or Van Gogh-esque artworks. The app actually sends your cat photo to its servers where a neural network does the complex transformation. Starting soon, that will no longer be necessary, though. “We have managed to implement neural networks to smartphones, which means users will no longer need an internet connection to turn their photos into art pieces,” the company says. Only half of Prisma’s styles will be available offline at first (16 total), but others will be added in the “near future.”

Running the algorithms locally will speed things up (depending on your smartphone), help folks with poor internet service and free up valuable CPU cycles on its servers. The latter benefit will allow its tech to work with video, in a later release, Prisma adds. “Now that we’ve implemented neural networks right to the smartphones, we have enough servers capacity to run full videos on them in the near future.”

Now that we’ve implemented neural networks right to the smartphones, we have enough servers capacity to run full videos on them in the near future.

Prisma claims it’s the first to implement neural network tech on a smartphone, and that “no team or company has ever done anything close.” That, it says, opens up AI to developers without access to server farms, meaning “we will see [a lot more] new products based on neural networks.” Companies like Google and Apple may beg to differ, as they have already implemented smartphone AI for translation, voice recognition and more.

52 million folks have installed Prisma and 4 million use it daily, according to the company. Much as Snapchat has done, it plans to monetize the app via brand filters, while keeping it free for users. The offline processing speed depends on which smartphone you have — Prisma says “it takes three seconds for the iPhone 6 to repaint a photo and 2.5 seconds for the iPhone 6s.” The new features will arrive to iOS shortly and hit Android after that.

Engadget RSS Feed