iOS 11 preview: Full of promise, especially on bigger screens

As always, Apple spent a considerable chunk of WWDC earlier this month hyping up iOS 11 and all of the new features it brings. Now it’s your turn to take them for a spin. The first public release of the iOS 11 beta goes live today for people participating in Apple’s testing program, and we’ve been playing with it for a few days to get a better sense of what it has to offer. Long story short, it’s already shaping up to be a very valuable, very comprehensive release.

In order to find out for yourself, you’ll need the right hardware: an iPhone 5s or newer, an iPad mini 2 or newer or a sixth-generation iPod touch. Before you replace your iVessel’s perfectly functional software with something that’s still months away from being ready, keep reading for a primer on what to expect.

But first…

Before we go any further, here’s the usual disclaimer: This software, while mostly functional, is a long way from being finished. Over the past few days of testing, I’ve seen my share of lock-ups, app crashes and overall funkiness. (As I write this, my iPhone’s “home row” has disappeared and I can’t figure out how to get it back.)

Since we’ve had a limited time with this preview, we haven’t been able to test all of the updates it contains either. Even though I work for Engadget, my home resembles that of a Luddite, so I didn’t have much of a need for the updated Home app. And since my car is relatively ancient, CarPlay was also a no-go. Meanwhile, other things just weren’t ready for prime time, including multi-room support in AirPlay 2 and the ability to send cash to friends via iMessage. And while we’re starting to see some really neat augmented reality tricks made with ARKit, none of those are available in the App Store yet. Long story short, just make sure you know what you’re getting into before you agree to the install.

Familiar, but different

The iOS aesthetic has undergone some major changes over the years, but that’s not really the case here if you’re using an iPhone. In fact, you’d be hard-pressed to find a difference until you swipe up in search of that flashlight. The iOS Control Center no longer looks like a handful of pages with quick options; it’s a more condensed cluster of buttons and controls that you can finally customize. I appreciate Apple squeezing all of this functionality into one place; it generally works well, and if your iOS device supports 3D Touch, you can press on these icons to access more controls. That said, I’ve already screwed up my screen brightness while trying to close Control Center maybe a thousand times, and I’m not sure I love the look either.

You can also view all your recent notifications from the home screen just by swiping up from your lock screen, which is nice if you need to get caught up on things quickly. That said, if you’re a digital pack rat (like me) and never clear your notifications, this is a great way to see iOS lag.

You’ll also see a big focus on big text: It’s meant to be clear and visually punchy, but if you didn’t like the Apple Music redesign, you’re probably not going to like this either. That bold approach is used everywhere to some extent, from the Messages app to your list of albums in Photos. The best new example, however, is the revamped App Store. It’s not just a place with lists of apps (though those still exist) — it’s more curated, and there’s a strong editorial bent. Featured apps get miniature articles (crafted with help from the developers), lots of big imagery, and more video to help explain what makes them so special. It kind of feels like Apple squeezed a teensy blog into the App Store.

And for the first time, games and apps are kept separate from one another. Sifting through these distinct lists is definitely more convenient than before, but it mostly benefits developers. With these lists now separate, apps won’t get pushed down in the Top Paid and Free lists by whatever the buzzy game of the moment is.

Intelligence everywhere

Apple’s pushing the concept of “intelligence” really hard with this release. With Core ML, developers will be able to weave machine learning features into their apps, and hopefully make them more responsive to our desires and behaviors. Too bad none of those apps are ready yet. There’s still one concrete example of Apple’s pronounced focus on intelligence here, though: Siri.

For one, it sounds profoundly more natural than before. There are still small tells that you’re talking to a collection of algorithms, but the line between listening to Siri and listening to an actual person is growing strangely thin. (You’ll notice the improved voice in other places too, like when Apple Maps is giving you directions.) Hell, Siri even sounds good when you ask it to translate something you’ve just said in English into Spanish, French, German or Chinese.

It’s also able to act on more unorthodox requests like “play me something sad,” which happens to launch a playlist called “Tearjerkers.” And if you’re tired of hearing Siri altogether, you can now type queries and commands to it instead. Unfortunately, you’ll have to disable the ability to talk to Siri in the process. Ideally, Apple wouldn’t be so binary about this, but there’s at least one workaround. Worst-case scenario, you can enable dictation for the keyboard, tap the button and start chatting with it.

If some of this sounds familiar, that’s because Siri actually has a lot in common with Google Assistant. While the feature gap between the two assistants is closing, Google is still better for answering general-purpose questions. Apple’s working on it, though. The company says Siri now pulls more answers from Wikipedia, which may be true, but you’ll still just get search results most of the time.

More important, the underlying intelligence that makes Siri work has been woven into other apps. Siri can help suggest stories you might be interested in inside the News app, and if you register for an event within Safari, Siri will add it to your calendar.

Getting social

Sometimes I wonder why Apple doesn’t just go all out and create its own social media service. Then I remember it did. It was called Ping, and it flopped hard. So it’s a little worrying to see Apple bake a stronger social element into Apple Music. At least the company’s approach this time is based on delivering features people actually use. In addition to creating a profile (which only partially mattered before), you can now share your playlists and follow other users. Sound familiar? Well, it would if you were a Spotify user. Apple’s attempts to stack up more favorably against major social services doesn’t end here, either.

With the addition of new features, iMessage has become an even more competent competitor to apps like Line and Facebook Messenger. You want stickers and stuff? Apple made it easier to skim through all of your installed iMessage apps, so you can send bizarro visuals to your friends quickly. You’ll get a handful of new, full-screen iMessage effects for good measure, and it’s not hard to see how the newfound ability to send money through iMessage itself could put a dent in Venmo’s fortunes. (Again, this feature doesn’t work in this build, so don’t bother trying to pay your friends back via text.)

And then there’s the most social tool of all: the camera app. The all-too-popular Portrait mode has apparently been improved, though I’ve been hard-pressed to tell the difference. (It’ll officially graduate from beta when iOS 11 launches later this year.) You’ll also find some new filters, but the most fun additions are some Live photo modes. You can take the tiny video clip associated with a Live Photo and make it loop, or reverse itself, or even blur to imitate a long exposure. Just know this: If you try to send these new Live Photos to anyone not on iOS 11, they just get a standard Live Photo.

The iPad experience

The new update brings welcome changes to iPhones, but it completely overhauls the way iPads work. This is a very good thing. Thanks in large part to the dock, which acts similar to the one in macOS, they’re much better multitaskers. You can pull up the dock while using any other app to either switch what you’re doing or get two apps running next to each other.

Just drag an app from the dock into the main part of the screen and it’ll start running in a thin, phone-like window. Most apps I’ve tested work just fine in this smaller configuration, since they’re meant to scale across different-sized displays. And you can move these windows apps around as needed. To get them running truly side by side, just swipe down — that locks them into the Split View we’ve had since iOS 9.

Having those apps next to each other means you can drag and drop images, links or text from one window into the other. This feels like a revelation compared with having to copy and paste, or saving an image to your camera roll so you could insert it somewhere else. Now it just needs more buy-in from developers. Literally all I want to do sometimes is drag a photo from the new Files app into Slack to share it, but that’s just not possible yet.

Oh, right, there’s a Files app now. It’s another one of those things that do what the name implies: You can manage stuff you’ve saved directly on your iPad, along with other services like Dropbox and Google Drive. Those third-party integrations are sort of theoretical right now, though: Dropbox sync isn’t ready yet, and navigating your Google Drive doesn’t really work the way it’s supposed to. It’s a great idea in concept, and I can’t wait to try it when it actually works.

When you’re done dragging and dropping, one upward swipe on the dock launches the new multitasking view. The most annoying part of this new workflow isn’t how your recent apps are laid out as a grid instead of the usual cards. No, it’s that you can’t just swipe up on those cards to close an app like you used to; you have to long-press the card and hit a tiny X to do that. I get that it’s more akin to the way you delete apps, but the original gesture was so much more intuitive and elegant. Otherwise, sifting through open apps to pick up where you left off is a breeze.

That said, it’s odd to see the Control Center to the right of those app windows. Having all these extra control toggles shoved into the side of the screen looks kind of lousy to me, but don’t expect that to change anytime soon. Thankfully, there’s no shortage of thoughtful touches on display here. Consider the new on-screen keyboard: Instead of tapping a button to switch layouts for punctuation and numbers, you can just swipe down on a key to invoke the alternate character. I still haven’t gotten completely used to it, but I’m much faster than I was on day one. Hopefully, your muscle memory resets more easily than mine. The Notes app also has been updated with the ability to scan documents on the fly, which has already made my life easier when I’m filing work expenses.

And don’t forget about the Apple Pencil. It was always kind of a hassle going through multiple steps before I started writing a note — you had to unlock the iPad, open Notes and tap a button to enable pen input. Now I can just tap the lock screen with my Pencil and I’m already writing. Longtime readers probably know my handwriting sucks, but it’s generally clean enough for iOS to parse it, so I can search for things I’ve written straight from Spotlight. Tapping a result brings up my note, and, even in its unfinished state, it’s honestly a little crazy how fast Apple’s handwriting interpretation works. Then again, Apple is pushing on-device machine=learning processes like this in a big way, so if we’re lucky, behavior like this will be the rule, not the exception.

These are all valuable improvements, and I’m sure I’ll wind up using these features a lot. At this point, though, I still wouldn’t choose an iPad over a traditional notebook or convertible as my primary machine. The situation will improve as more app developers embed support for all these features into their software, but the foundation still doesn’t seem to be as flexible as I need.

The little things

As always, there are lots of little changes baked into these releases that don’t require a ton of words. Let’s see…

  • There’s a handy one-handed keyboard in iOS 11, but it’s disabled by default. I have no idea why.
  • When you’re on a FaceTime call, you can now take a screenshot of what you’re seeing without that pesky box with your own face in it.
  • Do Not Disturb While Driving is good at knowing when you’re using an iPhone in a car — just be sure to add a toggle for it in the Control Center for when you’re a passenger.
  • It’s basically impossible to miss when an app starts using your location: You’ll see a blue banner at the top of the screen telling you as much.

Even in its unfinished state, iOS 11 seems promising, especially for iPad users. I’ve always maintained that iOS 10 was a release meant to weave Apple’s sometimes disparate features and services into a platform that felt more whole. It was maybe a little unglamorous, but it was necessary. When iOS 11 launches in the fall, we’ll be able to get a better sense of its character and value.

Engadget RSS Feed

VSCO adds full RAW photo support to its iPhone app

VSCO, smartphone photographers’ image tweaking app of choice, is letting iOS users tap into all the original image data captured on iPhone 6’s and up. Alongside a host of new community features, it’s offering full RAW image support on capture, importing and editing. This means photo editors will be able to access a wider range of colors and tones that are sometimes lost due to compression on typical JPEG photos. RAW support will even work on your must-share DSLR images too.

The update is also the culmination of the VSCO team’s efforts to better showcase its community and editorial team content. This includes a machine-learning engine that surfaces related images of what it spots in images. There’s also a new search and a discovery section specifically for notable community posts.

VSCO has introduced a new (invite-only, subscription-based) membership at an early-access price of $ 20 per year. This will give users monthly updates and early access to filter presets, particularly VSCO’s new Film X interactive presets. These tap into SENS, its new imaging engine, and attempt to offer, according to VSCO CEO and founder Joel Flory: “a physical model of film and not just a static preset.” New presets currently include the Fuji Pro 400H, and Kodak Portra 160 and 400. According to the team, they’ve tried to create a physical mode of film — and that also includes real-time shaders that you can tweak during live capture.

If you’re willing to subscribe, you’ll net the entire preset library (over 100 of those), which total around $ 200 if purchased through the app. RAW support, at least, comes for free in the new update available now. Oh and for that invite-only membership? Add your name to the waitlist here, and get ready to feel exclusive.

Engadget RSS Feed