Can an iPad Pro replace your PC?

In 2016, Apple believed its professional-grade tablet, the iPad Pro, was ready for the big time. Phil Schiller even described the machine as “the ultimate PC replacement” when describing the product onstage. The company’s own advertisements claimed that the device could do everything a desktop or laptop could do. But that wasn’t really true until the launch of iOS 11, when the company really let the iPad off the leash.

One of the headline features is that iOS 11 enabled truer multitasking than was available before. In fact, most of the commentary about the new operating system is about features, like the dock, that are at the heart of macOS. When a tablet gets the famous Mac dock, you know it’s time to consider it as a genuine PC replacement. Which is why I’ve spent a couple of days working (almost) exclusively from one in order to see if I’d be tempted to switch.

I’m a particularly good candidate for the experiment, since I’m such a slavish desktop aficionado that I even resent using a laptop. Unless it’s got dual displays, keyboard and mouse, not to mention the ability to run 10 programs at a time, I’m not happy.

In the service of the experiment, I borrowed the latest 10.5-inch iPad Pro from Apple, complete with a Smart Keyboard and Pencil. I also begged a friend to let me play with his 12.9-inch iPad Pro, similarly with a Smart Keyboard, to compare and contrast. My challenge was to try and do my job at Engadget using just the smaller iPad to write, edit and upload images.

The first thing you notice about working from an iPad is just how much more productive it makes you, because the iPad is the enemy of distraction. On my desktop, I normally work with two Chrome windows, iTunes and a couple of Pages documents on my primary display. The second monitor is dedicated to Slack, ensuring that I’m always on hand to respond to messages.

On the iPad, it’s far harder to succumb to the ravages of multiple-window syndrome. In fact, for all of Apple’s trumpeting about the iPad’s improved multitasking, the device is built to do one thing at a time. Part of it is a result of the limitations of the iPad itself: with only 10.5 or 12.9 inches of real estate to play with, you always need to be conscious about how much screen you’re using.

I spent most of my working days with Pages occupying about five-fifths of the display, with either a web browser or Slack on the right. Not that I really needed to, because iOS also has enabled fast switching, either by control-tabbing around your open apps or with the dock. The dock, obviously, was cribbed from macOS, and it’s one of the best tweaks available here.

When I work from a touchscreen Windows laptop, I’m always leery about not having a mouse alongside, because there’s that disconnect when you need to go from keyboard to display. Not only is it a real break with what you’re doing, but there’s the fact that your screen can get pretty greasy, pretty quickly.

Apple has, thankfully, solved the first half of that equation, because iOS’ gestures are more natural and intuitive. Pull your fingers in to close an app, swipe left or right to switch apps, tap the screen to highlight something. It makes a lot more sense, so you experience less of that break in your mind between using a keyboard and touching a screen. You still need a cloth at hand, unfortunately. When I went back to using a desktop, I found that I missed that sense of connection with the display that allowed me to quickly brush my finger against the screen to move the cursor.

Then there’s the iPad Pro’s Smart Keyboard, which filled me with dread when I thought I had to deal with it for a week-plus. It does, after all, look like the sort of rubber, industrial keyboard I thought I left behind when I stopped working in factories. At first blush, it looks stiff, uncomfortable, with little to no travel — a retrograde step toward the days of the ZX Spectrum.

I needed not have worried, since the Smart Keyboard has plenty of travel and is almost as comfortable as a laptop keyboard. Sure, it’s never going to match up to the sort of professional-grade mechanical keyboards I use on the desktop, or even the Apple-bundled chiclet keyboard. But it’s comfortable enough to use for long periods, and I’d happily use it as my primary input mechanism. Although I’d prefer the 12.9-inch version to its smaller sibling, because I’m a big guy with very big hands.

Oh, one thing: The angle of the iPad on its stand and my very large fingers mean that it’s far too easy to unintentionally brush the screen. It’s not a big issue, and I was able to learn to avoid it over time, but having keyboard controls at the bottom of the screen can sometimes be problematic.

I also want to talk about the Pencil, which I didn’t have much cause to use, since I’m not a very talented illustrator. However, I found out that, on top of being used for artistic purposes, the (don’t call it a) stylus pulls double duty as a mouse pointer.

For me — and I’d assume a large proportion of the people who work at Engadget — replacing our computers with iPads would be out of the question. Our CMS, the platform on which this site hangs, was designed more than a decade ago to work with keyboards and mice. Using it on phones and tablets, with their finger- and gesture-based interaction metaphors, is possible, but hellish. Not to mention that plenty of the apps that we need to work aren’t really designed to be used on tablets.

And yet, once I’d settled into a groove, I found it reasonably easy to do the bulk of my work on the iPad without interruption. The Apple Pencil is smart enough to let me use it in place of my finger in our CMS, and you can even shoot and edit photos on the device. Using Lightroom, it’s possible to shoot RAW images from the iPad’s 12-megapixel camera. I was able to produce some excellent imagery that, unless you’re looking hard, you’d assume came from a dedicated camera.

Thankfully, iOS 11’s Files app also means that I can actually just push the edited files into Google Drive and back again without any fuss.

Daniel Cooper

There are some issues that are specific to me, like the fact that I can’t yet find a batch resizing and watermarking app that suits our system. That’s not an issue that’s going to affect the majority of folks who will use the device. The muscle memory for pretty much everything else still works, and, after a few days, I didn’t even notice that I wasn’t using a desktop — except for the fact that you need to pull into Control Center to change music tracks, which is a total productivity killer.

One big trade-off between a personal computer and the iPad Pro is that the latter can’t really be the center of your digital universe. An iPad can’t host the sum of your iTunes media library, and you can’t sync devices with it. If you’re a fully paid-up member of the iCloud ecosystem, then that’s less of an issue. But if you’re still attached to physical media, you’re not going to be able to make that split so easily.

Another criticism, and one that’s often lobbed toward Apple, is that the iPhone and iPad are “closed” devices, hampering you from doing some of the things you would do on a desktop. Now, some of those things may not be on the right side of legality, but it may be something that you do anyway. Let’s imagine, for instance, that you enjoy watching controversial condiment-based cartoon Rick and Morty.

Here in the UK, Rick and Morty is available to view on Netflix seven days after its initial US broadcast. That’s easy to circumvent, however, since YouTube (and every other video hosting site on the internet) has streams of it available minutes after it airs. Now, on a desktop or laptop, you could simply visit one of the thousands of illegal streams on YouTube or elsewhere, save it to your hard drive and watch it at your leisure later. Or perhaps save it to a USB stick and then transfer it to a media player downstairs for family viewing.

You’ll get no prizes for guessing that such a job is difficult and very fiddly to implement on an iPad without plenty of help. Because you can’t simply save the file that’s being played in Safari, you need to use some creative workarounds. A service such as KeepVid, for instance, will paste the purloined files to your Dropbox account, from which you can then move them on. For all of Apple’s claims that iOS 11 will free your iPad from the tyranny of sandboxing, there’s still plenty of incentive for you to keep to your lane.

iPads, for all of their compactness, aren’t always the ideal machine for road warriors. On field trips, I use my MacBook Air’s two USB ports to charge all of my digital devices, from my iPhone and headphones to my Kindle. That way, all I need to do is carry the charging cables, rather than the wall plugs, and I can charge up to three devices at a time.

An iPad, on the other hand, can share its battery only with the Pencil, and so is useless for power sharing. Whatever bag weight you’ve saved by not toting around a hefty laptop and its power adapter, you’ll make back by bringing USB plugs for all of your various devices.

On the upside, the iPad Pro occupies a lot less horizontal space than a laptop, making it better-suited for working on a train or airplane. You’ll never entirely eliminate the stresses of crunching elbows with your neighbor when typing, but it does help to mitigate the problem. And there are plenty of scenarios when the iPad’s speed enables you to get short bursts of work done much faster.

I often think that iOS will always be relatively hampered because macOS exists. The former is a sleek, stripped-down race car designed for speed and getting people to their destination in record time. The latter, however, is a pickup truck, useful and slow and versatile in all the ways its sibling is not.

It’s with that in mind that you should approach the notion of whether you could live your life with the iPad Pro as your primary — nay, only — machine. For the electronic minimalist in us all, the device can do plenty of the usual things you’d use a desktop for. But you’ll always find that you can very easily butt up against the limits of what the iPad, and iOS 11, can do.

On the plus side, I love how focused the iPad Pro made me, and how comfortable the keyboard is to use. The screen, packing 120Hz ProMotion and True Tone display technology, is beautiful, and I actually really enjoyed spending time with it to work and read. Not to mention that, because it’s so fast, light and portable, it’s far easier to work with in places other than your office. You can prop it up beside you at breakfast or on the couch late at night, and it’s much easier to use where space is at a premium than a laptop.

What you’re giving up, however, is that sense of control and the ability to do what you want to do, how you want to do it. Because Apple has a very ingrained sense of how computing is done, and its devices are built to enforce that sense at all times. If you feel that you can cope with the rigidity, then you will probably have no qualms about making the switch.

It’s weird, because on one hand, I feel like I could do 90 percent of my job with an iPad Pro and eliminate so much stuff from my office overnight. But that in doing so, I’d have to always have a laptop on standby for when I needed to do things that Apple doesn’t want you to do. The biggest drawback to recommending one, right now, is that the iPad Pro is this useful only because of its Smart Keyboard, and the price for the two together is $ 968 for the base model 12.9-incher. This is an awful lot of money to spend on a very beautiful device that can’t save a video straight from Safari or efficiently batch-resize camera images suitable for publishing.

Can an iPad Pro replace a personal computer? No, and it’s likely that it won’t be able to for some time. But do you really need a personal computer for the majority of the things you do each day?

Engadget RSS Feed

Analyst rumor: iPhone 8 ‘function area’ to replace home button

While we’re still months away from finding out exactly what’s what with any new iPhone, the rumor mill is already running at full tilt. Following up on earlier reports of a 5.8-inch edgeless OLED screened device arriving as the “iPhone 8,” well-connected analyst Ming-Chi Kuo is telling investors more about what its home button-less front screen could be like.

As explained by AppleInsider and 9to5Mac, the analyst notes that this presumed OLED iPhone with its $ 1,000+ pricetag will be similar in size to the current 4.7-inch iPhone. However, instead of the home button, it will include a “function area” that can also display controls for video or games.

That would keep it matched in style with the recently-released MacBook Pros and their OLED TouchBar, and, the analyst says, reduce the screen size used for everything else to about 5.15-inches. Last year the New York Times reported that the next iPhone would ditch the home button for virtual buttons built into the screen, and this rumor explains how all that could work. Losing the home button could indicate a lack of TouchID, which could be replaced by a fingerprint reader embedded in the display itself, or other biometric technology like face recognition.

Source: Apple Insider, 9to5Mac

Engadget RSS Feed

Apple will replace a lost AirPod for $69

Following a slightly delay, Apple’s wireless AirPods are ready to order. They’re small and sleek, but the lack of cords has put a nagging thought in the back of my mind: I am guaranteed to lose one, if not both within a few weeks. If you’re equally forgetful, or happen to commute in jam-packed subway carriages, you’ll be happy to hear that Apple will replace a single AirPod for $ 69 (£65). Given a fresh pair costs $ 159 (£159), that seems like a reasonable fee. Similarly, a new AirPod charging case will set you back $ 69 (£65), for the inevitable “I threw it out thinking it was floss” stories.

To Apple’s credit, your music will stop as soon as one AirPod leaves your earhole. It serves two purposes: so you don’t have to press pause when someone starts talking to you, and to give you a heads-up whenever one AirPod drops out of your ear. If you’re somewhere busy, like a crowded train platform, that immediate notification could be vital to retrieving it. Otherwise, the allure of Apple’s AirPods is a tangle-free lifestyle, convenient pairing and charging. It’s doubly useful if you have the iPhone 7 with its non-existent 3.55mm jack. (Yeah, I’m still annoyed about it.)

Via: TechCrunch

Source: Apple (US), (UK)

Engadget RSS Feed

Our fingerprints, eyes and faces will replace passwords

Passwords are a pain in the ass. They’re either easy to crack or hard to remember, and when breaches occur you have to come up with a whole new one. So people are trying to do away with passwords altogether, and so far, fingerprint scanners are doing the job nicely.

Still, fingerprints alone are not enough. Online security has become increasingly important, forcing service providers to come up with better measures such as two-factor authentication to defend user information. Companies are turning to other parts of our bodies to find biometric complements that are up to the task, and our faces and eyes are at the top of the list. Although facial and eye-based recognition appear gimmicky for now (the Galaxy Note 7’s iris scanner, anyone?), they may soon become as prevalent and popular as fingerprint scanners. That pairing could eradicate passwords and clunky text-message two-factor verification altogether, making it a completely biometric process.

Before you brush the notion aside, think about the history of fingerprint scanners on smartphones. After Apple first put Touch ID on the iPhone 5s in 2013, people pointed out that it didn’t work very well and that it wasn’t secure. But Apple soldiered on, improving the hardware and implementing more useful features. Since then, many other tech giants have followed suit. Today, they’re basically a given feature on flagship Samsung, Nexus (or Pixel), LG and HTC phones, and are even spreading to more affordable handsets such as the $ 99 ZMax Pro, the $ 200 Huawei Honor 5X, the $ 400 OnePlus 3 and the $ 400 ZTE Axon 7. We can expect to see them everywhere soon, said Sayeed Choudhury, Qualcomm’s senior director of product management.

chiang mai thailand dec 30  ...

Despite the proliferation of fingerprint sensors, companies continue to chase convenience and novelty by introducing new biometric methods of logging in. We started seeing facial recognition as a method of identification when Google first revealed Face Unlock on Android 4.0 Ice Cream Sandwich. Years later, eye-print authentication started popping up on phones such as the ZTE Grand S3 and the Alcatel Idol 3. The latter two used a retinal scan to match the user by looking at the full eye and veins.

The good thing about this method, said Choudhury, was that it didn’t require additional hardware — you could just use the selfie camera. The challenge in retinal scanning is in its computation and algorithms, which Choudhury said is “very heavyweight” and “almost always uses the GPU in addition to the CPU.” This means it takes longer to detect and recognize your prints. Indeed, in my experience reviewing the Eyeverify system on ZTE and Eye-D on the Alcatel Idol 3, snapping a pic of my eyes to unlock the phones was always excruciatingly slow.

In contrast, iris scanning, which was one of the highlights of the Galaxy Note 7 when it launched (and before all that exploding hoopla), uses more compact algorithms, said Choudhury. That means faster detection and a shorter wait time. Plus, iris scanning has been around for a long time. People have been using it to get into secure labs, buildings and even through airport security (Global Entry), so the technology is pretty mature. It’s also more secure than fingerprints. According to Choudhury, “Iris-recognition technologies found in devices today identify three to five times more ‘feature markers’ to classify a specific iris versus what today’s fingerprint technologies can do.” The bad news with iris scanning, though, is it requires an infrared (IR) camera, which isn’t on many phones. But Samsung isn’t alone in looking to implement it — other brands will likely follow suit.

One of the biggest forces pushing the move toward eye-based authentication is the payments industry, said Choudhury. “What we’re seeing, driven by the mobile payments industry, is that both iris and retina biometrics are going to be incorporated in many more devices,” he said. Mobile payments are a “killer-use case,” according to him, and it certainly has a history of forcing even the most stubborn companies to adopt new technologies. The most obvious example of this would be Apple finally incorporating NFC into the iPhone 6 to enable its payment system, after years of resisting the tech that’s proliferated in Android phones.

Payments giant Mastercard is one of the proponents of the biometric security bandwagon, which encompasses fingerprints, eyes and faces. “We want to remove passwords,” said Ajay Bhalla, president of global enterprise risk and security at Mastercard. “Passwords are a big problem for people — they keep forgetting it or they use passwords which are very simple and dumb,” said Bhalla.

The company has been researching biometric-authentication methods using facial recognition, eye-based tech, fingerprints, heartbeats and voice, because these are unique to the user and don’t require memorizing or guesswork. It found fingerprints and face detection to be the most easily scalable. “We feel it’s reached a stage where it can become mainstream — it’s on devices, and consumers understand it,” said Bhalla.

Mastercard recently launched its “selfie pay” authentication method in Europe via its Identity Check Mobile app. The feature lets you authorize transactions by taking a portrait of yourself and blinking to prove it’s you and not a picture some wannabe hacker printed.

While it may sound cheesy to hold up your phone and pose for a picture each time you want to buy something, the company claims it is well-received. According to research from its 2015 trials, 90 percent of respondents found the Identity Check app more convenient than what they had been using. Seventy-one percent rated facial recognition as “highly convenient,” while 93 percent rated fingerprint recognition the same.

The popularity, prevalence and convenience of fingerprint scanning means it is here to stay, and by no means are face- and eye-recognition meant to replace it. Both Choudhury and Bhalla see the newer method as a complement to fingerprints, providing a more convenient second-factor authentication as opposed to entering a text code sent to your phone. While the tech we have right now may not be fast or secure enough to be truly convenient and helpful, we’re getting close. Using the adoption of fingerprint scanners as a model, Choudhury estimated that we are five years away from iris scanners and face detection becoming just as widespread. Until then, we’ll have to deal with changing our crappy passwords every so often and hope we don’t forget them.

Engadget RSS Feed

This phone-powered vision test can replace your eye doctor

There are dozens of inexpensive ways to buy glasses online today, but getting a new eyeglass prescription is as old-school as ever: Book an appointment with your eye doctor, spend more time than you expect in the waiting room and go through a full exam. Even if you’re lucky enough to book through Zocdoc, it’s still a long process. Smart Vision Labs hopes to make it easier to get a new glasses prescription with the SVOne Enterprise, a smartphone-powered self-guided vision test that’s launching in some New York City glasses stores today.

It may not have the catchiest name, but the SVOne Enterprise could be a huge boon for the vision impaired. It’s based on the same autorefraction technology as the company’s first product, the handheld SVOne Pro, which lets doctors perform eye exams just about anywhere. In a nutshell, the tech involves bouncing a laser off of your retina, which is then measured by the device. The new product adds a telemedicine element: After going through the vision test, the results are sent to a remote eye doctor who approves the final prescription. You can then download the prescription at any time and take it to the glasses retailer of your choice.

The SVOne Enterprise looks like an iPhone with a specialized eyepiece on top of a tripod. It’s more functional than attractive, the sort of thing an optical store can leave in a corner until it needs to test a customer. Since only a few stores can afford to have actual doctors on staff, most are left pointing customers elsewhere to get new prescriptions. Smart Vision Labs’ device allows stores to keep those customers in-house, so they’ll be more likely to buy a pair of glasses. The company chargers its partners $ 40 a test, but it’s up to the individual stores to determine pricing for shoppers.

Founder Yaopeng Zhou says he was inspired to create the SVOne Enterprise after realizing there are almost 200 million people in America who need glasses, but only 106 million eye exams take place every year. He also points out there’s only one eye doctor for every 5,000 people in the US. There’s a definite need for a faster way to perform vision tests.

To be clear, the SVOne Enterprise isn’t a completely self-service product. You’ll still need a bit of help to step through the exam, though it’s still far less involved than going to the doctor. To start, I answered a few questions on the SVOne Enterprise’s iPhone screen about my age and pre-existing eye conditions. If I had any major eye problems, the app would direct me to take a full exam from a doctor.

After that, Yaopeng had me read from a fairly standard vision chart on the SVOne Enterprise’s iPhone screen using my glasses. I then placed my right eye in the device’s eyepiece and stared at a red laser as it took three photos. I repeated the same process with my left eye, but it took a few tries and a move to a darkened room for it to make a successful measurement. (Yaopeng noted that my pupils were smaller than most, so we had to dilate them a bit by moving to a dark environment.)

A day after the exam, I received a link to an official prescription from one of the company’s contracted doctors. Surprisingly, they didn’t make any changes to my current prescription, which is hopefully a sign that my terrible vision is stabilizing a bit. I can now take that prescription to an online eyeglass outfit like Warby Parker, or a local store in my neighborhood, to get a new pair of frames. (If you’ve only ever gotten new glasses directly from your doctor, it’s definitely worth exploring the wealth of new options out there.)

Smart Vision Labs isn’t the first company to pursue phone-powered eye exams. Blink claimed it would send someone to your home for an exam (it hasn’t launched yet). And Peek has been trying to bring vision tests to the developing world for years. But the SVOne Enterprise is the first product I’ve seen that delivers a valid prescription just as accurate as my current one.

Looking ahead, Yaopeng says the company is attempting to bring the SVOne Enterprise to more markets in the US. Smart Vision Labs’ handheld product is already available in 23 countries. Though it’s only sold around 500 units of that device, they’ve already completed more than 40,000 refraction eye tests over the past few years.

Engadget RSS Feed