It looks like Google still isn’t done fielding complaints about the Pixel 2 XL’s display. While some users are experiencing premature screen burn-in and seeing a bluish tint, others are apparently having trouble with its responsiveness. Comments posted on the Pixel 2 community website have revealed that some units are having issues getting their phones to register touches near the edges of the screen. One poster even conducted a test and found that while the edges on his display can recognize swipes just fine, they can’t always recognize taps.
Here’s a video of the experiment:
According to Android Police, this happens because the device’s accidental touch protection feature is just bit too effective. The good news is that it’s a software issue, and Google is already working on a fix. Orrin, a Pixel 2XL Community manager, posted on the thread to inform people that the Pixel team is already investigating and addressing the problem in an upcoming over-the-air update.
In an effort to preempt similar complaints about bluish or greenish tinted screens and burn-ins, Apple recently updated its support page to explain that those are perfectly normal for OLED displays like the iPhone X’s and Pixel 2 XL’s. Nevertheless, iPhone X’s screen seems to come with its own set of issues. Some of them have a nasty green line going down their edges, while others stop responding to touches in cold temperatures.
Let’s face it: The AI hype train isn’t going away, and soon all our devices will be run by artificial intelligence. While Apple’s answer to the AI takeover is to just call its new A11 processor “Bionic,” Huawei has taken a more concrete approach. The company embedded a neural processing unit (NPU) on its Kirin 970 chip, which it claims can run AI tasks faster and with less power than others. The newly launched Mate 10 Pro is the first phone to use the Kirin 970, and it’s meant to demonstrate the wonders of deeply embedded AI. So far though, it’s a capable, well-designed phone that has yet to fully explore what a dedicated NPU can do.
When Huawei asked a group of reviewers what we wanted from AI, I didn’t have a real answer, though my peers pointed out things like natural linguistics and battery management. But after a few days with the Mate 10 Pro, I’ve realized what I want.
My ideal AI would basically be able to predict what I wanted based on how and when I’m using my phone. For example, if I’m holding my phone up at eye level in my apartment at about the same time every day, I’m most likely starting one of my daily selfie sprees. It should know then to automatically activate (or at least suggest) the Portrait mode on my front camera and even take a series of photos when I push one button. It gets tiring having to keep pressing the volume down button to take dozens of pictures.
The Mate 10 Pro doesn’t live up to my unrealistic expectations, but it marks a step in the right direction. The phone can recognize things you’re pointing the camera at, like food, pets, flowers or buildings, and adjusts settings like ISO, shutter speed and saturation to make your photos look good. For now, the Mate 10 Pro identifies only 13 scenes, but Huawei says it will continue adding situations that the phone will recognize.
In other words, the Mate 10 Pro is smart enough to be both camera and photographer. That is, in theory. While the Mate 10 Pro does take lovely pictures that are bright, sharp and accurately colored, I suspect that has more to do with its camera hardware than clever AI. The two cameras on its rear both feature an aperture of f/1.6 — the widest yet on a smartphone (tied with the LG V30). That hardware not only allows for clearer pictures in low light, but also creates a pleasantly shallow depth of field.
When I compared pictures I took in manual mode to those where the AI decided what settings to use, I had a hard time seeing a difference. My photos of flowers appeared as saturated whether the AI was at work or not, and the depth of field looked the same either way. The main difference I saw was a stronger bokeh effect applied by the AI. I guess this is kind of the point — the AI is as good as I, a human, am at determining the best settings.
Although the Mate 10 Pro’s tweaks aren’t very noticeable, its scene-recognition is mostly quick and accurate. However, some situations stumped the Mate 10 Pro, like my messy dinner of chicken covered with onions and peppers in a chili paste. Then there are the many objects that the phone can’t identify yet — like a group of players on a basketball court or a pair of pretty shoes. Huawei also needs more data before the phone can learn the best settings for those situations — whether it be bumping up the shutter speed to capture fast-moving soccer balls or producing shallower depth of field around shoes. The company said it will keep analyzing pictures (not user-generated) in the cloud and push out software updates to continually improve its camera software. No, Huawei isn’t spying on your photos — these are pictures it got elsewhere (the company hasn’t told us the source).
The AI is absent on the front camera, but I still loved the selfies I took with the Mate 10 Pro. Huawei’s Portrait Mode uses face-detection instead of depth-sensing like the iPhone X, which creates a softer depth of field that’s sometimes less defined than Apple’s. But the pictures from Huawei’s phone are more flattering. The iPhone X’s Portrait Mode selfies are so sharp that every imperfection and stray hair is obvious.
The primary benefit of having a dedicated neural-processing unit on the phone’s CPU is that machine-learning tasks can be executed quicker. Things like image-recognition or language translation can be carried out in tandem with other general functions so your phone shouldn’t slow down just to find the 3,500th picture of your cat’s face. With Huawei’s Kirin 970 chip, app developers can tap into the NPU by using either the Kirin API or popular machine learning frameworks like Google’s Tensorflow or Facebook’s Caffe 2.
The problem is, not many apps have done this. So far, only Huawei’s own camera software and Microsoft Translator tap the NPU for improved performance. The latter comes preinstalled in the Mate 10 Pro, by the way, and only its image-based translating tool is optimized right now. I took a picture of the phrase “You’re so pretty” in Mandarin, and barely a second later Translator told me it meant “You’re beautiful.” Close enough. Subsequent attempts with the same printout yielded dubious results, though, with the app often translating the words to “Hello, Drift.” This is more likely an issue with Microsoft’s engine than the Mate 10 Pro.
I tried the same thing out on a Galaxy Note 8 and an iPhone 8 Plus. All three phones performed within half a second of each other — with the Huawei frequently finishing the fastest. Sometimes the iPhone took the lead, but for the most part none of them lagged far behind the rest.
Aside from its camera and the Translator app, the Mate 10 Pro also uses AI to learn your habits over time so it can pre-allocate resources to the apps it thinks you’ll launch next. From my few days using the phone, it’s hard to judge how effective this has been, but the Mate 10 Pro certainly keeps up with my incessant selfie-taking, Instagram bingeing and light emailing.
So far, the Mate 10 Pro has too few AI integrations for me to really notice the benefits of a dedicated NPU. It’s a sleekly designed handset, though, and I love showing off the attractive “Signature” stripe on its elegant, shiny rear. The epic battery life is also a bonus. It easily gets through two days on a charge, and I can go four days without plugging it in under extremely light usage. I wish its display were sharper than 1080p, but that’s a minor complaint. Since Huawei hasn’t shared the US price and availability, I can’t definitively say if the Mate 10 Pro is a better deal than its competitors. But it’s an intriguing preview of the good that can come from a phone powered by AI.
The iPhone X’s design revolves around its all-encompassing OLED display, so you can imagine the heartbreak when that display is glitchy… and unfortunately, it looks like a handful of owners are going through that pain. People on Apple’s forums, Reddit and elsewhere are reporting a glitch where a green line runs down the left or right edge of the display, regardless of what’s happening on-screen. This doesn’t appear to affect the functionality, but it’s clearly annoying.
We’ve asked Apple for comment on the issue. It doesn’t appear that restarts or other common software solutions fix it, though, and this might be strictly a hardware problem. It’s not necessarily an overscan line like you might see on a TV, either. No matter what, it’s safe to say that you can get a replacement if the usual troubleshooting proves fruitless.
It’s unclear how many people are affected by the green line, although it doesn’t appear to be a widespread issue. Between this and the (software-fixable) cold weather responsiveness issue, though, it appears that the iPhone X has some teething troubles. That’s not entirely surprising. It’s Apple’s first phone to use an OLED screen, and it’s using a custom (Samsung-manufactured) panel at that — there may be a learning curve involved as the companies master their production techniques. As it is, Samsung has had problems with its own OLED phones. Provided the iPhone X flaw is a hardware issue, it illustrates the broader issues with manufacturing cutting edge OLED screens.
My new #iPhoneX appears a green line on the screen😂, and the faceID can’t recognize me when I with glasses.@Apple @AppleSupport pic.twitter.com/Fgj5fg9v2x
— Lejia Peng (@fanguy9412) November 6, 2017
Source: Apple Communities, Reddit, Lejia Peng (Twitter)
Apple gave its mobile software a facelift when it released iOS 11 back in September, but bugs led the company to push out an 11.1 update a month later to protect user security from that WPA2 Krack vulnerability. Turns out that version introduced another set of squirrely issues, which has led Apple to release iOS 11.1.1 today. You can finally say goodbye to that stupid autocorrect bug switching out the letter ‘i’ for all manner of gibberish.
The update also addresses an issue where the ‘Hey Siri’ feature occasionally stops working. And…that’s it. Even the security content is the same as the 11.1 release, meaning Apple pushed this update out just to fix these two issues. Consider your outrage heard, given that Apple could have waited to fix until the forthcoming 11.2 update; Reportedly, some GPS issues with the iPhone 8 and iPhone X are fixed in the 11.2 beta.
Apple’s Clips video creation app is less than a year old, but it’s already getting a big update. Thanks to lots of user feedback and the proliferation of new, more powerful iOS devices, Clips is now more polished than ever, and that’s very good news for people looking to craft their next viral video masterpiece.
The philosophy behind the app hasn’t changed — it’s still all about making fun, short videos without much technical know-how — but Apple worked to make the app even easier to use. Consider the app’s interface: it was never particularly hard to wrap your head around, but Apple’s zeal for simplicity sometimes made the original layout feel a little too basic. While you’ll still use a big, bright record button to add clips to your timeline, a handful of new shortcuts beneath the viewfinder window make it easier to gussy up your work.
Apple also moved its controls for Live Titles (a feature that automatically turns what you’re saying into subtitles) and style transfer filters (which add fun, Prisma-style art effects to your photos and videos) to the left and right of that big record button. These were two of the most popular (not to mention most useful) features in Clips, so I’m glad they’re getting a little more prominence this time.
This time, Clips also packs support for iCloud Drive, so you can start a new video project on an iPhone and pick up where you left off on an iPad without issue. The same project reflects updates made on multiple devices, so there’s no need to worry about version control — something that most average Clips users would probably loathe having to think about.
Interface revamps aside, the biggest new addition is what Apple calls Selfie Scenes. It’s unfortunately exclusive to the iPhone X, and one look at the feature in action confirms why — it uses the X’s TrueDepth camera to isolate your face, paint over it with some sweet artsy filters and replace your current background with something more scenic. Right now, the current batch of scenes includes a neon-soaked city in Asia, a hand-drawn rendition of Paris, an 8-bit city that looks like something out of Rampage and, uh, the Millennium Falcon. Seriously. Apple’s cozy partnership with Disney now means that you can virtually insert yourself into a corridor on the Falcon or the bridge of Supreme Leader Snoke’s Mega-class Star Destroyer from The Last Jedi. Naturally, the view of your face takes on the hazy blue of a Star Wars-style hologram.
Yes, Clips’s use of the iPhone X’s TrueDepth camera is a gimmick, but it’s a damned cool one. More importantly, it works well almost all the time — the iPhone X did a mostly great job isolating me and my extremities from my virtual background. There’s also something just a little wild about seamlessly inserting myself into a sci-fi universe I’ve yearned to be a part of since I was 8. It’s just too bad older iPhones don’t have the hardware necessary to make this work for more people. (There’s a small consolation prize for Star War buffs with older iPhones: loads of animated stickers depicting Chewie, Princess Leia, TIE Fighters and more.)
While not every iPhone will get all these new features, Apple’s thoughtful changes to the interface and workflow mean the Clips update is well worth installing — you can find it in the App Store now.
FBI special agent Christopher Combs complained how the agency couldn’t get into the Texas shooter’s phone during a press conference. Turns out all they had to do was ask Apple for help. In a statement the tech titan has released to the media, it said it “immediately reached out to the FBI after learning from their press conference on Tuesday that investigators were trying to access a mobile phone.” Cupertino offered its assistance and even promised to “expedite [its] response to any legal process.” It added that it “work[s] with law enforcement every day” and “offer[s] training to thousands of agents so they understand [its] devices and how they can quickly request information from Apple.”
The company told Business Insider that the FBI has yet to ask for help accessing the phone. That pretty much confirms Reuters’ report that officials missed the 48-hour window that would have allowed them to unlock the device simply by using the shooter’s fingerprint. If the gunman had fingerprint access enabled, Apple could’ve told authorities that they had 48 hours to use his prints to unlock the phone before the feature ceased to function.
Now that it’s past 48 hours, the agency has to find a legal means to get to the phone’s contents. Officials will now have to serve Apple with a court order to be able to get their hands on his iCloud data. It’s unclear if the FBI is already securing a court order, but it might have decided not to work with Apple after having a tough time convincing the company to unlock the San Bernardino shooter’s iPhone. Apple refused to open the device for the agency even after the FBI took the company to court. In the end, the feds paid big money for a third-party company’s tool that was able to unlock the device.
So … FBI didn’t reach out for assistance. Apple contacted agency. if iPhone with TouchID, contact established AFTER 48 hour window for touchID closed. finger could have been used to unlock (if touch ID enabled). https://t.co/BgVhfT8TdZ
— John Paczkowski (@JohnPaczkowski) November 8, 2017
Director Steven Soderbergh has made a name for himself by pushing cinematic boundaries, so it’s no surprise that his upcoming series for HBO, Mosaic, isn’t your usual TV fare. Today, he’s launching the Mosaic app on iPhone and Apple TV (with Android and web versions to follow soon), which will let you decide how you watch the show. It’s not quite “choose your own adventure,” since you’re not making any decisions on the show’s outcome. Instead, the app, which was developed by PodOp, lets you determine how Mosaic’s narrative flows.
The first episode introduces you to Olivia Lake, an author played by Sharon Stone. After viewing that, the narrative path branches into two episodes. You could just watch them in parallel, or you could follow the path down all the way to the end, then go back and catch up on what you’ve missed. You can also unlock additional clips, documents and recordings to flesh out the story. HBO is making all 7.5 hours of the series available in the app, but it’s also going to air a six-hour version of the series edited by Soderbergh (naturally) on January 22nd.
“While branching narratives have been around forever, technology now allows, I hope, for a more elegant form of engagement than used to be possible,” Soderbergh said in a statement. “At no point were we reverse-engineering the story to fit an existing piece of technology; the story was being created in lockstep with the technical team. The fluidity of that relationship made me feel comfortable because I wanted it to be a simple, intuitive experience.”
Conceptually, Mosaic sounds similar to what Arrested Development creator Mitch Hurwitz attempted with the fourth season of that show on Netflix. He originally said you’d be able to watch those episodes in any order, but then later backtracked on that suggestion. Francis Ford Coppola also tried something similar with Twixt in 2011, a film that he could “remix” narratively with an iPad. He wanted to tour with the movie and edit it live, but eventually settled for a traditional release.
For Soderbergh, Mosaic is just the latest in a string of TV experiments. His Cinemax series, The Knick, tackled the early days of medical surgery with an anachronistic synth-heavy score. Soderbergh’s film The Girlfriend Experience is now a TV show, as well, and its second season is also dabbling with branching narratives. Soderbergh says he’s working on two more series using the Mosaic platform. Eventually, he hopes to open it up to other directors.
I’ve only seen part of Mosaic’s first episode, but I’ll definitely be devouring the entire series as soon as I can. It’s unclear if the app will appeal to anyone beyond Soderbergh fans and cinephiles, though. In the age of bingewatching, it seems like more viewers simply want to sit back and consume hours of content without lifting a finger.
Whatever you think of your dual-camera iPhone, there’s one company that’s less than thrilled. Israeli startup Corephotonics is suing Apple for allegedly infringing on patented technology with the cameras in the iPhone 7 Plus and 8 Plus (it’s likely none too pleased about the iPhone X, for that matter). Corephotonics says it pitched Apple about a potential alliance, only to be shot down and see Apple implement dual cameras on its own. The plaintiff company even claims that Apple boasted it could infringe on patents without fear. Apple’s negotiator said it would take “years and millions of dollars” before the iPhone maker would have to pay if it did infringe, according to Corephotonics’ version of events.
We’ve asked Apple for comment and will let you know if it can provide its take on the situation.
The case may be more complicated than it seems at first. Apple has its own dual camera patents, so it’s clearly been exploring the idea. Corephotonics may need to show that Apple couldn’t have developed the iPhone’s dual cameras independently. Also, it may have to demonstrate that negotiations played out as described. There have been more than a few lawsuits where plaintiffs swore they’d informed tech giants about patents — Corephotonics’ detailed account of this is uncommon, but the court will likely want more tangible proof.
The one certainty is that this isn’t a fly-by-night lawsuit. Corephotonics got into dual camera technology relatively early, and it has worked with big-name partners like Samsung Electro-Mechanics and OmniVision. Whatever the truth, Apple can’t brush this off.
Welcome back to Gaming IRL, a monthly segment where several editors talk about what they’ve been playing in their downtime. This month we’ve been loving Super Beat Sports and Stardew Valley and taking an early look at Nintendo’s Animal Crossing mobile game. But first, let Kris Naudus tell you about the scariest dating sim she’s ever played.
This article contains spoilers for ‘Doki Doki Literature Club.’
Doki Doki Literature Club
Kris Naudus Senior Editor, Database
It’s sort of impossible to not have expectations when you start a game. I certainly had preconceived notions when I began playing Doki Doki Literature Club. I’d seen headlines that proclaimed it one of the scariest games of the year, and I certainly knew I was in for something… interesting when one of the opening screens warned that people with depression should not play.
The game looks like another dating sim, with your main character wooing the girl of his choice from among three options: the cheerful best friend (Sayori), the quiet geeky lady (Yuri) and the nasty but secretly nice freshman (Natsuki). Your courtship is conducted by writing poems, angling your word choices toward the girl you hope to end up with. I found the whole thing rather tedious. But when the girls would show me their own works of poetry, the cracks started to show. They were weird. They were unsettling. Clearly we were heading somewhere outside of the normal bounds of otome games.
The more I progressed with sensitive girl Yuri, the more my relationships with the other girls unraveled. Monika, the president of the eponymous literature club, was catty and passive-aggressive. Natsuki hated me. Sayori confessed that she suffered from severe depression. But it was when Sayori revealed her true feelings to me that things fell apart.
I could either tell her I loved her or reject her with an affirming “You’re my best friend.” I became disgusted with the game. I was angered by the obvious emotional blackmail, even if Sayori never actually said, “I will hurt myself if you reject me.” I had already committed myself to choosing Yuri. So I rejected Sayori. And the game did exactly what I expected it to.
Still, I felt awful, and resolved to do “right” by her on my next playthrough.
That’s one of the things we count on with video games. You live, you die, you live again. You can always reload your last save, start from the beginning of the board or even reset and do the whole thing all over again. This is especially important with visual novels and dating sims, where you might want to play it again to see all the paths untaken. This is expected enough that some games now count on it, requiring multiple playthroughs to reach the “true” ending like the Zero Escape series, or rewarding you with new story paths and game modes like Hatoful Boyfriend does.
Doki Doki Literature Club punishes you.
I figured I’d pick Sayori the next time around, if only to see what her story would have been like, to see how things would have been different if only I had just chosen her. But when I loaded the game, she was nowhere to be found. She had been removed from the game.
The choices are always wrong. You’re always going to fail.
So, with that choice removed, I made a play for Natsuki instead. And while I did everything I was supposed to do, I somehow ended up getting scenes with Yuri again and again and again, until… well, things continued to go wrong.
In the end, that’s the real horror of Doki Doki. In visual novels, you’re supposed to make choices and have those decisions matter. Sometimes you’re wrong and you fail, but you try again. Here, the choices are always wrong. You’re always going to fail. The game will emotionally abuse you as long as you continue to play. It will even break down the fourth wall to do it, something that made me scream, even though I knew the entire time it was just a game.
I’m constantly reminded of the ending of War Games, where “the only winning move is not to play.” And if you never open Doki Doki Literature Club, all of the girls get to live and be happy. Or not. It’s Schrödinger’s cat, but in a file folder.
Opening this box made me feel awful. But it also constantly surprised me. It’s like riding a roller coaster, or watching a jump scare in a horror movie. You feel a terrible shock for a brief moment, and then you find yourself laughing afterwards. Doki Doki didn’t make me laugh, but it subverted my expectations and denied my choices so brazenly, I can’t help but smile a bit.
Rob LeFebvre Contributing Writer
Oh, I do love Stardew Valley on the Nintendo Switch. I’ve just gotten through winter, my least favorite season so far, and my virtual farm is finally shaping up again. I’ve got beanstalks, parsnips and a few other “springtime” seeds in the ground, and I’m watering them daily with my upgraded watering can, which can pour across three different plants at once. I’ve got a full chicken coop with four egg-layers in there and a barn with a couple of cows that just started producing milk. I know pretty much all of the folks in town, including the wizard and that weird ancient mariner who has a magic amulet he refuses to sell to me.
If video games are all about a sense of progression and mastery, Stardew Valley ticks all the boxes. It’s clearly inspired by the Harvest Moon games, though it also has a touch of Animal Crossing thrown in for good measure. You are given a farm by a relative and tasked with meeting the residents, amassing a fortune and (of course) growing crops and raising animals. That’s really not the whole of it, though.
Stardew Valley offers quite a bit of exploration, combat (while in the deep mines — I’ve only made it down to level 65) and supernatural mystery to boot, with a haunted community center, the aforementioned wizard and some weird totems scattered around town. There are holiday festivals for each major season change too. Taking it on the go is even better; I’ve whiled away plenty of time, while waiting for my kid to finish a piano lesson, harvesting blueberries and fighting off slimes in the mines. Overall, Stardew Valley is a charming title with a ton of things to do; you won’t get bored if you enjoy the gentle Zen of growing crops and exploring your little corner of the world.
Super Beat Sports
Timothy J. Seppala Associate Editor
Like many others, I got my first exposure to Harmonix’s work through Guitar Hero 2. But outside of The Beatles: Rock Band, I didn’t spend a ton of time with the studio’s band-simulator franchise. Usually I didn’t have friends around to play it with, and lugging out a plastic drum set for a quick song was always a pain. I’ve loved the studio’s one-off games like Rock Band Unplugged for PSP and Rock Band Blitz for consoles, though, because they took what I loved about the full-on games — awesome licensed music and beat-matching gameplay that was second to none — and stripped away the bulky plastic instruments. Imagine my surprise when I fired up the team’s Nintendo Switch effort Super Beat Sports and discovered it was basically a portable Rock Band in disguise.
I’m talking specifically about the “Whacky Bat” mini game. On the surface, it looks like a simple batting practice exercise, with adorable pink monsters hurling baseballs at you in time with music. You have to knock them back from whence they came, using audio cues to get the timing right. It all seemed a little familiar, but I couldn’t figure out why. After a few rounds of this, I unlocked “Pro Mode,” which had me facing down multiple monster pitchers across five lanes, swapping between each. That’s when it hit me: This was basically one of the pared-back Rock Band games on my Switch.
The balls are the note gems; each pitcher’s lane is the note highway; and swinging my hockey stick (it makes sense in the game) to the beat, keeping a streak going, is nailing a full combo on a plastic instrument. Of course, there are other mini-games (“Net Ball,” a take on volleyball, and “Gobble Golf” are great as well) and deeper multiplayer offerings, but none of them grabbed me quite like “Whacky Bat.”
Super Mario Odyssey is one of the best games I’ve played in years, sure, but I’d rather experience that at home on my TV with surround sound. If I’m on the go, you can bet I’m playing Super Beat Sports.
Animal Crossing: Pocket Camp
Aaron Souppouris Features Editor
On first impression, I was as entranced by Animal Crossing: Pocket Camp as I was by New Leaf on the 3DS. Just as Fire Emblem Heroes is exactly what I want from a mobile Fire Emblem game, Pocket Camp seemed to be the perfect distillation of what makes the series so special. You shuffle around, solving various animals’ problems (mostly by gathering fruit, bugs or fish), and in return you get materials to add furniture to your campsite and camper. Pick the right objects and animals will come visit your camp, making space for more characters to appear around the game’s small world. It’s a nice loop that works great on mobile.
After a week or so, though, I felt like I was running out of things to do. The NPCs were giving me similar lines of dialogue, and the challenges were all the same. Perhaps that’s by design. A lot of the game seems to hinge around real people — you can make friends with people you know and don’t, and then team up to complete challenges or wander around their campsite looking at how they’ve chosen to decorate it. Because I played the game on a throwaway account, I’ve been unable to add any people I actually know to the game, and the world Nintendo has crafted began to feel oddly dull and lifeless.
This isn’t really Nintendo’s fault. I jumped through hoops to download Pocket Camp early, essentially lying to my iPhone until it believed I was living in the Sydney Opera House. I’m cautiously optimistic that when the game is released worldwide later this month, I’ll find more to do, because I’ll be playing with friends.
The other lingering question is about the payment structure. Pocket Camp is free to play, and the gifts that Nintendo gives away to new players dry up very quickly. Doing anything after a week seemed to take forever unless I paid to speed things up. Fire Emblem Heroes mostly strikes a good balance here, providing enough hooks for big spenders to keep spending while ensuring that you could choose never to part with real money and still have fun. That equilibrium doesn’t seem to be there for Pocket Camp.
This is definitely Animal Crossing; it’s just not very good right now. But even with these pre-launch issues, I’m still hopeful. The monthly updates to Fire Emblem Heroes over the past eight months have consistently improved it, and if Nintendo pays that much attention to Pocket Camp, it could grow into a great game.
“IRL” is a recurring column in which the Engadget staff run down what they’re buying, using, playing and streaming.
Back in 2015, Facebook introduced the ability to send money to friends through Messenger and now it has brought that capability to UK users. It’s the first time Facebook has launched the feature outside of the US.
A number of companies have begun working peer-to-peer payment abilities into their services. Skype lets users in nearly two dozen countries send cash within its mobile app via PayPal and PayPal has a bot that let’s you send money within Slack. In May, the encrypted messaging app Telegram began supporting payments through chatbots, as did Facebook last year. Facebook Messenger also lets you send payments through PayPal and introduced a group payment option earlier this year. Apple is also in on the money transfer game, allowing iPhone and iPad users to send money within iMessage via Venmo or by telling Siri to send cash via Square Cash, Monzo or PayPal. Additionally, Apple has its own Venmo-like money transfer service in the works that’s due to be released sometime this fall.
Transfers through Facebook Messenger will work in the UK as they do in the US. Users will need to link a debit card to their account before sending or receiving money. The feature is rolling out to UK users in the next few weeks.