Black Mirror captivates for a variety of reasons. It’s both a dark reflection of our world, and it serves, in part, as a cautionary tale. If there’s a lapse in our collective technological skepticism, the fantastical stories from the show could be real. CES, the consumer technology trade show, should elicit a similar level of concern, but it does not—despite the fact that this year’s event hosted a number of technologies that nearly jump from the Black Mirror script.
At the convention in Las Vegas, we were surrounded by robots and VR simulations and emotion-reading software, and yet attendees carry on as if everything’s normal and any change is gradual. This was my sixth year at CES, and I’m finally starting to understand why we’re more affected by a TV show than a hands-on demo. CES is a hyper-produced event with outlandish booths that obscure the gravity of the products being introduced. The terror or thrill of a hyper-realistic VR demo or a seemingly-aware robot assistant doesn’t always sink in when it’s being handed to you along with a swag bag full of pens, or set up next to a concession stand selling soft pretzels.
The CES experience registers more as a playground crossed with a mall for adults, instead of as a forum to unveil what’s coming to change our world. The exhibition is filled with excess and lacks the restraint that Black Mirror exercises to devastating effect. In a given episode, the characters still walk and talk and act and dress like us. They still live in homes that look like ours and have problems that are similar to ours. But at CES, a display for a smart speaker comes in a smart home that has a robot; everything is white, like a spaceship. If you look past the ostentation, our AI-produced future is already here: robotic assistants, virtual-reality experiences that extend past traditional gaming, and new platforms built to share pieces of our minds exist now. They show us a future—in some cases, a present—we formerly knew solely from TV. That which scares and thrills from Black Mirror and the future we see in its stories is housed under the harsh, unflattering lights of the Las Vegas Convention Center.
Welcome to the darkness. We hope you find it enlightening. https://t.co/eTKVpDxk8n— Black Mirror (@blackmirror) January 10, 2018
Driverless Cars Are for Delivering Pizza
Victor Luckerson: The events of “Crocodile,” episode 4 from season 4, are set in motion by two car collisions—one between a reckless driver and a cyclist, and another between an autonomous pizza-delivery vehicle and a pedestrian. The pedestrian’s insurance company uses a mind-reading device to recreate witness accounts of the crash, but the crucial technology at the center of the episode is the pizza-delivery car. It’s a concept we should get comfortable with, because, at CES, Ford and Domino’s demoed a driverless car that could deliver pizza pies to customers who ordered online.
When the car arrives at a hungry customer’s doorstep, they enter the last four digits of their phone number on a tablet mounted by the backseat window, which prompts the car to roll down the window and reveal the cheesy goods. The concept isn’t just a tech conference gimmick; real Domino’s customers used the driverless feature during a pilot program in Ann Arbor, Michigan, last year. “We’ve learned all kinds of things, like which side of the vehicle do people think they have to go to pick their stuff up,” says Jessica Robinson, director of City Solutions for Ford.
Questions about this delivery method abound. How will a customer know which pizza to take if the car is carrying multiple deliveries? Will pizza thiefs roam the streets plundering unsuspecting autonomous cars? Will all delivery drivers soon be put out of work? Automakers and restaurant chains seem confident the questions will work themselves out. Toyota and Pizza Hut announced a similar partnership at CES, and Ford announced it wants to shuttle plenty of goods besides pizza via a partnership with Postmates. Here’s hoping these initiatives only lead to shorter waits for pizza and not chillingly rationalized murder sprees.
The Parent Surveillance State
Alyssa Bereznak: Thankfully there were no CES exhibitors foolhardy enough to invent a brain implant like the one in Black Mirror’s “Arkangel” episode that can actively pixelate a child’s view of disturbing imagery. (At least not this year.) That being said, a survey of monitoring technology on the convention floor hinted that a good portion of the “Arkangel” technology exists in disparate bits and pieces.
The Mountain View-based startup Cocoon Cam allows parents to monitor their baby via an app that displays a real-time “breathing graph,” much like the vitals that Rosemarie DeWitt’s character watched to determine whether her daughter was under duress. Separately, an alarmist app called My Safe Map allows family and friends to track each other, rate the safety of locations around them, and send out danger warnings to contacts if something sketchy takes place. (Iterations of this app have existed for years, and despite privacy concerns, they just won’t die.) The company Tabs sells a “Family Locator Kit” that includes a LoRaWAN hub to provide long-range tracking coverage and a watch-like device that can strap onto a child’s wrist and monitor their location and activity.
Beyond the typical baby cam or activity tracker, more and more family-oriented gadgets are being designed to track children in intimate and somewhat covert ways. As I was staring into the eyes of the AvatarMind iPal robot—a Teletubby-esque machine designed with children’s education and elder care in mind—a booth attendant informed me that its eyes were cameras. “Parents can use an app in their mobile phone to remotely monitor their children’s activities,” he said casually. Even when a product isn’t built as a tracking device, the capability is included anyway.
Virtual Reality’s Future Gets Real
McHugh: The future of mainstream virtual reality is murky, but walking into CES, it’s clear that developers are pushing ahead nonetheless. Arguably the biggest hurdle for VR is making it practical for the average consumer; there are still many, many barriers to entry, and for VR to be more than a next-gen gaming technology, a wider, more sophisticated array of experiences need to be built. But there are complementary steps being taken that may create a virtual world that will be make VR viable, and perhaps lay the groundwork for a company like that one in Black Mirror’s season four opener, which introduces us to a VR gaming company where players plug in via earbud-like devices that attach to the sides of their heads. They lay back and let their minds do the work inside a different world. (The comatose bodies of the TV series are not unlike the Bodyfriend stations at CES.) Looxid Labs brought its VR headset to CES, but the product (which you slide your phone into) isn’t the company’s real sell. The device is equipped with eye-tracking cameras and EEG sensors, which aren’t there to create a more immersive virtual world for the user, but to track how the user’s brain is reacting to the game. The headset is fun, sure, but what’s actually happening is that Looxid Labs could create a massive database of user reactions and emotions—data that could be hugely important to other VR developers.
Our minds aren’t the only thing important to measure in order to create an ideal VR world. Sense Glove was another product on display, and it could signal the end of handheld controllers for good. It’s a haptic response controller that you wear like a glove. This is a necessary step for the future of VR. During demos, Sense Glove reps said the product, which is still just a prototype, would be used for training purposes for things like assembly-line workers or manufacturers.
Share What’s In Your Mind
McHugh: ThirdEye is an augmented reality product with few advancements. It’s a product that allows you to watch sports through “XI Smart Glasses” by turning your head to see different games. (It’s unclear how this would work with sports packages like NFL Sunday Ticket and NBA League Pass.) The glasses would not blend in as regular glasses in any way, whatsoever. On a tour around Vegas, I wore the glasses, and information about different buildings would pop up. When I tried on the glasses, I looked at a map on flat piece of paper, and suddenly different content bubbles popped out in 3-D, giving me more of an inside look.
ThirdEye’s most popular feature is called “See What I See,” a tool reminiscent of that in “Crocodile,” which helped an insurance agent get testimony straight from witnesses’ brains. ThirdEye calls this a “P.O.V. AR data communication” tool that allows users to see through each other’s eyes. The practical applications are obvious: Imagine teaching someone how to change a tire from miles away, able to see exactly what they’re doing. This is one of the most popular use cases, Nick Cherukuri, president and founder, told me when I tried the glasses. Many of the company’s users are in manufacturing, for example, and a technician wearing the glasses in the field can stream what he or she is seeing in real-time to someone back at the home base, who can view it from their desktop.
The smart glasses also have parallels with “Black Museum,” where a husband transfers his wife’s mind into his own. ThirdEye’s technology is able to learn from what it’s seeing, and compiles information—like a human. “Another cool application is one for people with Alzheimer’s,” Cherukuri told me. “We actually have image recognition so if you look at someone, we can train it through our neural network to recognize the face of that family member, and it will list their information next to their face.”
With these glasses, experiences can be streamed from one person to another, anywhere in the world. Sight isn’t the entirety of an experience, or a mind, but this software offers a new kind of intimacy.
Smart Assistants as Life Coaches
Bereznak: We may never see the day that an online dating service pours considerable time and resources into running the elaborate digital simulation we saw in “Hang the DJ.” (I think when you’ve developed that kind of technology, your company just pivots to something more lucrative, like black-ops military contracts.) But the hockey-puck-sized and voice-activated devices that the main characters use in the simulation are another story. Throughout the episode, our protagonists carry these assistants with them and use them to engage in free-flowing conversations. They’re game to complete simple tasks, but also answer philosophical questions about dating with vague platitudes like “Everything happens for a reason.”
When I spoke to Gummi Hafsteinsson, the project manager director for the Google Assistant, at CES this week, he said that more casual interactions with voice assistants are becoming increasingly common and expected. Beyond shouting commands at these assistants, which include Amazon’s increasingly ubiquitous Alexa, people just want to talk to them. “Once you promise a conversation, you want to be able to have that conversation,” he told me. “A conversation without personality is something that we, as human beings, just don’t quite understand.” Google has a team of writers dedicated to filling out the personality of their AI assistant, which Google Communications manager Kara Stockton describes as a “hipster librarian.” Not only that, but Hafsteinsson imagines that in the near future, voice-activated helpers will be our constant companions—disembodied buddies that follow us from our homes to our cars to our phones to aid us wherever and whenever they can. “Ultimately we need to get to the point where it feels so natural that you almost forget about it being there, but you can always get to it,” he said.
Our Robot Friends (Eventually, Foes?) Are Here, En Masse
Luckerson: Killer robots can come in all shapes and sizes, even those shapes we typically view as comforting. That’s one reason “Metalhead” was so frightening—the murder-machine that’s chasing down the episode’s protagonist isn’t a Terminator-style humanoid, but rather a quadruped that looks kind of like a robotic dog. Aibo, the cute-and-cuddly robotic pup that Sony unveiled at CES, may be the closest thing we have to this living nightmare.
The device was originally part of the Furby-fueled electronic pet wave in the late ‘90s, but the toy line was discontinued in 2006. The revived Aibo is smarter and more lifelike than ever, featuring touch sensors on its head and chin as well as 22 actuators to simulate lifelike movement. Aibo also uses facial and voice recognition software to identify which humans are its friends (or enemies?), as well as cameras on its nose and back. Sometimes his emotive OLED eyes go blank—but luckily, I haven’t seen them turn red yet.
The Aibo demo at CES struck more tenderness than terror in hearts of show attendees. The dog waddled across a mock living room, barked in the cutest way possible, and seemed to be mugging for the dozens of iPhones recording his every move. According to a Sony spokesperson I spoke to, Aibo isn’t equipped with any dangerous weapons, for now. But I suspect he’s playing with his accompanying electronic bone, recording his owner’s every move, and biding his time.