This week on The Ringer, we’re hosting the Best Video Game Character Bracket—an expansive competition between the greatest heroes, sidekicks, and villains of the gaming world. And along with delving into some of those iconic figures, we’ll also explore and celebrate the gaming industry as a whole. Welcome to Video Game Week.
Anyone whose video gaming origin story starts during the days of the medium’s intense technological puberty can recall being blown away by a game’s graphics as the state of the art accelerated from 2D to 3D or SD to HD. Like a docent at an art exhibit, my preteen self ushered a series of soon-to-be-awestruck, Dreamcast-deprived friends to the TV to watch Sonic sprint down the side of a skyscraper in the “Speed Highway” stage of Sonic Adventure (1998). I strafed the Death Star in Rogue Squadron II (2001) and mistakenly concluded that small-screen Star Wars space combat could never look better. I unconsciously crouched to dodge digital gunfire as I landed on Omaha Beach in Medal of Honor: Frontline (2002) and, later, tried to proselytize by demoing the mission for my unimpressed parents. (No, wait, watch! Video games are good!)
I’m gorging myself on memberberries, but it’s not just nostalgia that makes my mental catalog of gobsmacking, ape-approaching-the-monolith moments in video game graphics cluster around the turn of the millennium. Thanks to souped-up PCs and the recently released PlayStation 5 and Xbox Series X, current cutting-edge graphics are richer than ever, and future graphics will be better still. But modern games’ graphics are starting from such a beautiful baseline that the apparent pace of their improvement has significantly slowed.
On a 2017 episode of the Kotaku Splitscreen podcast, veteran video game programmer Brett Douville pointed to the late 1990s and early 2000s as a period of “huge differences” in visual fidelity, in contrast to the present’s “incremental” change. Compared to that earlier era’s revolutionary leaps, Douville said, “the apparent speed is a lot lower, because you can’t see the differences. Unless you actually keep up on this stuff and really read a lot, it’s very hard to be able to say, ‘What’s going on that’s different here?’ It looks better, but you wouldn’t be able to point and say, ‘This is why.’”
It’s fortunate, then, that the internet is teeming with technical authorities who have created a cottage industry devoted to scrutinizing video game graphics. It didn’t take a trained eye to distinguish a 16-bit release on the Super NES from a fully 3D world on the N64, or a comparatively primitive PS1 game from a PS2 title. But today’s differences are subtle on the surface, and the console landscape is increasingly complex. Mid-generation hardware updates, multiple flavors of newly released systems, and next-gen optimizations on backward-compatible consoles wring varying visual and technical performances from the “same” game, which may perform differently on an Xbox One, Xbox One X, Xbox Series S, or Xbox Series X. (Microsoft’s “fucking confusing” console names haven’t helped clear things up.) A surge of remasters and remakes (two terms that also sometimes cause confusion), coupled with patches that improve the performance of already-released games, has helped render the reality of how any given console title looks and plays more mutable than before.
The resulting uncertainty has heightened the need for comparative footage from multiple platforms and authoritative takes on which one wore it best. Pandemic-driven upticks in gaming and streaming have only augmented that consumer curiosity, and the accelerating trend toward purchasing games via remote digital downloads rather than retail outlets has strengthened the desire for figurative hands to hold from home. A number of companies and content creators have risen to meet that demand via videos that guide gamers through the bewildering world of postmodern video game graphics. First and foremost among them is Digital Foundry, a foundational site with a five-person staff scattered across the U.K. and Germany that specializes in technical analysis of video games and gaming hardware. But Digital Foundry’s success has helped inspire a smattering of popular YouTubers to pursue similar approaches, and the 800-pound gorillas of the mainstream gaming media have started to take notice and establish their own beachheads within this thriving online ecosystem.
“I like to think of us as sort of a universal translator, where we’re trying to present relatively complex topics and distill them down to something that’s actually enjoyable to read by a fairly large audience,” says John Linneman, a Germany-based writer and video producer for Digital Foundry. For Linneman, who grew up in Cincinnati and got into gaming in the late ’80s, the spark that lit a lifelong longing for graphical euphoria was Daytona USA, the arcade racing game designed as a showcase for Sega’s superpowered system board, the Model 2. The racer was released worldwide in 1994, and Linneman was entranced by the spectacle of a 3D game running at approximately 60 frames per second, complete with texture filtering techniques that weren’t yet achievable on at-home hardware.
“It was this magical moment where I just couldn’t believe what I was seeing,” Linneman says. “And this is something that’s hard to appreciate today, I think, for younger players, is that leap that you would get. … I miss seeing that kind of stuff. I loved experiencing something that you just couldn’t even believe was real at the time. I always chase that high, if you will. But it doesn’t come often.” Linneman, a former IT professional with a background in computer science, notes that developers are still capable of harnessing new hardware to make “major, remarkable achievements. … But to the average person, it’s not going to be as obvious, I think. Because everything kind of looks good.”
Digital Foundry was formed in 2004 by Richard Leadbetter and Gary Harrod, two longtime print journalists who marketed their editorial and production skills (and, later, hybrid DVD technology) to clients in the games industry. Leadbetter began blogging about games based on the videos he’d captured with his custom tech, and Eurogamer began hosting his (and the rest of the staff’s) written work in 2007. Ten years later, Microsoft unveiled the hardware that would become the Xbox One X exclusively via Digital Foundry, a nod to the site’s standing among tech heads. But the brand’s biggest draw is its 3,000-plus-video YouTube channel, which boasts more than a million subscribers and close to half a billion views. (DF also maintains a presence on Patreon, where users can sign up to access an archive of 4K video in all of its non-YouTube-compressed glory.)
Although the YouTube channel was created in 2008—kicking off with a side-by-side, 720p look at Call of Duty: World at War running on Xbox 360 and PS3—it wasn’t until 2015 (two years after Linneman left his IT job and joined DF) that the company began gravitating toward narrated and produced videos instead of wordless streams of tech specs. “There’s a ton of sites that review the games,” Linneman says. “I like to review the technology.”
By the time Digital Foundry pivoted to produced videos, a number of alternative channels, including NX Gamer, VG Tech, Gamersyde, Gaming Bolt, ElAnalistaDeBits, and Candyland had begun to offer their own visual comparisons or narrated, tech-oriented breakdowns. NX Gamer is the creation of almost-50-year-old U.K. programmer Michael Thompson, who started his channel in late 2013. Like Linneman, he remembers what got him hooked on exploring under the hood of his hobby, though the scales fell from his eyes (and ears) even earlier. For Thompson, the gateway was the Sound Interface Device on the Commodore 64, which enabled games such as 1984 classic Impossible Mission to feature immersive music and sound effects, including digitized speech. “I was blown away by the fact that I had this little gray box, this beige bread bin in my house, that … would speak to me,” Thompson says, adding, “I just couldn’t believe that I could plug this into my TV and I had this other world going on, and I just wanted to understand it.”
Thompson began programming games at age 7 or 8, and he’s worked in engineering and computer science for the past 30 years. In 2013, a lighter workload at his day job left him time to start NX Gamer as a side project that he hoped would help inform less tech-savvy gamers. His commitment to the channel grew in response to the controversy surrounding Ubisoft’s 2014 open-world Watch Dogs, which (like a lot of hyped games before it) had suffered a graphical downgrade between its E3 reveal and its release two years later. Frustrated by the misinformation he’d come across online, Thompson tried to correct the record via video. Positive feedback flooded in, convincing him to keep publishing. “I all of a sudden thought, ‘Well, I know a lot of this stuff, because I’ve been doing it for years,’” he recalls.
Thompson says viewers seek out his videos for four main reasons. Some come for his educational and lightly comedic commentary, which he tries to keep spoiler-free. Others tune in to reassure themselves that a game looks good before buying it for their system of choice, or to decide which version to invest in if more than one option is available to them. “And then,” he says, “you probably get that smaller portion, but certainly the most vocal, which are just here to take any ammunition they can to say, ‘See, I knew it. Xbox is better than PlayStation. He’s shown it on that video.’”
Console-based bickering is the bane of the graphics gurus’ existence. Linneman, Thompson, and others in the field insist that they’re unbiased when it comes to Microsoft-Sony-Nintendo divides. “I really try hard to not stoke those flames, because I don’t think it’s interesting or useful,” Linneman says. Yet supporters from each side co-opt their productions to denigrate their rivals. Others accuse them of antipathy or favoritism, just as some sports fans contend that particular prospect evaluators hate their team. (Digital Foundry does accept sponsorships from gaming companies for some of its videos, though it says it maintains complete editorial control.) The committed console warriors who’ve formed tribal bonds with corporations or platforms “will never go away, and the best we can do is not give them visibility and try to foster a healthy community,” says Rafael Martín, a programmer and aspiring game developer from Madrid who operates YouTube channel ElAnalistaDeBits, which has amassed more than 430,000 subscribers and nearly 300 million views since he started it in 2012.
Martín encounters console warriors despite the fact that he doesn’t do commentary, which he worries would fragment his international audience. The 29-year-old native Spanish speaker makes all of his income from his videos, in which he wordlessly compares frame rates, resolutions, loading times, and other technical qualities by presenting synced footage from multiple versions of the same game in a split-screen format. His channel is comparable to Candyland, which has almost half a million subscribers and was started in 2014 by Stefan Seiler, a former employee of Webedia, the entertainment and media company that still owns it.
Rafael Selzer, the Munich-based man who recently succeeded Seiler at Candyland’s controls, says that Seiler “wanted to give people a platform where they can just plainly see what’s the difference.” The name of the channel, which consists of synced, stat-free footage without commentary, is a reference to the eye candy on display, as well as the kid-in-a-candy-store feeling it’s intended to evoke. “It’s not our intention, or certainly it isn’t mine, to put oil into the fire of the console wars,” Selzer says. “Because what we do is just the comparison without the judgment, and people are there to judge by themselves. If they feel that their system is the best, I guess they will always find reasons for that, no matter what it looks like.”
The irony of the continued console warring is that the consoles themselves sport increasingly similar specs. “More than ever, they’re almost the same,” says Linneman, who notes that the PlayStation 5 and the Series X/S run on processors and graphics cards made by the same manufacturer, AMD. Thompson concurs, saying, “I think it’s the smallest gap we’ve seen. It’s smaller than last generation between the PS4 and Xbox One, which generally wasn’t that big.”
In the SD or early-HD olden days, certain systems had obvious strengths and weaknesses that stemmed from technological limitations. “You had to pick and choose when designing the silicon, and that obviously greatly impacted what was possible on the platform,” Linneman says, drawing a contrast between those idiosyncratic systems and today’s “very standardized gear that can do kind of anything.”
Variations in resolution were once more important too. “Back then, resolution mattered a whole lot because you were dealing with far fewer pixels,” Linneman explains, adding, “You had displays that were poor at scaling. You had very basic anti-aliasing techniques that were often somewhat ineffective. When you combine all that together, suddenly the pixel difference between two versions of a game [could] make a huge difference.” Some modern games run in native 4K while others fall slightly short of that standard, but “it’s so difficult to tell during normal play from a normal viewing distance that it doesn’t matter.” (For me, the visual advance from 1080p to 4K was welcome but less memorable than the eye-opening moment in 2000 when I switched my Dreamcast’s cables from composite to S-video.)
Linneman likes doing platform comparisons, “but only when the platforms are actually really different. … Just having the wonder box that can do everything isn’t nearly as interesting.” The four-year-old Switch, whose sales haven’t suffered from Nintendo’s gameplay-over-graphics philosophy, often outputs drastically downgraded visuals when it goes head-to-head with its higher-powered competitors, though it’s impressive that some studios pull off ports at all. But where Microsoft and Sony are concerned, Linneman would “rather talk about the technology of the game than, ‘OK, which platform’s better?’ Because it’s not that important anymore.” Like Japanese holdouts after World War II, the flame warriors are continuing a conflict that the combatants themselves have largely abandoned.
The way of the graphics guru requires not only patience, finesse, and technical know-how, but a lot of fancy hardware: All of the consoles, of course, plus capture equipment and an oft-upgraded, high-end PC with enough RAM to wrangle 4K video and hard-drive space to store it. (4K is a killer: When he was working with 1080p video, Thompson says, his computer could almost render it in real time, whereas 4K video takes close to 10 times as long to render as it does to watch, forcing him to chart out his week’s workflow in advance.)
Linneman describes his setup: a suite of consoles connected to an AV receiver with dual outputs, one tethered to his TV and the other to an external capture device, which itself splits the signal to various capture cards. There’s more splitting, upscaling, and line doubling involved when he works with old-school consoles. “I have like four different capture solutions ready to go at any point,” he says. (“It’s way more complicated in terms of the way it’s wired up than it sounds,” he concludes, though it sounds pretty complicated.) Those who work for big companies get gear and games for free, but for others, the accoutrements come at considerable expense: Martín and Thompson say they purchase 70-80 percent of the games they cover (although Sony sent Thompson a PS5).
Linneman’s bio brags that he’s nicknamed “The Human FRAPS,” a reference to the venerable screen recording program that displays frame rates in real time. But Linneman doesn’t rely on his eyes to assess how smoothly and steadily software runs. Like his colleagues and Thompson and Martín, he relies on proprietary, custom-designed software that ingests his stored footage and spits out the desired stats, which he embeds in his videos during post-production. Others in the graphics game use open-source software to do the same thing, although the proliferation of less-experienced analysts can lead to less-accurate info at times.
There's a real issue lately with people using open source frame-rate analysis tools to produce inaccurate results. Here's an example. When the character isn't moving, the tool loses track of frame updates suggesting a low frame-rate. It's wrong.https://t.co/9Va0wRJjbb— John Linneman (@dark1x) January 25, 2021
The rise in channels posting this stuff is deeply concerning as this misinformation only builds confusion and causes people to lash out. We spend a LOT of time ensuring accuracy but I can also vouch for @VGTech_ and @N_X_G (and Gamersyde). They are accurate. Others? Be cautious.— John Linneman (@dark1x) January 25, 2021
Thompson says he tries to spend at least five hours playing a given game before rendering a verdict on its visuals, though he prefers to finish games if he has time. Much more than 10,000 hours of combined programming, playing, and editing have given the veteran gurus an eye for imperfections and a sense of the most arresting or representative scenes, but outside of cutscenes, the captures can be a painstaking process. Linneman experimented with a dongle that theoretically allowed one player to control the PS4 and Xbox One with the same controller, but most games would go out of sync. Thus, the videographers line up parallel shots the old-fashioned way: recording gameplay footage from one console and then immediately trying to replicate the same sequence of movements in precisely the same place in a version of the game running on a second console. When they aren’t playing or editing, they’re often trying to learn more about what they’re watching by reading research, watching presentations, and talking to developers who are willing to illuminate the tricks of their trade.
Digital Foundry still tries to serve the segment of its audience that’s there for the facts and the frame rates. But between the near-parity among contemporary consoles and the industry’s diminishing returns in the quest for more lifelike looks and sounds, the company has gradually “shifted from what the details of things like resolution and frame rate are, to more [of the] how and why,” says Linneman, who relishes asking developers to divulge details of what they do. He’s also become more inclined to retreat to the past. In 2016, he started Digital Foundry’s popular DF Retro series, which allows him to investigate the graphics of games from eras when the visual leaps and disparities were still large and oppressive hardware restrictions compelled programmers to develop ingenious workarounds.
Until 2020, Thompson (who has a little more than 50,000 subscribers) had typically played for fairly small crowds. But in a sign of the swelling appetite for technical breakdowns, he started freelancing last month for gaming-media behemoth IGN. Destin Legarie, IGN’s director of video content strategy, hired him to contribute to the site’s YouTube channel, which is approaching 15 million subscribers. (At IGN, Thompson is averaging a video a week, in addition to producing one every week or two for his own channel.)
Last November, Legarie started an in-depth, Digital Foundry–esque “Performance Review” series, which Thompson has continued while Legarie is on paternity leave. “Seeing that games became closer and closer across consoles, I wanted to find a way to get a more refined and detailed look at what’s going on with each,” Legarie explains via Twitter direct message. The response from IGN’s massive audience, he says, has been “overwhelmingly positive,” and Thompson’s first video for the site has garnered nearly 400,000 views, easily surpassing all but one of the 500-plus videos on his home turf.
The relationships among the most prominent graphics gurus seem mostly harmonious, even as they threaten to encroach on each other’s turf. “There’s no rivalry,” reports Thompson, who says he consulted with Leadbetter when the latter was moving DF’s videos in a more produced direction. “Digital Foundry is, for me, the benchmark in the sector,” Martín says via email, and Legarie, who says IGN checked its frame-rate findings against Digital Foundry’s when developing its internal tools, adds that “Digital Foundry were a big influence on my desire to create something similar but unique on IGN.” Linneman doesn’t mind that Digital Foundry’s style of analysis has spawned so many imitators. “It kind of shows that it was a good idea in the first place,” he says, adding, “There’s hundreds of thousands of people just doing normal reviews on YouTube, so why not have people do this as well?”
Some of the most prominent channels see their approaches to covering video game visuals as complementary. “There are other people who answer the deeper questions, and we are there to answer the simple ones,” says Selzer, who describes Candyland as “a service from gamers, for gamers” that helps players balance their hunger for the latest and greatest graphics against the constraints of finite time and budgets. “One way we plan to stand out,” says Legarie, “is to give developers a chance to comment on any findings (good or bad) and share with us why they made those decisions during development.” And although there’s bound to be overlap on big games, there’s room on the release schedule for creators to coexist. As Linneman says, “There’s so many games that could be covered, and we can’t cover them all.”
Experience has given the gurus a finely tuned feel for when graphics go wrong. But it’s also deepened their respect for the wonders on display. Anyone can tell that Horizon Zero Dawn’s grass looks good, but learning that it looks that way because the scenery is procedurally generated unlocks a different dimension of appreciation. Although Thompson prides himself on understanding the sausage, he says that “There are techniques sometimes where I’m thinking, ‘How on earth did they do that?’”
Linneman, who feels like he’s dissecting the creations of “absolute geniuses,” says, “to me it’s amazing just thinking that some of these games can ship at all, especially at the complexity that we’re seeing now in these big Triple-A games. The hardware is no longer really the limiting point. It’s more just the sheer volume of people required to create some of this stuff.” The dense detail packed into spaces that were once bland and boxy means more labor for the builders of virtual worlds. “Hopefully people can appreciate that and stop and smell the scenery,” Linneman says.
As the recent Cyberpunk 2077 debacle demonstrated, that complexity can come at steep costs: months or years of crunch, culminating in a game that’s still unready for release. To get games done, developers cut any corners they can. “It’s all smoke and mirrors,” Thompson says, admiringly. “Everything’s a trick, a cheat, a cheap technique to get around … the cost or the expense of rendering something.” Developers have gotten better at faking effects—overlaying an animated texture map on a watery surface that resembles reflected light, say, instead of simulating a “real” reflection. But they may not have to fake it for long.
Even though the gurus acknowledge that the most stunning graphical gains are behind us—at least until we’re all living in VR—they’re still excited about the next steps. The rise of real-time ray tracing is transforming lighting and shadows from “good for a game” to photorealistic. Faster CPUs are raising frame rates, and SSDs are slashing loading times. Thompson also forecasts coming improvements in animation quality and the size, interactivity, and destructibility of environments. Those marvels will come to consoles a little later in the lifespans of the PS5 and Xbox Series X/S. “I think it is early to talk about differences between these two consoles, since we do not have a 100-percent-next-gen, multiplatform title to draw conclusions,” Martín says. “Time will settle these differences with the arrival of new engines like Unreal Engine 5.”
Increasingly, artificial intelligence is being brought to bear on both old and new games; an almost magical algorithmic technique called deep learning super sampling allows for higher resolutions with lower computational costs. The AI is arriving just in time to lighten the load on developers, because the bar for video game graphics is getting so high that, as Linneman says, “It’s almost reaching the point where it’s borderline too much for humans to do.” And, for that matter, for most humans to comprehend. That’s why the graphics gurus hope you’ll click “like” and “subscribe.”