top of page
  • Writer's pictureSalma Atallah

Are AIs Slowly Replacing Actors?

Updated: May 30, 2022

Haaave you heard of Ted Lasso? The American TV series that set the world of entertainment ablaze after receiving a whopping 20 nominations and 7 awards at the Emmys this year. But perhaps the biggest flame this show has ignited has been on the internet; one that has centred around one of the show’s main characters, Roy Kent. Masses online are convinced: He is not real.

Twitter users aren't convinced.

Roy Kent’s character is played by (very real) actor Brett Goldstein, who snatched the Emmy for “Outstanding Supporting Actor In A Comedy Series – 2021” for his role in the show. Yet the fans refuse to succumb. It may sound ridiculous to question the realness of the actor – but, upon taking a look at snippets and images from the show, we can’t help but join one Reddit commentator in wondering whether or not the character really is “CGI AF”.

Show snippet: S2 E8 - "Man City" Season 2 Official Trailer

In what has come to be dubbed “Ted Lasso CGI discourse”, some are suggesting this could be a way for Apple and other tech-entertainment giants to experiment with anything from minor alterations to an actors’ appearance, to replacing the actor entirely with an algorithm. With Roy Kent’s perfect facial hair, his seemingly calculated movement, and his sharply defined personality, numerous fans believe that Kent is pulled straight out of a FIFA video game. The fictional character’s chant “He’s here, he’s there, he’s every-f***ing-where, Roy Kent“, only adds fuel to the burning conspiracy. But why would Apple create a CGI for a character and not disclose it, even to its ever-wondering and suspicious fans? While we don’t know for certain, we do know the buzz has certainly made for great (free) marketing.

History of AI in entertainment

Artificial intelligence (AI) has come a long way since 1997, when the program Video Rewrite was first developed to synthesize lip movements from one audio track, and apply them onto actors mouths so that it would appear as though they were mouthing words that they did not actually say (Bregler et al., 1997). In the decades that followed, thanks to advances in technologies such as facial recognition and voice processing, the foundations of techniques that would birth computer-generated imagery (CGI) were developed and optimized.

Today, AI is playing a bigger and bigger role in how films are made. The use of AI ranges from algorithms that can predict a movie’s success, to altering an actor’s appearance to better fit their character, or even replacing them altogether. When hearing the term CGI, the first thing that is likely to come to mind might be an action blockbuster or video game figure like the Hulk. As technology advances, so do the techniques that not only create graphics that imitate or replicate real-life images, but with it also generate genuine emotions to complement and humanize these graphics.

With the help of AI, not only is it becoming quicker and cheaper to create actors/characters, it’s also becoming more realistic. A great illustration of AI’s advanced capabilities can be seen in its ability to generate real-time facial re-enactment and animated facial expressions. This is observable in Snapchat filters that have attracted millions of users. In the world of entertainment, the use of AI for CGI can mean much more than a full-face of makeup at the tap of a screen - it enables the creation of an entire character. For instance, using only one image and AI, Google developer Damien Henry successfully created a 56 minute video.

It’s not uncommon for TV shows or movies to include a CGI character - think of the de-aged Princess Leia in Rogue One: A Star Wars Story. More than ever, TV shows and films are blurring the line between CGI and human (actors). A popular example for the breakthrough of CGI in contemporary film is the 2009 movie Avatar, which mapped actors’ faces onto computer-generated graphics, ushering in a groundbreaking change to the world of media production. A more recent example can also be observed in Scorsese’s The Irishman, where viewers can see the most extensive de-aging process ever seen in a feature-length movie.

Still from Avatar (2009)

De Niro's progressive ageing in The Irishman (2019)

Blurred lines

Whether used to enhance an actor's performance, accentuate their features, or replace them altogether, we can agree on one important thing: the increased use of AI in entertainment can blur our sense of what is real, and what is not. What once seemed as a distant future, is slowly becoming a reality. Just a few weeks back, Mark Zuckerberg announced Facebook’s rebrand to Meta, and the company’s intention to bring us into the metaverse. In its essence, the metaverse is a digital replica of our world. Zuckerberg ‘jokes’ (in dystopian Matrix humour), that if you die in the metaverse, you die in real life.

You could easily shrug this off as outlandish and unrealistic - who would actually sacrifice the truth of their being to instead live a virtual life derived from human senses? Well…as a matter of fact, you, me, and most people with a stable internet connection. As of 2020, we spent roughly 61% more time on social media than we did in 2012 - and that figure can only be expected to have risen during the pandemic. We’re essentially already thriving in Zuckerberg’s metaverse. Some might argue that social media is a different ballgame, that no one would truly give up their reality for a virtual world. To those people, I give the example of Cypher from the Matrix - he quite literally sacrificed his human body and its senses (arguably also his soul) to escape from reality into the virtual world of the Matrix. How does that saying go again? Give a poor man a steak, you feed him for a day. Teach him how to get infinite steaks in a virtual world, you feed him for a lifetime.

Still from The Matrix (1999)

You, I, and AI

Technologies like CGI increasingly challenge us to redefine what it means to be human, and how we distinguish between reality and fiction. Developments in AI have enabled a generation of CGI so realistic, it ignites a full-fledged online conspiracy surrounding the realness of Roy Kent. It calls into question not just the future of entertainment, but that of humanity as a whole. Our inability to identify the human from a computer-generated figure, to distinguish between human and machine, raises endless questions. If AI becomes advanced to an extent where we cannot identify a real person from a CGI, would we still need real human actors at all? Or would acting careers perish? But more importantly, it poses deeper existential questions worth considering: what implications would it have on how we define ourselves as humans, and what it means to be, well, real?

Our inability to identify the human from a computer-generated figure, to distinguish between human and machine, raises endless questions. If AI becomes advanced to the extent where we cannot identify a real person from a CGI, would we still need real human actors at all? Or would acting careers perish?

Still from The Matrix (1999)

If Roy Kent really is CGI, this could have a big impact on how we envision and experience our reality - a reality that will exist within the ‘metaverse’ and the resulting IoT of Bodies.¹ With technological advancements on both ends (one end blurring the lines of our current reality, and the other end allowing us to exist in a different reality altogether), the metaverse doesn't seem so outlandish after all. Efforts into educating people about the technologies that surround them, how to identify them, and how to engage with them, is a necessary first step in battling paranoias and anxieties about technology.

It is vital for society to have an understanding of technology in order not to inhibit its future acceptance. When Austria failed at attempts to introduce genetically-modified organisms (GMOs) into society, it was largely due to a lack of citizen participation in the co-production of this socio-technical development, which consequently did not earn technology acceptance from Austrian citizens (Felt, 2015). To prevent this ‘innovation resistance’, it's necessary to mould technological solutions to fit an accepted socio-technical imaginary (Jasanoff, 2004). For technologies like CGI to continue to thrive - creating an endless stream of dragons and infinite matrices in the process - science communication and transparency are therefore essential (we’re looking at you, Apple). he real?

Given how popular the Roy Kent conspiracy theory is getting, many Redditors are opting to investigate this first-hand. One user said, “Just watched. Had to search. Roy Kent is definitely CGI. Must have imported him straight from FIFA.” Another argued that it may be a case of ‘on point’ acting, where Kent’s personality is intended to be unemotional, tough, and purposefully robotic. So is Roy Kent a CGI, and we are entering a period of blurred realities? Or could it be that the world is simply not ready for a man with such good posture? Grab some popcorn, give Ted Lasso a watch, and tell us your opinion!



  1. The IoT of Bodies is an ecosystem that is essentially made up of various devices (e.g. smart watches, heart monitor, smart baby monitors) that are connected to the internet and contain software, as well as possess the ability to collect personal health data or even alter the body's functions.


Bregler C, Covell M and Slaney M. (1997). Video Rewrite: Driving Visual Speech With Audio. Interval Research Cooperation. ACM SIGGRAPH 97.

Felt U. (2015). Keeping Technologies Out: Sociotechnical imaginaries and the formation of Austrian national technopolitical identity.

Jasanoff S. ed. (2004). States of Knowledge. The Coproduction of Science and Social Order. London: Routledge.

443 views0 comments


bottom of page