top of page
  • Writer's pictureSalma Atallah and techQualia Team

Idiocracy: Is AI making us smartly dumber?

Who’s smarter: the TikTok algorithm, or the Gen-Zer manipulating it to maximise engagement on their videos? Our intelligence as humans is what sets us so distinctly apart from other species. Today, technology has radically shifted how we conceptualise that intelligence. The use of the terms “artificial intelligence” and “smart technology” allude to the existence of a type of intelligence that can not only replicate, but surpass, and diminish our own.



Not only has the proliferation of technology shifted our understanding of what intelligence denotes; it has also caused shifts in the social fabric of society. The digital divide ensures that anyone who is unable/unwilling to integrate these types of technologies into their everyday lives is effectively sidelined. Given how technology has exponentially advanced over the past few decades, it’s no surprise that an astounding intergenerational gap in tech know-how exists between those born into these technologies and those born around it (think of your parent Boomers here).


In this article, I outline agreed upon definitions of intelligence before exploring whether the intergenerational gap in the use of technology has an impact on our intellectual capabilities. I then engage with the plot of Idiocracy to show the importance of critically assessing our reliance on technology. A dark(ish) comedy, the movie depicts a dystopian future where our increased dependence on technology makes us, well, astoundingly stupid. Is this our fate? Is the technology we created surpassing us in intelligence? Or is it helping free up time that would otherwise be spent on boring, monotonous tasks - time that could be better spent on advancing intelligence? Let’s dive in!


Intelligence


Our heightened reliance on technology comes as part of the promise that this reliance will make us smarter. Technological advancement and implementation have become part of the never-ending quest to become more intelligent, productive, and efficient.


But the task of defining that intelligence is no easy one - how do we qualify it? For centuries, scholars across an interdisciplinary spectrum of fields have dedicated themselves to this question. While there’s no one right way to qualify it, ‘intelligence’ has a few agreed upon characteristics. The word itself comes from the Latin word “intelligere,” - to understand. A person’s ability to comprehend new information, act on the newfound knowledge, and learn from the outcomes is considered a key feature of intelligence.


So, if we qualify intelligence by a person’s ability to learn and act accordingly, how do we then quantify that ability? The most generally agreed upon method of measuring intelligence is through standardised tests such as IQ tests. These tests, adjusted for age, approach intelligence as a gradient - someone might possess the ability to learn and act in new ways, but just how well are they able to do that? Through metrics like IQ, an individual’s intellectual capabilities are defined and labelled.


But, in light of the emergence of widely accessible computing technologies that automate much of the process of learning and doing, do these metrics still have weight? If we rely on technology to learn on our behalf, what could this say about our own intelligence?


Intergenerational gap


When comparing how you spend your weekend with how your parents used to spend theirs when they were your age, you will find their free time was almost entirely tech-less. The average day-off for a teenager in the 70s meant doing all kinds of ‘offline’ things, like going to the cinema or the record store. Today, young people are almost as embedded into technology as tracking cookies are embedded into Instagram. People belonging to the ‘boomer’ and ‘Gen X’ generations are significantly less tech-savvy than the younger ‘millennials’, or ‘Gen Z’ers. But does this reliance on technology take away from the generation’s intelligence?


Let’s explore with an example. My ability to quantify data in a software program is not a reflection of my mental abilities or IQ, but rather of the enhanced capabilities of the computer programs you use. On the other hand, my father (a solid Boomer) understands what a linear regression is, how it is calculated, and its mathematical application. I (unproudly) admit my lack of knowledge on the logic and mathematics behind linear regressions. However, using an abundance of data and advanced computing technologies, I’m able to generate a graph demonstrating an extremely eloquent linear regression model that is just as - if not more - proficient than my father’s (and in record time, too).


To an observer, I might certainly seem ‘intelligent’. But this efficiency can mainly be attributed to the machine learning (ML) program I used to help me calculate the regression. Although it would take my father more time, he would do the calculations himself. This raises the question of whether we are witnessing an increase in intelligence for both the human and the machine, or just for the latter. It also highlights the difference in intergenerational understandings of, and reliance on, technology, and the impact this might have on the future of ‘human intelligence’.


Let’s turn to our movie to help make sense of the questions and consequences that can arise from our reliance on technology. Although this movie is dystopian, I find that this exaggerated depiction compels us to reflect on our relationship to technology. Idioracy is an all-time favorite movie of mine, and it just happens to be the perfect film to relay my thoughts through.


Movie: Idiocracy (2005)

Source: Twentieth Century Fox Film Corporation

The film Idiocracy offers an exemplary illustration of a futuristic, idiotic, and uncivilized version of our world resulting from our massive reliance on technology. Set in America in 2005, the story begins with Private Joe Bowers, presented as ‘average in every way’, being obligated to partake in a secret military experiment on hibernation. Alongside a woman named Rita, Joe enters an induced hibernation meant to last one full year.


But the base shuts down shortly thereafter and the slumbering pair are forgotten about for five hundred years. When the duo awakens, the year is 2505, and the planet is ruled by, ‘idiocracy’ - a “disparaging term for a society run by or made up of idiots (or people perceived as such)”[1]. In this idiocracy, our (not so) average Joe is considered the smartest living man on the planet.


The movie imagines humans becoming so terrifically stupid, that the entire population is profoundly anti-intellectual, named after corporate brands, and speaks a limited, sloppy register of English. Water has completely been replaced with a sports drink called Brawndo, resulting in the death of crops and a nation-wide food shortage. Technology has continued to advance in this dystopian world, but human intelligence has not equally advanced alongside it; in fact, the two appear to be mutually exclusive. One goes up, and the other comes down.


Through its humorous depiction of this idiotic society, the movie delivers a satirical warning to its audience: we are all slowly getting dumber, and our extreme dependence on technology is partially to blame.


It’s worth noting that his movie was filmed in 2005 - almost two decades ago. A lot has changed since - namely that our reliance on technology has exponentially increased. The future depicted is frightening because it isn’t far-fetched: are we getting dumber? Is our over-reliance on technology slowly eroding at our cognitive abilities the way it did the people of Idiocracy?


So, are we getting dumber?


The answers to these questions are not straightforward. In fact, there is no one decisive answer. The correlation between technological advancement and human intelligence cannot be watered down to a singular, cause-effect relationship. Unlike the fictional world of Idiocracy, the growth of one type of (artificial) intelligence does not necessarily stifle the presence of the other.


Here’s what we do know: Research has shown that the average IQ of humans across the world has seen a steady increase over the past 100 years, with no signs of significant decrease. Researchers have (unsurprisingly) found a number of diverse factors influencing that increase, ranging from environmental factors such as family structure and access to education, to biological factors such as nutrition and blood lead levels. Amongst that long list, exposure to technology has also been identified as contributing to an increase in average IQ.


Remember how we said our intelligence is measured through our ability to learn and adapt? Well, in order to do so, we need access to basic resources that (as it turns out) modern technology has made more accessible than ever before: access to quality education (think about how technology redefined learning during the pandemic), comprehensive healthcare (think bio-tech), nutrition (think GMOs), and the list goes on. By automating and streamlining mundane, labour-intensive tasks, technology has actually facilitated our ability to take on more intellectual endeavours.


Idiocracy should be seen more as a cautionary tale than a prediction of what’s to come. Any fears surrounding our loss of intelligence are, at this point in time, speculative. The film unironically lends itself to interesting debate surrounding the future of human reliance on technology. We must cautiously walk the line between reliance and dependence; innovation and stagnation; creativity and indifference. Next time you’re having lunch with a friend and pull out your phone’s calculator to calculate how much you owe, stop and think about what’s at stake: do you really need that calculator to add 16 and 85? Do you?



542 views

Recent Posts

See All
bottom of page