Video games are one of the most popular forms of entertainment today. Although billions of people seem to be having fun playing them, the question still looms: could they also be bad for us? Many things have been said about the impact of video games on players in the past few decades, more specifically about their potential negative effects, while their potential positive effects have often been ignored. Video games have been accused of making players violent, isolated, dumb, or addicted. Just like rock’n’roll and comic books before them, video games worry parents and policymakers. But does academic research confirm these worries?
I grew up playing video games with my parents, friends, and other family members, so I wasn’t confronted with this suspicion towards video games until after I got my PhD in psychology (I’m specialized in cognitive psychology, the study of how humans process information and acquire knowledge). Despite my interest in both cognitive science and games, I never academically studied video games myself. My subjective opinion of video games was (and still is) that they’ve allowed me to explore new worlds, solve puzzles, learn to persevere, and to simply have fun. Objectively, it’s important to take a step back and scrutinize games in a scientific way. But before I get to the research on video games, it’s important for me to explain my professional relationship with them.
After I graduated in 2004, I left academia and started a career in the private sector, participating in the development of so-called educational toys and games. At this time I came to realize how some parents were quite worried about video games, which prompted me to delve into the research on games. I then started writing articles and giving talks about what were the benefits of playing video games and what were the potential concerns about them, according to academic research. My conclusion back then was that video games, like any games, had cognitive and social benefits, depending on the type of game, and that overall the worries about videogame play were exaggerated. Thus, as long as children played games that were age-appropriate, that video games did not dominate their fun activities, and that they had a good night sleep, there was no particular reason for concern.
Fast forward to 2021. For the past 13 years, I’ve been working in the videogame industry. I started at Ubisoft HQ (France), working on their Games for Everyone line and teaching game developers how humans learn and what it means for educational games, and more broadly for any game (especially regarding its learning curve, such as game tutorials). I later moved to Ubisoft Montreal, where I worked at their playtest lab, and my focus shifted from educational games to commercial games for adults, such as the Rainbow Six franchise. In 2012, I joined LucasArts to work on Star Wars games (e.g. Star Wars: 1313) and when the studio was shut down in 2013, I moved to Epic Games and became director of user experience (UX) there.
The term “UX” refers to the consideration of the end user (a human) when we design something. It’s a mindset, a philosophy, focusing on improving the production cycles and the design process to offer the best experience possible to the users of a product, system, or service, while minding their best interests. This mindset is relatively new in the game industry, although it blossomed and expanded after the Second World War in many other industries (from industrial design to web design in the 1990s).
Cognitive science is the foundation of UX practice, which is why my background is relevant in this field. Experiencing an object, an environment, a service, a website, or a video game happens in our minds. So if creators and developers want to offer the best experience possible to their customers or users, they need to understand what’s going on in the mind of a human as they are interacting with the product. For example, if we see a door with a handle on it, we are likely to believe that we need to grab the handle and pull this door. If the door actually needs to be pushed, it’s frustrating, because it’s not working the way we anticipated. This is called a “UX fail.” UX practitioners do their best to anticipate humans’ expectations and needs, so that they can more intuitively use objects or systems (such as putting a plate on the side of the door where it needs to be pushed), and hopefully even enjoy their interaction with them. Despite its impressive performances, the human brain has great flaws. Having a UX approach is therefore important if we are to offer the best experience possible to humans. UX practitioners do so by having some basic knowledge of how our mental processes work (such as perception, attention, or memory) and by having a “design thinking” process, whereby they start with a prototype that is then tested by users to verify whether it’s meeting their needs and expectations before moving forward in the production development process.
This is what I’ve been doing in the past decade or so in the video game industry: striving for video games to offer a usable and fun experience to players, all players (inclusion and accessibility are key concerns of UX practitioners), as I did on the game Fortnite (more specifically on the “Save the World” mode). In October 2017 I became a freelance consultant and started to publish about UX, video game design, and psychology. Not long afterwards, with the increased popularity of Fortnite among teenagers (in particular its free-to-play “Battle Royale” mode), and the overall booming popularity of video games in the world, concerns around this interactive art form and entertainment seemed to renew. This is why I wrote the book The Psychology of Video Games: to explain to a broad audience how games are made, how psychology is used to improve them (i.e. the UX mindset), and to digest the current research on the potential impact of video games on players in a nuanced yet concise and approachable way.
So, back to my original question: does science confirm that we need to be worried? Or could video games actually be good for us?
What are the potential positive impacts of playing video games?
Let start by what was sadly relatively much less studied about games: whether they can be beneficial to players. Beyond the fact that play, defined as an “activity that is intrinsically motivated, entails active engagement, and results in joyful discovery” is overall good for cognitive and social development, video games have been found to have several notable benefits. In particular, certain commercial action games have been found to enhance visual attention skills. Other games, such as Tetris, have been shown to improve spatial ability such as mental rotation time. Research on the cognitive benefits of videogame play is still emerging but there is some evidence that certain games can have positive effects on visual and cognitive skills. Other games have been explored for their potential to foster prosocial behavior.
Video games are also explored for their potential to engage players for educational purposes. The most notable impact here is admittedly made by teachers and educators who are using certain existing games for education in class (such as Minecraft or SimCity). Nonetheless, some scholars argue that video games can develop a “growth mindset” (Dweck, 2006). It refers to the idea that intelligence is incremental, thus hard work and perseverance are what matter. The hypothesis here is that video games could encourage perseverance, which is generally recognized as being important in learning.
One last main area of exploration is whether video games can have a positive impact on health and well-being. While some games are explicitly designed to treat children with attention deficit hyperactivity disorder, playing commercial games like Animal Crossing: New Horizons have also been recently found to be positively associated with affective well-being.
Overall, contrary to the popular belief that video games might have negative effects on health and well-being, the research is currently showing that in many cases video game play yields benefits, although they haven’t been found as extraordinary as some video game enthusiasts might believe.
What are the potential negative impacts of playing video games?
One of the oldest concerns regarding video games that was explored is whether they can cause aggressive behavior in real life. Decades of research studying this area have so far mostly yielded heavy debates, yet no consensus on the matter. Nonetheless, there is currently no clear evidence allowing to attribute real-life violence, such as mass shootings, to video games, as the American Psychological Association (APA) pointed out in a 2020 resolution: “Attributing violence to violent video gaming is not scientifically sound and draws attention away from other factors.” Moreover, their potential association with mild aggressive behavior is also highly debated among scholars and within the medical community.
The next concern was whether video games could negatively affect school performance (e.g. lack of attention at school). Here again, the results are highly debated among scholars. Some researchers found an association with videogame play and poorer grades, but others did not. And it’s important to note here that when such association is found, it doesn’t say anything about a causal relationship. Correlation is not causation. It could very well be that children that have poor performance at school are likely to play video games instead of doing their homework if they don’t feel competent at school. That being said, it’s obvious that if a child (or an adult) spends too much time playing games to the point that other activities are neglected (such as sleeping, doing school work, having a social life, etc.), it’s never good. Doing a variety of activities (including physical activity) and sleeping are very important for the brain, and even more so for brains in development. Which leads us to the last major concern about video games, becoming even more important today: can video games be “addictive”?
An increasing number of parents are worried about their children being “addicted” to certain video games. Addiction is a delicate and complex topic. It’s a pathology. While there isn’t any straightforward definition of what an addiction is, it’s generally considered as the encounter between a person, a context, and an object (product) that is causing significant distress to the person who feels a compulsion to consume the product despite harmful consequences. The object is usually a substance, such as heroin, alcohol, or tobacco, but sometimes it can be a behavior, such as gambling. Gambling disorder is currently the only behavioral addiction that is recognized by the DSM-5 (the manual used to diagnose mental disorders). But what about the other addictions that people talk about, such as sports addiction, shopping addiction, or video game addiction? It is true that some people in a certain context can develop a pathological relationship with a pleasurable habit, such as playing video games (its prevalence varies between studies, for example between 0.1% to 1% in one study, or 3.1% in another). In this sense, some people do have a pathological relationship with video games, and they need help. Nobody is disputing this fact. However, when the World Health Organization (WHO) announced in 2017 the introduction of a “gaming disorder” in the next International Classification of Disease, it has stirred a lot of debate and controversy among scholars who disagree with the creation of a new disorder related to playing video games. The media psychology divisions of the APA and Psychological Society of Ireland jointly released a statement disagreeing with the WHO diagnosis, pointing out that “the current research base is not sufficient for this disorder and that this disorder may be more a product of moral panic than good science”. On a side note, the same controversy applies to social media and the moral panic associated with it. In the video below, Dr Rachel Kowert, research director at Take This org, summarizes the state of the research on “gaming disorder.”
To clarify, scholars who disagree with the creation of a new “gaming disorder” claim that when a pathological relationship between a player and a video game emerges, it should be best viewed as a coping mechanism for stress and anxiety (the specific context in which the player is currently evolving), or as a way to satisfy basic psychological need of competence, autonomy and relatedness when these are not felt in real life by players. Lastly, scholars are also debating the proposed “gaming disorder” diagnosis criteria, because being engaged with a game and playing long hours is not enough for someone to be considered a pathological gamer. Clearly distinguishing between passionate gaming and pathological gaming is critical to avoid stigmas, and to avoid downplaying true addiction suffering.
Overall, the worries around video games seem greatly exaggerated. Video games, as a medium, are neither good or bad by themselves. It greatly depends on what game we are talking about, how it is consumed by the player, and why. Moreover, social relationships are very important to teenagers and in today’s world video games are where a lot of social connections happen, especially during a pandemic. But does that mean that video game developers should be washing their hands on any responsibility towards their players? I do not believe so.
Pushing for better ethics in the video game industry
Video games are not designed to be “addictive.” Like we saw earlier, an addiction is a pathology that does not depend uniquely on an object; it also depends on the individual and their current life (context). And video games are not a substance that can disturb the brain chemistry balance and lead to a physical dependence, like nicotine, alcohol, or heroin can do. But that does not mean that some video games should not be under ethical scrutiny. In fact, certain designs and monetization practices are used to exploit human brain limitations and biases in order to maximise revenue or play time, at the expense of players’ best interest.
Creating a video game, just like creating a movie, can be extremely expensive. Moreover, players are expecting most games to be free to play today, which means that studios need to be creative in order to make revenue. Any monetization system will create friction for the user. After all, spending money is generally not perceived as being a good experience. Games that have an amazing advertisement campaign influencing enthusiastic gamers to pay for the chance of accessing the game, only to find themselves disappointed because the game is nowhere near the amazing experience promised, are deceiving. With free-to-play games, the advantage is that players can try the game for free. They do not need to take the studio for their word and buy a game on faith that the advertised experience is the one they will have. But this is creating new types of friction points for players, and new potential for “dark patterns.”
A dark pattern is a design that is purposely deceiving, with the ultimate goal to benefit the company at the expense of users. In this sense, false advertising is a dark pattern. But with free-to-play games (and apps), new dark patterns have emerged. For example, they can take the form of some sort of pressure put on players to engage with the game a certain amount of time or on a certain day, otherwise they will miss out or lose something. This is called FOMO (fear of missing out), and it can be used as a dark pattern to influence players to play every day, otherwise they might lose a reward that they care about (this technique, along with others, are often considered as being part of the “attention economy”). These mechanics have not been invented by the videogame industry and are certainly not only used by this industry. Retail has been using FOMO for decades, such as when telling customers that they might miss on amazing savings if they do not buy on a specific day (e.g. on Black Friday).
They are many other examples of dark patterns or grey areas used today by the game industry. It’s important for UX practitioners to learn to detect them and raise awareness, because using dark patterns is fundamentally going against the UX mindset; it’s when business goals take priority over the user’s best interests. But it’s first of all the responsibility of stakeholders to define company values. In an effort to draft what a code of ethics could look like in the game industry, some colleagues and myself have started the ethicalgames.org initiative. This effort is going to take a lot of work, but the hope is that it will help game developers understand better what is at stake, and gamers and parents to be better informed to demand accountability.
Playing video games by itself does not constitute a particular worry to parents or policymakers. However, the videogame industry, just like any industry, should be under strict scrutiny regarding certain practices that can cross ethical lines, especially when minors can be affected. But to identify these lines, it’s important to have a nuanced and evidence-based approach to video games and their impact on players. The issue is that sometimes video games (and their makers) are accused of being purposely designed for addiction, just because they are fun and popular. Surely, this is not enough to question the ethics of a game, otherwise why not question the ethics of a popular TV show, arguably designed to keep its audience on the edge of their couch and eager to watch the next episode? Just like movies, games are supposed to be engaging. That’s their whole point. They’re also supposed to manipulate people’s emotions, just like music, movies, books, or paintings do. Evaluating the ethics of video games is not an easy task, and lines are blurry.
The Bottom Line
Over 2.8 billion people have fun playing video games and the large majority of game creators are passionate about their work. Video games are an art form. They are a rich medium that offers a very diverse pool of experiences, some of which you play alone, others collaboratively, and others in competition with many other people. While the current moral panic around video games is greatly exaggerated, it’s important to point out the flaws of the game industry and to push for better ethical practices overall. The benefits of games are also overall exaggerated, yet certain games do have added value in health and education. An increase of funding to explore the positive impact of games could greatly help increase our understanding of them. But above all else, video games are just supposed to be fun.