In a recent survey, we explored gamers’ attitudes towards the use of Gen AI in video games and whether those attitudes varied by demographics and gaming motivations. The overwhelmingly negative attitude stood out compared to other surveys we’ve run over the past decade.
In an optional survey (N=1,799) we ran from October through December 2025 alongside the Gamer Motivation Profile, we invited gamers to answer additional questions after they had looked at their profile results. Some of these questions were specifically about attitudes towards Gen AI in video games.
Overall, the attitude towards the use of Gen AI in video games is very negative. 85% of respondents have a below-neutral attitude towards the use of Gen AI in video games, with a highly-skewed 63% who selected the most negative response option.
Such a highly-skewed negative response is rare in the many years we’ve conducted survey research among gamers. As a point of comparison, in 2024 Q2-Q4, we collected survey data on attitudes towards a variety of game features. The chart below shows the % negative (i.e., below neutral) responses for each mentioned feature. In that survey, 79% had a negative attitude towards blockchain-based games. This helps anchor where the attitude towards Gen AI currently sits. We’ll come back to the “AI-generated quests/dialogue” feature later in this blog post since we break down the specific AI use in another survey question.



Well, what I’m working on is a mod for STALKER Anomaly, and most large models already seem to have good enough awareness of the STALKER games setting. I can imagine it’s a much bigger challenge if you’re making your own game set in your own unique world. I still need to have some minor game information inserted into the prompt, but only like a paragraph detailing some important game mechanics.
Getting longer term interactions to work right is actually what I’ve been working on the last few weeks, implementing a long-term memory for game characters using LLM calls to condense raw events into summaries that can be fed back into future prompts to retain context. The basics of this system was actually already in place created by the original mod author, I just expanded it into a true full on hierarchical memory system with long- and mid-term memories.
But it turns out creating and refining the LLM prompts for memory management is harder than implementing the memory function itself!