AI Is Causing Cultural Stagnation, Researchers Find
Generative AI relies on a vast amount of training material, which consists primarily of human-authored content randomly pulled from the Internet.
Scientists are still trying to better understand what will happen when these AI models run out of this content and have to rely on artificial intelligence-generated data instead, closing a potentially dangerous loop. Studies have found that AI models begin to decompose this AI-generated data, which can ultimately turn their neural networks into mush. When AI duplicates recycled content, it starts producing bland and often distorted output.
There is also the question of what will happen to human culture when AI systems endlessly absorb and produce AI content. While AI executives promise that their models are capable enough to replace creative jobs, what will future models be trained to do?
In an insightful new study published in the journal Patterns This month, an international team of researchers discovered that a text-to-image generator, when hooked up to an image-to-text system and asked to loop over and over again, eventually converges on “very generic-looking images” that they called “visual elevator music.”
“This finding reveals that even without additional training, AI’s autonomous feedback loops naturally drift toward common attractions,” they wrote. “Human-AI collaboration, rather than entirely autonomous creativity, may be necessary to maintain diversity and surprise in an increasingly machine-generated creative landscape.”
As Ahmed El-Gammal, a computer science professor at Rutgers University, wrote in an article about the work he did ConversationIt’s further evidence that generative AI may actually be leading to a state of “cultural stagnation.”
The recent study shows that “generative AI systems themselves tend toward homogeneity when used independently and repeatedly,” he said. “They even suggest that AI systems currently work this way by default.”
“Convergence occurred with a set of nice stock images without retraining,” the beauty added. “No new data added. Nothing learned. Crash only occurred due to repeated use.”
It’s a particularly troubling predicament given the tidal wave of artificial intelligence drowning out human-made content on the Internet. And while AI proponents argue that humans will always be the “final arbiter of creative decisions,” according to Aesthetics, algorithms have already begun to float AI-generated content to the top, a homogeneity that can significantly hinder creativity.
“The danger is not only that future models may be trained on AI-generated content, but that AI-mediated culture is already filtered in ways that favor the familiar, the describable, and the conventional,” the researcher wrote.
It remains to be seen to what extent existing creative outlets, from photography to theatre, will be affected by the advent of generative AI, or whether they can coexist peacefully.
However, it is a worrying trend that must be addressed. El-Gammal argues that in order to halt this process of cultural stagnation, AI models must be encouraged or incentivized to “deviate from norms.”
“If generative AI aims to enrich culture rather than flatten it, then I believe systems must be designed in ways that resist convergence to statistically average outputs,” he concluded. “The study makes one thing clear: In the absence of these interventions, generative AI will continue to drift toward mediocre and uninspiring content.”
More about generative AI: San Diego Comic Con quietly bans AI art
Don’t miss more hot News like this! Click here to discover the latest in AI news!
2026-01-26 16:07:00



