This is a real-time map of how AI is evolving and devolving humanity's tastes and tolerances.
Each bubble on this map is a documented moment when AI-generated content crossed a threshold or changed a human norm. The bubbles include platform failures, legal reckonings, scandals, and cultural shifts.
Take about 90 seconds with these five pages. They'll make the map make sense.
Every bubble is a real, documented event. Here is how to read what you are looking at.
The vertical position of each bubble shows whether this moment is SLOP (the content itself) or a SPILL (the AI-enabled consequences).
The artifacts. Fake images, AI-written articles, synthetic personas, generated audio and video.
The consequences. Legislation, lawsuits, platform policies, cultural reckonings that AI content triggered.
The color of each bubble tells you what kind of moment it is. Use the filters on the map to focus on what interests you most.
Click on the bubbles to read what happened, what it tells us, the source, and in some instances an editor's note from Ellie.
Throughout the site, bubbles are also called nodes. Same thing, more precise. You will see it in the filters, the counter, and the admin tools.
Category and source: where it lands and where to read more.
What Happened: the documented fact. No editorializing.
What It Tells Us: why this moment matters beyond the headline.
Impact score: drives the bubble size on the map.
Some bubbles carry a personal note from Ellie. Look for the speech bubble icon on the map.
The amber arc connecting key nodes is the curator's through-line — the spine of the story. Toggle it on or off using ◆ THROUGH-LINE in the top nav.
slop, n. — digital content of low quality that is produced usually in quantity by means of artificial intelligence. Merriam-Webster Word of the Year, 2025 ↗
I started noticing slop in small, innocent ways. A family member shared an image that was...off. Then my social media feed grew increasingly hollow. Wherever I looked, the Internet was flooded with content mimicking humanity but lacking its soul and imperfections.
The Slop Galaxy tracks the moments when AI-generated content crosses a threshold: When it entered a courtroom, a newsroom, a presidential feed, a child's video stream. Some of these moments are about the slop itself: the fake images, articles, or personas. Others are about the 'Spill': the legislation, the lawsuits, the cultural reckonings that follow. Together they're a record of how a technology is rewriting the terms of what we trust, consume, and tolerate.
Maybe you've been watching this closely. Maybe you've been trying not to. Either way, something is shifting, and it's worth understanding what and how, in real time.
This is not a neutral database. It's a curated map with a point of view: that what we tolerate online shapes what we become offline, and that someone should be keeping track.
Ellie Damashek is a strategist and researcher specializing in change management and emerging technology. She has spent her career helping organizations understand and navigate disruption before it becomes a crisis. The Slop Galaxy is her attempt to apply that same instinct to the cultural moment we're all living through.
Enter the admin password to enable editing.
Paste one URL per line. Claude AI will fetch each page, auto-fill the title, description, date, source, and category — then drop them into the galaxy. Review each one before it goes live.
⚠ Requires your Anthropic API key (stored locally in your browser only)
Crawl found 0 new items. Check the ones you want to add.
The galaxy can automatically search for new AI slop stories using the Anthropic API and web search. Set how often to crawl, and it runs silently in the background — adding only high-quality, non-duplicate nodes.