Podcasting is a unique space where people can share their voices, ideas, and stories freely. Unlike platforms controlled by a single company (like YouTube or Instagram), podcasting supports true freedom of expression. However, this openness is now being threatened by AI tools, such as Notebook LM, which make it easy to produce fake, low-quality podcasts. Unfortunately, many of these AI-generated shows are created by spammers, scammers, or blackhat SEOs, and they are harming both listeners and genuine podcast creators.
At Listen Notes, the leading podcast search engine and podcast API, we believe that creating a quality podcast takes real effort. Listeners can tell when a show has been crafted with care, and that’s why we are committed to stopping the spread of fake, AI-generated podcasts on our platform.
Enter Notebook LM
Notebook LM is a new tool built by a small team at Google. It's seen as a very "un-Googley" product: instead of aiming for mass appeal with billion-dollar revenue potential, Notebook LM serves a niche audience. It also breaks away from Google's typical approach of perfecting a product before launch, opting instead for a faster, more experimental release.
Notebook LM allows users to upload documents or web links and turns them into a podcast, featuring two AI-generated hosts discussing the content. Many people appreciate it as a fresh tech tool that boosts personal productivity, particularly for researching topics while multitasking. However, Notebook LM has also made it easier for spammers to create fake podcasts, cluttering the podcasting space with low-quality, automated content that is not intended for genuine human listeners.
Flooding the Podcast World with Junk
Notebook LM is contributing to the influx of low-quality content in the podcasting space. AI-generated content is already becoming prevalent across the internet, and Google, as a search engine, has been actively trying to counter this by penalizing websites with low-quality, AI-made content—though not always with complete success (for example, nearly all Google image results for "baby peacock" are AI-generated). Ironically, Google's own tool, Notebook LM, has made it easier than ever to create AI-generated podcasts, which is impacting the overall quality of the medium.
The team behind Notebook LM has claimed they are working on a tool to detect AI-generated audio, but as of now, it remains unavailable to the public—even after more than a week of back-and-forth emails with their product manager. At Listen Notes, we couldn’t wait any longer, so we took matters into our own hands. We built a simple tool, NotebookLM Detector, on a Friday afternoon to identify and remove fake podcasts. Just last weekend, we deleted over 500 fake shows that had been created with Notebook LM. This highlights the scale of the problem. You can see a curated list of such fake podcasts generated by Notebook LM here..
Bad for Listeners
Listeners tune into podcasts to hear real, interesting, and thoughtful content. They enjoy unique voices and personal stories. However, when AI tools like Notebook LM allow anyone to mass-produce low-quality shows, it becomes increasingly difficult for listeners to find high-quality, authentic podcasts. This wastes their time and erodes their trust in podcast platforms. This experience can ultimately make listeners less likely to continue using podcasting as a form of entertainment.
At Listen Notes, we are committed to minimizing AI-generated content on our platform. While AI-generated fake podcasts can often be found on services like Spotify and Apple Podcasts, we proactively remove them through a combination of automated tools and human moderation. Our goal is to ensure that listeners have access to meaningful, human-created content.
Although AI tools like Notebook LM can be useful for personal purposes, most people are unlikely to genuinely enjoy listening to AI-created content made by others. The absence of a human connection in these podcasts makes them less engaging. Furthermore, AI-generated conversations tend to be shallow and often contain fabricated information (also known as hallucinations), further diminishing the quality of the content.
Bad for Podcast Creators
Notebook LM doesn’t just hurt listeners—it also makes life harder for genuine podcasters. Creating a quality podcast takes time and effort, from researching topics to recording and editing episodes. However, with Notebook LM, anyone can quickly produce an AI-generated show with minimal effort. This means that real podcasters, who dedicate hours to crafting thoughtful content, are being drowned out by an influx of low-effort, AI-generated podcasts.
This trend is damaging to the podcasting community. Creators who put in the hard work to make compelling shows are increasingly overshadowed by the sheer volume of low-quality, AI-made content. This lack of visibility for human-produced podcasts can ultimately discourage creators, reducing the incentive to produce authentic content and paving the way for even more AI-generated shows—thus creating a vicious cycle.
If this trend continues, we risk losing the heart of podcasting: real human voices telling meaningful stories. At Listen Notes, we are dedicated to supporting real podcasters and preserving the quality of the podcasting medium. We believe that true connection comes from the effort, passion, and creativity that only human creators can bring.
Bad for Financial Infrastructure
Advertising is the primary way for podcasters to generate income, but AI-generated podcasts undermine this model. Podcast advertising relies on listener trust and engagement with the host. When ads are placed in AI-generated podcasts—which are often uninspiring and rarely listened to—advertisers see poor returns on their investment. Their messages fail to reach genuine listeners, making it a waste of resources. As the number of AI-generated podcasts grows, advertisers may start to question the value of investing in podcast ads altogether.
If advertisers pull away from podcasting, the financial infrastructure that supports real podcast creators will collapse. Creators will lose a sustainable way to monetize their content, which could discourage them from continuing in the podcasting space. Once again, we risk entering a vicious cycle that erodes the quality and diversity of podcasting.
Bad for Hosting Platforms
Podcast hosting platforms are also adversely affected by the surge of AI-generated content. Hosting these fake podcasts consumes valuable storage and computing resources—resources that could be better allocated to supporting high-quality, authentic podcasts. This not only wastes money but also makes it more challenging for hosting platforms to prioritize what truly matters: helping real podcasters succeed and providing listeners with quality content.
The Growing Problem: AI vs. Detection Tools
The rise of AI-generated content is becoming an increasingly serious issue. As AI tools improve their ability to mimic human-created content, distinguishing between real and fake podcasts becomes progressively harder. This challenge extends beyond podcasting, with implications for our offline life, such as the risk of deepfake audio being used in political campaigns or to spread misinformation.
Developing effective tools to detect AI-generated content is extremely challenging, especially when the creators of these AI tools are not always transparent about their methodologies. If this gap in detection capabilities continues to widen, Notebook LM and similar tools will cause greater harm to industries that thrive on human creativity and authenticity.
A Call for More Transparency and Responsible AI Use
While AI-generated content has a role to play as a productivity tool, its unchecked growth is negatively impacting the podcasting world and the broader internet. Notebook LM exemplifies this problem, as it allows spammers to flood the space with fake, low-quality podcasts. To protect the future of podcasting, AI developers need to be more transparent and take responsibility for the content their tools generate. One potential solution is to introduce clear watermarks or signals that help platforms like Listen Notes easily identify and remove AI-generated content.
In the end, listeners want to hear more real human voices and authentic stories— less machine-generated content. To preserve the unique and intimate nature of podcasting, we must work to prevent the spread of AI-generated audio and ensure that human creativity remains at the heart of the podcasting world.