AI in SEO, friend or foe?
As with most things in life, balance and moderation appears to be key.
→ Will I be writing important high-converting content with AI any time soon? Hell...no.
→ Am I eying up opportunities to scale top of funnel traffic via AI? HELL YES.
To understand whether it's worth the risks of adopting AI, I've collected a handful of AI SEO case studies from my network.
Here are some insightful ones👇
Note: Here's Google's guidance on using AI content.
1. The Largest AI SEO Case Study? Causal App: 0 to 1M/month
Jake Ward, Founder at Byword, used AI content to help Causal.App grow to 1 million monthly visitors in <1 year.
For this case study, he:
- Created 5,000 pages
- Used a proprietary GPT-3 model (now built into the tool: Byword)
- Focused on glossaries, question frameworks, formulas, and differences (X vs Y term)
- Ran the model but kept tweaking it until the output was high enough quality
Jake walked us through the full strategy here:
Notable advice from this case study:
- Leverage scalable keyword patterns: Some keywords follow clear, repetitive patterns (think "Define [X term]" or "Formulae for [X calculation]"). This is beneficial because you can invest time in defining inputs once, which helps deliver consistent, higher-quality outputs.
- Treat AI as minimum viable content: We introduced the term "minimum viable content" in my podcast episode with the SEOs at monday.com. The team there increase their content production speed by reducing quality to their minimum standard, only doing a final edit and review on content that actually received any traffic. AI content is perfect for this strategy. You can create 3,000+ articles in a day, and when one of them reaches a threshold (say...100 readers per month) you can use humans to rewrite it and improve the quality.
- What not to do: Jake also recently dropped a great post explaining what NOT to do with AI content. That helps explain some of the big fails we see in the case studies below.
2. How to write 1,000 articles via the GPT API
In this Twitter thread, Alton Lex dropped the *actual* code he used to automatically create AI content at-scale.
He goes on to share one of his case studies where he generated 3,000 articles.
It was caught in one of Google's latest algorithm updates but then it recovered dramatically again.
Tip: Alton used these steps to figure out how to recover from a Google update.
3. How we use AI to build a site to 45k visitors/mo
Jacky Chou (AKA Indexsy on Twitter) shows his process for building a niche site for 45k monthly visitors.
Here the process:
"1. Find 1000s of low competition high volume keywords via Ahrefs
2. Export and post them onto the site using an AI service
3. Use the free RankMath instant indexing API plugin and use an indexing service
4. If the post ranks, we then rewrite it with our in-house writers with Surfer. We would also target the same keywords on our money sites
I like this one because it uses AI to brute force the ranking, but human writers to futureproof that content. Smart.
PS. He then shared the AI tool he used to generate the content on his YouTube channel below.
4. 10,000 pages of AI content
"I tested 10k scraped content and AI content. Scraped content outperformed AI content. Both got panned."—Mark Williams-Cook
Mark took a brand new domain, created 10,000 pages and did 0 human editing to the content.
Here's the chart:
He notes that the first dip was the Helpful Content Update and the final plummet was October's Spam update.
Here's a little more on Mark's process (found in the comments—Mark, if you ever read this, that French comment was *chef's kiss*):
- He collected all of Google's People Also Asked data using Python
- He then used the GPT API to generate a response to each page
- He then used Wordpress's API to publish on the website
Note: this was an experiment with ZERO human intervention in the content. What would've happened with 30-minutes of human editing per article? Who knows.