Try our free Site Audit Tool. Get clear insights into your site's performance.

The Most Overlooked Aspect of Your AI Search Strategy: A Conversation with Peter Rota

Updated on February 25, 2026

min read

Peter Rota, technical SEO expert, on the Get Discovered podcast from Prerender.io

Table of Contents

If you’ve been watching the SEO conversation play out over the past 12 months, you’ve likely noticed a shift in tone. Less chasing algorithm updates. More getting back to fundamentals. Peter Rota—technical SEO expert, 15-year industry veteran, and LinkedIn thought leader with 80,000 followers—is one of the clearest voices making this argument. And he makes it well.

Peter joined host Joe Walsh on the latest episode of Get Discovered, Prerender.io’s podcast about AI and online discoverability. The conversation covered where technical SEO stands today, what AI has actually broken in the traditional playbook, and why the teams that stay grounded in first principles are the ones that will come out ahead.

Watch the full episode below, or keep reading for a summary.

It’s Not AI vs. Google: It’s How Can You Show Up in Both

Peter opened with a stat that reframes the whole conversation: a Semrush study using clickstream data showed that people who started using LLMs actually increased their Google search sessions. AI didn’t cannibalize search. It complicated it.

“People are trying to ask direct questions, or they’re trying to validate things that are coming up in LLMs,” Peter explained. The buyer’s journey has gotten longer and messier, not shorter. That has big implications for how teams measure success.

For SEO practitioners and business leaders alike, this dismantles the “AI vs. Google” framing that still shows up in a lot of industry commentary. The more useful question isn’t which one wins. It’s how do you show up in both, and how do you measure what’s actually working across a fragmented path to purchase?

This challenge of fragmented attribution came up repeatedly throughout the conversation — and it’s one we’ve explored in depth with other guests this season. Klaus Schremser from Otterly AI made a similar point about the difficulty of tracking visibility across AI channels, and Elizabeth Thorn from Toggl described how her team built custom AI referral tracking to fill the gap.

The Biggest Shift in 15 Years of SEO

When Joe asked how this AI disruption compares to other seismic changes, Peter didn’t hesitate. This is the most significant shift he’s seen since starting in the industry. The reason? It fundamentally changes how success is defined.

“The metrics have always been clicks and rankings,” he said. “Now it’s more like: am I visible here? Conversions are getting more important. It’s just the overall understanding of what is important now versus what was important five years ago.”

That’s not a trivial reframe. It affects how you report to leadership, how you structure content strategy, and how you justify the budget. Traffic as the primary KPI made sense when traffic was a reliable proxy for visibility and intent. In a world where LLMs answer questions without sending a click, traffic numbers can flatline while your brand influence grows—or erodes.

What Industries Are Most Impacted by Generative AI?

Peter was candid about which content types are feeling the most pain right now: news publishers and content-heavy sites. “Glossary and blog sections have been hit very hard year over year because they’re being surfaced in AI Overviews or LLMs, so people aren’t clicking through to the website.”

Informational content is increasingly being answered before the user ever reaches your site. Peter’s take isn’t to abandon that content, but to shift the framing: create content that’s genuinely useful for the user, not content that exists primarily to capture search engine visits.

It’s a subtle but important distinction that echoes what Noah Greenberg from Stacker discussed earlier this season when unpacking what actually gets cited in LLMs — authoritative, specific, citable content that earns third-party validation.

What’s Crucial for AI Search Visibility: Technical SEO Fundamentals

This is where the conversation got particularly useful. Peter’s view is that good technical SEO and good AI discoverability are largely the same thing—and most of the ranking factors that matter more for LLMs are factors that great SEOs have always cared about. They’ve just been getting less attention. A few of the areas Peter flagged as increasingly critical:

  • Semantic HTML. LLMs, unlike Google, mostly can’t render JavaScript. That means they’re reading your raw HTML. If your content is buried in divs instead of proper semantic tags—headers, lists, footers, article blocks—you’re harder to parse and more likely to be skipped.
  • Schema markup. Peter described this as getting “a second light,” something that was always a good practice but is now more essential. Anything that helps LLMs more easily understand your site’s structure, relationships, and meaning is worth investing in. Our own technical SEO guide for AI optimization goes deep on this.
  • JavaScript rendering. This one is particularly close to home for us. Most LLMs (with a few notable exceptions) cannot execute JavaScript. If your key content lives in JavaScript-rendered components, as is the case with many modern websites, it may as well be invisible to them. Google has invested enormous infrastructure into rendering JavaScript at scale and still doesn’t render everything. The idea that AI crawlers will do the same is, as Joe put it during the conversation, “a non-starter.”
  • Page speed, and specifically time to first byte. LLMs using retrieval-augmented generation visit sites in real time when constructing answers. If your site doesn’t load fast enough, they move on. Peter’s recommendation is to target three to four seconds or under. “They’re almost like they have unlimited crawl budget,” he said. “They’re just gonna go somewhere else if they can’t get you.”
  • Crawl logs. Peter said he now considers crawl log analysis a proactive practice rather than a reactive one. With Google Search Console, you can see which bots are visiting, what they’re finding, and whether there are unresolved requests. This is especially important now because AI crawler behavior is still not fully understood, and the more data you have, the better your decisions will be.

LLMs Care More About Brand Authority Than Search Engines

One of the most interesting threads in the conversation was about how LLMs evaluate credibility differently from Google.

Google’s algorithm has always valued backlinks as a proxy for authority, and gaming that system became its own cottage industry. LLMs think differently. They’re looking for consensus. They synthesize what’s being said across micro-communities, forums, review sites, social platforms, and the broader web. A brand with strong domain authority but a weak presence in third-party conversations may rank well in Google while being largely ignored by AI.

“LLMs are a lot more consensus-based,” Peter said. “They’re looking for branded signals because they want to serve something they’re confident in serving.”

This makes brand reputation management and PR strategy more important than ever. Getting mentioned in a Reddit thread might not produce a backlink. But it can influence how an LLM describes your brand. That’s a new kind of value that most SEO tools haven’t caught up to yet.

Peter also noted that the no-follow link vs. follow link binary that dominated SEO thinking is less relevant in this context. A no-follow mention on a high-authority site may actually be more valuable for LLM visibility than a follow link that doesn’t generate real contextual resonance. The metric that matters is signal strength, not link attribute.

The Attribution Problem (Again)

Attribution has always been difficult for SEO. You could see that organic traffic increased and correlate it with content efforts, but drawing a direct line from specific pages to specific revenue was never clean. AI makes this messier—a common theme in this season of the podcast.

Peter framed it well: “A lot of times, SEO was always harder to even get buy-in on attribution because it’s like, I did these paid ads and I can clearly see where the conversion was. With SEO, we know it’s working, but it was hard to show the value. And obviously, LLMs makes it even more complicated.”

Joe added a useful analogy: we’ve essentially moved from the attribution precision of digital advertising back toward something closer to billboard marketing. You can measure general uplift. You can observe trends. But the clean, last-click-attribution model that made Google’s ad network so appealing to CFOs? That era is fading.

For now, Peter’s approach is to report on what leadership understands: referrals, sessions, and conversions over time, while keeping an eye on visibility metrics as leading indicators, even if they’re imperfect.

What Teams Should Prioritize for AI Search Visibility

When asked to give concrete guidance on where to focus, Peter named a few key areas:

Prioritize:

  • Deeper schema investment, particularly to establish entity relationships and corroborate your brand’s “entity home” (a well-structured About page that’s then validated across the web).
  • Raw HTML content. Make sure your most important copy isn’t dependent on JavaScript.
  • Page speed.
  • Crawl log analysis.
  • Your overall brand presence in communities and third-party publications.

Deprioritize:

  • The “all-or-nothing” mentality around traffic as the primary metric.
  • Backlinks as the exclusive measure of off-site authority.
  • The expectation that recovering pre-AI traffic levels is a realistic goal. “LLMs will never send the amount of traffic Google sends,” Peter said. “I don’t think we can get the traffic back. It’s just setting that clear expectation.”

This mirrors advice we’ve heard across multiple guests this season, including from Ané Wiese at saas.group, who offered practical implementation tips for SEO teams navigating the same transition.

Looking Ahead

Looking out 12 to 18 months, Peter’s prediction is that organic clicks will continue to decline. This is not only because of AI, but because Google itself is adding more advertising to AI-powered search experiences.

But he raised a more nuanced concern beyond the traffic numbers: the erosion of critical thinking in teams that lean too hard on LLMs.

“I feel like people will look for more and more shiny objects to solve things and they lose that critical thinking,” he said. “LLMs will lie to your face and won’t even acknowledge it. You just really need to use your brain with a lot of these tools.”

Joe echoed this, noting that how humans relate to these tools—and where the actual value boundaries are—is something we’re still figuring out as an industry.

Key Takeaways

We asked Peter what he’d want listeners to remember from the conversation. His answer was clean: don’t forget the fundamentals of SEO.

“A lot of those will give you the 80/20 of showing up in LLMs,” he said. “A lot of the AI optimization principles evolved from SEO. A lot of them are SEO that’s just been repackaged. If you’re doing good SEO, you’ll definitely be off to a good start.”

This is a message that holds up across every guest we’ve spoken to this season. Getting back to first principles, like technically sound sites, authoritative content, and genuine brand presence, is not a retreat. It’s the actual strategy.

Tune Into the Full Conversation

This recap covers the highlights, but the full conversation goes deeper into crawl log analysis, visibility tracking tools and their limitations, prompt overlap, and what the next 12 months look like for SEO practitioners. Listen to the full episode on Get Discovered wherever you get your podcasts.

Subscribe to Get Discovered so you don’t miss future episodes with SEO experts and business leaders navigating the AI discovery challenge in real time.

You can also find Peter on LinkedIn or at his website, where he works with businesses on local, e-commerce, and technical SEO.

Make Sure AI Crawlers Can Actually Find You

Peter’s point about JavaScript and LLM visibility is one we hear consistently from guests, and it’s one we’ve built our product around. Most AI crawlers can’t execute JavaScript, which means if your content is JavaScript-rendered, it’s invisible to them by default.

Try Prerender.io for free and make sure your site is discoverable to ChatGPT, Perplexity, Claude, and every AI search platform your buyers are using.

Picture of Prerender

Prerender

More From Our Blog

How to adapt your marketing to the AI era, misconceptions about AI visibility, and why SEO is more about reputation
Listen to Noah Greenberg, CEO of Stacker, talk about the type of content that gets cited in LLMs on the

Unlock Your Site's Potential

Better crawling means improved indexing, more traffic, and higher sales. Get started with 1000 URLs free.