Saturday, November 22, 2025

Top 5 This Week

Related Posts

Google’s JavaScript Warning & How It Relates To AI Search

The Challenge of JavaScript in Search

A recent discussion from Google’s Search Relations team highlights a common web development challenge: making JavaScript work effectively with modern search engines.

In the latest Search Off The Record podcast, the team addressed the increasing use of JavaScript and the tendency to rely on it even when unnecessary.

The Purpose of JavaScript

Martin Splitt, a Search Developer Advocate at Google, explained that JavaScript was originally designed to help websites compete with mobile apps, enabling features like push notifications and offline access.

However, the team cautioned that overusing JavaScript can sometimes create more problems than solutions. While it has many benefits, it’s not always the best choice for every aspect of a website.

The JavaScript Spectrum

Splitt described the web development landscape as a spectrum between traditional websites and web applications:

“We’re in this weird state where websites can be just that – websites, basically pages and information that is presented on multiple pages and linked, but it can also be an application.”

For example, an apartment listing website provides essential details like square footage, floor number, and address. However, if it includes an interactive 3D tour, it also functions as an application.

Why This Matters for AI Search

Google Search Advocate John Mueller pointed out that many developers over-rely on JavaScript:

“There are lots of people that like these JavaScript frameworks, and they use them for things where JavaScript really makes sense, and then they’re like, ‘Why don’t I just use it for everything?’”

This discussion ties into a recent study that found AI search engines struggle with JavaScript.

The AI Crawler Limitation

AI-powered search engines, such as ChatGPT Search, rely heavily on HTML-based crawling. Unlike traditional search engines, many AI crawlers cannot render JavaScript, meaning content hidden behind JavaScript may not be indexed properly.

A study found that AI bots now account for a growing share of search crawler traffic, but many of them lack JavaScript execution capabilities. This could result in lost visibility if your website relies too much on JavaScript for displaying key content.

Key Considerations for Developers

To ensure your site is optimized for both traditional and AI search engines, keep these points in mind:

  • Server-Side Rendering (SSR): Since AI crawlers can’t execute JavaScript, SSR helps make content accessible.
  • Content Accessibility: AI crawlers like GPTBot prioritize HTML (57.7%), while others, like Claude, focus more on images (35.17%).
  • Development Strategy: The rise of AI search may require a shift away from a “JavaScript-first” approach.

Moving Forward: Best Practices

As AI crawlers become more relevant, it’s crucial to balance modern website features with search engine accessibility. Here’s how you can adapt:

  • Implement server-side rendering for key content.
  • Ensure core content is included in the initial HTML.
  • Use progressive enhancement to improve accessibility.
  • Be strategic with JavaScript usage—don’t use it for everything.

By optimizing your website for both traditional search engines and AI crawlers, you can maintain strong search visibility while providing a seamless user experience.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles