1 min read

Link: The rise of the AI crawler - Vercel

AI crawlers like OpenAI's GPTBot and Anthropic's Claude are substantially active, making up about 28% of the web traffic attributed to Googlebot. In comparison, GPTBot alone accounted for 569 million requests in the past month on Vercel's network.

Our analysis, using data from sites like nextjs.org, reveals that AI tools fetch various content types but struggle with JavaScript execution. ChatGPT, for example, prioritizes HTML, fetching it in 57.70% of requests, whereas Claude focuses more on images.

Despite fetching JavaScript files, AI crawlers like ChatGPT and Claude do not process them, creating a significant gap in how they interact with modern web applications. This JavaScript execution issue highlights inefficiencies, as AI crawlers often encounter 404 errors or unnecessary redirects.

For site optimization, server-side rendering is recommended to ensure critical content remains accessible to AI crawlers. Efficient URL management can also reduce the high rates of failed fetches currently seen with AI traffic.

While AI tools cannot render JavaScript, using a server-side approach and proper URL management could improve how these crawlers index and understand web content. For AI users and developers, knowing these limitations helps in molding strategies that enhance content visibility and accessibility.

Ultimately, recognizing the distinct capabilities and behaviors of AI crawlers compared to traditional search engines is crucial for optimizing web presence. Continued adjustments and web best practices will be necessary as AI-driven interactions become increasingly prevalent.#

--

Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.