After migrating a website, some site owners discover HTTP 429 errors when using SEO tools like Ahrefs or Semrush. A 429 “Too Many Requests” error means the server is throttling requests, which can prevent search bots like Googlebot from crawling your site. Left unchecked, this can hurt your SEO visibility.
This guide walks through diagnosing and fixing 429 errors to keep your site crawlable and search-friendly.
Issue background
429 errors often appear after switching hosting providers or applying new security configurations. They occur when too many requests are sent within a short timeframe, triggering rate limiting or firewall rules.
For human users, this may go unnoticed. But for search engines, repeated 429 responses can block crawlers from indexing your site, reducing rankings and organic visibility.
Diagnosis
To troubleshoot 429 errors, follow these steps:
- Review server logs
Check access and error logs for 429 responses. Pay attention to:- The IP addresses generating the errors.
- The user agents, especially bots like Googlebot, Bingbot, Ahrefs, or Semrush.
- The frequency and timing of requests that trigger rate limiting.
- Check rate limiting & firewall rules
Many hosting providers and Web Application Firewalls (WAFs) block or throttle traffic automatically. Confirm whether legitimate bots are being mistakenly restricted. - Inspect response headers
Look forRetry-Afterheaders in 429 responses, which indicate how long bots should wait before retrying. Also check cache-control headers. - Verify robots.txt
Ensure your robots.txt file isn’t unintentionally blocking key paths or introducing crawl delays. - Check cache bypasses
Sometimes, crawlers bypass caching and hit the origin server directly, creating high load. If this happens, your site may need better cache rules or a CDN.
Resolution steps
- Whitelist verified bots
Update server or firewall settings to allow trusted crawlers like Googlebot. This ensures they aren’t throttled by rate limits. - Optimize server performance
Reduce server strain by:- Using a Content Delivery Network (CDN).
- Optimizing code, queries, and plugins.
- Scaling hosting resources if necessary.
- Adjust crawler settings
SEO tools like Ahrefs or Semrush let you adjust crawl speeds. Lower the rate to avoid overwhelming your server (e.g., 1–2 requests per second). - Monitor results
Use Google Search Console to check crawl stats and errors. The URL Inspection tool helps confirm whether Googlebot can access pages without hitting 429 errors.
Final outcome
By diagnosing server settings, optimizing performance, and whitelisting search bots, you can eliminate 429 errors and restore smooth crawling. This ensures your site remains visible to search engines and accessible to users.
If you’re struggling with 429 errors, crawlability, or SEO performance after a site migration, Freshy can help. Contact Freshy today for expert troubleshooting and optimization.