Google Won’t Necessarily Follow the Date or Time Set in Retry-After Header
The Retry-After HTTP header is good practice to use alongside a 503 status code, but Google doesn’t always follow this as many sites use it in a generic way, and may retry your site sooner than you’ve...
View ArticleAvoid Serving a 503 Error for Days in a Row to Keep Site in Search
If a site serves a 503 error for several days in a row, then Google may start to assume that your site might be completely gone rather than temporarily unavailable and start removing pages from search....
View ArticleGoogle Can Periodically Try to Recrawl 5xx Error Pages
If a server error is shown on a page for as long as a week, Google can treat this in a similar way to a 404 error and will reduce the crawling of that page and remove it from the index, but will still...
View ArticleSignals are Kept for 4xx or 5xx Error Pages Previously Dropped from the Index...
If your pages displayed a 4xx or 5xx error for a while and were dropped from the index but become available again after a month or so, for example, Google will be able to return them to the search...
View ArticleGoogle Checks Status Code Pages Before Attempting to Render
Google checks the status code of a page before doing anything else, such as rendering content. This helps to identify which pages can be indexed and which pages it shouldn’t render. For example, if...
View ArticleGoogle Will Not Render JavaScript Content if The Page Returns a Redirect or...
If you have a page which contains JavaScript content but it returns a redirect or an error code, Google will not spend time rendering the content. For example, if you use JavaScript on a 404 page to...
View ArticleGoogle Doesn’t Crawl Any URLs From Hostname When Robots.txt Temporarily 503s
If Google encounters a 503 when crawling a robots.txt file, it will temporarily not crawl any URLs on that hostname. The post Google Doesn’t Crawl Any URLs From Hostname When Robots.txt Temporarily...
View ArticleGoogle Treats Permanently 503’ing Robots.txt as an Error & Eventually Crawls...
If a robots.txt returns a 503 for an extended period of time, Google will treat this as a permanent error and crawl the site normally to see what can be discovered. The post Google Treats Permanently...
View Article503s can help prevent pages dropping from the index due to technical issues
One user described seeing a loss of pages from the index after a technical issue caused their website to be down for around 14 hours. John suggests that the best way to safeguard your site against...
View Article