Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Thousands of 503 errors in GSC for pages not important to organic search - Is this a problem?
-
Hi, folks
A client of mine now has roughly 30 000 503-errors (found in the crawl error section of GSC). This is mostly pages with limited offers and deals. The 503 error seems to occur when the offers expire, and when the page is of no use anymore. These pages are not important for organic search, but gets traffic from direct and newsletters, mostly.
My question:
Does having a high number of 503 pages reported in GSC constitute a problem in terms of organic ranking for the domain and the category and product pages (the pages that I want to rank for organically)?If it does, what is the best course of action to mitigate the problem?
Looking excitingly forward to your answers to this
- Sigurd
-
Hi, Andy. Thank you so much for the insights
-
Yeah, I'd have to say that a 404 would be far preferable. A 301 would be ideal, but it would take some bandwidth to redirect to the next-most-relevant page whenever a deal expires.
-
Hi Sigurd,
A 503 seems like an odd choice in this circumstance - is this something that the eCommerce software dictates, or are there options for this?
A 503 is service unavailable and isn't something I would expect to see on a page that has expired. There are a number of alternatives depending on the product & niche. For Google to be seeing thousands of 503's might make it look like the site is having problems.
If a page really must go, then it should be a 404 / 410, or a 301 to a close alternative, but not a 503.
Are there pointers that show these are causing issues with the site's SERP's?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Image Search - Is there a way to influence the related icons at the top of the image search results?
Google recently added related icons at the top of the image search results page. Some of the icons may be unrelated to the search. Are there any best practices to influence what is positioned in the related image icons section? Thank you.
Intermediate & Advanced SEO | | JaredBroussard1 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
How Do You Remove Video Thumbnails From Google Search Result Pages?
This is going to be a long question, but, in a nutshell, I am asking if anyone knows how to remove video thumbnails from Google's search result pages? We have had video thumbnails show up next to many of our organic listings in Google's search result pages for several months. To be clear, these are organic listings for our site, not results from performing a video search. When you click on the thumbnail or our listing title, you go to the same page on our site - a list of products or the product page. Although it was initially believed that these thumbnails drew the eye to our listings and that we would receive more traffic, we are actually seeing severe year over year declines in traffic to our category pages with thumbnails vs. category pages without thumbnails (where average rank remained relatively constant). We believe this decline is due to several things: An old date stamp that makes our listing look outdated (despite the fact that we can prove Google has spidered and updated their cache of these pages as recent as 2 days ago). We have no idea where Google is getting this datestamp from. An unrelated thumbnail to the page title, etc. - sometimes a picture of a man's face when the category is for women's handbags A difference in intent - user intends to shop or browse, not watch a video. They skip our listing because it looks like a video even though both the thumbnail and our listing click through to a category page of products. So we want to remove these video thumbnails from Google's search results without removing our pages from the index. Does anyone know how to do this? We believed that this connection between category page and video was happening in our video sitemap. We have removed all reference to video and category pages in the sitemap. After making this change and resubmitting the sitemap in Webmaster Tools, we have not seen any changes in the search results (it's been over 2 weeks). I've been reading and it appears many believe that Google can identify video embedded in pages. That makes sense. We can certainly remove videos from our category pages to truly remove the connection between category page URL and video thumbnail. However, I don't believe this is enough because in some cases you can find video thumbnails next to listings where the page has not had a video thumbnail in months (example: search for "leather handbags" and find www.ebags.com/category/handbags/m/leather - that video does not exist on that page and has not for months. Similarly, do a search for "handbags" and find www.ebags.com/department/handbags. That video has not been on that page since 2010. Any ideas?
Intermediate & Advanced SEO | | SharieBags0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Does Google crawl the pages which are generated via the site's search box queries?
For example, if I search for an 'x' item in a site's search box and if the site displays a list of results based on the query, would that page be crawled? I am asking this question because this would be a URL that is non existent on the site and hence am confused as to whether Google bots would be able to find it.
Intermediate & Advanced SEO | | pulseseo0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0