Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
-
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated!
For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL.
My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error.I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index.
Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this?P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.
-
@maribailey10 If I could I would, but we very rarely have products similar enough to the discontinued one for that approach to make sense. Hence why I plan on sending them to a search query page.
Occasionally, we are able to immediately replace a discontinued product with a replacement, but that rarely happens.
-
No just try to interlink them to other similar product and edit content accordingly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Increase Website Visibility on Google and Bing?
I am working on an e-commerce niche website and I aim to rank higher on Google to drive more traffic to my website. Any suggestions?
Link Building | | digitalenginehub0 -
Requiring customer agree to shipping terms at checkout
I work for an ecommerce company that has many of its shipments go by LTL freight. Our customer service team has issues with a few customers per month that aren't equipped to receive freight shipments which leads to returns and other issues. In an effort to better inform our customers, the customer service team is requesting that we add a checkbox to the checkout that requires customers to agree to our shipping and returns policy, including a link to the policy page. I am wondering how concerned people here would be that requiring the customer to check a box agreeing to those terms would lead to more customers abandoning during the checkout process. Or do you think it's not a concern? Thanks for your thoughts.
Conversion Rate Optimization | | Kyle_M0 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Blogs Not Getting Indexed Intermittently - Why?
Over the past 5 months many of our clients are having indexing issues for their blog posts.
Technical SEO | | JohnBracamontes
A blog from 5 months ago could be indexed, and a blog from 1 month ago could be indexed but blogs from 4, 3 and 2 months ago aren't indexed. It isn't consistent and there is not commonality across all of these clients that would point to why this is happening. We've checked sitemap, robots, canonical issues, internal linking, combed through Search Console, run Moz reports, run SEM Rush reports (sorry Moz), but can't find anything. We are now manually submitting URLs to be indexed to try and ensure they get into the index. Search console reports for many of the URLs will show that the blog has been fetched and crawled, but not indexed (with no errors). In some cases we find that the blog paginated pages (i.e. blog/page/2 , blog/page/3 , etc.) are getting indexed but not the blogs themselves. There aren't any nofollow tags on the links going to the blogs either. Any ideas? *I've added a screenshot of one of the URL inspection reports from Search Console alt text0 -
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
What if meta description tag comes before meta title tag? Do the search engines disregard or penalize if the order is not title then description in the HTML?
Do the search engines disregard or penalize if the order is not title then description in the HTML? A client's webmaster is a newbie to SEO and did just this. Suggestions?
Technical SEO | | alankoen1230 -
How do I use only one URL
my site can be reach by both www.site.com and site.com. How do I make it only use www?
Technical SEO | | Weblion0