Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
-
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated!
For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL.
My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error.I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index.
Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this?P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.
-
@maribailey10 If I could I would, but we very rarely have products similar enough to the discontinued one for that approach to make sense. Hence why I plan on sending them to a search query page.
Occasionally, we are able to immediately replace a discontinued product with a replacement, but that rarely happens.
-
No just try to interlink them to other similar product and edit content accordingly
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over Optimised Magento Pages
We are working on a clients Magento site and we've added new copy which has a decent keyword density which is in line with best practice. When we run it through Moz we are getting a Key Word Stuffing alert saying the page has 27 keywords, where we can only see about 11. This is the page https://www.greatbeanbags.com/bean-bag-cushions The client is pushing back saying the page must have already been optimised before as our new copy has triggered the stuffing alert. But my guess is the page was already stuffed but buy some Magento code we can't see. Any ideas? #magento #Keyworddensity
Content Development | | Marketing_Optimist0 -
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Unsolved Duplicate Contents in Order Pages of Multiple Products
Hi, I have a website containing 30 software products. Each product has an order page. The problem is that the layout and content of these 30 order pages are very similar, except for the product name, for example: https://www.datanumen.com/access-repair-order/
On-Page Optimization | | ccw
https://www.datanumen.com/outlook-repair-order/
https://www.datanumen.com/word-repair-order/ Siteliner has reports these pages as duplicate contents. I am thinking of noindex these pages. However, in such a case, if a user search for "DataNumen Outlook Repair order page", then he will not be able to see the order page of our product, which drives the revenue go away. So, how to deal with such a case? Thank you.1 -
How can I make a list of all URLs indexed by Google?
I have a large site with over 6000 pages indexed but only 600 actual pages and need to clean up with 301 redirects. Haven't had this need since Google stopped displaying the url's in the results.
SEO Tactics | | aplusnetsolutions0 -
Product meta tags are not updating in my Magneto website!
I need some help! For some reason, each time I update the product meta tags in my Magento website, it doesn't change on the current website? Could someone help me understand why that is?
Technical SEO | | One2OneDigital0 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Google SERPs and NoIndex directives.
We have pages that have been added to robots.txt as url patterns in DisAllow. Also, we have the meta noindex tags on the pages themselves. But we are finding the pages in index. I don't think they are higher up in the rankings and they don't have any descriptions, any previews or any cached pages. Why does Google show these pages? Could it be due to internal or external linking?
Technical SEO | | gaganc0 -
Is anyone using Media Temple?
I'm looking to move 5 of my sites from Hostgator's shared servers to Media Temple's dedicated virtual servers. Anyone have experience with (mt)? I'm planning on adding a few more sites this year and several things they offer are attractive to me: A (virtually) dedicated environment: Faster websites, better user experience, plus I like having some control over my site's resources Scalability: I can add more resources easily (although not super cheap) Unique control panels for each site: More control for my tech savvy clients. Unique IPs for $1 a month: More linkjuice between my related sites. $50/month is a big jump from my $12/month Hostgator account but I'm thinking it will be worth it. Am I on the right track or is this a fool's errand?
Technical SEO | | AaronParrish0