Is it possible to deindex old URLs that contain duplicate content?
-
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website).
Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file.
Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages?
In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs?
Any help is greatly appreciated!
-
HI Harry, He should be able to set this up dynamically. Otherwise, yes I do feel sorry for him but then he didn't he didn't built it right the first time anyway. Martijn.
-
Hi Martijn,
Thank you for responding. I think canonical tags are the best way forward, I am looking forward to explain to the web developer that we need several hundred tags implementing!
Many thanks
-
I would definitely make sure that the pages that are expired will indeed return a 404.
In addition, what you could do as well for the duplicate content pages is use a canonical tag to point back to the original page. Wouldn't that work for you?
-
Hello,
If you want to speed up the process of deindexing, after setup all noindex tags you could upload a sitemap in GSC with all URLs you want to be crawled again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google selecting incorrect URL as canonical: 'Duplicate, submitted URL not selected as canonical'
Hi there, A number of our URLs are being de-indexed by Google. When looking into this using Google Search Console the same message is appearing on multiple pages across our sites: 'Duplicate, submitted URL not selected as canonical' 'IndexingIndexing allowed? YesUser-declared canonical - https://www.mrisoftware.com/ie/products/real-estate-financial-software/Google-selected canonical - https://www.mrisoftware.com/uk/products/real-estate-financial-software/'Has anyone else experienced this problem?How can I get Google to select the correct, user-declared canoncial? Thanks.
Technical SEO | | nfrank0 -
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
Database driven content producing false duplicate content errors
How do I stop the Moz crawler from creating false duplicate content errors. I have yet to submit my website to google crawler because I am waiting to fix all my site optimization issues. Example: contactus.aspx?propid=200, contactus.aspx?propid=201.... these are the same pages but with some old url parameters stuck on them. How do I get Moz and Google not to consider these duplicates. I have looked at http://moz.com/learn/seo/duplicate-content with respect to Rel="canonical" and I think I am just confused. Nick
Technical SEO | | nickcargill0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Why are pages linked with URL parameters showing up as separate pages with duplicate content?
Only one page exists . . . Yet I link to the page with different URL parameters for tracking purposes and for some reason it is showing up as a separate page with duplicate content . . . Help? rpcIZ.png
Technical SEO | | BlueLinkERP0 -
URL content format - Any impact on SEO
I understand that there is a suggested maximum length for a URL so as not to be penalized by search engines. I'm wondering if I should if should optimize our ecommerce categories to be descriptive or use abbreviations to help keep the URL length to a minimum? Our products are segmented into many categories, so many products URL's are pretty long if we go the descriptive route. I've also heard that removing the category component entirely from a product URL can also be considered. I'm fairly new to all this SEO stuff, so I'm hoping the community can share their knowledge on the impact of these options. Cheers, Steve
Technical SEO | | SteveMaguire0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Duplicate Content Issue
Hello, We have many pages in our crawler report that are showing duplicate content. However, the content is not duplicateon the pages. It is somewhat close, but different. I am not sure how to fix the problem so it leaves our report. Here is an example. It is showing these as duplicate content to each other. www.soccerstop.com/c-119-womens.aspx www.soccerstop.com/c-120-youth.aspx www.soccerstop.com/c-124-adult.aspx Any help you could provide would be most appreciated. I am going through our crawler report and resolving issues, and this seems to be big one for us with lots in the report, but not sure what to do about it. Thanks
Technical SEO | | SoccerStop
James0