How to remove duplicate content, which is still indexed, but not linked to anymore?
-
Dear community
A bug in the tool, which we use to create search-engine-friendly URLs (sh404sef) changed our whole URL-structure overnight, and we only noticed after Google already indexed the page.
Now, we have a massive duplicate content issue, causing a harsh drop in rankings. Webmaster Tools shows over 1,000 duplicate title tags, so I don't think, Google understands what is going on.
<code>Right URL: abc.com/price/sharp-ah-l13-12000-btu.html Wrong URL: abc.com/item/sharp-l-series-ahl13-12000-btu.html (created by mistake)</code>
After that, we ...
- Changed back all URLs to the "Right URLs"
- Set up a 301-redirect for all "Wrong URLs" a few days later
Now, still a massive amount of pages is in the index twice. As we do not link internally to the "Wrong URLs" anymore, I am not sure, if Google will re-crawl them very soon.
What can we do to solve this issue and tell Google, that all the "Wrong URLs" now redirect to the "Right URLs"?
Best, David
-
Yes David your link is very helpful..
-
Found the perfect answer:
http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well
-
Thanks a lot, Sanket.
Do you think, it might help, to submit a sitemap, which also contains the "Wrong URLs", so we can trigger a recrawl of those pages? Maybe then Google will notice that there is a 301-redirect.
-
Hi Davin
The best thing in this situation is to wait for sometime more.. Because you just done the redirection of wrong url's to right url's so it will take some time. In webmaster tool you will see the changes later because the data in webmaster tool are updates on 15 days or monthly basis, depends on the website so you need to wait. The url that was 301 redirected should not appear in the search results so the problem of duplication will be sorted out shortly so dont worry. Also you can verify the redirection are done correctly or not from this redirect checker tool http://www.internetofficer.com/seo-tool/redirect-check/.
I have one suggestion to crawl your website pages fastly : Maximize the "Crawl Rate" under Settings option of webmaster tool.
Hope my response would help you. If need any help feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Could duplicate (copied) content actually hurt a domain?
Hi 🙂 I run a small wordpress multisite network where the main site which is an informative portal about the Langhe region in Italy, and the subsites are websites of small local companies in the tourism and wine/food niche. As an additional service for those who build a website with us, I was thinking about giving them the possibility to use some ouf our portal's content (such as sights, events etc) on their website, in an automatic way. Not as an "SEO" plus, but more as a service for their current users/visitors base: so if you have a B&B you can have on your site an "events" section with curated content, or a section about thing to see (monuments, parks, museums, etc) in that area, so that your visitors can enjoy reading some content about the territory. I was wondering if, apart from NOT being benefical, it would be BAD from an SEO point of view... ie: if they could be actually penlized by google. Thanks 🙂 Best
Intermediate & Advanced SEO | | Enrico_Cassinelli0 -
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | jampaper0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Partial duplicate content and canonical tags
Hi - I am rebuilding a consumer website, and each product page will contain a unique product image, and a sentence or two about the product (and we tend to use a lot of the same words in different ways across products). I'd like to have a tabbed area below the product info that talks about the overall product line, and this content would be duplicate across all the product pages (a "Why use our products" type of thing). I'd have this duplicate content also living on its own URL's so they can be found alone in the SERP's. Question is, do I need to add the canonical tag to this page, since there's partial duplicate content on the product pages? And if I did that, would my product pages go un-indexed?? I understand how to handle completely duplicated content, it's the partial duplicate that I'm having difficulty figuring out.
Intermediate & Advanced SEO | | Jenny10 -
Making AJAX called content indexable
Hi, I've read a bit up on making AJAX called content indexable and there seems to be a number of options available, and the recommended methods seems to chaneg with time. My situation is this: On a product pages I have a list of reviews - of which I show the latest 10 reviews. The rest of the reviews are in a paginated format where if the user clicks a "next" button, the next set loads in the same page via AJAX. No ideally I would like all this content indexable as we have hundreds of reviews per product - but at the moment on the latest 10 reviews are indexed. So what is the best / simplest way of getting google to index all these reviews and associate them with this product page? Many thanks
Intermediate & Advanced SEO | | James770 -
Should I remove footer links?
I added footer links to my site some months ago as I figured that any authority my home page had would be distributed to several of my other most important pages on my site helping them to rank. Would I be better to remove them and would that improve the authority of my home page as less 'link juice' is being distributed. I did originally set up a page per keyword on my site and start building links to each one but as my home page has a good authority I am going to target several keywords on my home page instead as I have some way to go to improve the authority of my other important pages and think this would be a better solution. It would reduce the number of links I have per page however I did see Matt Cutts say that the no more than 100 links per page rule doesn't apply any more. Do footer links add any SEo value?
Intermediate & Advanced SEO | | SamCUK0 -
Is it fine to use an iframe for video content? Will it still be indexed on your URL?
If we host a video on a third party site and use an iframe to display it on our site, when the video is indexed in SERPs will it show on our site or on the third party site?
Intermediate & Advanced SEO | | nicole.healthline0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0