Getting rid of duplicate content remaining from old misconfiguration
-
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
-
Yep, this one I fixed just now as you send it.
I think the issue with wrong redirects is mostly me not spotting them all rather than a problem with the ones I already set not redirecting correctly.
I expect there to be thousand + wrong pages, but when I use site:domain.tld and a word in wrong language, for instance "evg" (french word for bachelor party) Google spots only up to 300 (suspiciously the same maximum amount for all sites).
-
Hey Rasmus - I honestly think it's an issue with the redirects. I would double check them.
I did just visit https://www.pissup.de/package/basic-survival-4/ and it looks like it's redirecting. Were you able to get those shored up? If you are still having trouble, I would contact your web host to make sure those are shored up.
-
Hi John
Yes, the idea is that https://www.pissup.de/package/basic-survival-4/ should redirect to a german equivalent where we have one.
It's strange that it isn't as it has not been more than a week since I uploaded all the redirects. Perhaps this is down to the site: search not providing all results, and perhaps if it's limiting the amount of results, when some are removed, it starts showing others that were not showing before?
-
Hey Rasmus,
Just so I understand - a url like this: https://www.pissup.de/package/basic-survival-4/, should not be displaying on the german site. The german site should just have german right?
I found that page doing the site search listed in your initial question.
What's interesting is that this page isn't redirecting. Let me know your thoughts. I have feedback but I want to make sure of a few things before I share it.
Thanks!
John
-
Hi John
Thanks for taking your time to answer!
The URL's were already showing 301 or 404 when we discovered them after launching new site
What we did so far was this:
- set up 301 redirect from pissup.com/german-url to pissup.com/english-equivalent where available or closest similar page
- added a sitemap with these URL's with the hope they'd be crawled faster
- Wait
We were advised it was better to redirect than to ask for removal. Do you disagree with this advice, and what makes you think so?
We're really seeing an increase yet for these issues in the SERPS. Some decrease by 5-10%, but some don't. Can it be because we are not seeing them all in SERPS, and in that case is there anywhere else we could find them (all url's indexed by google on our domain)?
-
Hey Rasmus,
In finding these index pages, I'm assuming that you did the following:
1. no-indexed the pages from the domain you are concerned about
2. dis-allowed them in robots.txt (just another step to help speed up things)
3. Used the URL removal tool in Google Search Console
Unfortunately, it does take time for Google to process these URL's out of the SERPS. Hopefully, you are seeing a decrease in the URLs shown in the SERPS
Also, don't forget to do this via the Bing Search Console too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Manage category pages and duplicate content issues
Hi everybody, I am now auditing this website www.disfracessimon.com
Intermediate & Advanced SEO | | teconsite
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion. When I was checking parts of the content in google to find duplicate content issues I found this: I google I searched: "Chaleco de streck decorado con botones" and found First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in. I was wondering if this could cause any problem related with duplicated content. Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google. Any advice? Thank you0 -
Duplicate Content with URL Parameters
Moz is picking up a large quantity of duplicate content, consists mainly of URL parameters like ,pricehigh & ,pricelow etc (for page sorting). Google has indexed a large number of the pages (not sure how many), not sure how many of them are ranking for search terms we need. I have added the parameters into Google Webmaster tools And set to 'let google decide', However Google still sees it as duplicate content. Is it a problem that we need to address? Or could it do more harm than good in trying to fix it? Has anyone had any experience? Thanks
Intermediate & Advanced SEO | | seoman100 -
How can a website have multiple pages of duplicate content - still rank?
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
Intermediate & Advanced SEO | | OhYeahSteve0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Duplicate content
I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.
Intermediate & Advanced SEO | | socrateskirtsios0