Getting rid of duplicate content remaining from old misconfiguration
-
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
-
Yep, this one I fixed just now as you send it.
I think the issue with wrong redirects is mostly me not spotting them all rather than a problem with the ones I already set not redirecting correctly.
I expect there to be thousand + wrong pages, but when I use site:domain.tld and a word in wrong language, for instance "evg" (french word for bachelor party) Google spots only up to 300 (suspiciously the same maximum amount for all sites).
-
Hey Rasmus - I honestly think it's an issue with the redirects. I would double check them.
I did just visit https://www.pissup.de/package/basic-survival-4/ and it looks like it's redirecting. Were you able to get those shored up? If you are still having trouble, I would contact your web host to make sure those are shored up.
-
Hi John
Yes, the idea is that https://www.pissup.de/package/basic-survival-4/ should redirect to a german equivalent where we have one.
It's strange that it isn't as it has not been more than a week since I uploaded all the redirects. Perhaps this is down to the site: search not providing all results, and perhaps if it's limiting the amount of results, when some are removed, it starts showing others that were not showing before?
-
Hey Rasmus,
Just so I understand - a url like this: https://www.pissup.de/package/basic-survival-4/, should not be displaying on the german site. The german site should just have german right?
I found that page doing the site search listed in your initial question.
What's interesting is that this page isn't redirecting. Let me know your thoughts. I have feedback but I want to make sure of a few things before I share it.
Thanks!
John
-
Hi John
Thanks for taking your time to answer!
The URL's were already showing 301 or 404 when we discovered them after launching new site
What we did so far was this:
- set up 301 redirect from pissup.com/german-url to pissup.com/english-equivalent where available or closest similar page
- added a sitemap with these URL's with the hope they'd be crawled faster
- Wait
We were advised it was better to redirect than to ask for removal. Do you disagree with this advice, and what makes you think so?
We're really seeing an increase yet for these issues in the SERPS. Some decrease by 5-10%, but some don't. Can it be because we are not seeing them all in SERPS, and in that case is there anywhere else we could find them (all url's indexed by google on our domain)?
-
Hey Rasmus,
In finding these index pages, I'm assuming that you did the following:
1. no-indexed the pages from the domain you are concerned about
2. dis-allowed them in robots.txt (just another step to help speed up things)
3. Used the URL removal tool in Google Search Console
Unfortunately, it does take time for Google to process these URL's out of the SERPS. Hopefully, you are seeing a decrease in the URLs shown in the SERPS
Also, don't forget to do this via the Bing Search Console too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
Duplicate page content errors for Web App Login
Hi There I have 6 duplicate content errors, but they are for the WebApp login from our website. I have put a Noindex on the Sitemap to stop google from indexing them to see if that would work. But it didn't. These links as far as I can see are not even on the website www.skemaz.net, but are links beyond the website and on the Web App itself eg : <colgroup><col width="529"></colgroup>
Intermediate & Advanced SEO | | Skemazer
| http://login.skemaz.net |
| http://login.skemaz.net/LogIn?ReturnUrl=%2Fchangepassword |
| http://login.skemaz.net/Login |
| http://login.skemaz.net/LogIn?ReturnUrl=%2FHome | Any suggestions would be greatly appreciated. Kind regards Sarah0 -
Duplicate content on product pages
Hi, We are considering the impact when you want to deliver content directly on the product pages. If the products were manufactured in a specific way and its the same process across 100 other products you might want to tell your readers about it. If you were to believe the product page was the best place to deliver this information for your readers then you could potentially be creating mass content duplication. Especially as the storytelling of the product could equate to 60% of the page content this could really flag as duplication. Our options would appear to be:1. Instead add the content as a link on each product page to one centralised URL and risk taking users away from the product page (not going to help with conversion rate or designers plans)2. Put the content behind some javascript which requires interaction hopefully deterring the search engine from crawling the content (doesn't fit the designers plans & users have to interact which is a big ask)3. Assign one product as a canonical and risk the other products not appearing in search for relevant searches4. Leave the copy as crawlable and risk being marked down or de-indexed for duplicated contentIts seems the search engines do not offer a way for us to serve this great content to our readers with out being at risk of going against guidelines or the search engines not being able to crawl it.How would you suggest a site should go about this for optimal results?
Intermediate & Advanced SEO | | FashionLux2 -
Pagination causing duplicate content problems
Hi The pagination on our website www.offonhols.com is causing duplicate content problems. Is the best solution adding add rel=”prev” / “next# to the hrefs As now the pagination links at the bottom of the page are just http://offonhols.com/default.aspx?dp=1
Intermediate & Advanced SEO | | offonhols
http://offonhols.com/default.aspx?dp=2
http://offonhols.com/default.aspx?dp=3
etc0 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
I need to add duplicate content, how to do this without penalty
On a site I am working on we provide a landing page summary (say top 10 information snippets) and provide a link 'see more' to take viewers to a page with all the snippets. Now those first 10 snippets will be repeated in the full list. Is this going to be a duplicate content problem? If so, any suggestions.
Intermediate & Advanced SEO | | oznappies0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0