Getting rid of duplicate content remaining from old misconfiguration
-
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
-
Yep, this one I fixed just now as you send it.
I think the issue with wrong redirects is mostly me not spotting them all rather than a problem with the ones I already set not redirecting correctly.
I expect there to be thousand + wrong pages, but when I use site:domain.tld and a word in wrong language, for instance "evg" (french word for bachelor party) Google spots only up to 300 (suspiciously the same maximum amount for all sites).
-
Hey Rasmus - I honestly think it's an issue with the redirects. I would double check them.
I did just visit https://www.pissup.de/package/basic-survival-4/ and it looks like it's redirecting. Were you able to get those shored up? If you are still having trouble, I would contact your web host to make sure those are shored up.
-
Hi John
Yes, the idea is that https://www.pissup.de/package/basic-survival-4/ should redirect to a german equivalent where we have one.
It's strange that it isn't as it has not been more than a week since I uploaded all the redirects. Perhaps this is down to the site: search not providing all results, and perhaps if it's limiting the amount of results, when some are removed, it starts showing others that were not showing before?
-
Hey Rasmus,
Just so I understand - a url like this: https://www.pissup.de/package/basic-survival-4/, should not be displaying on the german site. The german site should just have german right?
I found that page doing the site search listed in your initial question.
What's interesting is that this page isn't redirecting. Let me know your thoughts. I have feedback but I want to make sure of a few things before I share it.
Thanks!
John
-
Hi John
Thanks for taking your time to answer!
The URL's were already showing 301 or 404 when we discovered them after launching new site
What we did so far was this:
- set up 301 redirect from pissup.com/german-url to pissup.com/english-equivalent where available or closest similar page
- added a sitemap with these URL's with the hope they'd be crawled faster
- Wait
We were advised it was better to redirect than to ask for removal. Do you disagree with this advice, and what makes you think so?
We're really seeing an increase yet for these issues in the SERPS. Some decrease by 5-10%, but some don't. Can it be because we are not seeing them all in SERPS, and in that case is there anywhere else we could find them (all url's indexed by google on our domain)?
-
Hey Rasmus,
In finding these index pages, I'm assuming that you did the following:
1. no-indexed the pages from the domain you are concerned about
2. dis-allowed them in robots.txt (just another step to help speed up things)
3. Used the URL removal tool in Google Search Console
Unfortunately, it does take time for Google to process these URL's out of the SERPS. Hopefully, you are seeing a decrease in the URLs shown in the SERPS
Also, don't forget to do this via the Bing Search Console too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
Multiple 301 redirects and old site content appearing in Google results
I have found that for some Google searches the old version of the site on a completely different domain is appearing on page one of the results, while the newer site is only on page 3. The old site is redirecting to the new site with a 301 redirect, however there is also an additional redirect on the new site to force SSL. Despite this when you view the Google cache of the result that appears in Google the content of the page is still the old site. Is this normal or is Google not following the chain of 301 redirects? Edit: I just found out that downloading the page by right clicking a link and clicking download rather than viewing it in a browser leads to the old site appearing and the 301 redirect not being followed.
Intermediate & Advanced SEO | | freshleafmedia0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
Concerns about duplicate content issues with australian and us version of website
My company has an ecommerce website that's been online for about 5 years. The url is www.betterbraces.com. We're getting ready to launch an australian version of the website and the url will be www.betterbraces.com.au. The australian website will have the same look as the US website and will contain about 200 of the same products that are featured on the US website. The only major difference between the two websites is the price that is charged for the products. The australian website will be hosted on the same server as the US website. To ensure Australians don't purchase from the US site we are going to have a geo redirect in place that sends anyone with a AU ip address to the australian website. I am concerned that the australian website is going to have duplicate content issues. However, I'm not sure if the fact that the domains are so similar coupled with the redirect will help the search engines understand that these sites are related. I would appreciate any recommendations on how to handle this situation to ensure oue rankings in the search engines aren't penalized. Thanks in advance for your help. Alison French
Intermediate & Advanced SEO | | djo-2836690 -
Avoiding duplicate content on an ecommerce site
Hi all, I have an ecommerce site which has a standard block of text on 98% of the product pages. The site also has a blog. Because these cause duplicate content and duplicate title issues respectively, how can I ever get around this? Would having the standard text on the product pages displayed as an image help? And how can I stop the blog being listed as duplicate titles without a nofollow? We already have the canonical attribute applied to some areas where this is appropriate e.g. blog and product categories. Thanks for your help 🙂
Intermediate & Advanced SEO | | CMoore850 -
Duplicate Content on Blog
I have a blog I'm setting up. I would like to have a mini-about block set up on every page that gives very brief information about me and my blog, as well as a few links to the rest of the site and some social sharing options. I worry that this will get flagged as duplicate content because a significant amount of my pages will contain the same information at the top of the page, front and center. Is there anything I can do to address this? Is it as much of a concern as I am making it? Should I work on finding some javascript/ajax method for loading that content into the page dynamically only for normal browser pageviews? Any thoughts or help would be great.
Intermediate & Advanced SEO | | grayloon0 -
Does duplicate content on a sub-domain affect the rankings of root domain?
We recently moved a community website that we own to our main domain. It now lives on our website as a sub-domain. This new sub-domain has a lot of duplicate page titles. We are going to clean it up but it's huge project. (We had tried to clean it even before migrating the community website) I am wondering if this duplicate content on the new sub-domain could be hurting rankings of our root domain? How does Google treat it? From SEO best practices, I know duplicate content within site is always bad. How severe is it given the fact that it is present on a different sub-domain?
Intermediate & Advanced SEO | | Amjath0