Duplicate content
-
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
-
Thank you so much, Daniel! It is now fixed. Yippee!!!
-
A 301 redirect is done in the .htaccess file for the domain. There doesn't need to be a page present for a redirect, it is a server side thing. Look up info on writing a 301 redirect in the .htaccess file and you'll find what you need to set this up.
-
Thanks, Daniel! The page is indexed.
So, I will need to redirect with a 301. However, the actual page is gone. Do I just write one with abbreviated code that includes only language for the redirect? (yes, it's true...I am new at this).
-
If the page that now 404's is indexed, redirect it with a 301 to the page you are keeping. If it is not indexed, no harm done.
-
Hi Daniel,
I have a new problem with this "page B". It turns out that the designer, rather than changing 302 to 301, just completely removed page B. Now what we see is a 404 error. What are your thoughts (and/or anyone else's who might be reading this) on how best to correct this now?
Thanks!
-
What you don't want to do is have C redirect to A and then have A redirect to B. You can have a billion 301 redirects, but you don't want to send the crawlers on a wild goose chase. If an old URL currently goes to A and you want A to now go to B, I would change the 301 of your old URL to now go straight to B.
-
Hi Daniel, Thanks!
How many 301 redirects are OK with google? I ask because there is already one in place leading to A (using A & B from your example). This is from a still older URL that is no longer being used.
-
If I understand, you have home page A with old design and home page B with new design. A formerly had a 302 redirect to B. Now the redirect is removed, so you have A and B and they are essentially the same thing, right?
If this is correct, you want to use a 301 redirect from one to the other. If URL of A has all the link juice, I would suggest putting the home page you want on the URL of A and then redirect B with a 301 to A. This is the best way to handle it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content w/ same URLs
I am getting high priority issues for our privacy & terms pages that have the same URL. Why would this show up as duplicate content? Thanks!
Technical SEO | | RanvirGujral0 -
Does duplicate content not concern Rand?
Hello all, I'm a new SEOer and I'm currently trying to navigate the layman's minefield that is trying to understand duplicate content issues in as best I can. I'm working on a website at the moment where there's a duplicate content issue with blog archives/categories/tags etc. I was planning to beat this by implementing a noindex meta tag on those pages where there are duplicate content issues. Before I go ahead with this I thought: "Hey, these Moz guys seem to know what they're doing! What would Rand do?" Blogs on the website in question appear in full and in date order relating to the tag/category/what-have-you creating the duplicate content problem. Much like Rand's blog here at Moz - I thought I'd have a look at the source code to see how it was dealt with. My amateur eyes could find nothing to help answer this question: E.g. Both the following URLs appear in SERPs (using site:moz,com and very targeted keywords, but they're there): https://moz.com/rand/does-making-a-website-mobile-friendly-have-a-universally-positive-impact-on-mobile-traffic/ https://moz.com/rand/category/moz/ Both pages have a rel="canonical" pointing to themselves. I can understand why he wouldn't be fussed about the category not ranking, but the blog? Is this not having a negative effect? I'm just a little confused as there are so many conflicting "best practice" tips out there - and now after digging around in the source code on Rand's blog I'm more confused than ever! Any help much appreciated, Thanks
Technical SEO | | sbridle1 -
Looking at creating some auto-generated pages - duplicate content?
Hi Everyone! We just launched a new version of our research site and the main CTA on the page sends users to a subdomain that's blocked by robots.txt. The subdomain link is one of our PPC landing pages and they would be duplicate content for every model (cars). We're also looking at a new content stream of deals pages, on the main domain. The thought process was that we could rank these pages for things like "Volkswagen golf deals" and also use them as canonical URLs from the PPC pages so that Panda doesn't get mad at us for sending hundreds of links to a subdomain that's blocked. It's going to take us a lot of time to write the copy for the deals pages, so if we auto-generate it by pulling a paragraph of copy from the car review, and numerical stats about that model, will it be classes as duplicate and/or is there any downside to doing it? Review Page: http://www.carwow.co.uk/car-reviews/Ford/Fiesta Deals Page: http://www.carwow.co.uk/deals/Ford/Fiesta PPC Landing Page: http://quotes.carwow.co.uk/buy/Ford/Fiesta I can't help but feel that this may all be a bit overkill and perhaps it makes more sense to build 1 central deals page per model with unique content that we can also send the PPC traffic to, then life any block from the quotes. subdomain. But that will take time and we'd also like a quick solution. I'd also question if it's even an issue to link to a blocked subdomain, Google adds the quote URL into the index but can't crawl it, which I've been told is bad - but is it bad enough to do something about? Thanks, JP
Technical SEO | | Matt.Carwow0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Is there ever legitimate near duplicate content?
Hey guys, I’ve been reading the blogs and really appreciate all the great feedback. It’s nice to see how supportive this community is to each other. I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it. However, I’m looking at a scenario that is a little different from what I’ve read about. I’m not sure if we’d get penalized by Google or not. We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing. Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches. While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency. There is only so much you can say about specific lines of insurance. They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites? If so, how do you handle this? We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.
Technical SEO | | mavrick0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0