Help With Duplicated Content
-
Hi Moz Community,
I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated.
I have been reading up about Canonicalization and would to verify some details, when using the
canonical
tag would it be placed in the /mywebpage.html or /mywebpage file?I am having a hard time to sort this out so any help from you SEO experts would be great
I have also updated my htaccess file with the following
Thanks in advance
-
Hi Tom,
Thanks so much for your prompt reply, i will get those tags updated and await some help with the 301.
Is there anyone who could maybe help me out with the 301 redirect?
Thanks again
Alec
-
Hi there Alec
Regarding the canonical tag:
The tag you want to use should be the page that you want to keep, be identified as the original and want Google to rank. From what I gather, that would be this page: http://www.bereavementstationery.co.uk/funeral-order-of-service
Therefore, on that page, you should have a canonical tag pointing to http://www.bereavementstationery.co.uk/funeral-order-of-service - and on every subsequent duplicate page, you should have the same tag.
The tag effectively instructs Google to say "these other pages are duplicate versions of this URL, so I'm going to ignore those versions, not flag them as a problem, and just promote the original URL. What's the original URL? Here it is, it's http://www.bereavementstationery.co.uk/funeral-order-of-service"
Sometimes making a robot appear human actually helps, ha!
An even better solution, which I believe you're trying to implement, would be to 301 redirect all of the URLs containing .html to their equivalent page without the .html extension in the URL. Not only does this remove the duplicate content problem, but it will also pass the SEO strength of the pages to the new one.
Unfortunately, I'm not qualified to help you there - don't want to recommend the wrong thing and mess up the .htaccess file! Hopefully someone more proficient in .htaccess redirects will come along and help you out there.
Hope my canonical explanation helps though!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content from page links
So for the last month or so I have been going through fixing SEO content issues on our site. One of the biggest issues has been duplicate content with WHMCS. Some have been easy and other have been a nightmare trying to fix. Some of the duplicate content has been the login page when a page requires a login. For example knowledge base article that are only viewable by clients etc. Easily fixed for me as I dont really need them locked down like that. However pages like affiliate.php and pwreset.php that are only linked off of a page. I am unsure how to take care of these types. Here are some pages that are being listed as duplicate: Should this type of stuff be a 301 redirect to cart.php or would that break something. I am guessing that everything should point back to cart.php.
On-Page Optimization | | blueray
https://www.bluerayconcepts.com/brcl...art.php?a=view
https://www.bluerayconcepts.com/brcl...php?a=checkout These are the ones that are really weird to me. These are showing as duplicate content but pwreset is only a link of the KB category. It shows up as duplicate many times as does affilliate.php: https://www.bluerayconcepts.com/brcl...ebase/16/Email
https://www.bluerayconcepts.com/brcl...16/pwreset.php Any help is overly welcome.0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Is my blog simply duplicate content of my authors' profiles?
www.example.com/blog is the full list of blog posts by various writers. The list contains the title of each article and the first paragraph from the article. In addition to /blog being indexed, each author's contribution list is being indexed separately. It's not a profile, really, just a list of articles in the same title & paragraph format of the /blog page. So if /blog a list of 10 articles written by two writers, I have three pages: /blog/author1 is a list of 4 articles /blog/author2 is a list of 6 different articles /blog is a list of 10 articles (the 4+6 from the two writers) Is this going to be considered duplicate content?
On-Page Optimization | | Brocberry0 -
Copyscape Duplicate Content Ownership Question
We have a site that has had its content copied verbatim to numerous other sites and articles. We were advised to change our content but the content is originally ours. Does google take that into account before they apply duplicate penalties? And shouldn't copyscape be able to show this information in their reports? It just doesnt seem right that the originating author would have to change content because everyone else is stealing it. Any clarification on this?
On-Page Optimization | | anthonytjm0 -
Duplicate content problem
I am having an issue with duplicate content that I can't seem to figure out. I got rid of the www.mydomain.com by modifying the htaccess file but I can't figure out how to fix theproblem of mydomain.com/ and mydomain.com
On-Page Optimization | | ayetti0 -
Checking Duplicate Content
Hi there, We are migrating to a new website, which we are writing lots of new content for the new website. The new website is hosted on a development site which is password protected and so on so that it cannot be indexed. What i would like to know is, how do i check for duplicate content issues out there on the world wide web with the dev site being password protected? Hope this makes sense. Kind Regards,
On-Page Optimization | | Paul780 -
Website Content
Is it bad to have html pages on a blog? I converted a completely HTML site to wordpress, but havd hundreds of article pages that are still html.
On-Page Optimization | | azguy0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5