Help With Duplicated Content
-
Hi Moz Community,
I am having some issue's with duplicated content, i recently removed the .html from all of our links and moz has reported it as being duplicated.
I have been reading up about Canonicalization and would to verify some details, when using the
canonical
tag would it be placed in the /mywebpage.html or /mywebpage file?I am having a hard time to sort this out so any help from you SEO experts would be great
I have also updated my htaccess file with the following
Thanks in advance
-
Hi Tom,
Thanks so much for your prompt reply, i will get those tags updated and await some help with the 301.
Is there anyone who could maybe help me out with the 301 redirect?
Thanks again
Alec
-
Hi there Alec
Regarding the canonical tag:
The tag you want to use should be the page that you want to keep, be identified as the original and want Google to rank. From what I gather, that would be this page: http://www.bereavementstationery.co.uk/funeral-order-of-service
Therefore, on that page, you should have a canonical tag pointing to http://www.bereavementstationery.co.uk/funeral-order-of-service - and on every subsequent duplicate page, you should have the same tag.
The tag effectively instructs Google to say "these other pages are duplicate versions of this URL, so I'm going to ignore those versions, not flag them as a problem, and just promote the original URL. What's the original URL? Here it is, it's http://www.bereavementstationery.co.uk/funeral-order-of-service"
Sometimes making a robot appear human actually helps, ha!
An even better solution, which I believe you're trying to implement, would be to 301 redirect all of the URLs containing .html to their equivalent page without the .html extension in the URL. Not only does this remove the duplicate content problem, but it will also pass the SEO strength of the pages to the new one.
Unfortunately, I'm not qualified to help you there - don't want to recommend the wrong thing and mess up the .htaccess file! Hopefully someone more proficient in .htaccess redirects will come along and help you out there.
Hope my canonical explanation helps though!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
Content Mismatch
Hi, I've added my app to search console, and there are reported 480 content mismatch pages. How can I solve this problem?
On-Page Optimization | | Silviu0 -
Moz Crawl Shows Duplicate Content Which Doesn't Seem To Appear In Google?
Morning All, First post, be gentle! So I had Moz crawl our website with 2500 high priority issues of duplicate content, not good. However if I just do a simple site:www.myurl.com in Google, I cannot see these duplicate pages....very odd. Here is an example....
On-Page Optimization | | scottiedog
http://goo.gl/GXTE0I
http://goo.gl/dcAqdU So the same page has a different URL, Moz brings this up as an issue, I would agree with that. However if I google both URL's in Google, they will both bring up the same page but with the original URL of http://goo.gl/zDzI7j ...in other words, two different URL's bring up the same indexed page in Google....weird I thought about using a wildcard in the robots.txt to disallow these duplicate pages with poor URL's....something like.... Disallow: /*display.php?product_id However, I read various posts that it might not help our issues? Don't want to make things worse. On another note, my colleague paid for a "SEO service" and they just dumped 1000's of back-links to our website, of course that's come back to bite us in the behind. Anyone have any recommendations for a good service to remove these back-links? Thanks in advance!!0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
When it comes to duplicate page content how do I deal with correcting it. Its a dynamic e commerce site.
I am under the impression that with ecommerce sites this happens often and that there's a plug in or just simply not worry about it since queries will often find similar conent.
On-Page Optimization | | Wayne_c0 -
How dangerous are duplicate page titles
We ran a SEO crawl and on our report it flag up duplicate pages titles, we investigate further and found that these were page titles from the same product line that had more than one page, e.g 1-50 (products) 51-100 (products) with a next button to move to the following 50 products. These where flagged as duplicate page titles ".../range-1/page-1" and ".../range-1/page-2" These titles are obviously being read as duplicates but because they are the same range we do not know what the best course of action is. We want to know how detrimental these page titles will be to our SEO if at all. If anyone could shed some light on this issue it would be a massive help. Thanks
On-Page Optimization | | SimonDixon0 -
Duplicate Content
Hi I have Duplicate content that i do sent understand 1 - www.example.dk 2- www.example.dk/ I thought i was the same page, whit and without the / Hope someone can help 🙂
On-Page Optimization | | seopeter290