Duplicate content and the Moz bot
-
Hi
Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site?
He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc.
Can you advise please.
-
Hey There,
Rogerbot will still follow Nofollows because it needs to get a holistic view of the site.
Here are some things you can do about duplicate pages:
Delete content that is similar on each page.
Add some new and unique content to each page that is on the report. This can be done by adding more information, ideas, product descriptions, or anything that can make it differ from other pages on the domain.
You can also add a rel=canonical to one of the duplicate pages. Here are a few ways to do this:
Add a rel="canonical" link in between the and elements. This should be done on the version of the page you want to be ranking or that non-canonical version of the two (or multiple) pages.
To specify a canonical link to the page http://www.seomoz.org/blog.php?item=seomoz-iscool, create a element that looks like this: <link rel="canonical"href="<a href="http://www.seomoz.com/blog.php?item=seomoz-iscool">http://www.seomoz.com/blog.php?item=seomoz-iscool"/></link rel="canonical"href="<a>
Copy this link into the section of all non-canonical versions of the page, such as http://www.seomoz.com/blog.php?item=seomoz-iscool&sort=fun.
This should successfully eliminate the duplicate content issues.
-
Make sure you have no followed the correct pages....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suggestions on dealing with duplicate content?
What are the best ways to protect / deal with duplicate content? I've added an example scenario, Nike Trainer model 1 – has an overview page that also links to a sub-page about cushioning, one about Gore-Tex and one about breathability. Nike Trainer model 2,3,4,5 – have an overview page that also links to sub-pages page about cushioning , Gore-Tex and breathability. In each of the sub-pages the URL is a child of the parent so a distinct page from each other e.g. /nike-trainer/model-1/gore-tex /nike-trainer/model-2/gore-tex. There is some differences in material composition, some different images and of course the product name is referred multiple times. This makes the page in the region of 80% unique. Any suggestions welcome about the above example or any other ways you guys know of dealing with duplicate content.
On-Page Optimization | | punchseo0 -
Ecommerce product page duplicate content
Hi, I know this topic has been covered in the past but I haven't been able to find the answers to this specific thing. So let's say on a website, all the product pages contain partial duplicate content - i.e. this could be delivery options or returning policy etc. Would this be classed as duplicate content? Or is this something that you would not get concerned about if it's let's say 5-10% of the content on the page? Or if you think this is something you'd take into consideration, how would you fix it? Thank you!
On-Page Optimization | | MH-UK0 -
Magento Duplicate Content Question - HELP!
In Magento, when entering product information, does the short description have to be different than the meta description? If they are both the same is this considered duplicate content? Thanks for the help!!!
On-Page Optimization | | LeapOfBelief0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Duplicate content issues?
Our company consists of several smaller companies, some of whom deal with very similar things. For instance, two of our companies resell accounts software, but only one provides after-sales support. Because of the number of different companies and websites we have, sometimes it would be easier to simply copy content from one site to the other, optimised in the same manner as, in some instances, we would want different websites to rank for the same keywords. I have been asked my opinion on the potential impact of this practice and my initial response was that we should avoid this due to potential penalties. However, I thought I'd garner opinion from a wider audience before making any recommendations either way. What do people think? Thanks.
On-Page Optimization | | HBPGroup0 -
Duplicate Page Titles and Duplicate Content
I've been a Pro Member for nearly a year and I am bound and determined to finally clean up all the crawl errors on our site PracticeRange.com. We have 180 errors for Duplicate Page Titles and Duplicate Content. I fixed many of the pages that were product pages with duplicate content. Those product descriptions were edited and now have unique content. However, there remain plenty of the errors that are puzzling. Many of the errors reference the same pages, for example, the Home Page, Login Page and the Search page (our catalog pages).
On-Page Optimization | | AlanWills
In the case of the Catalog Page errors, these type pages would have the same title every time "Search" and the results differ according to category. http://www.practicerange.com/Search.aspx?m=6
http://www.practicerange.com/Search.aspx?m=15 If this is rel=canonical issue, how do I fix it on a search result page? I want each of the different category type pages to be indexed. One of them is no more important than the other. So how would I incorporate the rel=canonical? In the case of the Home Page errors, I'm really confused. I don't know where to start to fix these. They are the result of a 404 error that leads to the home page. Is the content of the 404 page the culprit since it contains a link to the home page? Here are examples of the Home Page type of crawl errors. http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aids/Rubber-Wooden-Tee-Holder.aspx http://www.practicerange.com/404.aspx?aspxerrorpath=/Golf-Training-Aids/Golf-Nets/~/Assets/ProductImages/products/Golf-Training-Aid/Impact-Bag.aspx Thanks , Alan WillsPracticeRange.com0 -
Duplicated Content Column in excel
I'd like to see all duplicated content URLs in excel. But when I do the export to csv, and then use text to columns, I end up with an empty duplicated content column. The URLs should be in column AF in excel, but this column is empty. Can somebody help me on this?
On-Page Optimization | | jdclerck0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0