Duplicate Ttile and Duplicate Content
-
I'm a beginner of SEO. I have a few questions need to ask people to help. The MozPro's Crawl Diagnostics show I have a lot of duplicate titles and duplicate content. However, most duplicate titles are related to Pagination. What should I do?
Also, for my duplicate content. B/c we are selling similar products,everything all most the same, only product's item number different. How can I avoid it?
-
The solutions can be a bit site-dependent, but rel=prev/next is Google's approved choice for paginated content. It does depend a bit on how complicated your searches are (are there also sorts, filters, etc. that create unique URLs?) and how many pages you're talking about. An older option some SEOs still prefer is to META NOINDEX, FOLLOW pages 2+ of results. Adam Audette has some good resources on the subject:
http://searchengineland.com/five-step-strategy-for-solving-seo-pagination-problems-95494
For products, it's a bit trickier. If you're talking about hundreds of variations, and they're very, very similar, then I think rel=canonical can be a good choice. Post-Panda, the risks of indexing hundreds of similar pages are worse than the benefits of potentially ranking for a few product variations. There's no one "right" answer, though - it's always a trade-off. In most cases, I think focuses your ranking power on your core, unique products is usually a good idea.
-
If you properly use Rel Prev Next ( http://googlewebmastercentral.blogspot.com.au/2011/09/pagination-with-relnext-and-relprev.html ) on paginated content you DONT have to worry about duplicate titles.
-
Thank you for your answer. I still have a question for Duplicate Title. Is it same ways?
-
Hi alexsu0910,
Using the Google Webmaster Tools Console along with the MozPro Crawl Diagnostics is a good place to start for troubleshooting duplicate content. In particular, the HTML Improvements section of Google Webmaster Tools support provides a guide to identifying and rectifying duplicate content. As well, make sure you have a robots.txt file installed to instruct the Google crawlers not to pick up on duplicate content. Also consider the value of a 301 redirect to steer users clear of pages not used or duplicates and instead make take them to a tidy home page. Finally, you can also use Google Webmaster Tools to inform Google of URL parameters, much in the same way a rel=”canonical” helps determine whether Google should index the content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reposting content.
I have some good articles I wrote for article directories a couple of years ago. I took them down 6 months ago. I am hoping to repost them somewhere better if the content isn't listed on google and passes Copyscape. Would this be safe?
Content Development | | T0BY1 -
Looking for a content marketing agency alternative to Brafton.
We've worked with Brafton for a while now and aren't happy with their service. So we're looking for an alternative content marketing agency not a digital marketing agency. Do you have any recommendations? Thanks in advance.
Content Development | | workathomecareers0 -
How Google judge about duplicate content?
With recent Search engines updates one thing is clear we cannot ignore content. Content marketing definitely going to be most important part of our SEO strategy. I have few doubts about content marketing (circulation of content over web) where I want suggestions of community members. There would be different thoughts so I would like to have as many as responses to know what majority thinks: When we are writing guest posts, does article needs to be unique with each and every blog we are writing or we can safely circulate one good piece of content to 10-15 blogs who are interested in our creative. We have written a good blog post for our own domain. Apart from social sharing should it be posted to other related blogs too or it should be unique to our domain only. Social sharing, mentions, like of blog matters in rankings?Seems yes they do but need to know what majority thinks. Finally what is the safe number to circulate your content over web.
Content Development | | EG0CENTRIX0 -
How often should content be updated
With all of Google's recent algo updates (or ranking updates, whatever they're calling it now), we've obviously been looking into changing our content strategy and shifting it from quantity to quality. How often would you say is ideal for website content updates? i.e. should we be updating once a month? Once every couple of months? This isn't a blog - just a regular services-oriented site. My take on it is that it should be as often as organically possible - and that means something different for everyone. At the same time, we want Google coming back frequently to crawl the site. Thanks!
Content Development | | eyecarepro0 -
I have created 2 blogs for a client as they have 2 domains (1 for their core business, and 1 for a product). I want to use the same content on both blogs. What is the best way to set this up so there are no ranking or duplicate content issues?
We are pushing SEO for only one of the domains, therefore I would like one to be dominant. We will be sending the blog post via email to their database, therefore each blog needs to have the same content. Thank you!
Content Development | | MarketingResults0 -
How to best take advantage of content being used on another site?
We've never syndicated content or done "article marketing". Another site contacted us and requested to use the content on several of our webpages. The other site is a fairly prestigious nonprofit in our industry. We don't mind them using our content, but we want to get the most benefit out of it. There are two ways the occur to me: Have them create pages with the exact same text as on our pages, but put in the header of those pages Just have them create pages with the text from our pages with embedded links back to our other pages. Each page they create will say "Content courtesy of XXX" Does anyone have opinions on which way is best, or another approach?
Content Development | | DanCrean0 -
Is it considered as duplicate content ?
Hello, I see a lot of errors on my webmaster tools because of this ajax code on my questions pages of the site (screen) : www.dismoicomment.fr The code : | / ADD ANSWER FORM |
Content Development | | elitepronostic
| | $("#answer-add-button").click(function () { |
| | $.ajax({ |
| | type: 'POST', |
| | url: '/answers/quelle-assurance-choisir-pour-un-scooter/', |
| | data: $("form#answer-add").serialize(), |
| | dataType: 'html', |
| | success: function(data) { |
| | |
| | if(data=="answer") { |
| | $('.answer-add-message').show().empty(); |
| | $(document).ready(function() { |
| | $(' Vous avez déjà répondu à cette question. ').appendTo('.answer-add-message'); |
| | }); | I have add a line on my robots.txt : http://www.dismoicomment.fr/robots.txt for remove all urls with /answers/. These urls with /answers/ aren't indexed in google. Do you think that it is dangerous and that can be considered as duplicate content ? 1129546035.png0 -
New Content
I am looking to add new content to pages that I currently have on my website. The content on these pages was taken from another provider and my idea is to rewrite the content to make it unique. Because it is duplicate content, these pages don't get much traffic. Should I add the new content to brand new urls or just change the content on these (already indexed urls). The issue has seen these pages contain duplicate content. If the content simply changes, will it recognise these pages as having unique content?
Content Development | | MattBB121