Number of characters to duplicate content
-
I wonder how much characters in a page title so it can be characterized for Googleas duplicate content?
Sorry for the English, I used Google Translator.
I'm from Brazil
Thanks. -
I have been wrestling with this issue for some time. As far as we have been made aware, if a Page Title exceeds 70 charactes not only will this text not appear in the Search Results, but Google will also not index text after this limit.. But Andrew do you believe this too not be the case, and in fact having perhaps longer Page Titles with Long-Tail Keyword targeting is a good way to improve ranking?
This article seems to agree with this as well:
http://searchenginewatch.com/article/2166510/4-SEO-Recommendations-to-Target-the-Long-Tail
We had begun to reduce Page Title Character lengths across site but we are in 2 minds whether to do so now!
-
As stated, the titles have to be exactly the same. 1 character different and they're not duplicate. Although 70 characters is what Google displays in the searches, they index more than that. You can make your title 140 characters long to make it unique, it just will cut off in the search.
-
The way I have seen duplicate page titles reported is "Exact duplicates" So if you have 2 pages with Page Title as "Widgets", they will show up as Duplicate. However, if one of them was a Blue Widgets and the other was Red Widgets, they will not show up as duplicate. It's a good idea to work on Duplicate Page Titles, but don't do all the work to make sure Google does not see them as Duplicate any more. Look at them as opportunities to display the correct, relevant information in the SERPS so that user's click through to your website. I hope that helps.
-
Less then 70 characters is best
http://www.seomoz.org/learn-seo/title-tag
Not sure what you mean by duplicate content. Every title should be unique and then you will not have to worry about duplicate titles.
Duplicare content means that Google has indexed content more than one time. Google does not like duplicate content on a page.
Google will look at the first content as the original and index it as the original. Anything after that is duplicate content. If you want to use someone else's content be sure to link back to the original article.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic links & duplicate content
Hi there, I am putting a proposal together for a client whose website has been optimised to include many dynamic links and so there are many pages with duplicate content: only the page title, h1 and URL is different. My client thinks this isn't an issue. What is the current consensus on this? Many thanks in advance.
On-Page Optimization | | lorraine.mcconechy0 -
Duplicate content affects on overall rankings
Hi guys, I have a website that has 23 pages with duplicate content. These pages serve the same function, which enables customers to upload their images. There is not much content on each one but we require a different page for each of our products, here is an example page: http://www.point101.com/giclee_printing/upload#/upload I don't think it makes sense to use a canonical tag as each page is for a different product and I think its going to be difficult to differentiate each page. I was wondering: 1. If this has a negative effect on the ranking of our homepage and other main product pages or if its an issue we do not need to worry too much about. 2. If anyone has any other ideas as to how we can resolve this issue. Thanks,
On-Page Optimization | | KerryK
Kerry0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
Duplication issue on my website
hi I have a cms website with 2000 pages.my problem is that 1. www.test.com/abc.html 2. www.test.com/abc.html?gallery?123testing it showing duplication page in me seomoz error list. It is a single page. Please suggest solution for it
On-Page Optimization | | wmsindia0 -
Duplicate Content - Delete it or NoIndex?
Last month I realized that one of my freelancers had been feeding my website with copied / spun content and sadly, there's lots of it. And of course it got my website to be hit hard by the last Panda update. Now that I've identified the content, what the best thing to do? Should I delete it permanently and get 404 errors or should I set the pages' robot meta tag to "nofollow"?
On-Page Optimization | | sbrault740 -
Tags creating duplicated content issue?
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation. For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this domain.com/blog/seo and domain.com/blog/marketing In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue. My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
On-Page Optimization | | Lakiscy0 -
Notonthehighstreet.co.uk - duplicate content? a reason to not sell via 3rd parties
A mixture of questions and discussion Question 1. can the following two pages be considered duplicate content http://www.notonthehighstreet.com/gardenbeet/product/deer-head-wall-art http://www.notonthehighstreet.com/1/1/219933-deer-head-wall-art-by-garden-beet.html both pages are indexed and both pages have different meta - aimed at different search combinations Discussion The search for 'deer head wall art gardenbeet' is generated by my PR company - we have done loads of print advertising for this item yet the sheer mass and volume of noths.com stops my store http://www.gardenbeet.com/garden-wall-art/58-deer-head.html from obtaining the number one position. All is fair in the business world I suppose BUT the original marketing machine for noths.com was claiming that they were assisting the small business owner. I paid them over £600 to join and now they compete with me head on. Stupid me I suppose. Let this be a key learning for those toying with the idea of investing in their own SEO or a 3rd party selling platform. Ho hum
On-Page Optimization | | GardenBeet0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0