Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will including "Contact Me" form degrade Google page ranking?
I have a content-rich page about one of my offerings. Will Google knock down the ranking if I include a contact me reply form on the page vs including a link to a standalone reply page? My concern is that including the form will cause Google to downgrade the page as being "too commercial".
On-Page Optimization | | Lysarden0 -
Duplicate content: Form labels and field content
I have a site that has 500 pages, each with unique content, the only content that could be deemed the same is the 'Make Contact' form, which has the same labels and placeholder text on each page. Is this likely to cause any duplicate content penalties?
On-Page Optimization | | deployseo0 -
Duplicate Content
I'm currently working on a site that sells appliances. Currently, there are thousands of "issues" with this site, many of them dealing with duplicate content. Now, the product pages can be viewed in "List" or "Grid" format. As Lists, they have very little in the way of content. My understanding is that the duplicate content arises from different URLs going to the same site. For instance, the site might have a different URL when told to display 9 items than when told to display 15. This could then be solved by inserting rel = canonical. Is there a way to take a site and get a list of all possible duplicates? This would be much easier than slogging through every iteration of the options and copying down the URLs. Also, is there anything I might be missing in terms of why there is duplicate content? Thank you.
On-Page Optimization | | David_Moceri0 -
Duplicate Content on Category Pages
Hi Everyone, I have a few category pages within a category for my eCommerce store and I've recently started writing a short description for each. However a lot of these paragraphs can be replicated for the same category. For instance '1 Inch thickness' I'll show all the information, and it'll be very similar to '2 inch thickness' but obviously one is 1 inch and one is 2 inch so I would only be changing one keyword and that is the thickness. I feel that this is helping customers because it has all the information in each category e.g. how to filter your choices. But it might be duplicate content. What would you recommend?
On-Page Optimization | | EcomLkwd0 -
Duplicate Content Again
Hello Good People. I know that this is another duplicate post about duplicate content (boring) but i am going crazy with this.. SeoMoz crawl and other tools tells me that i have a duplicate content between site root and index.html. The site is www.sisic-product.com i am going crazy with this... the server is IIS so cannot use htaccess please help... thanks
On-Page Optimization | | Makumbala0 -
Duplicate Content from on Competitor's site?
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action? Dino
On-Page Optimization | | Dino640 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Duplicate content on area specific sites
I have created some websites for my company Dor-2-Dor and there is a main website where all of the information across the board is on (www.dor2dor.com) but I also have area specific sites which are for our franchisees who run certain areas around the country (www.swansea.dor2dor.com or www.oxford.dor2dor.com) The problem is that the content that is on a lot of the pages is the same on all of them for instance our faq's page, special offers etc. What is the best way to get these pages to rank well and not have the duplicate content issues and be ranked down by search engines? Any help will be greatly received.
On-Page Optimization | | D2DWeb0