Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
How to remove duplicate content issues for thin page(containing oops no resulting found)
In this scenarios we have multiple different URLs but the page content is rendering same (containing Oops message) due to which content duplicate's issue arises.As soon as content for these URL,s are available then those pages duplicate issue will removed. So we want to remove duplicate issue not the page & Page URLs.
On-Page Optimization | | surabhi60 -
Will Google handle "this not that" pages differently?
If you create pages about "try keyword1 not keyword2" will there be any barriers to getting the pages ranked for keyword2? Example: You have furnished rental units in a small town, and you offer nightly/weekly rentals. You want to rank for "town hotel" since you offer the same service as a hotel. Since you're not really a hotel, you create a page called "Better than a hotel: Town nightly rental units". Anyone know if Google has an algorithm to detect this (they would have to detect the meaning of the words you were using and know that you were promoting something other than a hotel) and determine you're not really relevant to "town hotel" and not rank you well? I think they probably do not, as I've seen things like Google Adsense Alternatives articles ranking well for the term Google Adsense, or Boycott Godaddy sites ranking well for the term godaddy. But I would like to hear any evidence or facts others know of.
On-Page Optimization | | AdamThompson0 -
Duplicate content from category pages?
I have an ecommerce store with different categories for my products. Some products do appear in more than one category, is that an issue even if you end up on the same product page/link? Also, I have a "show all products" category, which I believe creates duplicate content too? What is your take on this? What can I do to solve this? Is it even an issue of duplicate content? All answers are very much appreciated!
On-Page Optimization | | danielpett0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Do videos count as duplicate content?
If we allow users to embed our videos on their site, would that count as duplicate content? I imagine note, given that Google can't usually 'see' the content of videos, but just want to double check.
On-Page Optimization | | nicole.healthline0 -
How can I stop google reading a certain section of text with my H1 tag?
Hey Mozzers, I'm wondering if anybody knows of a way that I can stop google reading a certain part of text within my H1 texts? My issue is that I have individual office pages on my site, but many offices are based in the same city; such as 'London'. I want to keep London within the H1 tag for user experience but I do not want it to be picked up by the search engines and start a canonical issue. I've seen some people say to use document.write or use an image. Does anybody know of a correct way of doing this? Many Thanks.
On-Page Optimization | | Lakeside0 -
Duplicate page title and content
Hello, I have an ecommerce store where we offer many similar products, and the main difference could be the color or memory storage. Due to this reason my main problem appears to be be duplicate page title and content. What is the best way to correct this issue? I cant make them different neither. I always include this particular difference in title or description. I guess it is not enough? any way to fix it? thanks!
On-Page Optimization | | tolyadem10