Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it OK to include name of your town to the title tag or H1 tag on a blog to enhance local search results
I recently attended a webinar by ETNA Interactive on local search SEO. The presenter recommended including the name of your town in the title of the blog to increase local search SEO. Is this OK? Ive always been concerned that it is such an obvious attempt to rank locally that Google would consider it "spammy" ? black hat, "sketchy" or otherwise manipulative. Have the rules changed? Is it OK to do? Brooke
On-Page Optimization | | wianno1680 -
Adding meta description tags to Drupal 6 content
It looks as if a quick improvement to the (inherited) site that I manage would be to add meta description tags to all our pages. But I don't see a user-friendly way to modify the html headers from within Drupal. I'm debating whether to interpose CSS code between Drupal and the final html content, but I'd face a steep learning curve on CSS. Can you suggest a more excellent approach? Thanks!
On-Page Optimization | | marcellu
John0 -
Mass Duplicate Content
Hi guys Now that the full crawl is complete I've found the following: http://www.trespass.co.uk/mens-onslow-02022 http://www.trespass.co.uk/mens-moora-01816 http://www.trespass.co.uk/site/writeReview?ProductID=1816 http://www.trespass.co.uk/site/writeReview?ProductID=2022 The first 2 duplicate content is easily fixed by writing better product descriptions for each product (a lot of hours needed) but still an easy fix. The last 2 are review pages for each product which are all the same except for the main h1 text. My thinking is to add no index and no follow to all of these review pages? The site will be changing to magento very soon and theres still a lot of work to do. If anyone has any other suggestions or can spot any other issues, its appreciated. Kind regards Robert
On-Page Optimization | | yournetbiz1 -
Why isn't our site being shown on the first page of Google for a query using the exact domain, when its pages are indeed indexed by Google
When I type our domain.com as a query into Google, I only see one of our pages on the homepage, and it's in 4th position. It seems though, that all pages of the site are indexed by google when I type in the query "site:domain.com". There was an issue at the site launch, where the robots.txt file was left active for around two weeks. Would this have been responsible for the fact that another domain ranks #1 when we type in our own domain? It has been around a couple of months now since the site was launched. Thanks in advance.
On-Page Optimization | | featherseo0 -
Creating Duplicate Content on Shopping Sites
I have a client with an eCommerce site that is interested in adding their products to shopping sites. If we use the same information that is on the site currently, will we run into duplicate content issues when those same products & descriptions are published on shopping sites? Is it best practice to rewrite the product title and descriptions for shopping sites to avoid duplicate content issues?
On-Page Optimization | | mj7750 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1 -
Archive of content
Hi there, I have recently joined a company to look after the e-marketing side of things, anyway the company I work for have been writing articles for a website that they own for over 2 years, probably about 200 or so unique articles on that website, however over the past year or so there has been no contribution to this site and was wondering if it would be worthwhile transferring these article over to our blog?, as this is where all the attention is in terms of marketing etc Kind Regards,
On-Page Optimization | | Paul780 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30