No problem Luke - it is hard to judge whether your pages are keyword stuffed without some examples such as what are you page titles, headings and even an example of content (paragraph or two)? Do you think they look natural or do you think that you have included your keyword too many times - more than is really necessary? Even with an in your face approach you can reduce keyword stuffing I think..
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by Matt-Williamson
-
RE: Keyword density and meta tags
-
RE: Keyword density and meta tags
Hi Luke
I wouldn't particularly worry about the actual keyword density of the page but more about whether your keyword optimized text actually looks natural and not stuffed with a set keyword. In regards to meta data this doesn't have an direct value in relation to your search engine ranking. I personally don't use meta tags as this tells my competition exactly what keywords I am going after. As far as meta description goes this is important because it influences what text is displayed against your page in the search engine results. Good meta descriptions can help influence whether people click through to your site from the SERPs. Hope this helps.
Keyword density is so old hat - if I were you I would stop measuring your pages by keyword density and make sure that you right decent content that contains your keyword. Also placing your keywords in important elements such as the title tags, H1 header, etc. In direct answer to your question if you insist on measuring keyword density meta data won't count as it is not a ranking factor - the days of meta crawlers are long gone!
-
RE: Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi Leona - what you have done is something along the lines of what I thought would work for you - sorry if I wasn't clear in my original response - I thought you meant if you created a robots.txt and specified Googlebot to be disallowed then Googlebot-image would pick up the photos still and as I said this wouldn't be the case as it Googlebot-image will follow what it set out for Googlebot unless you specify otherwise using the allow directive as I mentioned. Glad it has worked for you - keep us posted on your results.
-
RE: Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi Leona
Googlebot-image and any of the other bots that Google uses follow the rules set out for Googlebot so blocking Googlebot would block your images as it overrides Googlebot-image. I don't think that there is a way around this using the disallow directive as you are blocking the directory which contains your images so they won't be indexed using specific images.
Something you may want to consider is the Allow directive -
Disallow: /community/photos/
Allow: /community/photos/~username~/
that is if Google is already indexing images under the username path?
The allow directive will only be successful if it contains more or equal number of characters as the disallow path, so bare in mind that if you had the following;
Disallow: /community/photos/
Allow: /community/photos/
the allow will win out and nothing will be blocked. please note that i haven't actioned the allow directive myself but looked into it in depth when i studied the robots.txt for my own sites it would be good if someone else had an experience of this directive. Hope this helps.