KW density and idiot clients. HELP!!!!
-
I have a client who insists on using KW1 @ a 3% rate in a 600-word piece, aka 18 references to KW1 in a two page piece. I upped the KW1 count to 18, but in doing so, added 100 words of text, getting the piece to 700 words. Now the client wants 21 KW1 appearances to maintain that 3% density. If I add 3 more KW1's, I'll up the word count again, requiring more KW1's to hit the 3% mark. Any suggestions for solving the never-ending problem of KW density and idiot clients? Thanks in advance. Paul
-
Thanks for the assist, all.
@Matt: If the client is an idiot, someone should tell her.
Though you're right. I shouldn't point this particular client to this thread.
I love this site. Very generous mozzers.
Best,
Paul
-
RTM.
-
Show them this:
http://www.youtube.com/watch?v=Rk4qgQdp2UA
If you mention it once, good. If you mention it twice, ok, you're about this. If you mention it 3 times, ok ... getting less helpful. By the 4th, 5th, 18th time you mention it, it's not really helping. And if you overdo it it does hurt.
I've found (and YMMV) that keyword consistency is more important than density. If my article is about flowers and I mention roses and daisies, sunflowers and tulips - that's more important than saying "I like flowers because flowers smell nice and flowers are pretty flowers and you can give your wife flowers on Valentine's Day, a flowers holiday."
Also as far as consistency, use the main keywords in the title & description but don't spam them in there 3 or 4 times. Then use those keywords and the related set of keywords (types of flowers, potting, plant, garden) on the page. That's more helpful, I think.
(I'd suggest you could show your client a forum topic where a bunch of professional SEOs say keyword density doesn't matter but I'd really avoid using the word "idiot" when talking about the people who pay you if you want them to continue to do so.)
-
Show them high-ranking content for the KW's and I can almost guarantee that it doesn't follow the criteria that your client is suggesting.
-
Show him blog posts and white board Fridays that say to not be crazy about kw density.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Help - Lost 75% Of Traffic Since May 2018
Sorry to go in-depth here, but want to give all available information. We went live late April 2018 with our two websites in Shopify (moved from Magento, same admin, different storeviews...which we find later to cause some issues). Both of these websites sell close to the same products (we purchased a competitor about 5 years ago, which is why we have two). The nice thing is that they do almost identical amounts in sales. They have done very well for years, especially in the last two years. Well, the core algo update around May 22nd-24th 2018 happened and wiped out about 65% of our Google traffic for one website (MySupplementStore.com). And this latest update, wiped out another 20%. I couldn't figure out why this would have happened, because we were very cautious about keeping things separate, unique descriptions etc. So I did some digging and this is what I found: The reviews we migrated over from Magento somehow were combined and added to both websites. This is something I didn't notice. I had this resolved a month ago so that each site's reviews are now only on that website. Our blog section was duplicated across both websites during the migration. Again, something I didn't notice, as we have close to over 1,000 blog posts per site. This was resolved two weeks ago. As I was looking more, I found that the last 6 months, a person working for us (for 3 years), started writing descriptions and pasting them on both websites, instead of making them unique to each website. I trusted her for years, but I think she just got lazy. She quit about a month before the migration as well. We are currently working on this, but its been taking awhile because we have over 5,000 products on each site and have no idea which ones are duplicates. I did also notice: Site very slow when checking site speed tools. Working on that this week. When I take snippets of text or do searches, many times it shows up in omitted results. No messages in Google Webmaster Tools So the question is... Do you think it is the duplicate content issues that caused the drop? Our other site is Best Price Nutrition, which didn't see a big drop at all during that update. If not, any other ideas why?
Intermediate & Advanced SEO | | vetofunk0 -
Difficulty with Indexing Pages - Desperate for Help!
I have a website with product pages that use the same URL, but load different data based on what's passed to them with GET. I am using a Wordpress website, but all of the page information is retrieved from a database using PHP and displayed with PHP. Somehow these pages are not being indexed by Google. I have done the following: 1. Created a site map pointing to each page. 2. Defined URL parameters in Search Console for these type of pages. 3. Created a product schema using schema.org, and tested it without errors. I have requested re-indexing repeatedly and these pages and images on the pages are still not being indexed! Does anybody have any suggestions?
Intermediate & Advanced SEO | | jacleaves0 -
Website Indexing Issues - Search Bots will only crawl Homepage of Website, Help!
Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
URL Index Removal for Hacked Website - Will this help?
My main question is: How do we remove URLs (links) from Google's index and the 1000s of created 404 errors associated with them after a website was hacked (and now fixed)? The story: A customer came to us for a new website and some SEO. They had an existing website that had been hacked and their previous vendor was non-responsive to address the issue for months. This created THOUSANDS of URLs on their website that were then linked to pornographic and prescription med SPAM sites. Now, Google has 1,205 pages indexed that create 404 errors on the new site. I am confident these links are causing Google to not rank well organically. Additional information: Entirely new website Wordpress site New host Should we be using the "Remove URLs" tool from Google to submit all 1205 of these pages? Do you think it will make a difference? This is down from the 22,500 URLs that existed when we started a few months back. Thank you in advance for any tips or suggestions!
Intermediate & Advanced SEO | | Tosten0 -
Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
Hi, A client of mine has compliance issues in their industry and has to show two different types of content to visitors: domain.com/customer-a/about-us domain.com/customer-b/about-us Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection. I'm uneasy about this from an SEO POV because: Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content, It makes sense to separate them into three URL paths so that Google can index them all, It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available. I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated. Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
Intermediate & Advanced SEO | | steviephil0 -
Please help on this penalized site!
OK, this is slowly frying my brain and would like some clarification from someone in the know, we have posted multiple reconsideration requests the regular "site violates googles quality guidelines" .."look for unnatural links etc" email back in March 2012, I came aboard the business in August 2012 to overcome bad SEO companies work. So far i have filled several disavow requests by domain and cleared over 90% of our backlink profile which where all directory, multiple forum spam links etc from WMT, OSE and Ahrefs and compiled this to the disavow tool, as well as sending a google docs shared file in our reconsideration request of all the links we have been able to remove and the disavow tool, since most where built in 2009/2010 a lot where impossible to remove. We managed to shift about 12 - 15% of our backlink profile by working very very hard too remove them. The only links that where left where quality links and forum posts created by genuine users and relevant non spam links As well as this we now have a high quality link profile which has also counteracted a lot of the bad "seo" work done by these previous companies, i have explained this fully in our reconsideration request as well as a massive apology on behalf of the work those companies did, and we are STILL getting generic "site violates" messages, so far we have spent in excess of 150 hours to get this penalty removed and so far Google hasn't even batted an eyelid. We have worked SO hard to combat this issue it almost feels almost very personal, if Google read the reconsideration request they would see how much work we have done too remove this issue. If anyone can give any updates or help on anything we have missed i would appreciate it, i feel like we have covered every base!! Chris www.palicomp.co.uk
Intermediate & Advanced SEO | | palicomp0 -
Help needed regarding 1:1 Redirection
Hi all, We are currently working on a very large site having approximately 5000+ pages and its going to be 301 redirected to a new domain. For this we need to redirect each and every page on a 1:1 basis as mentioned in the Webmaster Central guide. Now the site is in flt file and not in CMS and is becoming very tough to set manually redirection for these pages. The site is hosted in a Windows server and using IIS web config file. Now any help regarding any automated or easy way to do the 1:1 redirection will be appreciated. Thanks in advance,
Intermediate & Advanced SEO | | ITRIX0 -
DMCA Complaint to Google - HELP
I have several sites copying my content, which I found out via Copyscape.com. Unfortunately, this is giving me duplicate content. I filed a DMCA complaint through Google and the infringing pages were approved but the pages still remain. Can someone please help me understand this better? I thought Google was supposed to remove these pages? Am I supposed to content the site owner to get the content removed or are their pages simply de-indexed?
Intermediate & Advanced SEO | | tutugirl0