How keywords per page to keep from being "spammy"?
-
Hi all,
I am currently doing a marketing internship for a B2B company that does all sorts of out-sourced recruiting work.
I have some experience with SEO, but not completely confident.
My first question is, I know Google sees websites that load up on keywords as "spammy", so what is the appropriate number of keywords per page?
Currently, I was thinking about this setup:
1 keyword for the URL
1 keyword per alt tag (1 per page, at most)
2 keywords per each title tag (approximately 4 pages that I am going to follow internally, not following the "about us" page). After that, I was thinking of adding 2-3 more keywords in each meta description and 2-3 in the body copy.
That would equate to 6-8 keywords on each page, is this too many and should keywords be repeated (on the same page or across multiple pages)?
Since this website is brand new (zero links), would it make sense to nofollow all of the internal links so that they homepage can gain ranking as quickly as possible within Google?
-
As this website is brand spanking new..should I just allow the homepage to follow the subsequent pages (each service's page) but on all of the services' actually pages "nofollow" each internal link?
The way I am thinking is...this will allow Google to crawl the 4-5 service pages from the homepage, but it will save me a little bit of link juice (when I start building) and allow the homepage to be more highly ranked.
-
I'd suggest using it in the body as appropriate. If it goes with the flow , use it. Dont try to add it in just for the sake of impacting keyword density. Same goes with the internal links. One internal link related to the keyword should be safe.
-
Should I nofollow all of the other internal inks? Also, do you mean to avoid using the keyword in the body if I use it in the alt and title tags?
-
You have to be very careful because Google has gotten very aggressive with the over optimization penalty off late. I would recommend sticking with just one keyword and the brand name in the title tag ( Keyword | Brand Name) , use the keyword only once in the meta description and 1 for the alt tag.
You can probably get away with 2 keywords however it's better to be safe than sorry and use just one keyword.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Are pages not included in navigation given less "weight"
Hi, we recently updated our website and our main navigation was dramatically slimmed down to just three pages and no drop down under those. Yet we have many more important pages, which are linked to once on one of those main three pages. However, will this hurt those other pages because they are not included in navigation (some of which were starting to get good traction in rankings)?
Web Design | | LuaMarketing2
Thanks!0 -
Duplicate Content Home Page http Status Code Query
Hi All, We have just redone a site wide url migration (from old url structure to new url structure) and set up our 301's etc but have this one issue whereby I don't know if' it's a problem of not. We have 1 url - www.Domain.co.uk**/** which has been set up to 301 redirect back to www.domain.co.uk However, when I check the server response code, it comes back as 200. So although it appears to visually 301 redirect if I put the url in the tool bar, the status code says different. Could this be seen as a potential duplicate home page potentially and if so , any idea how I could get around it if we can't solve the root cause of it. This is on a cake php framework, thanks PEte
Web Design | | PeteC120 -
CMS dynamicly created pages indexed?
Hey Moz'erz, Looking at the indexed pages of my clients eCommerce website I noticed that dynamically created pages are being indexed. For example this page does not "exist" but is created by a drop down filter menu that sorts by product tag: /collections/tools/TAG I can only conclude that this page got indexed either through a backlink or once upon a time there was an internal link pointing to this URL and got indexed (currently there is not). Are either of these cases possibilities? In either case before considering removal or any action I would of-course reference analytics to check for conversions, traffic and any backlinks for those "pages". I believe at the end of the day is recommend a drop down filer that doesn't create new pages as the best solution. Thoughts, comments and experience is greatly welcomed 🙂
Web Design | | paul-bold0 -
What does it mean that "too many links" show up in my report - but I'm not seeing them?
I've noticed that on the crawl report for my site, www.imageworkscreative.com, "too many links" is showing up as a chronic problem. Reviewing the pages cited as having this issue, I don't see more than 100 links. I've read that sometimes, websites are unintentionally cloaking their links, and I am concerned that this is what might be happening on my site. Some example pages from my crawl report are: http://www.imageworkscreative.com/blog/, http://www.imageworkscreative.com/blog/10-steps-seo-and-sem-success/index.html, and http://www.imageworkscreative.com/blog/business-objectives-vs-user-experience/index.html. Am I having a cloaking issue or is something else going on here? Any insight is appreciated!
Web Design | | ScottImageWorks0 -
301 forwarding during site migration problem - several url versions of the same page....
Hello, I'm migrating from an old site to a new site, and 301 forwarding many of the pages... My key problem is this I'm seeing www.website.com/ indexed in SE and www.website.com/default.aspx in showing as URL when I'm on homepage - should I simply 301 forward both of these? Then for several internal pages there are 2/3 versions of each page indexed. Canonicalization issues. Again, I'm wondering whether I should 301 forward each URL even if there are several different indexed URLs for the same page? Your advice will be welcome! Thanks in advance - Luke
Web Design | | McTaggart0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0