Suggestions on plans to optimize my site? (NOOB)
-
I am currently trying to plan how to optimize my site based on keywords. I read and I understand site architecture and usability http://www.seomoz.org/blog/site-architecture-for-seo , but I am still somewhat confused about how to target each keyword per page or when
http://www.seomoz.org/img/upload/splitting-keyword-targeted-.gif
Let me give you an example. We build databases for SME's using 3 different technologies. One of them is MS Access.
Based on PPC campaigns and keyword research some of the possible keywords might be
ms access programmer
ms access consultants
access database experts
According to the link provided, should these be separate pages? I feel if they were, our site nativigation would be cluttered and clients would not be benefiting from them at all. It might even lead to some redundant data which I believe is bad right? My feeling is to make one page and target one keyword, but I'm not sure.
For example, see one of our top ranking competitors
http://www.justgetproductive.com/content/access-programmer/index.php
Please, look at the footer? Is that actually how I should structure my links? I hope the answer is NO! Then again, if I do just have one page targeting one keyword, what do I do about the others? Do I just try to use blog posts/articles addressing those keywords? Do I not target them at all?
Thanks for any advice, please keep in mind I am just getting started. My approach is to create a plan to outline everything before I put a lot of time into it.
-
Hi JP,
Thanks for the response. I understand the concept of 'silo-ing' which does make the navigation better, but I feel that doesn't take away the spammy effect. Perhaps I don't fully understand it, but I believe in my case it would be like this
Database solutions->Access->ms access programmer->ms access programmer in San Francisco
Database solutions->Access->ms access programmer->ms access programmer in Los Angeles
Database solutions->Access->ms access consultant
then pages on the same level would then link to one another correct ie consultant and programmer or San Fran and Los Angeles?
I understand about the cannobilization of keywords ie target one or two keywords per page per site so they don't compete with one another. That makes sense.
The thing I don't understand about these methods is that: wouldn't they all have the same sort of content (almost duplicate)? What use is it to the user to navigate all these pages? Or the point is for them to land in them (landing page) and hopefully get a conversion? Is the method solely as workaround to target different keywords while minimizing the spammy effect? Can you provide an example of how you use 'silo-ing'? Is your content extremely similar? or do you tailor each page for the keyword?
-
This explains site architecture and the 'what is a silo' all perfectly -> http://www.seomoz.org/blog/site-architecture-for-seo
edit : as for the data at the bottom of the page I believe they did that due to their top level menu being ajax - even though google can crawl and understand ajax it might have been from years ago when it was a bit hit and miss and they wanted to be sure.
-
Eduardo,
I am in the same boat right now and am willing to pay to get the question answered but cant find any takers. If I find out the answer I will email you and please let me know if you find out anything. I think we have the exact same problem and I also will probably have cannobilization issues too.
JP>>> I will have to find out more about "silo-ing". And I was also told that data at the bottom of a page can be considered a "page envelope" and was okay as long as there was valuable content above it in much greater amount.
-
I would say your idea to target the keywords in blogs that are subsequently developed is a reasonable approach. The example of the footer links seems a bit spammy, yet at the same time it does seem to provide useful navigation and may represent the overall architecture of the site. Are you going to target a local approach as well? (i.e. ms access consultants in California, etc.). Maybe someone on here can explain cannobilization of keywords better than I can, but we've had good results "silo-ing" each page and are ranking high on those keywords. I think you could do the same. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site name in page title - leave it or remove it?
Hi all, Recently came across some authority blog (quicksprout to be precise) which stated that apart from main page, contact page, about us and some other generic pages, site name should be removed as it might produce duplicate content. example "How to blog | Example Site name" This mostly is the issue with tags and categories pages as it shows on Moz issues. Is that really a problem and site name should be taken off them? Thank you.
On-Page Optimization | | Optimal_Strategies1 -
How to optimize WordPress Pages with Duplicate Page Content?
I found the non WWW ans WWW duplicate pages URL only, more than thousand pages.
On-Page Optimization | | eigital0 -
Jobs listing page optimization
I've a client with a jobs page. I've found that a high traffic, low competition keyword phrase 'OH jobs' and wish to optimize appropriately. The trouble is that there is hardly any body text and any number of job entry listing. As the listing which all go to individual job description pages, naturally fit the description (part description) OH Job - would it be good to use this description within the anchor text of the links? If so is there a maximum? I've been trawling around SEOmoz and Google Search and have got more and more bogged down! ta
On-Page Optimization | | catherine-2793880 -
On Site Problem Caused Rankings To Drop?
I'm getting to the bottom of why my site dropped rankings on the 9th March. I noticed that in google there is a cached version of my site on 9th March at 8.37am which is when my rankings dissapeared. Presumably this is when google last crawled my site? I guess this means that google found something on the home page or on the site that it didn't like? I wonder if anyone can take a look and let me know if there's anything obvious. Could it be a duplicate content penalty as I have lots of categories pulling content from the same posts?
On-Page Optimization | | SamCUK0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Setting up a domain for a future site
Hi there, That may be a bit of a silly question to ask, but we've setup a new domain for an existing site. While the site is in the making, the site owners wants to already start promoting the new URL on stationeries etc. Hence, we need to setup the new URL so that it forwards to the site, but so that Google doesn't give it the history of a secondary (less important) domain. What is the best way to do this? Currently we've put in a 301 redirect, but will that bear no future consequences on the SEO of the site, when the site is moved to this new domain, and the old domain is 301 redirected. Thanks, SEOeclipse
On-Page Optimization | | Bozboz0 -
How to "rich-content" optimized!
Hi mozzers! How to optimize really a rich index.php of a page,with a keyword example: " mobile " what kind of things to include,video,comments,images,how many words,manually meta-descriptons or to leave it empty to take automatically the googlebot a snippet! Tell us more on this, because we forget sometimes the rich-content-optimized and only concentrated on the link-building. Thanks,
On-Page Optimization | | leadsprofi0 -
Optimizing for Date Sensitive Products/Services
We have a product that we currently rank number one for, but would like to capture the date modified variations of the term (such as event 2011 or product 2012). My question is - what would be the best way to optimize for a date senstive product/service? Would it be better to include the date variation of the term on the main page for the product? Or should we create a new page entirely to capture this variation? I lean towards optimizing the existing page because the intent is the same whether a user is searching for product or product 2012. I should mention that the previous year versions of the product are not available. Merci. Chris Thompson
On-Page Optimization | | GroupPublishing0