The impact of using directories without target keyword on our Rankings
-
Hello all,
I have a question regarding a website I am working on. I’ve read a lot of Q en A’s but couldn’t really find the best answer.
For one of our new websites we are thinking about the structure of this website and the corresponding URL-structure. Basically we have a main product (and a few main keywords) which should drive the most traffic to our website, and for which we want to optimize our homepage.
Besides those main keywords, we have an enormous base of long-tail keywords from which we would like to generate traffic. This means we want to create a lot of specific pages which are optimized.
My main question is the following:
We are thinking of two options:
- Option 1: www.example.com/example-keyword-one
- Option 2: www.example.com/directory/example-keyword-one
With option 1 we will link directly from our homepage to the most important pages (which represent our most important keywords). All the pages with the long tail content will be linked from another section on our website, which is one click away from our homepage (specifically a /solutions page which is linked from the footer). All the pages with long-tail content will have this structure www.example.com/example-keyword-one so the URLs will not contain the directory /solutions
With option 2 we will use more subdirectories in our URLs. Specifically, for all the long tail content we would use URLs like this: www.example.com/solutions/example-keyword-one
The directories we want to use wouldn't really have added value in terms of SEO, since they don’t represent important keywords.- So what is the best way to go? Option 1, straightforward, short URL’s which don’t really represent the linking structure of our website, but only contain important keywords. Or option 2, choose for more directories in our URLs which represent the linking structure of our website, but contain directories which don’t represent important keywords.
- Would the keyword ‘solutions’ in the directory (which doesn’t really relate to the content on the page) have a negative impact on our rankings for that URL?
-
Hi Rob,
Thanks for the helpful answer! I did a lot of research and also concluded that both options can work. I just haven't found any supporting case studies which clearly shows which of the two alternatives would work best. So if anyone knows a good article related to URL-structure and my question in specific, that would be very welcome!
Thanks!
Regards,
Jorg
-
It all depends if you want (or are going too):
1. Short URL's usually work best with regards to indexing and product correlation (too long means characters get left off by Google when indexing). Keep things within a short URL length also helps Google index the full length and get the full value of the URL - using your <keywords>to reinforce the URL relation.</keywords>
-
Also - Having these URL's linked too from the main page will help flow 'link juice' through the site, providing you keep the amount of links on the homepage to a minimum amount, and mix with other links that are <nofollow>. Usually links beyond 100 will not be crawled by Googlebot.</nofollow>
-
Also - If your URL's are strings - make sure to have 301's setup for URL's that include any type of string (?=question123456 or something to that alignment) Make sure to change that string = www.domains.com/keyword-rich-content. This might be nothing for the site/domain you are working on, or might be a step that needs to be included in the site's overhaul project work.
2. Longer URL's (like adding directories or sub-folders) can be good too, depending on your product breakdown in you site architecture. It might not be needed though. If you have hundreds of thousands of products, directories will most likely be needed to sort the data and organize the database being used to work alongside the CMS. Then you would want to go this route, other than having an unorganized ROOT directory with thousands of pages in it (even if dynamically generated)
Each option works, in their own way. Each with supporting documentation and methods. Just something to consider in helping you steer the SEO sea
Cheers!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website Redesign & Ensuring Minimal Traffic/Rankings Lost
Hi there, We have undergone a website redesign (mycompany.com) and our site is ready to go live however the new website is built on a different platform so all of our blog pages will not be copied over - to avoid a large web developer expense. So our intention is to then leave all the blog pages as (on the old web design) but move it to the domain blog.mycompany.com with 301 redirects inserted on mycompany.com for each blog post pointing to the corresponding blog.mycompany.com. Is there anything else we should do to ensure minimal traffic/rankings are lost? Thank you so much for your help.
Web Design | | amitbroide0 -
Does too much inline CSS impact SEO rankings
Hello, Does implementing a lot of inline CSS have a negative impact on SEO rankings? I imagine it could affect page speed, but any other issues I might run in to?
Web Design | | STP_SEO1 -
Using Button Links vs Sidebar Menu
I have a services page with a lot of rich text and a slideshow of images. Currently, I am using a column of buttons to various services, and am wondering if a sidebar menu would be more effective for Google to crawl and rank?
Web Design | | cinchmedia0 -
Is Fall In Keyword Ranking After Launch of Revamped Website Normal
After launching my redesigned website (www.nyc-officespace-leader.com) Google ranking has dropped significantly for competitive keywords. The previous version of the site and the new version both have approximately 450 pages. My website developer was careful to implement 301 redirects. Monitoring Google Webmaster tools it shows that Google has picked up a quantity of duplicate content. More than 950 pages or shown in their index while my site only has 450 pages. There are also certain pages which require canonical which tags my developer is in the process of implementing. The relaunch was July 10. My developer is of the opinion that this fluctuation in ranking is normal and that it will take Google about one month to reindex the new site and remove the old pages from the directory. Is this accurate? Anyone have any ideas on why my site has tanked in Google's search results? Thank you very much. Sincerely,
Web Design | | Kingalan1
Alan Rosinsky0 -
Getting a highly ranked site a better result for 1 search term
I have a highly ranked website for a niche category. My site ranks higher in SEOMOZ than all of my competitors, but I can't get any higher than 4th on a page for one specific search term. What can I do to help my site increase its ranking on a specific search term?
Web Design | | tadden0 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Image Replacement Using Cufon (Javascript)
Our agency is working with an outside developer that has designed a beautiful site. The possible problem is that they used Cufon to change a large amount of the text on the page to an image of the text in a nicer font. On some pages all of the text is replaced and on others its about 20%. The text that is replaced is identical to what is shown to the user. I realize that Google has stated that sIFR (similar to Cufon) is okay, in a limited way years ago, but I am stil a little leery of the large amount of image replacement that is happening. I am also worried about user experience, should flash not be enabled or it is slower to load. So I have a couple questions. 1. Would this amount of image replacment raise a flag to Google, especially since it is the heading tags and large chunks of the body content both? 2. I know about 2% of the site's users do not have javascript enabled. Do you have an idea of what percentage of people have issues, like slow connection speeds or slow computers, using javascript even if it is enabled?
Web Design | | DirectiveGroup0 -
Competitor Rockets to #1 and I'm looking at keyword stuffing. Will Google catch up with it?
We have a competitor whose home page rocketed up to number one, page one, on our key search term after they did a website redesign. They even beat out the original retailer for that position, as they are resellers of the product (not affiliate sales, resale in the secondary market.) They are the first to knock the original seller out of the #1 position. In the past couple of years that I have been doing in-house SEO, they have never ranked on page one for the term. I ran their site through the SEOmoz page grader for the specific search term, loading their page that is ranking, and found that they grade a “B,” but have some alerts for keyword stuffing, (the search term is on the home page 30+ times,) and they have eleven tags on said page. Aside from the two things listed above, they have pretty good site architecture on this new site, and are pretty well branded, etc. Should I expect Google to catch the keyword stuffing and eleven tags, and possibly adjust their rank? Will their keyword stuffing come back to bite them?
Web Design | | Ticket_King0