How Important is it to Use Keywords in the URL
-
I wanted to know how important this measure is on rankings.
For example if I have pages named "chair.html" or "sofa.html"
and I wanted to rank for the term seagrass chair or rattan sofa..
Should I start creating new pages with the targeted keywords "seagrass-chair.html" and just copy everything from the old page to the new and setup the 301 redirects??
Will this hurt my SEO rankings in the short term? I have over 40 pages I would have to rename and redirect if doing so would really help in the long run.
Appreciate your input.
-
I can definitely say that keywords in a URL can make a difference in a competitive vertical. We have an article on site with a slug of "best-place-to-buy-a-mac-online" and you get one guess what keywords it ranks for.
That said, if you're talking about changing URLs sitewide, I would be very cautious. Maybe make changes to one or two pages to start and watch them for a few weeks.
-
I've just checked your website, and it does look fine to me. You have a clear navigation and all of that. I wouldn't worry about changing URLs (Google would still index them) But just double check with search console if Google has indexed all of your pages.
In regards to the seagrass.html vs seagrass-furniture.html, I wouldn't worry about changing it, because Google understands that your website is about selling furniture, therefore it realises that seagrass.html has to do with furniture. When you change your pages and put 301 redirects you loose a bit of link juice, so if I were you I wouldn't worry too much about your URL situation (it's a minor ranking factor) and would focus on link building and on-site optimisation.
Hioe that helps
-
It is an ecommerce site (yahoo) so all the urls are wickerparadise.com/ (whatever the id of the page is called) .html
The page with all of the chairs would be "x-chairs" x being the specific type of chair.
I have a main category page: http://www.wickerparadise.com/seagrass.html
Now this page has everything in the seagrass furniture category. I believe we made a mistake in calling this seagrass.html vs. seagrass-furniture.html (since we want to rank high for seagrass furniture and this page has more content then a specific item page)
Should I go ahead with renaming the urls and doing the redirects on a massive scale (30+ at a time) ??
-
Hi,
I would say that it is beneficial to have a clear architecture/structure for users and search engine spiders to navigate across your website. If you'd decide to change the structure, I would suggest using something like this: www.yourwebsite.com/chairs/seagrass-chair/.
When you say that you have a "chair" page, does that mean that all of your products are listed on that single page or do you have separate products pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Keyword Phrase vs. separate keywords - Title Tag best practices
Hello, What is your opinion about when to use a keyword phrase vs. 2 keywords, separated by a comma, in the title tag? For example, on this page, the title could be either: NLP Hypnosis, Language Patterns | Nlpca.com or NLP and Hypnosis Including Language Patterns | Nlpca.com Which do you guys think is best with respect to rankings, updates, and future updates?
White Hat / Black Hat SEO | | BobGW0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Keyword Rich Domains on Same IP
In addition to my main website, I want to create two new sites for the upcoming football and basketball seasons. By starting now, I'm thinking I have enough time to get them ranked decently. I have purchased www.collegefootballpredictions.net for the upcoming football seasons. The intent here is two fold. First, I'd like to rank in the top 3 for "College Football Predictions." Second, and this is why I'm thinking that Google won't hate me for the approach, is that someone looking for that search term is much more likely to convert on a landing page geared for them then on my main website. If the goal of a separate website is truly to compliment the main website, then is it considered white hat? I'm thinking that, as long as my intentions are pure, they should go on the same IP. Placing them on separate IPs could be a good way of letting the big G know that I'm trying to cheat the system and get away with it.
White Hat / Black Hat SEO | | PatrickGriffith0 -
Can Using Google Analytics Make You More Prone to Deindexation?
Hi, I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case. Thanks for your thoughts, Donna
White Hat / Black Hat SEO | | WebMarketingHUB0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11