Keyword in URL - SEO impact
-
Hi,
We don't have most important keyword of our industry in our domain or sub-domain. How important it is to have keyword in website URL? Most of our competitors pages with "keyword" urls been listing in SERP. What is back-links role in this scenarion? And which URL have more advantage? keyword in sub-domain or page with keyword. Like for "seo" keyword..... seo.example.com or example.com/seo
-
Hi JK,
Thanks for the reply.
First of all, this is an old article, and many (if not all) of the information given has changed in Google search algorithms.
In the case you are describing, in nowadays SEO (2019), I'd suggest you not to focus on having any keyword weight in the domain/subdomain.
Keep both services in the same domain, under the same company name. Google will understand that its the same company offering two services. Of course, you should have different pages for both services.
Probably would be of help, creating really good content to give context and use internal linking to tell Google which are your main pages and targeted search terms.Hope it helps.
Best luck.
Gaston -
GR,
Appreciate the advice. We are working with a customer that offers two very different services (janitorial services and pest control). We have 2 options for organizing their site.
Option A: 1 website under the main company name with dual focus
Option B: 2 seperate site 1 for each service giving main service the company domain name and the secondary service the company domain name - janitorial.com
Would enjoy to hear your thoughts on this!
Thanks,
JK
-
Yeap, it should be redirected to relevant pages. Not to the homepage nor less relevant pages, because it's treated as a soft 404 error.
Here a great article:
Proof That 301 Redirects To Less-Relevant Pages Are Seen As Soft 404s To Google [Case Study] Google May Treat Expired Products Page Redirects As Soft 404s And, what google considers as Soft 404 errorsHope it helps.
GR. -
Thanks GR.
I have gone through the Moz article on 301 redirects. They say that every redirect comes with certain amount of SEO risk. What exactly it might be?
And everybody say that we must redirect to relevant page. I can understand that 2 pages content must be relevant, but what's the importance of URL here? Do we need a match in URL keywords too? Because, if a non-existing link is redirected to existing page, what will be the metrics to pass link juice or any risk as there will be no content in one of these pages and how bots check? Like for below example:
website.com/folder/page1/content/ is a non-existing page and if it's redirected to website.com or website.com/folder2/page14.
Thanks,
Satish
-
You're welcome, we are here to help.
Theoretically, there is no linkjuice loss with redirects.
Take a look in this article about the last news and update on Google about 3xx redirects.:
301 Redirects Rules Change: What You Need to Know for SEOHope I've helped.
GR. -
Thanks for the suggestions GR. That gives us an idea to proceed on.
We are also planning to move all the links from one subdirectory to another subdirectory. But we will be redirecting them to related pages.
For example, website.com/folder1/seo-changes to website.com/folder2/seo-changes
Only sub directory gonna change. But we have many links pointing to sub directory we are directing. Do the same link juice passes to these links redirected to another sub directory?
-
Hi there.
I've conducted some experiments on having the keyword in several places: Exact match domain, subdomain, subdirectory and in the slug.
My conclusions are:- Having it in a subdomain doen't help at all.
- Having it in a exact match domain (e.g. keyword.com) helps very little and creates a problem when the business tries to expand to more search terms.
- Having it in a subdirectory (e.g. domain.com/my-keyword/some-page) doesnt help much, unless you're trying to rank the page that comes at the subdirectory. This latter makes it a single page or just a slug
- Having it in the slug (e.g. domain.com/single-page-keyword) helps, makes the difference.
My opinion, when it comes to on-page optimization and keyword optimization, it's mandatory to place the main (or some variation of it) in the final URL.
In all my expriments, always there was a correct optimization (on page and for that kw). And were focused for similar kw with similar search difficulties.---- UPDATE ---
Here some information and resources:15 SEO Best Practices for Structuring URLs - Moz blog
URL - Moz's learn On-Page SEO: Anatomy of a Perfectly Optimized Page (2016 Update) - Backlinko---- UPDATE ---
Hope ir helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Webjaguar SEO shortcomings
Hey All. I have a client whose ecommerce site is build in Webjaguar. Does anyone have experience with this platform. It appears to be loaded with technical SEO challenges (duplicate content, weird URLs, etc). Interestingly, when I Google "webjaguar SEO challenges" and things like that....nothing comes up. Suspicious, methinks. I appreciate any thoughts from SEO folks. Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
SEO impact: Link categories on the description of the products
Hi, Is there any positive or negative SEO impact if I link product categories on the description of the products? Ex: At the product page of the Passenger Car Blue, a link to the Passenger Car category of the website on the product description. Or this is more a UX question?
Intermediate & Advanced SEO | | Tiedemann_Anselm
The product page already has a breadcrumb on top. Thanks!0 -
We killed our SEO, but how come some of our keywords are still in the top 1-3
I am looking for the answer for this interesting question: 1. I have a static page with NO information on it, this is almost completelyblank, only a search box on it, which does nothing! The information it contains is absolutely zero, but the page has got a specific URL which is = the keyword i will look for 2. And i have another page which is fully optimized with the help of on page grader (97%) for anoter specific keyword Q, in case of 1: Searching in google for a keyword which is = the page www.domain.hu/keyword, and i will have top 1-3 serp!???? (i say it again, the page contains no information...and the keyword is really frequent, in google adwords it says that this keyword which has got high competition) in case of 2. i have the /url which is completely the same as the keyword, have 97% on page grade, and i see that week by week i can only move upwards a little in the serp. Created unique content a lot, made several changes on this page and like no position changing. So the question is WHY in case 1 i can be with no information in (empty static page) top 1-3 for a really hard keyword, and why i cannot move upward on the list for a not so frequent keyword however i did everything i could??
Intermediate & Advanced SEO | | Neckermann0 -
Where is the best location for my primary keyword in my URL?
http://moz.com/learn/seo/url says: http://www.example.com/category-keyword/subcategory-keyword/primary-keyword.html However I am wondering about structuring things this a little backwards from that: http://www.example.com/primary-keyword/ (this would be an introduction and overview of the topic described by the primary keyword)
Intermediate & Advanced SEO | | TheEspresseo
http://www.example.com/primary-keyword/secondary/ (this would be a category landing page with snippets from articles within the niche described by the secondary keyword, which is itself a niche of the primary keyword)
http://www.example.com/primary-keyword/secondary/article-title/ (in-depth article on a topic within the scope of the secondary, which is within the scope of the primary) Where http://www.example.com/primary-keyword/ is the most important page targeting the most important URL. Thoughts?0 -
Complex URL Migration
Hi There, I have three separate questions which are all related. Some brief back ground. My client has an adventure tourism company that takes predominantly North American customers on adventure tours to three separate destinations: New Zealand, South America and the Himalayas. They previously had these sites on their own URL's. These URL's had the destination in the URL (eg: sitenewzealand.com). 2 of the three URL's had good age and lots of incoming links. This time last year a new web company was bought in and convinced them to pull all three sites onto a single domain and to put the sites under sub folders (eg: site.com/new-zealand). The built a brand new site for them on a Joomla platform. Unfortunately the new sites have not performed and halved the previous call to action rates. Organic traffic was not adversely affected with this change, however it hasn't grown either. I have been overhauling these new sites with a project team and we have managed to keep the new design but make usability/marketing changes that have the conversion rate nearly back to where it originally was and we have managed to keep the new design (and the CMS) in place. We have recently made programmatic changes to the joomla system to push the separate destination sites back onto their original URL's. My first question is around whether technically this was a good idea. Question 1 Does our logic below add up or is it flawed logic? The reasons we decided to migrate the sites back onto their old URL's were: We have assumed that with the majority of searches containing the actual destination (eg: "New Zealand") that all other things being equal it is likely to attract a higher click through rate on the domain www.sitenewzealand.com than for www.site.com/new-zealand. Having the "newzealand" in the actual URL would provide a rankings boost for target keyword phrases containing "new zealand" in them. We also wanted to create the consumer perception that we are specialists in each of the destinations which we service rather than having a single site which positions us as a "multi-destination" global travel company. Two of the old sites had solid incoming links and there has been very little new links acquired for the domain used for the past 12 months. It was also assumed that with the sites on their own domains that the theme for each site would be completely destination specific rather than having the single site with multiple destinations on it diluting this destination theme relevance. It is assumed that this would also help us to rank better for the destination specific search phrases (which account for 95% of all target keyword phrases). The downsides of this approach were that we were splitting out content onto three sites instead of one with a presumed associated drop in authority overall. The other major one was the actual disruption that a relatively complex domain migration could cause. Opinions on the logic we adopted for deciding to split these domains out would be highly appreciated. Question 2 We migrated the folder based destination specific sites back onto their old domains at the start of March. We were careful to thoroughly prepare the htaccess file to ensure we covered off all the new redirects needed and to directly redirect the old redirects to the new pages. The structure of each site and the content remained the same across the destination specific folders (eg: site.com/new-zealand/hiking became sitenewzealand.com/hiking). To achieve this splitting out of sites and the ability to keep the single instance of Joomla we wrote custom code to dynamically rewrite the URL's. This worked as designed. Unfortunately however, Joomla had a component which was dynamically creating the google site maps and as this had not had any code changes it got all confused and started feeding up a heap of URL's which never previously existed. This resulted in each site having 1000 - 2000 404's. It took us three weeks to work this out and to put a fix into place. This has now been done and we are down to zero 404's for each site in GWT and we have proper google site maps submitted (all done 3 days ago). In the meantime our organic rankings and traffic began to decline after around 5 days (after the migration) and after 10 days had dropped down to around 300 daily visitors from around 700 daily visitors. It has remained at that level for the past 2 weeks with no sign of any recovery. Now that we have fixed the 404's and have accurate site maps into google, how long do you think it will take to start to see an upwards trend again and how long it is likely to take to get to similar levels of organic traffic compared to pre-migration levels? (if at all). Question 3 The owner of the company is understandably nervous about the overall situation. He is wishing right now that we had never made the migration. If we decided to roll back to what we previously had are we likely to cause further recovery delays and would it come back to what we previously had in a reasonably quick time frame? A huge thanks to everyone for reading what is quite a technical and lengthy post and a big thank you in advance for any answers. Kind Regards
Intermediate & Advanced SEO | | activenz
Conrad0 -
Switching Url
I started working with a Roofer/Contractor about a year ago. His website is http://www.lancasterparoofing.com/. The name of his business is Spicher Home Improvements. He used to have spicherhomeimprovements.com, well he still does. He was focusing on Roofing and Siding but now would like to branch to other areas like Interior remodeling. So adding interior work under LancasterPaRoofing.com is not applicable. I do not think starting another domain and having two is the best option. I think he should go back to using SpicherHomeImprovements.com and I assume he would take a small hit but in time he should be better off. Plus the url is more applicable to the real name of his business. Thanks for any feedback I receive. Chad
Intermediate & Advanced SEO | | ChadEisenhart0 -
Does capitalization matter for SEO?
Two places capitalization comes into play: (1) on-page use (title, h1, body text, img alt text, etc) (2) external anchor text I didn't think it mattered from Google's point of view for on-page usage (is this correct?) but I notice that OpenSiteExplorer' s 'anchor text distribution' tab shows different counts for the same keyword if it's capitalized in different ways (eg seomoz.org is listed separate from SEOmoz.org). Is that just OSE or does Google treat the keyword/phrase different based on its capitalization, too? And if so, then should I be creating external links to my site with the 'regular' and 'Capitalized' versions of my key phrases?
Intermediate & Advanced SEO | | scanlin1