Keyword in URL - SEO impact
-
Hi,
We don't have most important keyword of our industry in our domain or sub-domain. How important it is to have keyword in website URL? Most of our competitors pages with "keyword" urls been listing in SERP. What is back-links role in this scenarion? And which URL have more advantage? keyword in sub-domain or page with keyword. Like for "seo" keyword..... seo.example.com or example.com/seo
-
Hi JK,
Thanks for the reply.
First of all, this is an old article, and many (if not all) of the information given has changed in Google search algorithms.
In the case you are describing, in nowadays SEO (2019), I'd suggest you not to focus on having any keyword weight in the domain/subdomain.
Keep both services in the same domain, under the same company name. Google will understand that its the same company offering two services. Of course, you should have different pages for both services.
Probably would be of help, creating really good content to give context and use internal linking to tell Google which are your main pages and targeted search terms.Hope it helps.
Best luck.
Gaston -
GR,
Appreciate the advice. We are working with a customer that offers two very different services (janitorial services and pest control). We have 2 options for organizing their site.
Option A: 1 website under the main company name with dual focus
Option B: 2 seperate site 1 for each service giving main service the company domain name and the secondary service the company domain name - janitorial.com
Would enjoy to hear your thoughts on this!
Thanks,
JK
-
Yeap, it should be redirected to relevant pages. Not to the homepage nor less relevant pages, because it's treated as a soft 404 error.
Here a great article:
Proof That 301 Redirects To Less-Relevant Pages Are Seen As Soft 404s To Google [Case Study] Google May Treat Expired Products Page Redirects As Soft 404s And, what google considers as Soft 404 errorsHope it helps.
GR. -
Thanks GR.
I have gone through the Moz article on 301 redirects. They say that every redirect comes with certain amount of SEO risk. What exactly it might be?
And everybody say that we must redirect to relevant page. I can understand that 2 pages content must be relevant, but what's the importance of URL here? Do we need a match in URL keywords too? Because, if a non-existing link is redirected to existing page, what will be the metrics to pass link juice or any risk as there will be no content in one of these pages and how bots check? Like for below example:
website.com/folder/page1/content/ is a non-existing page and if it's redirected to website.com or website.com/folder2/page14.
Thanks,
Satish
-
You're welcome, we are here to help.
Theoretically, there is no linkjuice loss with redirects.
Take a look in this article about the last news and update on Google about 3xx redirects.:
301 Redirects Rules Change: What You Need to Know for SEOHope I've helped.
GR. -
Thanks for the suggestions GR. That gives us an idea to proceed on.
We are also planning to move all the links from one subdirectory to another subdirectory. But we will be redirecting them to related pages.
For example, website.com/folder1/seo-changes to website.com/folder2/seo-changes
Only sub directory gonna change. But we have many links pointing to sub directory we are directing. Do the same link juice passes to these links redirected to another sub directory?
-
Hi there.
I've conducted some experiments on having the keyword in several places: Exact match domain, subdomain, subdirectory and in the slug.
My conclusions are:- Having it in a subdomain doen't help at all.
- Having it in a exact match domain (e.g. keyword.com) helps very little and creates a problem when the business tries to expand to more search terms.
- Having it in a subdirectory (e.g. domain.com/my-keyword/some-page) doesnt help much, unless you're trying to rank the page that comes at the subdirectory. This latter makes it a single page or just a slug
- Having it in the slug (e.g. domain.com/single-page-keyword) helps, makes the difference.
My opinion, when it comes to on-page optimization and keyword optimization, it's mandatory to place the main (or some variation of it) in the final URL.
In all my expriments, always there was a correct optimization (on page and for that kw). And were focused for similar kw with similar search difficulties.---- UPDATE ---
Here some information and resources:15 SEO Best Practices for Structuring URLs - Moz blog
URL - Moz's learn On-Page SEO: Anatomy of a Perfectly Optimized Page (2016 Update) - Backlinko---- UPDATE ---
Hope ir helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Print Button Creating Duplicate PDF URLs set to NoIndex, OK for SEO?
Our real estate website has 400 listings. We have added a button that allows the visitor to print listing pages in the for.m of a PDF. The PDF exists as a URL ending in ?print=17076. This print URL is set to noindex and follow. So our site has 400 additional URLs. Is this a negative for SEO? Or neutral? I have read it using CSS it is possible to set up printing without creating all these extra URLs. Is this method better from an SEO perspective? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Reasonable to Ask URL of Link from SEO Providing New Links before Link Activation?
My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Company name often shows in anchor text (important keyword phrase within), can this impact ranking?
Hi everyone, My company is called "Hawaii Job Engine" - www.hawaiijobengine.com - and many sites that link to my site use my company name as anchor text "Hawaii Job Engine". I have heard Google may devalue a certain keyword phrase if used too often in anchor text. Does this mean I may, over time, get a poor ranking for the term "Hawaii Job" since that phrase is part of my company's name. Or, will search engines easily notice it is my company name and therefore it will not have a negative impact on rankings? Example: if the anchor text leading to my company's homepage is company's name 95% of the time (on authoritative sites) could this be an issue? I don't know the %, but just to establish if there may be in % levels to keep in mind. thank you, Kristian
Intermediate & Advanced SEO | | knielsen1 -
Dynamic URLs Appearing on Google Page 1\. Convert to Static URLs or not?
Hi, I have a client who uses dynamic URLs thoughout his site. For SEO purposes, I've advised him to convert dynamic URLs to static URLs whenever possible. However, the client has a few dynamic URLs that are appearing on Google Page 1 for strategically valuable keywords. For these URLs, is it still worth it to 301 them to static URLs? In this case, what are the potential benefits and/or pitfalls?
Intermediate & Advanced SEO | | mindflash0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1