Is a state name in the URL hindering national rankings?
-
My client's site has a state name in the URL, but he practices on a national level. Is this going to limit rankings to that state and is it worth creating a whole new site with just the keywords in the URL?
Thank you,
Donna
-
No it will not. I like to use brandkeyword for my domain... If my brand name is: White Blue Green Red and my keyword was "color" I would look for the url : WBGRcolor.com if it's available...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the risk of changing underscores to hyphens in URLs that are ranking
Client wants to change URL structure from underscores to hyphens - reason for doing this is a cosmetic move. What is the risk of changing underscores to hyphens for URLs that that have been around since 2012 and have a lot of keywords ranking in the top 5 in the SERPS? When the created the site - they structured the URL using dashes and underscores. Here is an example of what an URL looks like: /programs-degrees**/clinical-psychology/clinical_phd_kansas****-city****/** *This page ranks for many high volume keywords in the top 5 of the SERPS. I have started to compile a list on why the URL should not be changed... Building trust and authority from scratch 301 redirects do not pass 100% of the link juice 301 redirecting from underscored to hyphenated versions of the same content is an unnecessary risk to some of that link equity. Good chance rankings/traffic will drop because of the URL change
Technical SEO | | The-frank-Agency0 -
URL Parameters
On our webshop we've added some URL-parameters. We've set URL's like min_price, filter_cat, filter_color etc. on "don't Crawl" in our Google Search console. We see that some parameters have 100.000+ URL's and some have 10.000+ Is it better to add these parameters in the robots.txt file? And if that's better, how can we write it down so the URL's will not be crawled. Our robotos.txt files shows now: # Added by SEO Ultimate's Link Mask Generator module User-agent: * Disallow: /go/ # End Link Mask Generator output User-agent: * Disallow: /wp-admin/
Technical SEO | | Happy-SEO1 -
Rankings after manual penalty removal
I've just started working on a ecommerce website that was hit by Penguin 2.0 in May (It was ranking 2nd for it's major keyword at the time) and it hasn't been indexing for that keyword since After a lot of link removal, the reconsideration request was accepted and the manual penalty had been removed. Rankings haven't really improved and that specific keyword has not been reindexed The site does have a lot of not found errors (It was 5.5k but recently taken down to 4k) but it was still ranking before the penalty. Is there anything you believe I'm missing? Is it the onsite errors that are flagging the site as unreliable? I thought it would still appear for the keyword if that was the case
Technical SEO | | Sandeep_Matharu0 -
Marketing URL
Hi, I need a bit of advice on marketing URL's. The destinations URL is http://www.website.com/by-development.php?area=Isle Of Wight&development=developmentname. If we wanted to use www.website.com/developmentname on literature to send people to the ugly URL above, what would we do? Would we need to rewrite the ugly URL to the neat and then 301 the ugly to the neat? Currently, the team are using a new domain of neatandrelevant.info and 301 redirecting it to ugly URL but there are lots of different developments they want to send people to so a new domain is bought for each development which seems a bit unnecessary. They point to different pages on the ugly URL website. Assuming canonical tag would not be needed then because the ugly URL page would be redirected. Also, as the website has ugly URL's anyway, would it not be best practice to use rewrites anyway so that the URL's read www.mywebsite.com/region/development? Would it confuse things to then have extra short marketing URL's missing out /region? Hope that makes sense....
Technical SEO | | Houses0 -
Second URL
Hi We have a .com and a .co.uk Main website is .co.uk, we also have a landing page for the .com If we redirect the .com to the .co.uk, will it create duplicate content ... May seem like a silly question, but want to be sure that that the visitors cant access our website at both urls, as that would be duplicate content Thanks in advance John
Technical SEO | | Johnny4B0 -
No Keyword in URL
SEOMoz (and other platforms) advise that I need to add my keyword to the page URL, however as far as I'm concerned it has been, so why don't these platforms see it. My home page URL is www.salesandinternetmarketing.com, but apparently I haven't added the keyword internet marketing to the URL, what advice can you give me please? Lindsay
Technical SEO | | lindsayjhopkins1 -
Changed URL of all web pages to a new updated one - Keywords still pick the old URL
A month ago we updated our website and with that we created new URLs for each page. Under "On-Page", the keywords we put to check ranking on are still giving information on the old urls of our websites. Slowly, some new URLs are popping up. I'm wondering if there's a way I can manually make the keywords feedback information from the new urls.
Technical SEO | | Champions0 -
/$1 URL Showing Up
Whenever I crawl my site with any kind of bot or a sitemap generator over my site. it comes up with /$1 version of my URLs. For example: It gives me hdiconference.com & hdiconference.com/$1 and hdiconference.com/purchases & hdiconference.com/purchases/$1 Then I get warnings saying that it's duplicate content. Here's the problem: I can't find these /$1 URLs anywhere. Even when I type them in, I get a 404 error. I don't know what they are, where they came from, and I can't find them when I scour my code. So, I'm trying to figure out where the crawlers are picking this up. Where are these things? If sitemap generators and other site crawlers are seeing them, I have to assume that Googlebot is seeing them as well. Any help? My developers are at a loss as well.
Technical SEO | | HDI0