URL Importance In Search
-
This may have been addressed before. If it is, please link me to the thread. I'm trying to SEO for local surrounding cities my client services. It was suggested I purchase domains relevant to those cities and create separate pages optimized for those local keywords. Wondering if this is a good tactic.
For example my client's business is located in Chicago, but services the surrounding suburbs of Chicago. Whats the current, best way to SEO?
-
One website with cities n url's for other pages is strongest I would think.
-
Hi Russel,
If you were my client, I would advise you to stick with your one website and then work on building out awesome content for your surrounding cities. Why would I advise this:
-
Everything you build on your one website serves to strengthen your overall presence. Rather than dividing things up onto a bunch of different domains, you're building a big, powerful website.
-
I seldom find it warranted for a single business to have more than one website. For example, a San Francisco-based plumber is offering identical services, whether within San Francisco, or traveling to serve clients in San Jose, Oakland and Mill Valley. I'm not convinced of the genuine need for him to have millvalleyplumbing.com, sanjoseplumbing.com, etc. To me, it looks like an obvious attempted grab for rankings rather than a client-focused decision.
-
Tied into that is the fact that few plumbers have the time to develop enough truly unique and helpful content to flesh out multiple websites. This puts the client in danger of suffering EMD or duplicate content penalties for publishing a bunch of thin content (or worse, duplicate content) websites. From my experience working with local business owners, I've found that an attainable goal is creating a really strong, unique page for each of their service cities...not creating whole websites for each of their service cities. There could, of course, be exceptions to this, but this is what I've found to be the case most of the time.
-
It's so much easier for the client (and you) to manage a single website than a bunch of different mini sites.
So, those are some reasons I'd advise going with a single website and creating city landing pages on that site for the client's service cities. Do not create duplicate pages. Create fantastic, unique pages. This is a must. Hope these make sense. And yes, write good URLs, for the city landing pages when you create them.
-
-
how does duplicate content affect this strategy?
-
One website, one URL with internal pages dedicated to regions.
The only reason for separate sites would be if you were trying to get exact match domain names.
-
Hi Russell,
You can either purchse separate domains for each location or create separate pages for each city under the same domain. Both way works. But I would prefer using the same domain and create different pages for each location. This way, you will have all the backlinks/link juice to one website instead of having to build differetn backlinks for each website.
For each location, you can create a page and insert the location name in the URL so that Search Engines and visitors will know that that page is for that location. For example: www.sample.com/locationA/services/folder, www.sample.comlocationB/services/folder
Hope this helps. .
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Google search console: 404 and soft 404 without any back-links. Redirect needed?
Hi Moz community, We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed? Thanks
Algorithm Updates | | vtmoz0 -
Google search analytics position - how is it worked out
In our Google search analytic s graphs total clicks and impressions appear as a sold line on the graph(ie showing a result for each day) Position only shows as an occasional dot or line - not a continuous result for each day) sometimes there are days with no result for position. How do google get these results
Algorithm Updates | | CostumeD0 -
Are internal search meta keywords necessary?
One of my duties is to improve internal search results. From my reading, good site architecture and content is the way to achieve that. However, for example, are internal meta keywords necessary? I know they haven't mattered externally for a long time, but somebody in my department mentioned wanting this functionality in our CMS. I just want to make sure I'm not missing anything.
Algorithm Updates | | SSFCU0 -
Organic Traffic dropped 50%. Anyone want to have a stab at why? (URL listed)
Just curious what the pro's on here think is the reason why our site got hammered recently. The URL is www.jobshadow.com. We've got gobs of quality content that had been ranking for quite a few keywords. Even one from Rand himself http://www.jobshadow.com/interview-with-seo-and-seomoz-founder-rand-fishkin/ Rankings for even the exact match domain keyword 'Job Shadow' have been pummeled. Anyway, we've got a pretty solid link profile I would think. We also have a very high user time on the site, thus suggesting the organic traffic was engaged when Google ranked us for those keywords. We have lots of unsolicited inbound links and even recent ones from PBS. I'm not really sure what it takes to please the "machine" at this point. Curious as to what everyone here thinks.
Algorithm Updates | | arkana0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
301 Redirect has removed search rankings
As per instructions from a SEO , we did a 301 redirect on our url to a new url (www.domain.com to subdomain xxxx.domain.com). But the problem is we lost all the google rankings that the previous url had gained. How can we rollback this situation. Can we retrieve the rankings of the previous url if we remove 301 permenant move redirection ? The new url does not figure in the google search for the keyword that use to fetch the previous url at no 3 in the results Please help ...
Algorithm Updates | | BizSparkSEO0