Will having a big list of cities for areas a client services help or damage SEO on a page?
-
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page.
They service the entire southeast so the list just looks crazy ridiculous.
--------- Example: ----
South Carolina:
Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc
------ end example ------
The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
-
Right on! It worked for the tortoise.
-
Excellent suggestion. Slow and steady wins the race.
-
Scott,
Curious if the business in question has a blog? Could he blog about 'an engine I fixed for a client in Abbeville, SC', and put a content strategy in place to start blogging about his projects in his major cities? Maybe just start with the top 10 cities from which he gets orders for engine repair? Craft writeups of each project he accomplishes for a unique client in each city and make it a blog post. Then, move onto the 10 next-most-important cities. So, maybe he would be starting with the capitols of South Carolina, Florida, Georgia, Alabama, and then moving on to other busy cities.
Eventually, you could have a page on the site (or a menu area) designated Successful Project Showcase that would link permanently to these posts.
My goal here would be to find an authentic and natural approach for showcasing his work in a way that adds great content to the site and doesn't simply list every city in the South East. This strategy, in combination with his service area map, could work well, I believe.
-
That certainly solves the design problem, but would not help someone in in Abbeville, South Carolina find the business (and the business certainly won't have a unique landing page for such a small city). Decisions, decisions. Thanks for the suggestions.
-
While I can't say this would results in an actual penalty, as you say, it looks spammy, so anything like that is kind of shaky ground.
Have you considered making a service area map instead, showing all of the client's service states/cities?
If he services every city in every state of the South East, I simply cannot find a logical justification for listing them all. A map would send the same message, but in a logical, visual manner.
-
Good answers. They do some seriously technical stuff with broken engines. They only have one location, but because it's so niche and there are so few competitors they have clients all over the country that ship their engines to the client in Florida for repairs.
It certainly looks spammy design wise (and we'll find ways to rectify that with some jquery drops), but I'm more concerned with any potential penalty this might cause, if any.
-
Hi Christopher,
Yes, I'd say that would end up looking pretty spammy if they've got a list like this for every state in the South East on their contact page. For the same reason that an e-commerce website wouldn't list all 1000 items they carry on a single page, this is not something I'd recommend.
What's the business model? Virtual or Local? If local, a more natural approach to this would be to have unique pages for each of their physical offices. I very much doubt they have an office in every one of those cities in South Carolina, right? But, perhaps they have 10 offices throughout the South East and could have a unique page for each of them?
Maybe you could share a few more details about the type of business this is?
-
I would create a page called "Service Area" and put an unordered list (ul) may look nicer, and is less spammy. Without knowing the product or service, I'm not sure if that will work for you.
Ex:
South Carolina
- Abbeville
- Aiken
- Allendale
- Anderson
- etc.
Georgia
- Atlanta
- Blah
- Clah
- Dlah
Most importantly - DO NOT post that list in the footer or sidebar of every page. It will significantly dilute the effectiveness. Containing this information on a single page, and peppering the rest of the site with some of your larger markets will be likely most effective for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a DV SSL Certificate Help with SEO more than a standard SSL?
I've found information on both sides of the topics, so I'd love to get some thoughts and insights from people here. Does a DV SSL Certificate have more SEO benefits than a standard SSL certificate? I understand the "boost" from an SSL is minimal, but is there extra value in the DV? Some have argued that since a DV SSL is "certified", that Google gives it more preference over a standard SSL. While others have argued that a DV SSL is only required for specific types of websites, which is why Google doesn't give them special preference. Thoughts? Opinions? Proof? Thanks everyone!
Technical SEO | | McFaddenGavender0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
SEO Ramifications of migrating traditional e-commerce store to a platform based service
Hi I'm thinking of migrating my 11 year old store to a hosted platform based e-commerce provider such as Shopify or Amazon hosted solutions etc etc I'm worried though that will lose my domains history and authority if i do so Can anyone advise if this is likely or will be same as a 301 redirect etc etc and should be fine ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Page titles in browser not matching WP page title
I have an issue with a few page titles not matching the title I have In WordPress. I have 2 pages, blog & creative gallery, that show the homepage title, which is causing duplicate title errors. This has been going on for 5 weeks, so its not an a crawl issue. Any ideas what could cause this? To clarify, I have the page title set in WP, and I checked "Disable PSP title format on this page/post:"...but this page is still showing the homepage title. Is there an additional title setting for a page in WP?
Technical SEO | | Branden_S0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Client error 404 pages!
I have a number of 404 pages coming up which are left over in Google from the clients previous site. How do I get them out of Google please?
Technical SEO | | PeterC-B0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Will a drop in indexed pages significantly affect Google rankings?
I am doing some research into why we were bumped from Google's first page into the 3rd, fourth and fifth pages in June of 2010. I always suspected Caffeine, but I just came across some data that indicates a drop in indexed pages from 510 in January of that year to 133 by June. I'm not sure what happened but I believe our blog pages were de-indexed somehow. What I want to know is could that significant drop in indexed pages have had an effect on our rankings at that time? We are back up to over 500 indexed pages, but have not fully recovered our first page positions.
Technical SEO | | rdreich490