Location Pages and Duplicate Content and Doorway Pages, Oh My!
-
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services.
Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc.
They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well.
My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names".
In a nutshell, Google's Guidelines seem to have a conflict on this topic:
Location Pages: "Have each location's or branch's information accessible on separate webpages"
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one."Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page:
Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content."
...starting to feel like I'm in a Google Guidelines Paradox!
Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?
-
Thanks for the comment Laura!
I was aware of the fact duplicate content wasn't the issue, but it just baffled me that this very obvious black-hat tactic wasn't punished by Google in any way. Even though their guidelines clearly stated doorway pages are a big "no-no".
Let's hope the December 2017 update has a noticeable impact
Have a nice day!
-
The Panda filter is just that, a filter. It doesn't remove pages from the index, and you won't get a manual penalty because of it.
In the case of duplicate content, Google chooses the most relevant or original content and filters out the duplicates. On the other hand, when a website has multiple pages with the same content, that can affect the overall quality of the entire website. This can affect search performance as well.
Then there's the issue of doorway pages, which are duplicate pages created for the purpose of funneling visitors to the same destination. This goes against Google's guidelines, and they confirmed a December 2017 algorithm update that affects sites using doorway pages.
-
Hi Laura,
It seems like this age-old black-hat tactic still works though. Maybe only outside of the US? Check out this SERP: https://www.google.be/search?q=site:trafficonline.be+inurl:seo-&ei=Z0RnWqHED47UwQLs5bkQ&start=0&sa=N&filter=0&biw=1920&bih=960&num=100
You don't have to understand the language to see that this is almost the same identical page, purely setup to rank well for localized terms (city names). Each page has the same exact content but uses some variables as to not have the exact same text: nearby city names, a Google Map embed, and even some variables for the amount of people living in a city (as if that's relevant information for the user). The content itself is really thin and the same for all cities.
The crazy thing is this site ranks well for some city names in combination with their keywords, even though it's very clearly using black-hat SEO tactics (doorway pages) to manipulate rankings for localized search terms. I would think websites that so blatantly violate the Google Guidelines would be completely removed from the search index, but that definitely isn't the case here.
Any thoughts as to why sites like this aren't removed for violating Google's terms and conditions? Or how I could keep telling our clients they can't use black hat tactics because Google might remove them from the index, even though it appears the chance of such a removal is almost non-existent?
Thanks in advance,
Kind regards -
Some great ideas: Content Creation Strategy for Businesses with Multiple Location Pages
-
Yeah it seems like the best logical answer is that each location page needs unique content developed for it. Even though it still kinda feels a little forced.
Goes to show you that Google has really pushed SEO firms to think differently about content and when you have to do something just for SEO purposes it now feels icky.
Yes creating unique content for that page for that location can be seen as useful to the users but it feels a little icky because the user would probably be satisfied with the core content. But we're creating unique location specific content mostly to please Google... not the user.
For example what if Walmart came to this same conclusion. Wouldn't it be a little forced if Walmart developed pages for every location that had that locations weather, facts about the city, etc?
Due to it's brand it's able to get away with the thin content version of location pages: http://www.walmart.com/store/2300/details they don't even use the markup... but any SEO knows you can't really follow what is working for giant brand like Walmart.
-
In response to the extra landing pages, our key thing for our business following on from the above comments is to remember that fresh and unique content is best.
We have spent a lot of money on our websites as well as clients in building extra pages, what we do is have a plan. For example if we have 30 pages to add, we spread this over a period of weeks/months. Rather than bashing them all out together. We do everything in a natural organic manner.
Hope this helps, it is our first post!
-
Welcome to my hell! I have 18 locations. I think it's best practice to have a location page for each location with 100% original content. And plenty of it. Yes, it seems redundant to talk about plumbing in Amherst, and plumbing in Westfield, and plumbing in...wherever. Do your best and make the content valuable original content that users will find helpful. A little local flair goes a long way with potential customers too and also makes it pretty clear you're not spinning the same article. That said, with Google Local bulk spreadsheet uploads, according to the people I've spoken with at Google, your business description can be word for word the same between locations and it won't hurt your rank in the maps/local packs one bit. Hope this helps!
-
These do appear to be contradictory guidelines until you understand what Google is trying to avoid here. Historically, SEOs have tried to rank businesses for geo-specific searches in areas other than where a business is located.
Let's say you run a gardening shop in Atlanta and you have an ecommerce side of the business online. Yes, you want to get walk-in traffic from the metro Atlanta area, but you also want to sell products online to customers all over the country. Ten years ago, you might set up 50 or so pages on your site with the exact same content with the city, state switched out. That way you could target keywords like the following:
- gardening supplies in Nashville, TN
- gardening supplies in Houston, TX
- gardening supplies in Seattle, WA
- gardening supplies in San Francisco, CA
- and so on...
That worked well 10 years ago, but the Panda update put a stop to that kind of nonsense. Google understands that someone searching for "gardening supplies in Nashville, TN" is looking for a brick and mortar location in Nashville and not an ecommerce store.
If you have locations in each of those cities, you have a legitimate reason to target the above search queries. On the other hand, you don't want to incur the wrath of Google with duplicate content on your landing pages. That's why the best solution is to create unique content that will appeal to users in that location. Yes, this requires time and possibly money to implement, but it's worth it when customers are streaming through the door at each location.
Check out Bright Local's recent InsideLocal Webinar: Powerful Content Creation Ideas for Local Businesses. They discussed several companies that are doing a great job with local landing page content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Page URL Question
Our main website is geared toward the city where we are located and includes the city name in content page URLs. We also have separate websites for three surrounding cities; these websites have duplicate content except the city name: MainWebsite.com
Local Website Optimization | | sharon75025
City2-MainWebsite.com
City3-MainWebsite.com
City4-MainWebsite.com We're restructuring to eliminate the location websites and only use the main website. The new site will have city pages. We have well established Google business locations for all four cities. We will keep all locations, replacing the location website with the main website. Should we remove City-IL from all content page URLs in the new site? We don't want to lose traffic/ranking for City2 or City3 because the content pages have City1 in the URL. Page URLs are currently formatted as follows: www.MainWebsite.com/Service-1-City1-IL.html
www.MainWebsite.com/Service-2-City1-IL.html
www.MainWebsite.com/Service-3-City1-IL.html
www.MainWebsite.com/Service-4-City1-IL.html Thanks!0 -
My pages are absolutely plummeting. HELP!
Hi all, Several of my pages have absolutely tanked in the past fortnight, and I've no idea why. One of them, according to Moz, has a Page Optimisation Score of 96, and it's dropped from 10th to 20th. Our DA is lower than our competitors, but still, that's a substantial drop. Sadly, this has been replicated across the site. Any suggestions? Cheers, Rhys
Local Website Optimization | | SwanseaMedicine0 -
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Improve my on-page SEO
Hello, I am a photographer based in the UK, I have recently increased my prices, so SEO has become more important then ever as I need to target additional cities and wedding venues. I am looking for suggestions on ways I can ethically improve my websites on-page SEO and regional landing pages. I am running out of ideas, so any suggestions would be welcome. Do you think search engines will see these regional pages as low quality spammy pages are they not advised! If so how can I target other cities with out paying for PPC. Home page Additional Issues Is the 404 server script any good? I also have an issue, with old deleted wordpress pages, redirecting them even though there are no redirects set up in SEO yoast. I am not sure the server script on the shared hosting for 404 errors is any good, does anyone have any experience with this. For example this page returns the 404 page, however the header status is 200. http://www.robertsail.co.uk/derby-wedding-photographers-2/ If I moved to a dedicated server would this help me out.
Local Website Optimization | | Roboto19701 -
Home page links -- Ajax When Too Many?
My home page has links to major cities. If someone chooses a specific city I want to give them the choice to choose a suburb within the city, With say 50 cities and 50 suburbs for each city that's 2500 links on the home page. In order to avoid that many links on the home page (or any page) I would like to have just the 50 cities and pull up the suburbs as an ajax call that search engines would not read/crawl. This would be better than clicking on a main city and then getting the city page which they then can choose a suburb. Better to do it all at once. Is it a bad idea to ajax the subregions on the home page and to code it so Google, Bing, other search engines don't crawl or even see anything on the home page related to the suburbs? The search engines will still find the suburb links because they will be listed on each main city page.
Local Website Optimization | | friendoffood0 -
How to handle clients who want to target far away from their location?
In general, How do you recommend handling clients that are persistent about targeting a location that is very far away from their physical location, i.e. the client is in Providence, RI, but wants to target Boston, MA. I typically give them a discussion about how they will not rank in the 7 packs, particularly post pigeon, but wanted to know if the Moz community had any other tips since this seems to come up so frequently. Thank you!
Local Website Optimization | | Red_Spot_Interactive1 -
Stuck on Page 4...is this diagnosis on the right track?
My website's (http://bartlettpairphotography.com) SERP rank is #45 for my targeted keyword: Philadelphia wedding photographers. My site is several years old, with 31-Domain Authority and 42-Page Authority. I've been stuck in SERP 40's for about a year now (I used to be top 5) and I have been pulling my hair out trying everything to no avail. I have an inkling that some configuration is seriously wrong, and would be very very appreciative is someone could point me in the right direction! I'm evidently not an expert at this, but here are my high level thoughts, though I could be totally off base here: Homepage problems (ranking 45 for highest priority keyword: Philadelphia wedding photographers): The #5 rank has a flash website, homepage = 33-DA/44-PA (slightly better than me). This makes me wonder if my problem is off-page? I have recently been submitting my photography work to many relevant wedding blogs so I think I will get some nice relevant backlinks in the coming weeks/months. The #11 rank has the same wordpress theme as me (ProPhotoBlogs), and homepage = 26-DA, 35-PA (somewhat worse than me) and similar homepage content etc...this makes me think I have an on-page problem? As you can see, my targeted keyword starts off with a geographic location. Geographically, our location is ~1 hour outside of the location, so ranking on Google maps etc. is very competitive (hundreds of competitors that are closer). Therefore, I'm mostly focused on non-local ranking. Both of the competitors I mentioned are ranking non-locally and both are 1 hour outside Philadelphia. With that said, would it still benefit me to add local content to my homepage (insert google maps, address, hours etc.)? NON-homepage problems (ranking ~30 for longer tail keywords, i.e. specific wedding venues) My blog page (http://bartlettpairphotography.com/blog) is ="noindex,follow." My reasoning for the "noindex" is because I'm showing FULL posts rather than excerpts (because I want my brides to flip through ~5 weddings rather than only clicking on 1). My thinking was that the FOLLOW aspect would pass along the link juice, while avoiding a duplicate content penalty by noindexing? I don't think this problem affects my higher priority homepage problem, but still wanted to point it out. We have ~100 published posts, but honestly I only care about ranking for ~30 of them. What should I do with the ~70 that I don't care about? Are they sucking up link juice that would be better elsewhere? Or should I just leave it because it's more content? Other than that, I'm really lost as to how I can improve my site. I gave the above examples to show that I am trying, but ultimately I feel like I'm looking in the wrong areas. With my SERP in the mid 40s, I feel like many things are broken that I am not able to figure out. I would be so very grateful if someone could help diagnose my issues!
Local Website Optimization | | bartlettpairphoto0 -
One location performing worse than the rest despite no major difference in SEO strategy
Hi all, I'm flummoxed. I'm dealing with a business that has 15 or so offices in three cities, and one city is performing horribly (this includes every office therein). The other two cities have shown consistently stellar results with massive traffic increases month over month for the past year; the city in question dropped unexpectedly in June and hasn't ever recovered. We didn't perform any major website changes during or immediately prior to that time period, and the website in general hasn't been negatively affected by Hummingbird. All locations for the business are optimized in the exact same way and according to best practices; there's no significant difference in the number of local listings, reviews, G+ fans, social signals, etc across locations. All meta data and content is optimized, NAPs are all consistent, we've built links wherever we can: the SEO for every location has been by-the-books. We've run a competitor audit in this particular city that included pulling our top competitors and exploring their domain authority, meta data, on-page keyword grade for the term we're trying to rank for, number and type of inbound links, social signals, and more; and we didn't spot any patterns or any websites that were significantly outperforming us in any area (besides actual rankings). It's frustrating because the client is expecting a fix for this city and I can't find anything that needs to be fixed! Have any multi-local SEOs out there run into a similar problem? What did you do about it?
Local Website Optimization | | ApogeeResults0