Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Location Pages and Duplicate Content and Doorway Pages, Oh My!
-
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services.
Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc.
They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well.
My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names".
In a nutshell, Google's Guidelines seem to have a conflict on this topic:
Location Pages: "Have each location's or branch's information accessible on separate webpages"
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one."Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page:
Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content."
...starting to feel like I'm in a Google Guidelines Paradox!
Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?
-
Thanks for the comment Laura!
I was aware of the fact duplicate content wasn't the issue, but it just baffled me that this very obvious black-hat tactic wasn't punished by Google in any way. Even though their guidelines clearly stated doorway pages are a big "no-no".
Let's hope the December 2017 update has a noticeable impact
Have a nice day!
-
The Panda filter is just that, a filter. It doesn't remove pages from the index, and you won't get a manual penalty because of it.
In the case of duplicate content, Google chooses the most relevant or original content and filters out the duplicates. On the other hand, when a website has multiple pages with the same content, that can affect the overall quality of the entire website. This can affect search performance as well.
Then there's the issue of doorway pages, which are duplicate pages created for the purpose of funneling visitors to the same destination. This goes against Google's guidelines, and they confirmed a December 2017 algorithm update that affects sites using doorway pages.
-
Hi Laura,
It seems like this age-old black-hat tactic still works though. Maybe only outside of the US? Check out this SERP: https://www.google.be/search?q=site:trafficonline.be+inurl:seo-&ei=Z0RnWqHED47UwQLs5bkQ&start=0&sa=N&filter=0&biw=1920&bih=960&num=100
You don't have to understand the language to see that this is almost the same identical page, purely setup to rank well for localized terms (city names). Each page has the same exact content but uses some variables as to not have the exact same text: nearby city names, a Google Map embed, and even some variables for the amount of people living in a city (as if that's relevant information for the user). The content itself is really thin and the same for all cities.
The crazy thing is this site ranks well for some city names in combination with their keywords, even though it's very clearly using black-hat SEO tactics (doorway pages) to manipulate rankings for localized search terms. I would think websites that so blatantly violate the Google Guidelines would be completely removed from the search index, but that definitely isn't the case here.
Any thoughts as to why sites like this aren't removed for violating Google's terms and conditions? Or how I could keep telling our clients they can't use black hat tactics because Google might remove them from the index, even though it appears the chance of such a removal is almost non-existent?
Thanks in advance,
Kind regards -
Some great ideas: Content Creation Strategy for Businesses with Multiple Location Pages
-
Yeah it seems like the best logical answer is that each location page needs unique content developed for it. Even though it still kinda feels a little forced.
Goes to show you that Google has really pushed SEO firms to think differently about content and when you have to do something just for SEO purposes it now feels icky.
Yes creating unique content for that page for that location can be seen as useful to the users but it feels a little icky because the user would probably be satisfied with the core content. But we're creating unique location specific content mostly to please Google... not the user.
For example what if Walmart came to this same conclusion. Wouldn't it be a little forced if Walmart developed pages for every location that had that locations weather, facts about the city, etc?
Due to it's brand it's able to get away with the thin content version of location pages: http://www.walmart.com/store/2300/details they don't even use the markup... but any SEO knows you can't really follow what is working for giant brand like Walmart.
-
In response to the extra landing pages, our key thing for our business following on from the above comments is to remember that fresh and unique content is best.
We have spent a lot of money on our websites as well as clients in building extra pages, what we do is have a plan. For example if we have 30 pages to add, we spread this over a period of weeks/months. Rather than bashing them all out together. We do everything in a natural organic manner.
Hope this helps, it is our first post!
-
Welcome to my hell! I have 18 locations. I think it's best practice to have a location page for each location with 100% original content. And plenty of it. Yes, it seems redundant to talk about plumbing in Amherst, and plumbing in Westfield, and plumbing in...wherever. Do your best and make the content valuable original content that users will find helpful. A little local flair goes a long way with potential customers too and also makes it pretty clear you're not spinning the same article. That said, with Google Local bulk spreadsheet uploads, according to the people I've spoken with at Google, your business description can be word for word the same between locations and it won't hurt your rank in the maps/local packs one bit. Hope this helps!
-
These do appear to be contradictory guidelines until you understand what Google is trying to avoid here. Historically, SEOs have tried to rank businesses for geo-specific searches in areas other than where a business is located.
Let's say you run a gardening shop in Atlanta and you have an ecommerce side of the business online. Yes, you want to get walk-in traffic from the metro Atlanta area, but you also want to sell products online to customers all over the country. Ten years ago, you might set up 50 or so pages on your site with the exact same content with the city, state switched out. That way you could target keywords like the following:
- gardening supplies in Nashville, TN
- gardening supplies in Houston, TX
- gardening supplies in Seattle, WA
- gardening supplies in San Francisco, CA
- and so on...
That worked well 10 years ago, but the Panda update put a stop to that kind of nonsense. Google understands that someone searching for "gardening supplies in Nashville, TN" is looking for a brick and mortar location in Nashville and not an ecommerce store.
If you have locations in each of those cities, you have a legitimate reason to target the above search queries. On the other hand, you don't want to incur the wrath of Google with duplicate content on your landing pages. That's why the best solution is to create unique content that will appeal to users in that location. Yes, this requires time and possibly money to implement, but it's worth it when customers are streaming through the door at each location.
Check out Bright Local's recent InsideLocal Webinar: Powerful Content Creation Ideas for Local Businesses. They discussed several companies that are doing a great job with local landing page content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
Do you use HREF lang tags when each page that is localised only exists in that language?
Hi, I have 2 questions I am seeking an answer for. We have a home page in english GB, we then also have products which are specifically served in US. For these pages where the phone number is american, the spelling is american, the address is american, do we need to implement href lang tags? The page isn't a version of another page in english, the page is only in the native language.Secondly, is it recommended to create a second home page and then localise that page for US users?I'd be really greatful if anyone has any pointers as googles forum doesn't explain best practice for this case (as far as I can tell).Many thanks
Local Website Optimization | | Adam_PirateStudios0 -
Service Area Location Pages vs. User Experience
I'm familiar with the SAB best practices outlined here. Here's my issue: Doing local landing pages as described here might not be ideal from a user experience point of view. Having a "Cities We Serve" or "Service Areas" link in the main navigation isn't necessarily valuable to the user when the city-specific landing pages are all places within a 15-mile radius of the SAB's headquarters. It would just look like the company did it for SEO. It wouldn't look natural. Seriously, it feels like best practices are totally at odds with user experience here. If I absolutely must create location pages for 10 or so municipalities within my client's service area, I'd rather NOT put the service areas as a primary navigation item. It is not useful to the user. Anyone who sees that the company provides services in the [name of city] metropolitan area will already understand that the company can service their town that is 5 miles away. It is self-evident. For example**, who would wonder whether a plumbing company with a Los Angeles address also services Beverly Hills?** It's just... silly. But the Moz guide says I've got to do those location pages! And that I've got to put them high up in the navigation! This is a problem because we've got to do local SEO, but we also have to provide an ideal experience. Thoughts?
Local Website Optimization | | Greenery1 -
Are core pages considered "cornerstones"?
To check that I understand the terminology, "cornerstone articles" are posts (or pages) that have some extensive, detailed, important information about a subject that other blog posts and articles can link to in reference, right? For example, a website for an auto repair shop might have a blog post about what cold weather does to a car's transmission and that post could link to a cornerstone "explainer" article that goes into more detail explaining to car-dummies like me what a transmission even DOES. But are core pages also in this category of cornerstone content? Or are they something entirely different and should be constructed accordingly? By "core pages", I mean the base-level pages about what your business is and does. For the repair shop example, I mean things like an "About Us" page or a "Services" page*. *or broken up into individual pages listing the services related to brakes, engine, wheels, etc. Thanks!
Local Website Optimization | | BrianAlpert780 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Duplicate Schema within webpage
I'm implementing schema across a few Wordpress sites. Most (probably all) WP sites use widgets for their footer, which offer their own editable HTML. Is it damaging (or helpful) to implement the exact same markup in the footer and a specific page, like for instance, a locations page that has the address and contact info (which are also in the footer)?
Local Website Optimization | | ReunionMarketing0 -
Schema for same location on multiple sites - can this be done?
I'm looking to find more information on location/local schema. Are you able to implement schema for one location on multiple different sites? (i.e. - Multiple brands/websites (same parent company) - the brands share the same location and address). Also, is schema still important for local SEO? Thank you in advance for your help!
Local Website Optimization | | EvolveCreative0 -
Are there any suggestions when you completly redesign your web page keeping the same domain but change the host? I want it to go smoothly and want to avoid the rankings we already have including sub pages.
I am currently having our website completely redone by a design company. Are there any suggestions on this process as to not lose the rankings we currently have for our site? The domain will remain the same however we are planning on changing our host. We also have a good amount of sub domains that the web company will not be changing for us.
Local Website Optimization | | molchman0