Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Duplicate LocalBusiness Schema Markup
-
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information.
Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google.
Do the pros of having more detailed markup outweigh that potential negative impact?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New business / content marketing
Hi all SEO experts, if a website is brand new, so published in the last 3 months- new domain name and website design. We have rebranded recently, using a new domain as entered new business partnership, there doesn’t seem to be much guidance on this at all, from various SEO websites, so our question is would you delay publishing new blog posts / content marketing as frequently because the company website is brand new? So would SEO’s decrease the frequency of publication of blog posts, because the website is new? Or perhaps it does not matter, and would still post every week as you would if the website has been live for a long time? So, in nutshell, what we are wondering is, is the “Google Sandbox” still in use?
Local SEO | | Ryan070 -
Schema Markup Validator vs. Rich Results Test
I am working on a schema markup project. When I test the schema code in the Schema Markup Validator, everything looks fine, no errors detected. However, when I test it in the Rich Results Test, a few errors come back.
Intermediate & Advanced SEO | | Collegis_Education
What is the difference between these two tests? Should I trust one over the other?1 -
Looking for live web examples of Medical schema
Has anyone seen a hospital system or medical clinic properly employ schema markup to their sites? This seems like very new territory, and we want to do it right by our client. Are there any best practices I need to look out for?
Web Design | | Madgenius3 -
Can I use Schema zip code markup that includes multiple zip codes but no actual address?
The company doesn't have physical locations but offers services in multiple cities and states across the US. We want to develop a better hyperlocal SEO strategy and implement schema but the only address information available is zip codes, names of cities and state. Can we omit the actual street address in the formatting but add multiple zipcodes?
Local Website Optimization | | hristina-m0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0 -
Duplicate content question for multiple sites under one brand
I would like to get some opinions on the best way to handle duplicate / similar content that is on our company website and local facility level sites. Our company website is our flagship website that contains all of our service offerings, and we use this site to complete nationally for our SEO efforts. We then have around 100 localized facility level sites for the different locations we operate that we use to rank for local SEO. There is enough of a difference between these locations that it was decided (long ago before me) that there would be a separate website for each. There is however, much duplicate content across all these sites due to the service offerings being roughly the same. Every website has it's own unique domain name, but I believe they are all on the same C-block. I'm thinking of going with 1 of 2 options and wanted to get some opinions on which would be best. 1 - Keep the services content identical across the company website and all facility sites, and use the rel=canonical tag on all the facility sites to reference the company website. My only concern here is if this would drastically hurt local SEO for the facility sites. 2 - Create two unique sets of services content. Use one set on the company website. And use the second set on the facility sites, and either live with the duplicate content or try and sprinkle in enough local geographic content to create some differential between the facility sites. Or if there are other suggestions on a better way to handle this, I would love to hear any other thoughts as well. Thanks!
Local Website Optimization | | KHCreative0 -
Local Business Schema Markup on every page?
Hello, I have two questions..if someone could shed some light on the topic, I would be so very grateful! 1. I am still making my way through how schema is employed, and as I can tell, it is much more specific (and therefore relevant) in its details than using the data highlighter tool. Is this true? 2. Most of my clients' sites have a footer with the local business info included on every page of their site (address and phone). This said, I have been using the structured data markup helper to add local business schema to home page, and then including the footer markup in the footer file so that every page benefits from the local business markup. Is this incorrect to use it for every page? Also, I noticed that by just using the footer markup for the rest of the pages in the site, I am missing data that was included when I manually went through the index page (i.e. image, url, name of business). Could someone tell me if it is advisable and worth it to manually markup every page for the local business schema or if that should just be used for certain pages such as location, contact us, and/or index? Any tips or help would be greatly appreciated!!! Thanks
Local Website Optimization | | lfrazer0