Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Duplicate LocalBusiness Schema Markup
-
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information.
Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google.
Do the pros of having more detailed markup outweigh that potential negative impact?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the importance of exact match keywords for local SEO in service industry businesses?
I am working with a local service contractor. Several of his competitors have domain names with exact match keywords. Audits of competitor sites and use of other research tools reveals that their sites are behind in content and technical SEO. The competitor sites consistently rank higher in organic search results. I am new to SEO and I understand that some of my lack of clarity here is a result of not understanding the value of key word use in local SEO vs. wider efforts.
Technical SEO | | Andrew Woffenden15 -
Best redirect destination for 18k highly-linked pages
Technical SEO question regarding redirects; I appreciate any insights on best way to handle. Situation: We're decommissioning several major content sections on a website, comprising ~18k webpages. This is a well established site (10+ years) and many of the pages within these sections have high-quality inbound links from .orgs and .edus. Challenge: We're trying to determine the best place to redirect these 18k pages. For user experience, we believe best option is the homepage, which has a statement about the changes to the site and links to the most important remaining sections of the site. It's also the most important page on site, so the bolster of 301 redirected links doesn't seem bad. However, someone on our team is concerned that that many new redirected pages and links going to our homepage will trigger a negative SEO flag for the homepage, and recommends instead that they all go to our custom 404 page (which also includes links to important remaining sections). What's the right approach here to preserve remaining SEO value of these soon-to-be-redirected pages without triggering Google penalties?
Technical SEO | | davidvogel1 -
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Should I avoid duplicate url keywords?
I'm curious to know Can having a keyword repeat in the URL cause any penalties ? For example xyzroofing.com xyzroofing.com/commercial-roofing xyzroofing.com/roofing-repairs My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way. Also One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
Local Website Optimization | | Lyontups0 -
Looking for live web examples of Medical schema
Has anyone seen a hospital system or medical clinic properly employ schema markup to their sites? This seems like very new territory, and we want to do it right by our client. Are there any best practices I need to look out for?
Web Design | | Madgenius3 -
How does Google read multiple Geo Shape Schema Mark Up?
Hi Guys, I posted a question recently about "Can I have multiple areaServed mark up on one domain?" and the responses I got was no. My client work predominantly in the South East of England in specific towns, so I wanted to be able to list all the areas they service. However, after being told no, I went ahead anyway and put in multiple areaServed markup on the page to see if this generates any errors and it isn't when I run it through the Structured Data Testing Tool. I don't get any errors by doing this, so hurray! But... What I want to understand (which I can't find the answer anywhere), is if this is okay, and how will Google read my markup? Will Google see that we are in multiple areas across the SE of England and push my content up before other sites, or is this just going to confused Google? By putting in all these areas into the website as multiple locations, will Google identify that person X in area Y fits the areaServed mark up I've added and push my content to them? Overall... has anyone else used multiple areaServed markup and can validate that this works? hHpEyQf
Local Website Optimization | | Virginia-Girtz1 -
Local Business Schema Image requirement
Hello, I work exclusively with Dentists and we have been putting our json schema in the footer for a while now. Just recently they made 'image' a requirement for the Dentist category. We already use the logo in our schema and that is an image. Since the schema is in the footer, it is on every page, and the only image on every page is the logo. Does the image we add to our schema need to be on the actual web page or could it be anything related to the business, like an image of the practice or the dentist? Would it hurt to have the logo listed twice in the schema - once as the logo and once as the image? Trying to figure out what the best thing to do is for the required 'image' field for a dentist. Thanks! Angela
Local Website Optimization | | tntdental0 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0