Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple Local Schemas Per Page
-
I am working on a mid size restaurant groups site. The new site (in development) has a drop down of each of the locations. When you hover over a location in the drop down it shows the businesses info (NAP). Each of the location in the Nav list are using schema.org markup.
I think this would be confusing for search robots. Every page has 15 address schemas and individual restaurants pages NAP is at the below all the locations' schema/NAP in the DOM.
Have any of you dealt with multiple schemas per page or similar structure?
-
I help run a directory site and we have City pages with multiple listings. We don't mark up that page but we have a landing page for each location. The landing page for a location is what we markup that way.
If you look on sites like Yelp they do the same thing.
On the Dallas page there is no schema markup but
http://www.yelp.com/biz/eddie-vs-prime-seafood-dallas
If you visit a local restaurant then the schema markup shows up.
I was looking through the schema.org documentation
http://schema.org/docs/gs.html
Using the url property. Some web pages are about a specific item. For example, you may have a web page about a single person, which you could mark up using the Person item type. Other pages have a collection of items described on them. For example, your company site could have a page listing employees, with a link to a profile page for each person. For pages like this with a collection of items, you should mark up each item separately (in this case as a series of Persons) and add the url property to the link to the corresponding page for each item, like this:
_[itemprop="url">Alice Jones](alice.html) [itemprop="url">Bob Smith](bob.html)_
So in the example on Schema, you can tag the location with the type and then use the URL parameter to point to the actual page that has the information.
I then looked at CitySearch I did see an example of this
http://dallas.citysearch.com/find/section/dallas/restaurants.html
and
http://dallas.citysearch.com/profile/34327220/lewisville_tx/mama_s_daughters_diner.html
If you look at the code on the Dallas Restaurant pages they use Item List markup
itemscopeitemtype="http://schema.org/ItemList">
That is a list of Breakfast Restaurants in Dallas and then for each place on that page they will mark up
itemtype="http://schema.org/LocalBusiness" itemprop="itemListElement">
and
itemscope itemtype="http://schema.org/AggregateRating" itemprop="aggregateRating">
and then they reference the URL to the location page (as suggested above in the schema.org documentation)
[Check with your developer, but it looks like if you define the list of locations first, the spiders can see that all of the locations are a part of that list (vs the page being dedicated to a single location) then when you have the link to the landing page for the location you can do the full markup.
Good luck!](http://dallas.citysearch.com/profile/41040743/dallas_tx/breadwinners_cafe_bakery.html)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
Moving pages to new domain
Hello, Our product pages are ranked #1 on google for our target keywords using our domain e.g. www.olddomain.com/cases/productxyz and sell about 20 products all ranked #1. We have a new company called www.newco.com/case/product1, 2, 3 etc. We use woocommerce e-commerce for both old and new sites. What is the best way to list our old co-products on our new site and move over the #1 rankings? Do we create new products (using our new nice design) in the newco.com woo commerce and then redirect old co links? do we copy and paste all that old content into the newco.com? Totally confused. Thank you!
Web Design | | Jamesmcd031 -
Dead end pages are really an issue?
Hi all, We have many pages which are help guides to our features. These pages do not have anymore outgoing links (internal / external). We haven't linked as these are already 4th level pages and specific about particular topic. So these are technically dead end pages. Do these pages really hurt us? We need to link to some other pages? Thanks
Web Design | | vtmoz0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0 -
Indexing Dynamic Pages
Hi, I am having an issues among others, regarding indexing dynamic pages. Our website, www.me-by-melia, was just put live and I am concerned the bottom naviagtion pages (http://www.me-by-melia.com/#store, http://www.me-by-melia.com/#facebook, etc) will not be indexed and create duplicate pages. Also, when you open these pages in a new tab, it takes you to homepage. The website was created in HTML5. Please advise.
Web Design | | Melia0 -
Combining web pages and it's affects on SEO?
We are looking into amending a website we are working on to try and combine 2 or 3 current pages onto one page. This site is similar to an estate agents site and currently has images, map, floor plan sub pages etc. Can anyone tell me, if we were to combine these pages and include the above details on one page, how that would affect the current search engine rankings?
Web Design | | SoundinTheory0