What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
-
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter?
The Local business properties section of developers.google.com says to:
Define each local business location as a LocalBusiness type. Use the most specific
LocalBusiness
sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on.Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype).
I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all?
I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course).
I do feel this is important and that a better workflow could be the answer.
If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
-
See the comment(s) here from danbri (Dan Brickley, Schema.org webmaster), last one matters though and should address your exact situation. You may not even need to make any changes really, see if you need to and see if it is worth the effort and time.
https://github.com/schemaorg/schemaorg/issues/1109#issuecomment-212480234
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of .us vs .com on SEO rankings?
Our website is hosted on www.discovered.us. I have 2 questions: 1: we have had regular feedback a .us domain is negative in SEO and in conversion (customers don't like it). We are thinking of changing domain to: www.dscvrd.com.
Local Website Optimization | | Discovered
Any insights on the impact on our rankings (if any) if we do this? 2: we are focusing our SEO global / USA first but conversions in UK are better. We currently do not have multi-language SEO setup. What would the impact be of implementing www.discovered.co.uk on SEO in UK? Thanks! Gijsbert0 -
Local SEO Best Practise?
We are planning to localize our website by launching CCTLD. But there is a little confusion about some aspects, which are: Should we track location and take our visitors to their native domain? Or do we need to take our visitors to .com domain and show a Popup, if they want to visit the native region website? What is the best case study for localization?
Local Website Optimization | | UmairGadit0 -
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install. Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets. The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Local Website Optimization | | Adam_RushHour_Marketing
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.com Should we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com? Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution? Thank you so much for the help.0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Listing bundle info on site and on local SEO page.
We just finished a new telecom site, and like all telecom sites (think AT&T, Verizon, Suddenlink, etc.), we allow people to put their location in and find internet and phone service packages (what we call bundles) unique to their area. This page also has contact information for the local sales team and some unique content. However, we're about to start putting up smaller, satellite pages for our local SEO initiative. Of course, these pages will have unique content as well, but it will have some of the same content as what's on the individual bundle page, such as package offerings, NAP, etc. Currently this is the URL structure for the bundles: domain.com/bundles/town-name/ This is what I'm planning for the local SEO pages: domain.com/location/town-name-state/ All local FB pages, Google listings, etc. will like to these location pages, rather than the bundle pages. Is this okay or should I consolidate them into one?
Local Website Optimization | | AMATechTel0 -
Will subdomains with duplicate content hurt my SEO? (solutions to ranking in different areas)
My client has offices in various areas of the US, and we are working to have each location/area rank well in their specific geographical location. For example, the client has offices in Chicago, Atlanta, Dallas & St Louis. Would it be best to: Set up the site structure to have an individual page devoted to each location/area so there's unique content relevant to that particular office? This keeps everything under the same, universal domain & would allow us to tailor the content & all SEO components towards Chicago (or other location). ( example.com/chicago-office/ ; example.com/atlanta-office/ ; example.com/dallas-office/ ; etc. ) Set up subdomains for each location/area...using the basically the same content (due to same service, just different location)? But not sure if search engines consider this duplicate content from the same user...thus penalizing us. Furthermore, even if the subdomains are considered different users...what do search engines think of the duplicate content? ( chicago.example.com ; atlanta.example.com ; dallas.example.com ; etc. ) 3) Set up subdomains for each location/area...and draft unique content on each subdomain so search engines don't penalize the subdomains' pages for duplicate content? Does separating the site into subdomains dilute the overall site's quality score? Can anyone provide any thoughts on this subject? Are there any other solutions anyone would suggest?
Local Website Optimization | | SearchParty0 -
URL structure for local SEO
Hi fokes, question; which url structure is best for local rankings. For example: when I want to rank on the keyword: "Plumber Londen". And I dont have plumber in my brand. What is the best url structure: example.com/plumber/londen example.com/plumber-londen
Local Website Optimization | | remkoallertz1 -
Merging two pages into one - bad seo done previously
Hi, I have two pages Page 1
Local Website Optimization | | Syed_Ozair
/stop-smoking-hypnotherapy.php
Page authority: 24 and Page 2
/stop-smoking-in-highgate-north-london-radlett-hertfordshire-and-city-of-london.php
Page authority: 13 with 2 internal links only This was probably done to get more local searches to the page but i think it is a bit spamy. Would it be better to 301 page 2 to page 1 or make it as a blog post and keep it alive?0