Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best schema option for condos / condominiums?
-
Hey guys, I'm doing a review on some schema on some of our sites. Most of them are generic using LocalBusiness.
There are a few more specific schemas I could use, but not sure what would be the most relevant. Wondering if any of you have a suggestion or ideas?
https://schema.org/LodgingBusiness
https://schema.org/ApartmentComplex
or I could just stick with LocalBusiness.
I'm leaning towards LodgingBusiness or ApartmentComplex.... but when I think of LodgingBusiness I think of something temporary / vacation type deal like hotels. Apartments... kind of self explanatory, a condominium isn't exactly an apartment but perhaps it is more comparable to an apartment than a hotel, motel or inn. What are you thoughts on this?
Also, which "format" is better to use RDFa, microdata, or JSON-LD. Does it matter?
-
Excellent. I have to say, the number of schema they offer is bewildering. Glad to hear you're sorted.
-
Thanks, I actually found a GatedResidentCommunity which is perfect for them.
I went with JSON-LD this time, in the past I used microdata and will probably never do that again lol. JSON-LD was a lot easier and quicker. Oh wow thanks for the link, will probably put that to use in the future!
Thanks for the reply!
-
I think I'd go with either Residence or ApartmentComplex and pick what best suits from there - a quick glance suggests both would suit (though you'll know more about that than I).
I'd go with JSON-LD (and have done) all the way: It's a lightweight, flexible and easy way to add metadata to sites. It's robust and forgiving so unlikely to break anything. It plays well with Google and opens up the a lot of options for you.
There's also a nifty looking JSON-LD generator, here: https://www.schemaapp.com/tools/jsonld-schema-generator/
All the best with the project!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL keyword separator best practice
Hello. Wanted to reach out see what the consensus is re-keyword separators So just taken on a new client and all their urls are structured like /buybbqpacks rather than buy-bbq-packs - my understanding is that it comes down to readability, which influences click through, rather than search impact on the keyword. So we usually advise on a hyphen, but the guy's going to have to change ALLOT of pages & setup redirects to change it all wasn't sure if it was worth it? Thanks! Stu
On-Page Optimization | | bloomletsgrow0 -
Updating Old Content at Scale - Any Danger from a Google Penalty/Spam Perspective?
We've read a lot about the power of updating old content (making it more relevant for today, finding other ways to add value to it) and republishing (Here I mean changing the publish date from the original publish date to today's date - not publishing on other sites). I'm wondering if there is any danger of doing this at scale (designating a few months out of the year where we don't publish brand-new content but instead focus on taking our old blog posts, updating them, and changing the publish date - ~15 posts/month). We have a huge archive of old posts we believe we can add value to and publish anew to benefit our community/organic traffic visitors. It seems like we could add a lot of value to readers by doing this, but I'm a little worried this might somehow be seen by Google as manipulative/spammy/something that could otherwise get us in trouble. Does anyone have experience doing this or have thoughts on whether this might somehow be dangerous to do? Thanks Moz community!
On-Page Optimization | | paulz9990 -
Random /feed 404 error from a wordpress site
My Moz Analytics report shows a 404 error on a page which I think should not exist at all. The URL is http://henryplumbingco.com/portfolio-item/butler-elementary/feed/. When I checked webmaster tools, it looks like there are a number of random /feed urls throwing 404 errors. I am using WordPress and the Enfold theme. Anyone know how to get rid of these errors? Thanks,
On-Page Optimization | | aj6130 -
Schema description wordcount guidelines ?
Hi is there a wordcount guideline for the description field in Ravens schema creator ? according to their page on event schema an excerpt from the page will show up as a short description but then their tool has a field for adding a description! I was just adding some edited copy from the page into this but if it already pulls in an excerpt is there any need ? I take it its a good idea for better control of what's displayed in rich snippet, if so what's suggested wordcount limit ? cheers dan
On-Page Optimization | | Dan-Lawrence0 -
What is the best way to execute a geo redirect?
Based on what I've read, it seems like everyone agrees an IP-based, server side redirect is fine for SEO if you have content that is "geo" in nature. What I don't understand is how to actually do this. It seems like after a bit of research there are 3 options: You can do a 301 which it seems like most sites do, but that basically means if google crawls you in different US areas (which it may or may not) it essentially thinks you have multiple homepages. Does google only crawl from SF-based IPs? 302 passes no juice, so probably don't want to do that. Yelp does a 303 redirect, which it seems like nobody else does, but Yelp is obviously very SEO-savvy. Is this perhaps a better way that solves for the above issues? Thoughts on what is best approach here?
On-Page Optimization | | jcgoodrich0 -
Blog.mysite.com or mysite.com/blog?
Hi, I'm just curious what the majority think of what's the best way to start a blog on your website for SEO benefits. Is it better to have it under a sub domain or a directory? Or does it even matter?
On-Page Optimization | | truckguy770 -
Best practice for Meta-Robots tag in categories and author pages?
For some of our site we use Wordpress, which we really like working with. The question I have is for the categories and authors pages (and similiar pages), i.e. the one looking: http://www.domain.com/authors/. Should you or should you not use follow, noindex for meta-robots? We have a lot of categories/tags/authors which generates a lot of pages. I'm a bit worried that google won't like this and leaning towards adding the follow, noindex. But the more I read about it, the more I see people disagree. What does the community of Seomoz think?
On-Page Optimization | | Lobtec0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5