Schema.org - Right way to mark the pages
-
Dear all,
Almost since we started designing our site, we are using schema microdata. It is not only because of the rich snippets, but because I want the search engines to better understand what we have.
For example, the +1 buttom would not work properly without schema microdata, because it kind of ignores the OpenGraph parameters that specified image and description; and since we are a (very small) local bussiness directory (between other things), all our clients have a hand written schema complient description on their lisings, including address, opening ours, telephone number, description, etc. It is hand written by us because the tools avialable are simply not good enough to cover all different scenarios that a listing can present.
I have not use, until today, a proper for the homepage, and it is probably the cause that our page lost the nice links below the site description in the google snippet. I did not place it on the body tag, but near the description, closing it inmediately after the description finishs. Now this is solved and we will wait to see if the links come back in the next weeks.
Now to the question. Our site has three sections, with three different systems installed, two running wordpress and a third running another script. the main site is the local bussiness directory. The front page is mark as "schema.org/WepPage", and I do not know how to mark the other pages of the main site. I was thinking of marking the listings as "schema.org/ItemPage" since they are related to specific clients. Would you consired it to be right? Then, we have landing pages for the categories, should they be mark as WepPage, or as an Article, or something else?
Many thanks in advance for your help,
Best Regards,
Daniel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Business Services via Index Page or Dedicated Pages?
We're in the process of building a new website for our Business (B2B Event Services) and we've hit a design snag. Our designer wants to combine all of our business services information (there are six service lines) into a single Index Page titled "Services". Beyond this creating a needlessly long page to scroll through, I'm worried this will negatively impact our ability to SEO for each service line, as we wouldn't have any intention of letting people visit the individual pages from within the site. Our current site features individual pages which, in my opinion, is how we should build our new site. That being said, I'm completely open to any ideas that will further enhance usability and searchability. Any clarification would be greatly appreciated!
Web Design | | SHWorldwide1 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
What would be the best way to translate my website for international seo?
I am planning on creating a multi language website that targets different countries and would like to know what would be the best way to translate my content from English to multiple different languages (French, Spanish, Swedish, Chinese, etc.). I would hire a translator but doing so for all these different languages would be too costly. Using Google translate will leave me with at best a rough translation. What are my other options? Is there a website that can provide me a better translation? Would Fiver be a better/cheaper alternative? Thoughts?
Web Design | | Shawn1240 -
Main page redirect affecting search results?
Question.... A recent change was made to our page www.BGU.edu by a marketing person. So now when you type in www.BGU.edu it actually redirects to a different page www.BGU.edu/inquiry This is a really bad idea isn't it? I do not know enough about SEO to know a lot, and just joined SEOmoz but do I need to tell the admin to change it back?
Web Design | | nongard10 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
.org versus .com
I work at a non-profit that has always used the .org URL for any sites they create. I am designing a site right now with a heavy weight on SEO as part of the strategy and I have both the .org and .com versions of the URL. Am I right in thinking that I should use the .com?
Web Design | | dschreib_greenpeace.org0