Is micro data on every page over kill?
-
Hi Rand & Team,
I am using microdata for the company address in the footer of all pages. This shows up as over 3000 pages on the "Structured Data" in Google WMT.
Is this overkill? Should I have the microdata only on the home page?
Please advice.
Mash
-
Thank you Sanket! Mash
-
As we know that search engine announced that we can use microdata tagging to generate more relevant and more detailed search results. The microdata tags can help your site get indexed and ranked more accurately, I am not sure but as per my knowledge it is not over kill to use microdata on every page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
Canonicalisation and Dynamic Pages
We have an e-commerce single page app hosted at https://www.whichledlight.com and part of this site is our search results page (http://www.whichledlight.com/t/gu10-led-bulbs?fitting_eq=GU10). To narrow down products on the results we make heavy use of query parameters. From an SEO perspective we are telling GoogleBot to not index pages that include these query parameters to prevent duplicate content issues and to not index pages where the combination of query parameters has resulted in no results being returned. The only exception to this is the page parameter. We are posting here to check our homework so to speak. Does the above sound sensible? Although we have told GoogleBot to not index these pages, Moz will still crawl them (to the best of my knowledge), so we will continue to see crawl errors within our Moz reports where in fact these issues don't exist. Is this true? Is there anyway to make Moz ignore pages with certain query parameters? Any other suggestions to improve the SEO of our results pages is most appreciated. Thanks
Technical SEO | | TrueluxGroup0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Index page
To the SEO experts, this may well seem a silly question, so I apologies in advance as I try not to ask questions that I probably know the answer for already, but clarity is my goal I have numerous sites ,as standard practice, through the .htaccess I will always set up non www to www, and redirect the index page to www.mysite.com. All straight forward, have never questioned this practice, always been advised its the ebst practice to avoid duplicate content. Now, today, I was looking at a CMS service for a customer for their website, the website is already built and its a static website, so the CMS integration was going to mean a full rewrite of the website. Speaking to a friend on another forum, he told me about a service called simple CMS, had a look, looks perfect for the customer ... Went to set it up on the clients site and here is the problem. For the CMS software to work, it MUST access the index page, because my index page is redirected to www.mysite.com , it wont work as it cant find the index page (obviously) I questioned this with the software company, they inform me that it must access the index page, I have explained that it wont be able to and why (cause I have my index page redirected to avoid duplicate content) To my astonishment, the person there told me that duplicate content is a huge no no with Google (that's not the astonishing part) but its not relevant to the index and non index page of a website. This goes against everything I thought I knew ... The person also reassured me that they have worked within the SEO area for 10 years. As I am a subscriber to SEO MOZ and no one here has anything to gain but offering advice, is this true ? Will it not be an issue for duplicate content to show both a index page and non index page ?, will search engines not view this as duplicate content ? Or is this SEO expert talking bull, which I suspect, but cannot be sure. Any advice would be greatly appreciated, it would make my life a lot easier for the customer to use this CMS software, but I would do it at the risk of tarnishing the work they and I have done on their ranking status Many thanks in advance John
Technical SEO | | Johnny4B0 -
301 by category rather than page?
Hi Guys It is possible (and best practice) to 301 a category as a whole rather than every page within the category. I was asked this by a developer the other day, and it was something I'd not considered before as I've always done it by page. It seems there's some wordpress plugins out there that do it. Thanks in advance.
Technical SEO | | PerchDigital0 -
Putting blog excerpts in footer of every page?
I have roughly 150 non-blog pages and 500 articles in my blog. The footer of every non-blog page includes excerpts from 3 blog posts selected at random from the inventory of 500. The posts in the footer of each page change with every page refresh. So, if you scroll to the bottom of any non-blog page, you'll see about 85 words for each of 3 randomly selected blog posts, with a link to the source article in the blog section of my site. Each page will link to 3 different posts. One of my objectives is to drive visitors to some older blog content that has become buried deep in the archives over the years. Question 1: In a post-Panda/Penguin world, is this a good or bad technique? Question 2: Should the links to the full content in the blog use rel="nofollow"? Without it, the internal link structure for this part of the site looks pretty crazy and random - I assume nofollow would help make things look more orderly (and prevent my main non-blog pages from passing excess link juice to my blog). Thoughts or comments?
Technical SEO | | ahirai0 -
Removing pages from website
Hello all, I am fairly new to the SEOmoz community. But i am working for a company which organizes exhibitons, events and training in Holland. A lot of these events are only given ones ore twice and then we do not organise them any more because they are no longer relevant. Every event has its own few webpages which provide information about the event and are being indexed by Google. In the past we did not remove any of these events. I was looking in the CMS and saw a lot of events of 2008 and older which are being indexed. To clean the website and the CMS i am thinking of removing these pages of old events. The risk is that these pages have some links to them and are getting some traffic, so if i remove them there is a risk of losing traffic and rankings. What would be the wise thing to do? Make a folder with archive or something? Regards, Ruud
Technical SEO | | RuudHeijnen0