Should I delete 'data hightlighter' mark-up in webmaster tools after added schema.org mark-up?
-
LEDSupply.com is my site, and before becoming familiar with schema mark-up I used the 'data-highlighter' in webmaster tools to mark-up as much of the site as I could. Now that Schema is set-up I'm wondering if having both active is bad and am thinking I should delete the previous work with the 'data highlighter' tool.
To delete or not to delete? Thank you!
-
Ah, ok. My mistake, I didn't drill down enough. One thing I did notice: you have authorship markup on those product pages as well. That should be removed.
According to Google's guidelines, for product pages that are not specifically written/constructed by an "author," that markup should not be there. Rel="publisher" is the only necessary markup for non-blog or article content.
The schema markup you've implemented looks good in the page source, and checks out as being correctly implemented (without any duplicates) using Google's Structured Data Testing Tool (found in Google Webmaster Tools). It appears the data highlighter markup is not causing duplicates.
I'd recommend double-checking all the product pages you've added schema to that you originally had from the data highlighter markup. There may be duplicates, there may not. To be honest, I've always gone right to schema.org, but checking for duplicates should be the only thing you should have to worry about.
Good luck!
-
only on the product pages. it's live.
-
Where have you added schema markup? What pages?
After briefly scanning a few pages on your site, I'm not seeing any code from schema.org in the page source. Has this been implemented/gone live yet?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site not showing up in search - was hacked - huge comment spam - cannot connect Webmaster tools
Hi Moz Community A new client approached me yesterday for help with their site that used to rank well for their designated keywords, but now is not doing well. Actually, they are not on Google at all. It's like they were removed by Google. There are not reference to them when searching with "site: url". I investigated further and discovered the likely problem . . . 26 000 spam comments! All these comments have been removed now. I clean up this Wordpress site pretty well. However, I want to connect it now to Google webmaster tools. I have admin access to the WP site, but not ftp. So I tried using Yoast to connect. Google failed to verify the site. So the I used a file uploading console to upload the Google html code instead. I check that the code is there. And Google still fails to verify the site. It is as if Google is so angry with this domain that they have wiped it completely from search and refuse to have any dealings with it at all. That said, I did run the "malware" check or "dangerous content" check with them that did not bring back any problems. I'm leaning towards the idea that this is a "cursed" domain in Google and that my client's best course of action is to build her business around and other domain instead. And then point that old domain to the new domain, hopefully without attracting any bad karma in that process (advice on that step would be appreciated). Anyone have an idea as to what is going on here?
Intermediate & Advanced SEO | | AlistairC0 -
Google Penalties not in Webmaster tools?
Hi everybody, I have a client that used to rank very well in 2014. They launched an updated URL structure early January 2015, and since they rank very low on most of the keywords (except the brand keywords). I started working with them early this year, tried to understand what happened, but they have no access to their old website and I cant really compare. I tried the started optimisation methods but nothing seems to work. I have a feeling they have been penalised by Google, probably a Panda penalty, but their Webmaster tools account does not show any penalties under manual actions. Do people impose penalties that are not added to Webmaster tools? If so, is there away I can find out what penalties and what is wrong exactly so we can start fixing it? The website is for a recruitment agency and they have around 400 jobs listed on it. I would love to share the link to the website but I don't believe the client will be happy with that. Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Why isn't the Google change of address tool working for me?
Last night I switched my site from http to https. Both sites are verified in Webmaster Tools but when I try to use the change of address it says- Your account doesn't contain any sites we can use for a change of address. Add and verify the new site, then try again. How do I fix this?
Intermediate & Advanced SEO | | EcommerceSite0 -
301 Redirect and Webmaster Central
I've been working on removing canonical issues. My host is Apache. Is this the correct code for my htaccess? RewriteEngine On
Intermediate & Advanced SEO | | spkcp111
RewriteCond %{HTTP_HOST} ^luckygemstones.com$ [NC]
RewriteRule ^(.*)$ http://www.luckygemstones.com/$1 [R=301,L] SECOND!!! I have two websites under Google's Webmaster Central; http://luckygemstones.com which gets NO 404 soft errors... AND http://www.luckygemstones.com which has 247 soft 404 errors... I think I should DELETE the http://luckygemstones.com site from Webmaster Central--the 301 redirect handles the"www" thing. Is this correct? I hate to hose things (even worse?) Help! Kathleen0 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0 -
Matt Cutts Announces Disavow Google Webmasters Tool
Today at Pubcon Cutts announced this tool similar to Bing's - http://searchengineland.com/google-launches-disavow-links-tool-136826. My question is, has anybody used Bing's? Do you foresee any problems or issues to consider? Just checking before going ahead with using it 🙂 Thanks
Intermediate & Advanced SEO | | bradkrussell0 -
How to Set Custom Crawl Rate in Google Webmaster Tools?
This is really silly question to set custom crawl rate in Google webmaster tools. Any one can find out that section under setting tab. But, I have confusion to decide number for request per second and second between requests text field. I want to set custom crawl rate for my eCommerce website. I checked my Google webmaster tools and find out as attachment. So, Can I use this facility to improve my crawling? 6233755578_33ce83bb71_b.jpg
Intermediate & Advanced SEO | | CommercePundit0 -
When to delete low quality content
If 75% of a site is poor quality, but still accounts for 35% of the traffic to the site, should the content be 404ed? Or, would it be better to move it to a subdomain and set up 301 re-directs? This site was greatly affected by Panda.
Intermediate & Advanced SEO | | nicole.healthline0