Data highlighter in WMT displays old version of page
-
I want to mark up a business address for Google Local, so I thought I would use the data highlighter in WMT. However I only just added the address to the the bottom of the home page and when using data highlighter iit is giving me the old version of page to mark up without the address on.
Rather frustrating, does any body have any experience on the time frame until Google updates the page in the data highlighter?
According to this thread it's not even related to the page re caching: Data Highlighter: Start link is pulling an old version of page
-
OK I just checked it and it is now updated to the correct preview.
So it took up to approx 19 hours to change. Although bear in mind I wasn't checking it all the time, could be faster. Also note, if you have started highlighting the old version, when you go back into the saved one it will still have the old preview, so you need to start over again.
I can also confirm the preview in the data highlighter is not connected to Google's cache of the page in the index, as the old version of the page is still cached.
-
Thanks for the tips Thomas. I had considered doing it 'manually' but wanted to experiment with the data highlighter tool.
I'll keep on eye on it and report back the time it took Google to update to the correct preview.
-
The only thing I can think of is Google is showing you what it last indexed and that does sound strange. However there is a other way to put your address on schema properly and quickly so you'll get the local search results you want.
Use this tool http://www.feedthebot.com/tools/address/
it's 100% free and has a lot of extra tools connected that are great
You can also utilize
If you prefer to use micro data which is almost the same thing as schema you can use this tool.
http://www.microdatagenerator.com/
However I have been told not to mix the two causes some issues with search engines. So take schema or micro data is what I have recently been told I have been trying to get a solid confirmation so I think it's a possibility that it would make sense but don't want to tell you something that is not true.
My $.02 use the 1st tool and it will do the job just fine.
It is in the form of a Word press plug-in but also gives you the ability to create schema correctly right on the site and paste it into your code.
The nice thing about it is there's a little box to the right that gives you a exact match of what it's going to look like on your site.
If you don't want it to look like it formatted anything I would use the 1st tool. However both of them are excellent.
One last thing if you're using WordPress consider Yoast Local SEO it seems expensive but does a fantastic job
More great sources of information
http://www.searchenginejournal.com/how-to-use-schema-markup-for-local-seo/
http://searchengineland.com/13-semantic-markup-tips-for-2013-a-local-seo-checklist-143708
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When serving a 410 for page gone, should I serve an error page?
I'm removing a bunch of old & rubbish pages and was going to serve 410 to tell google they're gone (my understanding is it'll get them out of the index a bit quicker than a 404). I should still serve an error page though, right? Similar to a 404. That doesn't muddy the "gone" message that I'm giving Google? There's no need to 410 and die?
Intermediate & Advanced SEO | | HSDOnline0 -
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
Where is sitelinks getting its data from?
Hi, This is an issue that is really upsetting my client. There are sitelinks that are coming up when you search for his business that aren't relevant as the other pages are! I assured him that there is nothing for me to do about it besides for demoting a sitelink, which is simply a suggestion in Google's eyes. 1. I would love to know why they are choosing the titles they are publishing, which is the shortened version of states? Where are they getting it from? I don't have any linking pages with such anchor text. The only thing I can think of is that there is a clickable map that has abbreviated words of that state. Would that do it? How could I change it? 2. Also, why are they choosing pages that are really not the top visited pages on my website instead of the pages that visitors are really interested in? Here is a snapshot of the issue: http://screencast.com/t/9w9C3DPAHvYb Thanks!
Intermediate & Advanced SEO | | Rachel_J0 -
Pages Disappearing from Search
Hi, We have had a strongly ranking site since 2004. Over the past couple of days, our Google traffic has dropped by around 20% and some of our strong pages are completely disappearing from the rankings. They are still indexed, but having ranked number 1 are nowhere to be found. A number of pages still remain intact, but it seems they are increasingly disappearing. Where should we start to try and find out what is happening? Thanks
Intermediate & Advanced SEO | | simonukss0 -
Structured Data Questions
I am showing 2 items with errors. These products have both been removed from the site, and will trigger a 404 Page Not Found. I am still seeing the page URLs in Webmaster Central > Search Appearance > Structured Data. They are shown as items with errors, the errors being that they are missing price too. Should I 301 redirect these on an htaccess file, or should I remove the page url in some other way from Google? Also, I have a site with over 50,000 products and 2,000 category level pages. In Structured Data, there are only 2,848 items. Does it seem like Google is collecting very little data compared to how many urls I have on my site?
Intermediate & Advanced SEO | | djlittman0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Google + Local Pages
Hi, If I have a company with multipul addresses, Do I create separate Google + page for each area?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0