Data highlighter in WMT displays old version of page
-
I want to mark up a business address for Google Local, so I thought I would use the data highlighter in WMT. However I only just added the address to the the bottom of the home page and when using data highlighter iit is giving me the old version of page to mark up without the address on.
Rather frustrating, does any body have any experience on the time frame until Google updates the page in the data highlighter?
According to this thread it's not even related to the page re caching: Data Highlighter: Start link is pulling an old version of page
-
OK I just checked it and it is now updated to the correct preview.
So it took up to approx 19 hours to change. Although bear in mind I wasn't checking it all the time, could be faster. Also note, if you have started highlighting the old version, when you go back into the saved one it will still have the old preview, so you need to start over again.
I can also confirm the preview in the data highlighter is not connected to Google's cache of the page in the index, as the old version of the page is still cached.
-
Thanks for the tips Thomas. I had considered doing it 'manually' but wanted to experiment with the data highlighter tool.
I'll keep on eye on it and report back the time it took Google to update to the correct preview.
-
The only thing I can think of is Google is showing you what it last indexed and that does sound strange. However there is a other way to put your address on schema properly and quickly so you'll get the local search results you want.
Use this tool http://www.feedthebot.com/tools/address/
it's 100% free and has a lot of extra tools connected that are great
You can also utilize
If you prefer to use micro data which is almost the same thing as schema you can use this tool.
http://www.microdatagenerator.com/
However I have been told not to mix the two causes some issues with search engines. So take schema or micro data is what I have recently been told I have been trying to get a solid confirmation so I think it's a possibility that it would make sense but don't want to tell you something that is not true.
My $.02 use the 1st tool and it will do the job just fine.
It is in the form of a Word press plug-in but also gives you the ability to create schema correctly right on the site and paste it into your code.
The nice thing about it is there's a little box to the right that gives you a exact match of what it's going to look like on your site.
If you don't want it to look like it formatted anything I would use the 1st tool. However both of them are excellent.
One last thing if you're using WordPress consider Yoast Local SEO it seems expensive but does a fantastic job
More great sources of information
http://www.searchenginejournal.com/how-to-use-schema-markup-for-local-seo/
http://searchengineland.com/13-semantic-markup-tips-for-2013-a-local-seo-checklist-143708
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Revamping/Re-optimizing State Pages - What to do with old content?
Hello Moz Fam! I work in the insurance industry and we serve all 50 states. We have a state page for each state where the content is thin(ish). We're slowly revamping each page and hitting every talking point for that specific state. I've used multiple tools to come up with a content template and link building template for each page as well. I spent 5 months last year proof reading all these pages. So the content is good, just not SEO good. I didn't have the team or resources to really optimize them all yet, now I do. My question is... what should I do with the old content? I was thinking of publishing it to other platforms that we have a contributor account on and linking back to each state page with it. Of course, I would wait a few days for the search engines to index the new content so it wouldn't be duplicated on these platforms. Good or bad idea?
Intermediate & Advanced SEO | | LindsayE0 -
Pages with Duplicate Page Content (with and without www)
How can we resolve pages with duplicate page content? With and without www?
Intermediate & Advanced SEO | | directiq
Thanks in advance.0 -
Should We Add the W3.org Language Tag To Every Page Or Just The Home Page?
Greetings, We have five international sites around the world, two of which are in difference languages. Currently we have the following line of html code on the home page of each of the sites: Clearly, we need to change the "en" portion for the sites that aren't in English, but, should we include that meta tag in each of the site's pages, or will the home page suffice. Thanks!
Intermediate & Advanced SEO | | CSawatzky0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Will Creating a Keyword specific Page to replace the Category Section page cause any harm to my website?
I am running a word press install for my blog and recently had 3 of my main keywords set as categories. I recently decided to create a static page for the keywords instead of having the category page showing all the posts within the category, and took it off the navigation bar. I read about setting the categories to use NO index so the search engines can shine more importance on the new pages i created to really replace where the category was showing. Can this have a negative effect on my rankings? http://junkcarsforcashnjcompany.com junk car removal nj is showing the category section, So i placed the no index on it. Will the search engines refresh the data and replace it with the new page I created?
Intermediate & Advanced SEO | | junkcars0 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0