Webmaster Tools Suddenly Asking For Verification of Site Registered for 5 Years
-
Google Webmaster Tools has been successfully installed on my website, (www.nyc-officespace-leader.com) for more than five years. Suddenly, today I have received a request to Verify this Site". This makes no sense.
The only possibility I can think of is that this is somehow tied to the following events in the last month:
1. Launch of new version of website on June 4th
2. Installation of Google of Tag Manager
3. Sudden Increase in number of pages indexed by Google. Unexplained indexing of an additional 175 pages. About 625 pages should be indexed, while 800 are now indexed.In the last month ranking and traffic have fallen sharply. Could it be tat these issues are all linked? But the strangest issue is the request to verify the site.
Does anyone have any ideas?
Thanks,
Alan -
True. Could be that you were using a method that required reverification after a Google-determined time period.
"1. Launch of new version of website on June 4th"
Since you launched the new website, did someone check to make sure all the tracking data was moved over as well? If you were using a meta tag, did you insert the same meta tag in the header of the new site? Or if using the file upload, did someone upload the file?
"3. Sudden Increase in number of pages indexed by Google. Unexplained indexing of an additional 175 pages. About 625 pages should be indexed, while 800 are now indexed.
In the last month ranking and traffic have fallen sharply. Could it be tat these issues are all linked? But the strangest issue is the request to verify the site."
If you just lauched a new website, check to make sure you have canoncial links enabled and working. If you are using a cms, and the old site was HTML (or anything static and not database-driven) could be you have a ton of stuff indexed you don't know about. This could also be influencing yoru traffic loss, as Google will see these additional pages as duplicates.
Example:
yourdomain.com
www.yourdomain.com
www.yourdomain.com/home...could possibly all be showing the same page.
-
This has happened to me before, i just download the new file and upload it and all the data was there, are you using html or a wordpress site?
-
How was your site verified in the past? Is there a meta tag or html verification that might have gotten accidentally removed? It's been a while, but I think I've seen this happen just on its own, with GWT occasionally wanting you to reverify, even if nothing has changed. The html file was there, so I just hit reverify and everything was fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone have a good resource for seeing SERP feature/presentation change Year-Over-Year?
Moz only tracks general SERP feature changes for the last 30 days.
Reporting & Analytics | | homedepot0 -
Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites. We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years. By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step? Thanks, Alan
Reporting & Analytics | | Kingalan10 -
Having Issue with Site Search in Analytics
Hi Mozzers, We launched a website in October 2012 and have added in the settings(Google analytics) of that profile "Do Track Site Search" since we have a search box on the website. The site search report worked for 10 days and it was over(from end of december till beginning of January 2013). Since then I have been trying to understand this issue. I have added all the query search terms possible, but still not showing any signs of life. At this point I am not sure what to do? Some Help would be appreciated! Search URL= subdomain.example.com**/search/node/**.... Thanks! z93cGUZ.png
Reporting & Analytics | | Ideas-Money-Art0 -
Google Webmaster Tools - spike in 'not selected' under Index Status
Hi fellow mozzers Has anyone seen a huge shift in the number of pages 'Not Selected' under Index Status in Google WMT, and been able to identify what the problem has been? My new client recently moved their site to wordpress - and in doing so the number of pages 'not selected' rose from ~200 to ~1100, It was high before but is ridiculous now. I am thinking there must be a new duplicate content issue which should be cleaned up in my quest to improve their SEO. Could it be the good old WP tag/category issue? In which case I won't worry as Joost is doing its job of keeping stuff out of the index. There are loads of image pages which could well appear as dupe as have no content on them (i do need to fix this), but Google is already indexing these so doesn't explain the ones 'not selected'. I've tried checking dupe title tags but there are very few of them so that doesn't help Any other ideas of how to identify what these problem pages maybe? Thanks very much! Wendy
Reporting & Analytics | | Chammy0 -
Google Analytics Site Search to new sub-domain
Hi Mozzers, I'm setting up Google's Site Search on a website. However this isn't for search terms, this will be for people filling in a form and using the POST action to land on a results page. This is similar to what is outlined at http://support.google.com/analytics/bin/answer.py?hl=en&answer=1012264 ('<a class="zippy zippy-collapse">Setting Up Site Search for POST-Based Search Engines').</a> However my approach is different as my results appear on a sub-domain of the top level domain. Eg.. user is on www.domain.com/page.php user fills in form submits user gets taken to results.domain.com/results.php The issue is with the suggested code provided by Google as copied below.. Firstly, I don't use query strings on my results page so I would have to create an artificial page which shouldn't be a problem. But what I don't know is how the tracking will work across a sub-domain without the _gaq.push(['_setDomainName', '.domain.com']); code. Can this be added in? Can I also add Custom Variables? Does anyone have experience of using Site Search across a sub-domain perhaps to track quote form values? Many thanks!
Reporting & Analytics | | panini0 -
What extra shall we do to increase organic traffic for both the sites?
One of my clients has english language websites targeted for USA and UK audience with some content variations but60%-70% of the content on site remains the same. The site is hosted in USA. One is hosted on brandname.co.uk and one is on brandname.com. The precuations that we have already taken to save it from being marked duplicates are: 1. used rel=alternate element for all product detail pages2. currency in both the sites are different that is GBP and USD3. have tried to differenttate the product by using different product specific terms like Publishing Year: vis a vis published in:Author Vis a vis Written byFormat vis a vis binding type Add to cart vis a vis add to basket and so on What remains the same: 1. Title structure 2. Description 3. Product Name 4. About the product text
Reporting & Analytics | | CyrilWilson0 -
Should we add the city to our keywords for a site that is only local?
This is one of those things I have done for a long time and all of a sudden asked myself was it necessary: For our local clients, we add the city name (Houston, KC, Birmingham) after each keyword. An example would be TestSite.com/big-tester-houston A Title Tag might be Big Tester Houston | Test Site, etc. Where appropriate we do the same with H1 or H2's and occasionally in the content we will use the city name. The thought being that since the site is only for a given city, it will be deemed more relevant than a site from outside.( I understand there are other factors in SEO; this is a specific question around adding the city). Yes, we also optimize with local directories/citation sites. Is this overkill, is it even worthwhile? Is there any evidence one way or another? I would love some strong opinions backed up with something other than anecdotal evidence where possible.
Reporting & Analytics | | RobertFisher0 -
Google Analytics internal Site Search - Destination pages dispaly Search results
Hi, Im having a bit of an issue with Google Analytics internal site search, I am able to currently track the search terms through my website internal search but when I click onto destination pages I just get the search result page. When clicking destination pages I would expect to get the pages on which the user ended up after the results page, instead I just get the results page which is pretty much useless ?submitsearchXXXXXX hope you can help, look forward to your response. Thanks,
Reporting & Analytics | | Tug-Agency1