Site property is verified for new version of search console, but same property is unverified in the old version
-
Hi all!
This is a weird one that I've never encountered before.
So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail).
Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
-
That assuredly did used to be a problem and in these times I've found it hit and miss. Sometimes Google is able to reach the file directly and not be redirected, but sometimes Google still can't reach the file. In which case, you modify your .htaccess file to allow that one file (or URL) to be accessed via either protocol. I don't remember the exact rule but from memory, doing this isn't that hard
Failing that you should have access to this method:
https://support.google.com/webmasters/answer/9008080?hl=en
Ctrl+F (find) for "DNS record" and expand that bit of info from Google. That version works really well and I think, it also gives you access to the new domain level property
The htaccess mod method may be more applicable for you. Certainly make the change via FTP and not via a CMS back-end. If you break the .htaccess and kill the site, and you only have the CMS back-end to fix it - which also becomes broken, you're stuck. Modding your .htaccess file should not break FTP unless you do something out of this world, crazy-insanely wrong (in-fact I'm not sure you can break FTP with your .htaccess file)
Another option, temporarily nullify the HTTP to HTTPS redirects in the .htaccess, verify, make your changes, then put the rule back on. This is a bad method because, in a few weeks Google will fail to reach the file and you will be unverified again. Also your site may have legal reasons it must, must be on HTTPS. Also allowing HTTP again may shake up and mess up your SERPs unless you act lightning fast (before Google's next crawl of your site)
Something like this might help: https://serverfault.com/questions/740640/disable-https-for-a-single-file-in-apache or these search results: https://www.google.co.uk/search?q=disable+https+redirect+for+certain+file
Hope that helps
-
Thanks very much for your response. You are exactly right about the travails of the multiple properties, and I hadn't even thought about how the new domain level access should handle the multiple versions of each site (I'm still used to having to verify four separate properties).
In the end, you were exactly right; I just had to FTP the verification file once more and it worked immediately.
A question, though: if you were trying to verify a non secured protocol (http://) of a site that is https://, and you were for some reason unable to verify through GA or GTM, wouldn't uploading a verification file automatically create a secured protocol and therefore be invalid for verification? This is (thank goodness) purely theoretical, but it seems as though it would be a rough task which I'm sure happens periodically.
Thanks again for the insight. You were a great help!
-
I have no experience with this particular error but from the sounds of it, you will just have to re-verify and that's all that you can do. One thing to keep in mind is that different versions of the same site (HTTPS/WWW, HTTPS, HTTP/WWW, HTTP, any sub-domains) all count as separate websites in Search Console
The days of that being a problem are numbered as Google have come out with new domain-level properties for Search Console, but to verify those you need hosting level access so most people still aren't using that until Google can make the older verification methods applicable
What this does mean is that, if the URLs which you want to remove are for a different version of the site (which still counts as a separate property) then you still have to verify that other version of the site (maybe the pre-HTTPS version, or a version without WWW). If you have the wrong version of the property (site) registered in your GSC (which doesn't contain the URLs you want to remove) then you still need to register the old version
A common issue is when people move from HTTP to HTTPS, and they want to 'clean up' some of the old HTTP URLs and stop them from ranking (or at least, re-direct Google from the old property to the new one properly). They delete the HTTP version of the site from their GSC, but then they can't get back to do proper clean-up. In most instances Google still considers different site versions to be different sites in GSC. As mentioned this isn't a problem for some people now, soon it won't be a problem for anyone. But if you're looking at any kind of legacy account for sites that were built and verified up to a few months ago, the likelihood is you still have to re-verify other site versions
The new domain level properties may also have bugs in, where they defer back to the non-domain level properties for some stuff. You may have just found an example of that to be honest (but I can't confirm this)
I'd advise just doing what the UI tells you, it's really all you can feasibly do at this juncture
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Anyone help me with Google search console soft 404 error?
Hello everyone, I just build one site on WordPress and submitted it to the Google search console along with the sitemap. Some URLs got indexed but one of my URL is showing error in Search console. My post https://hotpass.site/create-subdomain-godaddy/ is showing soft error 404 webpage not available in Google search console. I have rechecked the site and seo settings. Nothing is there. I have even checked it with LIVE URL test. But still Search console is not accepting this URL. Can anyone help me with this? Help will be appriciated.
Reporting & Analytics | | Pauline210 -
Multiple GA codes, one site.
Hi all, Is anyone running two GA codes on one website successfully? My organisation own a number of websites so we used to have one global GA code on all our sites to track global stats, and then we would also have site unique GA on each property to just track that one property. This worked fine, but of late we seem to be getting no data from the globally based code. Obviously, with the site-specific codes we can enter the name for that domain in GA but for the overall code, it is called 'all.com' I'm wondering if Google has now tied the GA domain to the code or if we are doing something wrong. All the codes are the same as they always were but have stopped working. As a stop gap, we have swapped to using Piwik as the all.com code. However, we are then comparing the stats in two different analytics programs so will get a different result. Also, it would be nice to be able to add the all.com to tools such as this to generate weekly reports. Anyone else having GA woe like this? Thanks. Carl
Reporting & Analytics | | WonkyDog0 -
Dip in traffic from Pune for our sites in Google Analytics
Hi, We have noticed dip in traffic from Pune after 6th May'14 in our Google Analytics account for few of our sites. Did anyone noticed the same for your site. Kindly let me know if you have any idea. Thank and Regards
Reporting & Analytics | | vivekrathore0 -
Is there a problem with using same gmail account for multiple site analytics and GWMT?
Hi, Is there a problem or a general recommendation about using the same gmail account for two different sites (both in Google Analytics and Webmaster tools)? Thanks
Reporting & Analytics | | BeytzNet0 -
Google Internal Search Tracking Kaput :-(
Buongiorno from cloudy & overcast Wetherby UK... On this site http://www.dartexcoatings.com/ I configured Google to tracdk internal search & heres how - http://i216.photobucket.com/albums/cc53/zymurgy_bucket/internal-search-jinx_zpscf86b49d.jpg But internal search data is not pulling through 😞 How can i fix this please.
Reporting & Analytics | | Nightwing
Thanks,
David0 -
Local site rankings have dropped off first page but Universal went up.
My site was performing first page locally on 6 of 20 keywords, and universally on 3 of 20 keywords. We started a link building campaign and optimization about 3 weeks ago. When I looked at the rankings today I was happy to see that 16 of 20 keywords were in the top 20 rankings universally, but not happy to see that only 1 of the 25 words were ranking locally now. I lost my local ranking on 5 very important keywords. I realize that you can not rank first page for both local and organic but its as if I traded my first page local ranking for a universal ranking that appears lower on the page. Maybe someone could point me in the right direction.
Reporting & Analytics | | whmgatx0 -
How would you measure the SEO success of new site launch?
It has been 12 months, and it is time for some serious SEO reality check up. I think we have done some really nice things (social integration, on page optimization etc) but we honestly could do a million time better on some other elements (anchor, text, link building etc...). Would love to hear from the community what would be the top 10 criteria you would use to judge the quality of the SEO work done for a new site during is first 12 months. PS: we are a very content rich over 1,500 new articles/post in our niche with 12 months - our site is migraine.com Thanks
Reporting & Analytics | | OlivierChateau0 -
How can I capture search terms and forward it along with my contact form?
I would like to be able to capture the keywords when someone comes to my site with a PPC click or a keyword search. I can't imagine that this is an uncommon request, but I can't find the tools that would let me do that, or heck something that will help me build it myself. What is everyone using to capture search information from people who contact you?
Reporting & Analytics | | ciinc0