Site property is verified for new version of search console, but same property is unverified in the old version
-
Hi all!
This is a weird one that I've never encountered before.
So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail).
Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
-
That assuredly did used to be a problem and in these times I've found it hit and miss. Sometimes Google is able to reach the file directly and not be redirected, but sometimes Google still can't reach the file. In which case, you modify your .htaccess file to allow that one file (or URL) to be accessed via either protocol. I don't remember the exact rule but from memory, doing this isn't that hard
Failing that you should have access to this method:
https://support.google.com/webmasters/answer/9008080?hl=en
Ctrl+F (find) for "DNS record" and expand that bit of info from Google. That version works really well and I think, it also gives you access to the new domain level property
The htaccess mod method may be more applicable for you. Certainly make the change via FTP and not via a CMS back-end. If you break the .htaccess and kill the site, and you only have the CMS back-end to fix it - which also becomes broken, you're stuck. Modding your .htaccess file should not break FTP unless you do something out of this world, crazy-insanely wrong (in-fact I'm not sure you can break FTP with your .htaccess file)
Another option, temporarily nullify the HTTP to HTTPS redirects in the .htaccess, verify, make your changes, then put the rule back on. This is a bad method because, in a few weeks Google will fail to reach the file and you will be unverified again. Also your site may have legal reasons it must, must be on HTTPS. Also allowing HTTP again may shake up and mess up your SERPs unless you act lightning fast (before Google's next crawl of your site)
Something like this might help: https://serverfault.com/questions/740640/disable-https-for-a-single-file-in-apache or these search results: https://www.google.co.uk/search?q=disable+https+redirect+for+certain+file
Hope that helps
-
Thanks very much for your response. You are exactly right about the travails of the multiple properties, and I hadn't even thought about how the new domain level access should handle the multiple versions of each site (I'm still used to having to verify four separate properties).
In the end, you were exactly right; I just had to FTP the verification file once more and it worked immediately.
A question, though: if you were trying to verify a non secured protocol (http://) of a site that is https://, and you were for some reason unable to verify through GA or GTM, wouldn't uploading a verification file automatically create a secured protocol and therefore be invalid for verification? This is (thank goodness) purely theoretical, but it seems as though it would be a rough task which I'm sure happens periodically.
Thanks again for the insight. You were a great help!
-
I have no experience with this particular error but from the sounds of it, you will just have to re-verify and that's all that you can do. One thing to keep in mind is that different versions of the same site (HTTPS/WWW, HTTPS, HTTP/WWW, HTTP, any sub-domains) all count as separate websites in Search Console
The days of that being a problem are numbered as Google have come out with new domain-level properties for Search Console, but to verify those you need hosting level access so most people still aren't using that until Google can make the older verification methods applicable
What this does mean is that, if the URLs which you want to remove are for a different version of the site (which still counts as a separate property) then you still have to verify that other version of the site (maybe the pre-HTTPS version, or a version without WWW). If you have the wrong version of the property (site) registered in your GSC (which doesn't contain the URLs you want to remove) then you still need to register the old version
A common issue is when people move from HTTP to HTTPS, and they want to 'clean up' some of the old HTTP URLs and stop them from ranking (or at least, re-direct Google from the old property to the new one properly). They delete the HTTP version of the site from their GSC, but then they can't get back to do proper clean-up. In most instances Google still considers different site versions to be different sites in GSC. As mentioned this isn't a problem for some people now, soon it won't be a problem for anyone. But if you're looking at any kind of legacy account for sites that were built and verified up to a few months ago, the likelihood is you still have to re-verify other site versions
The new domain level properties may also have bugs in, where they defer back to the non-domain level properties for some stuff. You may have just found an example of that to be honest (but I can't confirm this)
I'd advise just doing what the UI tells you, it's really all you can feasibly do at this juncture
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Search Console data include GMB traffic? Branded CTR is 37.8%- Good or Bad?
Hey all, Per Search Console our branded keyword CTR is 37.8%. But when that keyword is searched our GMB listing shows up on top of the #1 result. For the same 90 day period GMB shows another 35% visits to our GMB (based on the number of impressions and visits to our GMB page) listing when the same keyword is searched. My question is this. Does Search console data include clicks that came from our GMB listing or not? My thinking is like this: If GMB traffic is not calculated in search console then it means that 72.8% of people looking for our brand will end up on our site on way or another 9organic #1 result plus GMB listing visits) We are also doing PPC for this very keyword that has gets almost 20% of the remaining traffic. So after adding all up we are loosing about 8% of our branded traffic to people who are doing adwords. When you search our brand you normally see 2, 3 competitor's adwords ads. Does anyone know how this works exactly? And if you don't mind sharing your branded keyword CTR's, so I can compare to ours please. I would love to compare to a site that actually has a GMB listing ranking for the same keyword Thanks in advance, Davit
Reporting & Analytics | | Davit19850 -
Moz reports way fewer backlinks than google search?
My site is only 11 months old but has steadily (if not slowly) been gaining backlinks. My question, is why Moz shows me at 303 backlinks and Google search console is showing at 1,237? I am more than a little suspicious that this could highlight the reason Moz shows such an unfavorable DA ranking for our site at a DA12. Other competitors that rank for similar keywords to mine are DA 42, DA 65, DA 73, etc. If the largest ranking factor is links, and they have mine reported incorrectly - is this the issue with DA as it relates to sites like mine? Any answer from someone who has experienced similar, or has a definitive answer is more than welcome to chime in! Thanks, Kevin
Reporting & Analytics | | kvncrll0 -
Why would on-site search queries show up as referral traffic?
The site analytics have been set up for over a year and suddenly last month there was a huge spike in referral traffic (1100+ sessions). Upon further investigation, the majority of it was coming directly from internally, either as mysite.com or search.mysite.com and the landing pages from the referrals are all /search.html?query=* This was never an issue before so I'm trying to understand what could have changed. I'm following up with the client to find out if their dev team may have changed anything related to their search engine but I'm wondering if there may be another explanation. A few notes: previously mysite.com / search.mysite.com were not in the Referral Exclusion list. I've added them now but this was never an issue before. Thanks in advance!
Reporting & Analytics | | SEMnMs0 -
UA Codes on Multiple Sites
Someone came in and set up Google Analytics code on one of our clients' sites before we got there and it is showing data from an outside website that the code is also on. Does anyone know the best way to find out what that other site is so we can remove this code from both sites? Thanks as always MOZ friends!
Reporting & Analytics | | ReunionMarketing0 -
Stop getting info from Google analytics on purchases in our site
Hi guys, We have eCommerce.
Reporting & Analytics | | WayneRooney
We connected the site to the Google analytic eCommerce.
Everything was work fine until 3 weeks ago. Suddenly we stooped getting purchases information in the analytic although i see purchases in the website. We didn't change anything in the website and i really don't know how to solve this problem.
If someone here can point me where i can get some info on how to fix it it can be great. Thanks a lot!0 -
Blog posts not appearing in Search
http://www.themorrisagency.co.uk/blog works well to index my niche terms. Looked at webmaster tools today and since today it has put my indexed pages from 900+ to 11 ??? I have recently added a new site format via a web designer and was all working fine. Now none of my posts are appearing.I have resubmitted site map for 993 pages Is there anything I can do to stop/ prevent this from happening again? and what do you think was the cause? Thanks everyone! Daniel
Reporting & Analytics | | Agentmorris0 -
Linking Multiple Niche Site In Same Google Analytics Account
Hi, I am providing SEO for Local business. Is it advisable to separate out the Google Analytics into different Google account or is it ok to remain it this way? Some of the client might be in the same niche, and might be competing with the same keywords as well. What I was worried is, Google might see these sites as same owner and only rank for 1 of the site. I was thinking to get the owners to register for their own Google Analytics and share the access to me.
Reporting & Analytics | | JonathanSoh0 -
Google's New Privacy Policy and Analytics
Does anybody know if Google's new privacy policy allows it to use data gathered by Analytics to be used as a ranking factor in the SERPs?
Reporting & Analytics | | Jolora0