Site property is verified for new version of search console, but same property is unverified in the old version
-
Hi all!
This is a weird one that I've never encountered before.
So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail).
Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
-
That assuredly did used to be a problem and in these times I've found it hit and miss. Sometimes Google is able to reach the file directly and not be redirected, but sometimes Google still can't reach the file. In which case, you modify your .htaccess file to allow that one file (or URL) to be accessed via either protocol. I don't remember the exact rule but from memory, doing this isn't that hard
Failing that you should have access to this method:
https://support.google.com/webmasters/answer/9008080?hl=en
Ctrl+F (find) for "DNS record" and expand that bit of info from Google. That version works really well and I think, it also gives you access to the new domain level property
The htaccess mod method may be more applicable for you. Certainly make the change via FTP and not via a CMS back-end. If you break the .htaccess and kill the site, and you only have the CMS back-end to fix it - which also becomes broken, you're stuck. Modding your .htaccess file should not break FTP unless you do something out of this world, crazy-insanely wrong (in-fact I'm not sure you can break FTP with your .htaccess file)
Another option, temporarily nullify the HTTP to HTTPS redirects in the .htaccess, verify, make your changes, then put the rule back on. This is a bad method because, in a few weeks Google will fail to reach the file and you will be unverified again. Also your site may have legal reasons it must, must be on HTTPS. Also allowing HTTP again may shake up and mess up your SERPs unless you act lightning fast (before Google's next crawl of your site)
Something like this might help: https://serverfault.com/questions/740640/disable-https-for-a-single-file-in-apache or these search results: https://www.google.co.uk/search?q=disable+https+redirect+for+certain+file
Hope that helps
-
Thanks very much for your response. You are exactly right about the travails of the multiple properties, and I hadn't even thought about how the new domain level access should handle the multiple versions of each site (I'm still used to having to verify four separate properties).
In the end, you were exactly right; I just had to FTP the verification file once more and it worked immediately.
A question, though: if you were trying to verify a non secured protocol (http://) of a site that is https://, and you were for some reason unable to verify through GA or GTM, wouldn't uploading a verification file automatically create a secured protocol and therefore be invalid for verification? This is (thank goodness) purely theoretical, but it seems as though it would be a rough task which I'm sure happens periodically.
Thanks again for the insight. You were a great help!
-
I have no experience with this particular error but from the sounds of it, you will just have to re-verify and that's all that you can do. One thing to keep in mind is that different versions of the same site (HTTPS/WWW, HTTPS, HTTP/WWW, HTTP, any sub-domains) all count as separate websites in Search Console
The days of that being a problem are numbered as Google have come out with new domain-level properties for Search Console, but to verify those you need hosting level access so most people still aren't using that until Google can make the older verification methods applicable
What this does mean is that, if the URLs which you want to remove are for a different version of the site (which still counts as a separate property) then you still have to verify that other version of the site (maybe the pre-HTTPS version, or a version without WWW). If you have the wrong version of the property (site) registered in your GSC (which doesn't contain the URLs you want to remove) then you still need to register the old version
A common issue is when people move from HTTP to HTTPS, and they want to 'clean up' some of the old HTTP URLs and stop them from ranking (or at least, re-direct Google from the old property to the new one properly). They delete the HTTP version of the site from their GSC, but then they can't get back to do proper clean-up. In most instances Google still considers different site versions to be different sites in GSC. As mentioned this isn't a problem for some people now, soon it won't be a problem for anyone. But if you're looking at any kind of legacy account for sites that were built and verified up to a few months ago, the likelihood is you still have to re-verify other site versions
The new domain level properties may also have bugs in, where they defer back to the non-domain level properties for some stuff. You may have just found an example of that to be honest (but I can't confirm this)
I'd advise just doing what the UI tells you, it's really all you can feasibly do at this juncture
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search my keyword on google
When I search my keyword on google I can see my website, but when I connect to a VPN and again search it, I'm not in the search result, what happens on my website?
Reporting & Analytics | | Pintapin0 -
Google Search Console - Why is my average mobile position better than my average desktop position?
I'm wondering why my average mobile position is much better than my average desktop position. I'm wondering if Google is comparing the same queries for both mobile vs desktop or if they're only showing me the top ranked for each type of search. Is it example 1 or 2? Example 1: Desktop may have 5,000 ranking queries that average to 21.6
Reporting & Analytics | | Pauly_Gigs
Mobile may have 1,500 ranking queries that average to 8.5 OR Example 2: example.com has 5,000 total ranking keywords, those queries' average ranking in a desktop search 21.6 and mobile search 8.5. I'm curious to know exactly what I'm seeing in Google's Search Console. https://08875344305734164866.googlegroups.com/attach/777ae98664ed418f/Mobile%20VS%20Desktop.png?part=0.1&view=1&vt=ANaJVrEHOjRLlPH43i00NnC8PxaG3ct7bsHum_TWnUoa7xVamCpRp8jrvRQJL-gz4n7Q0otqKcKxcAJA5z1VySs2naQU_Zy5tDps6bJhUSZsLRQq4uU-tJQ0 -
When analysing my inbound anchor text am I using by page or site?
When checking to see my anchor text profile to make sure it's not too dense with the same phrases, should I be measuring against my whole site or page specific? eg if i have 100 links across my site and 20 are for the same phrase this is 20%, but if the same 20 phrases are to one page and that page has 40 links this is 55% Many thanks Ash
Reporting & Analytics | | AshShep10 -
Any advice for better results for local search in the UK ?
Hi there, I have tried the tools here, and they are BRILLIANT but maybe to much for me. I just wondered if anyone in the UK had time to give me some advice. A lot of my customers are small businesses (mainly trades, like plumbers, builders etc etc), and I am trying to find best practices for getting them more quality traffic amidst a lot of competition. One customer I am working with currently has 50+ competitiors for his small plumbing business that are within 5 miles of him and have an effective web prescence. In those circumstances (take it as read that I have killed myself creating quality content that I am proud of for this client), does anyone have any killer advice for me that could make my sites more effective ? If anyone does have the time I am more than willing to PM the URL I am working with at the moment. Thanks for your time..
Reporting & Analytics | | nicklemonpromotions0 -
How to find out which URLs are NOT indexed on a site
Is there a way to easily find out which URLs on a store-type site are NOT being indexed in Google? For example, if my sitemap information in Google Webmaster tools shows I have 7342 URLs in my sitemap and 5699 of those indexed, how do I find out what the 1643 non-indexed URLS are? Thanks for any help!
Reporting & Analytics | | GregWalt0 -
Google is listing my site using IP also, is it normal?
https://www.google.com/search?sourceid=chrome&ie=UTF-8&q=site%3A50.97.XXX.XXX About 7,050 results (0.24 seconds) when we do list by domain we get : About 10,400,000 results (0.29 seconds) is it ok? would google smart enough to count IP address not as duplicate content?
Reporting & Analytics | | tpt.com0 -
Tracing Google Analytics 'goal' back to original search phrase
I added Goals to my Google Analytics tracking. It's working; I get visitors who have completed Goals showing up in the reporting. My question is: Is it possible to trace backwards from a completed Goal to the original search phrase a user entered in Google to come to my site (for those who entered from Google.com via organic search result)? I'm trying to answer the question of which search phrases are resulting in completed Goals (as opposed to bouncing off the site or just any behaviour other than completing a Goal). It seems like this should be one of Analytics' default reports -- help identify which search phrases are converting well. It's probably there and I'm just not seeing it... Thanks.
Reporting & Analytics | | scanlin0 -
Unable to use Open Site Explorer
I have repeatedly tried to use Open Site Explorer on a site. I always get the same response:
Reporting & Analytics | | RyanKentNo Data Available for this URL The site is 4 months old. It is listed by Alexa in 1.3 million page category. The site ranks well on numerous key terms for the industry it covers. OSE offers 4 reasons for not having data: 1. Recency of page creation. In the explanation offered 45-60 days is the longest estimate it should take. The site has been live close to 120 days. 2. Deep down in the web. The tool doesn't even have any data for the site's home page. 3. Blocked pages. The site is listed in Google and these pages are not blocked. 4. No links. There are plenty of links. Site url is www.terapvp.com What can I do to fix this issue and get data?
0