Site property is verified for new version of search console, but same property is unverified in the old version
-
Hi all!
This is a weird one that I've never encountered before.
So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail).
Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
-
That assuredly did used to be a problem and in these times I've found it hit and miss. Sometimes Google is able to reach the file directly and not be redirected, but sometimes Google still can't reach the file. In which case, you modify your .htaccess file to allow that one file (or URL) to be accessed via either protocol. I don't remember the exact rule but from memory, doing this isn't that hard
Failing that you should have access to this method:
https://support.google.com/webmasters/answer/9008080?hl=en
Ctrl+F (find) for "DNS record" and expand that bit of info from Google. That version works really well and I think, it also gives you access to the new domain level property
The htaccess mod method may be more applicable for you. Certainly make the change via FTP and not via a CMS back-end. If you break the .htaccess and kill the site, and you only have the CMS back-end to fix it - which also becomes broken, you're stuck. Modding your .htaccess file should not break FTP unless you do something out of this world, crazy-insanely wrong (in-fact I'm not sure you can break FTP with your .htaccess file)
Another option, temporarily nullify the HTTP to HTTPS redirects in the .htaccess, verify, make your changes, then put the rule back on. This is a bad method because, in a few weeks Google will fail to reach the file and you will be unverified again. Also your site may have legal reasons it must, must be on HTTPS. Also allowing HTTP again may shake up and mess up your SERPs unless you act lightning fast (before Google's next crawl of your site)
Something like this might help: https://serverfault.com/questions/740640/disable-https-for-a-single-file-in-apache or these search results: https://www.google.co.uk/search?q=disable+https+redirect+for+certain+file
Hope that helps
-
Thanks very much for your response. You are exactly right about the travails of the multiple properties, and I hadn't even thought about how the new domain level access should handle the multiple versions of each site (I'm still used to having to verify four separate properties).
In the end, you were exactly right; I just had to FTP the verification file once more and it worked immediately.
A question, though: if you were trying to verify a non secured protocol (http://) of a site that is https://, and you were for some reason unable to verify through GA or GTM, wouldn't uploading a verification file automatically create a secured protocol and therefore be invalid for verification? This is (thank goodness) purely theoretical, but it seems as though it would be a rough task which I'm sure happens periodically.
Thanks again for the insight. You were a great help!
-
I have no experience with this particular error but from the sounds of it, you will just have to re-verify and that's all that you can do. One thing to keep in mind is that different versions of the same site (HTTPS/WWW, HTTPS, HTTP/WWW, HTTP, any sub-domains) all count as separate websites in Search Console
The days of that being a problem are numbered as Google have come out with new domain-level properties for Search Console, but to verify those you need hosting level access so most people still aren't using that until Google can make the older verification methods applicable
What this does mean is that, if the URLs which you want to remove are for a different version of the site (which still counts as a separate property) then you still have to verify that other version of the site (maybe the pre-HTTPS version, or a version without WWW). If you have the wrong version of the property (site) registered in your GSC (which doesn't contain the URLs you want to remove) then you still need to register the old version
A common issue is when people move from HTTP to HTTPS, and they want to 'clean up' some of the old HTTP URLs and stop them from ranking (or at least, re-direct Google from the old property to the new one properly). They delete the HTTP version of the site from their GSC, but then they can't get back to do proper clean-up. In most instances Google still considers different site versions to be different sites in GSC. As mentioned this isn't a problem for some people now, soon it won't be a problem for anyone. But if you're looking at any kind of legacy account for sites that were built and verified up to a few months ago, the likelihood is you still have to re-verify other site versions
The new domain level properties may also have bugs in, where they defer back to the non-domain level properties for some stuff. You may have just found an example of that to be honest (but I can't confirm this)
I'd advise just doing what the UI tells you, it's really all you can feasibly do at this juncture
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Hacked: Is it Faster and Better to 301 or 404 Irrelevant URLs?
Hey Everyone, So our site was hacked which created a large amount of irrelevant URLs on our domain; resulting in thousands of 404 errors and pages coming up for searches unrelated to our brand. The question is now that the issues have been resolved (and site re-submitted) would it be quicker (and more ideal) to redirect important 404 errors that see traffic, have links…etc. although not relevant or just let everything 404 out? We’re not as concerned with offering a relevant user experience because these are not in our demographic but want to avoid these pages convoluting our analytics as well as issues that might arise from Google thinking these topics do apply. Any help or insight would be very appreciated. Please let us know if you have any questions, concerns or we could provide further details that might help. Looking forward to hearing from all of you! Thanks in advance. Best,
Reporting & Analytics | | Ben-R0 -
Google Search Console - > Google Search Analytic gives figure for google organic or adwords or combine of both?
Hi All, In Google Search Console -> In Search Analytics. I can see Clicks, Impressions, CTR and Position. I want to know all these 4 - Clicks, Impressions, CTR and Position gives information related to google organic only? or combine or google organic and google adwords? Thanks!
Reporting & Analytics | | wright3350 -
I have a WP site which uses categories to display the same content in several locations. Which items should get a canonical tag to avoid a ding for duplicate content?
So...I have a Knowledge Center and press room that pretty much use the same posts. So...technically the content looks like its on several pages because the post shows up on the Category listing page. Do I add a Canonical tag to each individual post...so that it is the only one that is counted? Also...I have a LONG disclaimer that goes at the bottom of most of the posts. would this count as duplicate content? Is there a way to markup a single paragraph to tell the spiders not to crawl it?
Reporting & Analytics | | LindsayiHart0 -
Weird visitors to my site
Hi, I am in the process of disentangling myself from a dodgy SEO company. At some point they set up another GA account on my site without consulting me. I replaced the tracking code with my original account on my wordpress site, placing the tracking code on the dashboard. There is a box in the dashboard for you to do this. For some reason the account he created is still giving me analytics but from mostly one url :forum.topic55622342.darodar.com. It has marked it as a referral? When you click it it redirects to this site : http://activities.aliexpress.com/computers_channel.php?aff_platform=aaf&sk=vV3B2RJYB%3A&cpt=1421321021096&null There have been 218 visits from this "referral" in the last month and also 2 direct visits to a clients online gallery (i'm a photographer). I am guessing the code for this new account is still on the site somewhere? Funnily enough in the first month I was getting targeted by spam using my contact form and I was a bit perplexed as to why. We had to put captchas on the contact forms which I was loathe to do as its another step for a client to have to go through causing resistance. Has this link got something to do with it? I have recently disavowed a lot of toxic links he created, so maybe they had something to do with it? Best wishes. David.
Reporting & Analytics | | WallerD0 -
How can I verify if someone is Google Analytics certified?
I am looking to hire an IC to help with analytics. I need to know how I can verify if they are GA certified. They gave me a link to a http://www.starttest.com profile. Is that legit?
Reporting & Analytics | | inhouseseo0 -
Statistics, R, and You: Advice for a New Analyst?
Hey SEOMozers! Two prongs to this question; I'll keep it succinct. I've been working as an in-house SEO/SEM Analyst for about 5 months now. While I'm generally savvy at telling the story behind the traffic/conversion data, and making forensic recommendations (I worked in SEO prior to this while in college), ideally I'd like to see my reports read less like these piddly Excel charts and percent change statistics. Ideally they'd look more like Nate Silver's FiveThirtyEight blog for the New York Times, or OkCupid's periodic dispatches on OkTrends: visual, statistically-informed, and predictive, the kind of report that under other circumstances might plausibly generate backlinks. Data analysts swear by R for statistical modeling, but is it useful for our Google Analytics data sets, holes and uncertainty and all? Is the steep learning curve worth the effort? Tutorials I've seen online assume a proficiency in programming or statistics that's beyond me, or they're written to support a textbook exercise. Any recommendations for a book, online course, or general resource with more of a niche focus? And a general question about stats too, since it's related: what level would you prescribe if I really wanted to kick this up a notch? I studied a humanity in college and while it helps with the numerical storytelling, I wonder if the practical arcana of Bayesian Methods/abstract probability theorems have a place in Web Analytics. Do they? Are there options for us bushy-tailed young analysts to pick this up without resorting to B School? Thanks in advance!
Reporting & Analytics | | sweetfancymoses0 -
If I change the URL of a page, but the old page canonicalizes to the new, do I need to change my Analytics goals to get data?
I changed the URLs of some pages recently (because the same thing that affects the internal anchor text also affects the URL - grr...) but considered it not a big deal because even if I looked at the source code of the old URL, the canonical tag was now pointing to the new one. The question is - if I had URL destination goals set up for those URLs in Google Anlaytics, do I now have to change them? Or does Google somehow know that anyone getting to the new URL is the equivalent of someone getting to the old URL because of the canonical tag that exists on the old URL source code? I still do see goal conversions for some of the old URLs even since I changed them - but it could be that people are still somehow finding the old URL somewhere - or that Google only reindexed it a week or so after I made the change. Any light to shed? Thanks in advance, Aviva B
Reporting & Analytics | | debi_zyx0 -
Organic search on google
Hi there, pl take a look at this link, there is a section which says shared results and has a star against two agencies and once article. can pl someone let me know what those starts are and how to get them? Thank you 🙂 so sorry - the link is http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=digital+mareketing+agency+los+angeles Vijay
Reporting & Analytics | | vijayvasu0