Search console site verification
-
I've been going on the assumption that when verifying a website in search console, it's always good to register and verify all variants of the site URL:
- http
- https
- www
- non-www
However, if you create redirects to the preferred URL, is it really necessary to register/virfy of the other three? If so, why?
-
Your dev is right on the specific question of whether the additional GMB profiles are necessary for Google to understand which is the primary version. This is not essential as long as the proper redirects are in place. It's a "belt and suspenders" approach. If there's no way for the search engine crawlers to ever reach anythig but the primary version, then there's no way for them to get it wrong. You're using the GMB info to reinforce what the redirects are already doing. (That said - it's trivial to reinforce the redirect process with the declaration in GMB, so best practice is to do that as well. You need the GMB profile to properly manage the other aspects of marketing the site anyway, so...)
Put another way - declaring the primary version of the site using the alternate GMB profiles is Google's way of allowing those who might not have proper dev access to sites to at least partly accomplish the same thing from within GMB which they can manage.
The real value to verifying the other versions in GMB is so that you can monitor to make certain that those non-canonical versions of the site are in fact definitely not getting indexed or ranked. This is essential after an HTTPS migration, for example, as you should see the HTTP profile showing a steady drop in indexing while the HTTPS profile shows the steady increase.
A periodic check of the non-canonical GMB profiles will alert you immediately to any newly discovered issues Google's crawlers may be encountering (like a sitewide redirect got accidentally removed or changed, for example.)
Make sense?
Paul
-
If you don't have access to the old Google accout, verify the property from the current new account. Verify all versions and then select the preferred one. This is what Google asks people to do. It is very fast and easy. This is always the preferred method.
Best Regards
-
The preferred version has been verified by adding an HTML tag in the Shopify theme (this is how I usually verify too). But I don't have access to the same search console account...
- can I generate a new HTML tag to verify the other three variants (ie. is it OK to use two different HTML tags)?
- or should I create a new HTML tag to verify ALL 4 variants (ie. is there any negative side to replacing the orginal HTML tag)?
-
Nope, you should not believe what your dev is saying (in this particular case).
As William said, Google suggest us to verify all your versions and set the preferred to be considerd. -
Thanks William,
That's always been my approach too, but the dev is adamant that if they redirect all variants to the preferred version there is no need to verify them all.
My question is: should I believe what this dev is saying?
-
Hello,
Yes, Google suggests that you register all variants of your site. Then make sure to select the preferred one in the search console. That way Google will understand your intent and desire.
http://www.site.com
http://site.com
https://www.site.com
https://site.comMake sure you select the preferred one to show in the search results.
Best Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
I have a metadata issue. My site crawl is coming back with missing descriptions, but all of the pages look like site tags (i.e. /blog/?_sft_tag=call-routing)
Intermediate & Advanced SEO | | amarieyoussef0 -
Open Site Explorer - Spam analysis: need help with inbound links... from my site!
hallo, reading my spam analysis report from open explorer, I found somenthing I don't understand (please see attached image): The long list of links inside the red rectangle are inbound links with a spam score of 5 coming from my same site. How is that possible? Should I remove those links? Also , I see that many of those links are links present in the top navigation bar (about page, home page, service description etc.) or in the sidebar section of the website (categories, recent posts, recent comments). Should I treat them differently? Thank you for your time.
Intermediate & Advanced SEO | | micvitale0 -
Dealing with 404s during site migration
Hi everyone - What is the best way to deal with 404s on an old site when you're migrating to a new website? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Review of our site
Hi Moz-Fans 🙂 I'm doing SEO for about a year now and have a new site to which I do not know where to improve any further. The main keyword is "Webdesign Freiburg" and the site is werkzeug - kasten . com Anyone want to have a look into and tell me what might bring us from page 2 to page 1 on google? Thanks a lot Marc
Intermediate & Advanced SEO | | RWW0 -
Dfferent url of some other site is shown by Google in cace copy of our site's page
Hi, When i check cached copy of url of my site http://goo.gl/BZw2Zz , the url in cache copy shown by Google is of some other third party site. Why is Google showing third party url in our site's cached url. Did any of you guys faced any such issue. Regards,
Intermediate & Advanced SEO | | vivekrathore0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Site Search Results in Index -- Help
Hi, I made a mistake on my site, long story short, I have a bunch of search results page in the Google index. (I made a navigation page full of common search terms, and made internal links to a respective search results page for each common search term.) Google crawled the site, saw the links and now those search results pages are indexed. I made versions of the indexed search results pages into proper category pages with good URLs and am ready to go live/ replace the pages and links. But, I am a little unsure how to do it /what the effects can be: Will there be duplicate content issues if I just replace the bad, search results links/URLs with the good, category page links/URLs on the navi. page? (is a short term risk worth it?) Should I get the search results pages de-indexed first and then relaunch the navi. page with the correct category URLs? Should I do a robots.txt disallow directive for search results? Should I use Google's URL removal tool to remove those indexed search results pages for a quick fix, or will this cause more harm than good? Time is not the biggest issue, I want to do it right, because those indexed search results pages do attract traffic and the navi. page has been great for usability. Any suggestions would be great. I have been reading a ton on this topic, but maybe someone can give me more specific advice. Thanks in advance, hopefully this all makes sense.
Intermediate & Advanced SEO | | IOSC1 -
Search Engine Pingler
Hello everyone, it's me again 😉 I've just got a Pro membership on SeoMoz and I am full of questions. A few days ago I found very interesting tool called: Search Engine Pingler And description of it was like this: Your website or your page was published a long time, but you can not find it on google. Because google has not index your site. Tool Search engine pingler will assist for you. It will ping the URL of your Page up more than 80 servers of google and other search engines. Inform to the search engine come to index your site. So my question is that tool really helps to increase the indexation of the link by search engine like Google, if not, please explain what is a real purpose of it. Thank you to future guru who can give a right answer 🙂
Intermediate & Advanced SEO | | smokin_ace0