When choosing GWT preferred domain its asking for re-verification?
-
Trying to set a preferred domain in GWT, and the site is verified via Google Analytics and meta tag in the code, but still asks:
Part of the process of setting a preferred domain is to verify that you own http://site.org/. Please verify http://site.org/.
Tried looking for answer to no avail, am I missing anything?
-
Thanks again Devanur, same to you!
-
Understood my friend. By propagation, I meant the propagation of the sitemap.xml file for newly added preferred-version of the site and yes, you are absolutely correct. It takes a while for Google to get the stats for the preferred version. Yes, I knew you would be having a 301 in place already, I mentioned about it just in case
Wish you good luck for all your endeavors.
Best,
Devanur Rafi
-
By propagating I meant all the statistics currently up, not in a sense of DNS propagation, it takes a while for Google to crawl and the stats to the GWT dashboard. But its not really hard to re-set that up, just dont really see a reason now that the old profile shows the preferred choice selected.
Yes, 301 perm redirect set up
-
You just need to add the sitemap for the preferred version and refresh the page. All your sitemap info will be readily available. Here we don't need to bother about it propagating or anything of that sort. As the sitemap.xml is still there where it was (in the root folder), nothing has changed for Google.
If you have not done this already, you should go in for a 301 permanent redirection from your non-preferred of your site to your preferred-version.
-
Hmm, I actually did. and it is still there under my original profile, and with my preferred selection. Its just that I have to add the sitemap to the new account and wait until it propagates with all the info and crawl stats/errors.
Ill wait a few days and see if it changes. Or is it a bad idea?
-
Good to hear that my friend. Please do not delete the newly added non-www version of your website as you will not be able to set the preferred domain without it. Just leave it as it is.
You can access your preferred domain by clicking on it to get all the stats in GWMT account.
Best,
Devanur Rafi
-
Great worked like a charm... Now I just delete the new profile that I had just created as the initial profile states my correct preferred domain now, correct?
Many Thanks Devanur!!
-
Hi Vadim,
Yes you need to get both the non-www and www versions of your website verified in GWMT account to be able to set the preferred domain. Its as simple as adding your site without www to the account and hit the Verify this site option. That's it. As you would have your www version already verified, getting the non-www version of your website doesnot require to go through the verification process like uploading the html file or adding the verification meta tag.
Just add your site without www by clicking the ADD A SITE button and click the Verify this site option and you should be good.
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain hacked and redirected to another domain
2 weeks ago my home page plus some others had a 301 redirect to another cloned domain for about 1 week (due to a hack).The original pages were then de-indexed and the new bad domain was indexed and in effect stole my rankings.Then the 301 was removed/cleaned from my domain and the bad domain was fully de-indexed via a request I made in WMT (this was 1 week ago).Then my pages came back into the index but without any ranking power (as if it's just in the supplemental index).It's been like this for a week now and the algorithms have not been able to correct it. So how do I get this damage undone or corrected? Can someone at Google reverse/cancel the 301 ranking transfer since the algorithms don't seem to be able to?I have the option to do a "Change of Address" in WMT from bad domain to my domain. But I don't think this would work properly because it says I also need to place a 301 on the bad domain back to mine. Would a change of address still work without the 301?Please advise/help what to do in order to get my rankings back to where they were.
Technical SEO | | Dantek0 -
Different domains for multilingual website
Hey guys, A site that I'm currently working on as different domains for each website language. So for example: word1word2.com for the english version word3word4.com for the french version word5word6.com for spanish version .... Is it better to move all of the different languages to the same domain and use subfolders for each language /fr/... Please note that the domains being used bring in organic traffic as well as they are EMDs. Thank You.
Technical SEO | | BruLee0 -
Grabbing Expired Domains
How hard is it to grab expired domains? I have my eye on a domain that is expiring in 3 days, but I don't think it's quite that simple. Doesn't it go through months of waiting to become available? Is there an easy way to grab domains that are set to expire? Are the services that offer this type of service good? And who do you guys recommend?
Technical SEO | | applesofgold0 -
Migrating to a subdirectory in the same domain
Hi! I have a new version of my website, running with a different CMS (joomla). In order to install the new CMS while not loosing my all content and links I was forced to install the new site in a subdirectory. So the old website was http://www.mydomain.com And the new one is http://www.mydomain.com/subdirectory I had redirected http://www.mydomain.com to http://www.mydomain.com/subdirectory but I am not sure if that is correct, or if it will generate SEO problems. I named the subdirectory with a keyword, at least to have any advantage of something that to my short knowledge looks bad... What do you think? Another question... I understand that it is a good SEO rule to optimize each page for a different keyword. Is it a problem if http://www.mydomain.com is not optimized for anything? Thanks!
Technical SEO | | ociosu0 -
How does robots.txt affect aliased domains?
Several of my sites are aliased (hosted in subdirectories off the root domain on a single hosting account, but visible at www.theSubDirectorySite.com) Not ideal, I know, but that's a different issue. I want to block bots from viewing those files that are accessible in subdirectories on the main hosting account, www.RootDomain.com/SubDirectorySite/, and force the bots to look at www.SubDirectorySite.com instead. I utilized the canonical meta tag to point bots away from the sub directory site, but I am wondering what will happen if I use robots.txt to block those files from within the root domain. Will the bots, specifically Google bot, still index the site at its own URL, www.AnotherSite.com even if I've blocked that directory with Disallow: /AnotherSite/ ? THANK YOU!!!
Technical SEO | | michaelj_me0 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990 -
Google refuses to index our domain. Any suggestions?
A very similar question was asked previously. (http://www.seomoz.org/q/why-google-did-not-index-our-domain) We've done everything in that post (and comments) and then some. The domain is http://www.miwaterstewardship.org/ and, so far, we have: put "User-agent: * Allow: /" in the robots.txt (We recently removed the "allow" line and included a Sitemap: directive instead.) built a few hundred links from various pages including multiple links from .gov domains properly set up everything in Webmaster Tools submitted site maps (multiple times) checked the "fetch as googlebot" display in Webmaster Tools (everything looks fine) submitted a "request re-consideration" note to Google asking why we're not being indexed Webmaster Tools tells us that it's crawling the site normally and is indexing everything correctly. Yahoo! and Bing have both indexed the site with no problems and are returning results. Additionally, many of the pages on the site have PR0 which is unusual for a non-indexed site. Typically we've seen those sites have no PR at all. If anyone has any ideas about what we could do I'm all ears. We've been working on this for about a month and cannot figure this thing out. Thanks in advance for your advice.
Technical SEO | | NetvantageMarketing0 -
Domain Redirect Issues
Hi, I have a domain that is 10 years old, this is the old domain that used to be the website for the company. The company approximately 7 years ago was bought by another and purchased a new domain that is 7 years old. The company did not do a 301 redirect as they were not aware of the SEO implications. They continued building web applications on the old domain while using the new domain for all marketing and for business partner links. They just put in a server level redirect on the folders themselves to point to the new root. I am on Tomcat, I do not have the option of a 301 redirect as the web applications are all hard coded links (non-relative) (hundreds of thousands of dollars to recode) After beginning SEO; Google is seeing them as the same domain, and has replaced all results in Google with the old domain instead of the new one..... My questions is.... Is it better to take the hit and just put a robots.txt to disallow all robots on the old domain Or... Will that hurt my new domain as well since Google is seeing them as the same? Or.... Has Google already made the switch without a redirect to see these as the same and i should just continue on? (even the cache for the new site shows the old domain address) Old Domain= www.floridahealthcares.com New = www.fhcp.com *****Update after writing this I began changing index.htm to all non relative links so all links on the old domain homepage would point to fhcp.com fixing the issue of the entire site being replicated under the old domain. I think this might "Patch" my issue, but i would still love to get the opinion of others Thanks Shane
Technical SEO | | Jinx146780