Any harm and why the differences - multiple versions of same site in WMT
-
In Google Webmaster Tools we have set up:
ourdomain.co.nz
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.auAs you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says:
"If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site."
The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say:
"Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead."
This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au).
However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead:
Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/.
This is a problem as it means that we can't delete these profiles from our WMT account.
Any thoughts on the above would be welcome.
As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain....
Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...
-
I agree with Federico that you probably don't need to have every page be secure. Perhaps you should consider making the http://www. version your canonical default instead?
-
It is fine to have multiple versions of a site in different countries. Some of the biggest brands in the world do this. There are "right" and "wrong" ways to go about it, but if I had a ccTLD for the UK and lots of UK customers I wouldn't send them to my US site, regardless of whether I had a /uk/ folder or not.
-
Chirs,
Is the content exactly the same on all domains? Anything changes between .com, .co.uk, etc?
If so, you MUST use the canonical to only ONE version (.com would be my guess) and rel="alternate" for the other domains, however, that doesn't make any sense if the content is the same. Why not just redirect all domains to .com (or whatever definitive version you choose)?
-
Hi Frederico,
Thanks very much for your response. And yes, sorry, my initial question wasn't written so great, sorry!
ourdomain.com and www.ourdomain.com both 301 to https://www.ourdomain.com (which is also the canonical definitive version for the .com)
ourdomain.co.uk and www.ourdomain.co.uk both 301 to https://www.ourdomain.co.uk (which is also the canonical definitive version for the .co.uk)
and the same as above for .com.au domains, and .co.nz domains.
The content is the same across all domains.
The thing is that a lot of info appears in Webmaster tools under the non canonical versions of the sites, and is not showing under the canonical profile in WMT. Which makes us feel like maybe we shouldn't delete those profiles?
Regarding the HTTP vs HTTPS issues... sounds like what you are saying is that we should consider only using HTTPS on pages that really need it - at the moment it is site wide. That makes sense.
Thanks again and look forward to your thoughts as to whether there is any benefit or harm if we keep/remove the non canonical site profiles from WMT.
-
Hi Chris,
That was hard to follow. Let's start with the basics:
Do all those domains redirect to one single domain? or all those domains serve the same content but within the domain accessed?
If you redirect all domains to a single domain, using a 301 will do while having the profiles in WMT is useless. If you serve the same content within all domains, you should use canonicals pointing to a definitive version with no canonical tag. Then again, you can use WMT to track searches and links, but Google will serve one URL in their results, and that's the one all other versions are pointing in the canonical tag.
Now, are you trying to serve all your content under SSL or standard HTTP? As that causes a duplicate content issue if you are serving both, and again, you should use 301 to the verison you prefer or canonicals. There's no benefit or harm using HTTPS for all your pages, and sometimes, HTTPS could be slower as the browser has to negotiate certificates in each request (I would go with regular HTTP if you are not requesting input from your visitors or showing private information).
Am I on the right track so far?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Explore by site - Site overview's servers
Hello, When I want to "Explore by site" and make a "Site overview", I have only 4 choices for the region : USA United Kingdom Canada Australia But the location of my business is in Chile.
Reporting & Analytics | | Sodimaccl
Does this have any repercussion or negative impact in the analytics ? Thank you.0 -
Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites. We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years. By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step? Thanks, Alan
Reporting & Analytics | | Kingalan10 -
Site relaunch and impact on SEO
I have some tough decisions to make about a web site I run. The site has seen around for 20 years (September 1995, to be precise, is the date listed against the domain). Over the years, the effort I've expanded on the site has come and gone, but I am about to throw a lot of time and effort back into it. The majority of the content on the site is pretty dated, isn't tremendously useful to the audience (since it's pretty old) and the site design and URL architecture isn't particularly SEO-friendly. In addition, I have a database of thousands vendors (for the specific industry this site serves). I don't know if it's a factor any more but 100% of the links there have been populated by the vendors themselves specifically requesting inclusion (through a form we expose on the site). When the request is approved, the vendor link shows up on the appropriate pages for location (state) and segment of the industry. Though the links are all "opt-in" from vendors (we've never one added or imported any ourselves), I am sure this all looks like a terrible link farm to Google! And some vendors have asked us to remove their link for that reason 🙂 One final (very important) point. We have a relationship with a nationwide brand and have four very specific pages related to that brand on our site. Those pages are essential - they are by far the most visited pages and drive virtually all our revenue. The pages were put together with SEO in mind and the look and feel is very different to the rest of the site. The result is, effectively, a site-within-a-site. I need to carefully protect the performance of these pages. To put some rough numbers on this, the site had 475,000 page views over the last year, with about 320,000 of those being to these four pages (by the way, for the rest of the content "something happened" around May 20th of last year - traffic almost doubled overnight - even though there were no changes to our site). We have a Facebook presence and have put a little effort into that recently (increasing fans from about 10,000 last August to nearly 24,000 today, with a net gain of about 2,500 per month currently). I don't have any sense of whether that is a meaningful resource in the big picture. So, that's the background. I want to totally revamp the broader site - much improved design, intentional SEO decisions, far better, current and active content, active social media presence and so on. I am also moving from one CMS to another (the target CMS / Blog platform being WordPress). Part of me wants to do the following: Come up with a better plan for SEO and basically just throw out the old stuff and start again, with the exception of the four vendor pages I mentioned Implement redirection of the old URLs to new content (301s) Just stop exposing the vendor pages (on the basis that many of the links are old/broken and I'm really not getting any benefit from them) Leave the four important pages exactly as they are (URL and content-wise) I am happy to rebuild the content afresh because I have a new plan around that for which I have some confidence. But I have some important questions. If I go with the approach above, is there any value from the old content / URLs that is worth retaining? How sure can I be there is no indirect negative effect on the four important pages? I really need to protect those pages Is throwing away the vendor links simply all good - or could there be some hidden negative I need to know about (given many of the links are broken and go to crappy/small web sites, I'm hoping this is just a simple decision to make) And one more uber-question. I want to take a performance baseline so that I can see where I started as I start making changes and measure performance over time. Beyond the obvious metrics like number of visitors, time per page, page views per visit, etc what metrics would be important to collect from the outset? I am just at the start of this project and it is very important to me. Given the longevity of the site, I don't know if there is much worth retaining for that reason, even if the content changes radically. At a high level I'm trying to decide what questions I need to answer before I set off on this path. Any suggestions would be very much appreciated. Thanks.
Reporting & Analytics | | MarkWill0 -
Tracking Clicks on a Global Header Across Multiple Sites
Hey All, A particular client has multiple websites and we're planning on implementing a global header across 15+ sites. I've been looking for a way to track the clicks on this global header across all sites (that is that they are summed up), what's the best way to go about this if I am using Google Analytics (I know Adobe site catalyst could do this no problem with some advanced tweaking), any ideas? I could do the general click tracking route and tag every link but that will only help me if I do that for each site (that being said, if the global header for all sites pulls from a single HTML, then tagging it would technically count all the clicks from all the sites, the only caveat being that I'd have to pick which Google analytics profile I'd want to track the header with). Thoughts? Thanks!
Reporting & Analytics | | EvansHunt0 -
How can I redirect incoming links from an old version of my site ending in .ctlg and .ivnu?
My original site was published in 2001 using "version 2" software from Ivenue, the hosting company that I signed up with at that time. The site's structure was built in such a way that the primary category pages ended in the extension .ivnu. Product or item pages on the shopping cart side ended in the extension .ctlg. My site's name was and is [Lamplight Feather, Inc.](<a class="webkit-html-attribute-value webkit-html-external-link" href="http://www.tonyhill.net/" target="_blank">http://www.tonyhill.net/</a>). We built our business between 2001 and 2011 and by the last three years (2009 - 2011) of using their version two were averaging a million dollars per year in gross sales. We decided to "upgrade" to Ivenue's "version 3" in 2011 to take advantage of some more modern options and because their newer software created web pages ending in .html which we thought more desirable. We made the switch in late 2011. But it was a disaster. Traffic and sales dropped precipitously. For the past two years (2012-2013) our annual gross sales average dropped to $400,000. (Two other factors were involved beside losing the many incoming links and link juice we had built up over the years: Panda came in that fall and my little niche market (decorative feathers) was flooded with competitors.) However as I try to rebuild our traffic and business little by little, I am stumped as to how to redirect the many incoming links that went to our first site's .ivnu and .ctlg pages. I have constructed redirects for some of our current but changed .html pages like this and put them in the file cabinet and they work: For (example): http://www.tonyhill.net/feathers_c384589.html then But trying the same thing for (example) http://www.tonyhill.net/craftfeathers.ivnu still returns a 404. Is there something I am missing. Ivenue is useless in this matter by the way. Their "technicians" are no help. I plan to be migrating my site once again to a new hosting company and hope to solve this problem before then. Thanks for the attention, Tony Hill This is an example from Google Webmaster of the type of links that show up as 404's that I would like to redirect: | URL: | http://www.tonyhill.net/productCat96521.ctlg | | | Error details | Linked from | | <colgroup><col></colgroup>
Reporting & Analytics | | featherman
| http://www.tonyhill.net/productCat43986.ctlg |
| http://forum.muppetcentral.com/showthread.php?t=21416&page=2 |
| http://www.cosplay.com/showthread.php?p=3832751 |
| http://forum.muppetcentral.com/showthread.php?t=21416&page=2&highlight=fur |
| http://www.muppetcentral.com/forum/threads/puppeteers-resources-links.19330/page-2 |
| http://www.muppetcentral.com/forum/threads/how-do-you-like-my-puppets.18549/page-2 | | | | |0 -
How to put structured data on your site?
HI All, I Would like to know your views on structured data on your site. What are the qualities and point that should keep in mind when we talking about the "structured data on your site". how we can apply structured data. How we use data highlighter for the website which don't have any events. your thoughts please.
Reporting & Analytics | | lucidsoftech
thanks0 -
Bing Won't Index Site - Help!
For the past few weeks I’ve been trying to figure out why my client's site is not indexed on bing and yahoo search engines. My Google analytics is telling me I’m getting traffic (very little traffic) from Bing almost daily but Bing webmaster tools is telling me I’ve received no traffic and no pages have been indexed into Bing since the beginning of December. At once point I was showing ranking in Bing for only one keyword then all of a sudden none of my pages were being indexed and I now rank for nothing for that website. From Google I’m getting over 1200 visits per month. I have been doing everything I can to possibly find the culprit behind this issue. I feel like the issue could be a redirect problem. In webmaster tools on Bing I’ve used “Fetch as Bingbot” and every time I use it I get a Status of “Redirection limit reached.”. I also checked the CRAWL Information and it’s saying all the URL’s to the site are under 301 redirect. A month or so ago the site was completely revamped and the canonical URL was changed from non www to www. I have tried manually adding pages to be indexed multiple times and Bing will not index any of the sites pages. I have submitted the sitemap to Bing and I am now at a loss. I don’t know what’s going on and why I can’t get the site listed on Bing. Any suggestions would be greatly appreciated. Thanks,
Reporting & Analytics | | VITALBGS
Stephen0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0