Two "Twin" Domains Responding to Web Requests
-
I do not understand this point in my Campaign Set-Up.
They are the same site as fas as I understand Can anyone help please?
Quote from SEOMOZ
"We have detected that the domain www.neuronlearning.eu and the domain neuronlearning.eu both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
thanks
John
-
They're the same site but not the same url. Notice one of those URLs begins with www and the other does not. It's just a weird thing about the way internet servers are set up and having problems with which one of them should be the one you use is called a canonicalization issue.
Most webmasters chose to use the www version and redirect the non-www version to it via settings on the web host. Here's some more reading on canonicalization.
-
That is the problem... that they are the same site.
That means that Google can index both versions and visitors and other sites can create backlinks to both versions - which is not good, because it splits your backlinks up between two sites instead of one.
You need to set up a 301 redirect from one of the versions to the other, as well as set a preferred version in Google Webmaster Tools.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multisite domain
good morning I have a wordpress site I have activated the multisite, currently the site has a domain authority of 8, when I publish a post, it is indexed quite quickly, if I publish a post in a language other than the /es subdomain it takes 24 hours why? If the author domain is the same, why does the employee take longer to be indexed on Google? Thank you
Technical SEO | | alainscilly770 -
GSC: Change of Domain Not Processed, Despite Saying "Approved"?
Hi folks, I've just completed a straightforward olddomain -> newdomain migration. All the redirects were done on 7th Feb. I submitted the change of domain request on 7th Feb. All seemed fine - as can be seen in the attached. It's now 19th March and our pals at GSC are still saying that the domain migration is ongoing. I've never had this take so long before; 2-3 days tops. Their results are tanking as I can't geo target and more features in GSC are out of action as it's 'locked' due to this migration (I just get a screen as per the attached). Thoughts? Shall I risk withdrawing the request and starting anew? The old "turn it off and on again"? Thanks! hJXKC
Technical SEO | | tonyatfat0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
How do I "undo" or remove a Google Search Console change of address?
I have a client that set a change of address in Google Search Console where they informed Google that their preferred domain was a subdomain, and now they want Google to also consider their base domain (without the change of address). How do I get the change of address in Google search console removed?
Technical SEO | | KatherineWatierOng0 -
Sitemaps and "noindex" pages
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?
Technical SEO | | LocalLocal0 -
How to increase your Domain Authority
Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth
Technical SEO | | GAZ090 -
Two blogs on the same domain
I have had two blogs on the same domain for a while now, and it just occurred to me that no one else seems to do this and maybe it's even weird. http://www.stadriemblems.com/blog/
Technical SEO | | UnderRugSwept
http://www.stadriemblems.com/scouting/blog/ One is our main blog, and one is for a very concentrated niche of customers. What are your opinions on this? Everything from SEO to best practices, to overall unusual-ness?0 -
Changing preferred domain
My company has an international website, and because of a technical issue visitors in one of our main countries cannot visits the "www" version of our site. Currently, the www version is our preferred domain - and the non www redirects to that page. To solve this problem, I was thinking of proposing the following and would greatly appreciate any feedback! (Note: If you answered my www vs. non www question, thanks - this is a follow up) 1. Set non www site as the preferred version 2. Redirect from www to non www 3. Contact our current links and ask them to change to without “www” 4. Change canonical URLs to without “www”
Technical SEO | | theLotter0