"WWW" versus non "WWW" on domain
-
We plan on migrating our site to a new shorter domain name. I like the idea of removing "www" to gain an additional 3 letters in the URL display.
Is there any disadvantage of doing so from a technical or SEO perspective?
Thanks,
Alan -
Hi Alan, I don't think the extra 3 spaces would affect your SEO / rankings tremendously. Bots would still crawl and recognise the keywords in the URL. If you're after visibility on SERPs, you can optimise your meta title and description in a way that your brand & keywords are visible and clickable. As the meta title font is heaps bigger than the URL and description, that would be the first thing searchers see. Also, I doubt your URL will get truncated in SERPs with the extra 3 characters. Hope this helps!
-
Hi Nikki:
By not using "www" my objective would be to save 3 spaces (valuable real estate) to improve visibility of longer URL. Any significant upside to this? I don't see this as common so I am not certain if there is.
Any thoughts? Thank! Alan
-
Hello there, according to this guide there is no SEO benefit with choosing one over the other - it all depends on preference. Whatever you choose, let Google know your preference through Google Search Console. With regards to technical differences, adding www acts as a hostname that helps with DNS flexibility, restricting cookies etc. non-www doesn't have the same technical advantage according to the guide.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
Happy Friday, everyone! 🙂 This week's Community Discussion comes from Monday's blog post by Everett Sizemore. Everett suggests that pruning underperforming product pages and other content from your ecommerce site can provide the greatest ROI a larger site can get in 2016. Do you agree or disagree? While the "pruning" tactic here is suggested for ecommerce and for larger sites, do you think you could implement a similar protocol on your own site with positive results? What would you change? What would you test?
Intermediate & Advanced SEO | | MattRoney2 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
With or without the "www." ?
Is there any benefit whatsoever to having the www. in the URL?
Intermediate & Advanced SEO | | JordanBrown0 -
Do I need to set Preferred domain when the non-www redirects to www version?
Hi mozzers, Not sure if I should setup preferred domain when the non www version redirects to the www version? if yes why? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Avoiding 301 on purpose; Landing homepage linking to another domain with "Click here to go" and 5 sec meta refresh
Hello, Some users when they search for our site by using "ourbrand" keyword that ignore the first result (we will call it here ourbrand.de -not real name-) and they look for ourbrand.com . Even though we have that domain name also registered (indeed it also has a high ranking power) we are doing a 301 from the dot com to the dot.de . What we want to do is to index the homepage of the dot com, that is http://www.ourband.com as a secondary result while doing a 301 to any other internal URL of the dot com to the dot .de. Yes, we will loose link juice for the main domain but at least we will not loose visits from the brand traffic (which is our main traffic). So the question is, would Google index ourbrand.com if we show just a landing page that just show our logo, a "Click here to go to ourbrand.de" with a link to http://www.ourbrand.de and a meta refresh of 6 seconds to that URL? Additionally a cookie would be sent to the first time visitors, so in the next time they would be automatically redirected. PS: The 6 seconds is to avoid search engine consider it a "301" like it do with short meta refresh (not sure what time is the minimum to avoid be considered a 301). Any other suggestions on how to deal with this problem are welcomed
Intermediate & Advanced SEO | | Zillo0 -
Use of rel="alternate" hreflang="x"
Google states that use of rel="alternate" hreflang="x" is recommended when: You translate only the template of your page, such as the navigation and footer, and keep the main content in a single language. This is common on pages that feature user-generated content, like a forum post. Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland. Your site content is fully translated. For example, you have both German and English versions of each page. Does this mean that if I write new content in different language for a website hosted on my sub-domain, I should not use this tag? Regards, Shailendra Sial
Intermediate & Advanced SEO | | IM_Learner0 -
How to keep the link juice in E-commerce to an "out of stock" products URL?
I am running an e-commerce business where I sell fashion jewelry. We usually have 500 products to offer and some of them we have only one in stock. What happens is that many of our back links are pointed directly to a specific product, and when a product is sold out and no longer is in stock the URL becomes inactive, and we lose the link juice. What is the best practice or tool to 301-redirect many URLs at the same time without going and changing one URL at a time? Do you have any other suggestions on how to manage an out of stock product but still maintain the link juice from the back link? Thanks!
Intermediate & Advanced SEO | | ikomorin0