Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for www and non www
-
How is the best way to handle all the different variations of a website in terms of www | non www | http | https?
In Google Search Console, I have all 4 versions and I have selected a preference.
In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score.
eg.
- http://mydomain.com DA 25 PA 35
- http://www.mydomain.com DA 19 PA 21
Each version of the home page having it's only set of links and scores.
Should I try and "consolidate" all the scores into one page?
Should I set up redirects to my preferred version of the website?
Thanks in advance
-
thanks for your answer
that was helpful
-
Thanks for taking the time to put together such a wonderfully detailed answer.
-
Hi Samantha,
What you have is what are called "canonical issues." By allowing multiple versions of your domain open and crawlable to search engines you "split" your ranking authority and result in the issues you are seeing right now.
The best practice is to choose one version of your domain as the "true canonical" and then 301 redirect the others at the server level by means of mod_rewrite code. Doing so will consolidate your content, incoming links and PageRank and greatly increase the root domain authority of your site.
To search engines, if your site hasn't instituted 301 redirect commands at the server level, all of these versions of your site home page would be treated as "separate pages" and each would accumulate authority individually:
http://yoursite.com/
http://www.yoursite.com/
http://yoursite.com/index.php
http://www.yoursite.com/index.php
https://yoursite.com
https://www.yoursite.comYou get the idea.
Most websites are run on one of three different types of servers...
- Unix-based servers running Apache.
- Unix-based servers running Nginx.
- Microsoft Windows-based servers running IIS or similar.
If you're unsure of what kind of server runs your site, ask your hosting company. Most sites are run on Unix-based servers with Apache. In that case, the server's behavior is configured using something called the .htaccess file.
If your site's root domain already contains a
.htaccessfile, you can simply scroll to the end of whatever code is already there and append your 301 redirect code at the bottom of the file, starting on a new line. While this may sound complicated, it's actually very, very simple to do. If you can upload files to and from your Web server, then chances are you'll have no trouble managing (i.e. altering or creating and uploading) your.htaccessfile(s).But yes, bottom line, you ALWAYS want to consolidate URLs and present one uniform "preferred" URL format to search engines and users. In your case, that would appear to the be the non-www domain which has the higher Domain Authority.
You can learn all about redirection best practices at the Moz resource here: https://moz.com/learn/seo/redirection
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
What is the best way to deal with an event calendar
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks,
Technical SEO | | categorycode0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Cloaking? Best Practices Crawling Content Behind Login Box
Hi- I'm helping out a client, who publishes sale information (fashion sales etc.) In order for the client to view the sale details (date, percentage off etc.) they need to register for the site. If I allow google bot to crawl the content, (identify the user agent) but serve up a registration light box to anyone who isn't google would this be considered cloaking? Does anyone know what the best practice for this is? Any help would be greatly appreciated. Thank you, Nopadon
Technical SEO | | nopadon0 -
Www vs non-www which is better?
Is it better to have all your pages point to the www version or non www version.
Technical SEO | | bronxpad0 -
Hreflang on non-canonical pages
Hi! I've been trying to figure out what is the best way to solve this dilemma with duplicate content and multiple languages across domains. 1 product info page 2 same product but GREEN
Technical SEO | | LarsEriksson
3 same product but RED
4 same product but YELLOW **Question: ** Since pages 2,3,4 just varies slightly I use the canonical tag to indicate they are duplicates of page 1. Now I also want to indicate there are other language versions with the_ rel="alternate" hreflang="x" _element. Should I place the _rel="alternate" hreflang="x" _on the canonical page only pointing to the canonical page with "x" language. Should I place the _rel="alternate" hreflang="x" _on all pages pointing to the canonical page with the "x" language? Should I place the _rel="alternate" hreflang="x" _on all pages and then point it to the translated page (even if it is not a canonical page) ? /Lars0 -
Best Dynamic Sitemap Generator
Hello Mozers, Could you please share the best Dynamic Sitemap Generator you are using. I have found this place: http://www.seotools.kreationstudio.com/xml-sitemap-generator/free_dynamic_xml_sitemap_generator.php Thanks in advanced for your help.
Technical SEO | | SEOPractices0