Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Solved Should I consolidate my "www" and "non-www" pages?
-
My page rank for www and non-www is the same. In one keyword instance, my www version performs SO much better.
Wanting to consolidate to one or the other. My question is as to whether all these issues would ultimately resolve to my chosen consolidated domain (i.e. www or non-www) regardless of which one I choose. OR, would it be smart to choose the one where I am already ranking high for this significant keyword phrase?
Thank you in advance for your help.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
Require the www Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www\.askapache\.com$ [NC] RewriteRule ^(.*)$ https://www.askapache.com/$1 [R=301,L]
-
Yes. I would recommend picking the version (either www or non-www) that has the historical data showing it performs better than the other version. Check the list of indexed pages for each of the versions to compare. Ideally both the www and non-www version of the website will be indexed in Google so it will help you to decide which version makes the most sense to consolidate to.
Once you identify the preferred version, set 301 redirects from the non-preferred URLs to the preferred version of each URL (the one that has more traffic, links, authority, etc.) of the site. This should be done site-wide so that all URLs are either www or non-www, it shouldn’t be a mix of both. In my experience, I’ve found that between 90-99% of the Site’s SEO Authority is preserved when setting a permanent 301 redirect.
-
@meditationbunny Sorry for the slow reply - but yes, I'd expect Page Authority to increase slightly, if the "other" version had any value to it.
For Page Optimization, yes. For example, for my own site I see:
http://tcapper.co.uk redirects to https://www.tcapper.co.uk/. This on-page analysis is for https://www.tcapper.co.uk/.
-
It may be that one version (www or non-www) has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience in practice this type of migration tends to be smooth.
-
@tom-capper
Thank you. Yes, I should be more clear. I am calling it page rank, when I am actually referring to Moz's domain authority and Moz's keyword ranking. Still, I believe you answered my question. Under page optimization, I can see what appear to be duplicate listings of my pages along with different SERP ranking. It was confusing until I realized that one was the www and the other was non-www. I have since added code to my .htaccess file that will send everything to www. Can I expect the page optimization section to now only show www versions of the pages? Also, can I expect page authority to increase because it is no longer a mish-mash and is all headed to the same domain and same pages (i.e. www version)? -
It may be that one version ("www" or "non-www") has more historical links. You say your PageRank for both is the same, but how are you checking that? Google's public PageRank has not been updated in a decade or so.
Either way, I'd generally say that if you pick one version and stick to it (redirect the other, e.g. so every non-www. URL points to its www. equivalent), you should maintain all rankings. There is a theoretical advantage to picking the version with more links, but in my experience, in practice, this type of migration tends to be smooth.
-
This post is deleted! -
This post is deleted! -
This post is deleted!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google.
Technical SEO | | ChophelDoes anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Www vs non-www which is better?
Is it better to have all your pages point to the www version or non www version.
Technical SEO | | bronxpad0 -
301 for "index.php" in Web.config?
Hi there, I'm trying to create a 301 redirect for the file "index.php" but I keep getting a "fail to redirect" message in Firefox whenever I insert it into the Web.config file. <location path="index.php"></location> Is there anyway around this? Thanks for any help According to Open Site Explorer, there are about 500 links to my index file but it only has a 302 status so will not be passing link juice.
Technical SEO | | tdsnet0 -
What is best practice for redirecting "secondary" domain names?
For sites with multiple top-level domains that have been secured for a business or organization, I'm curious as to what is considered best practice for setting up 301 redirects for secondary domains. Is it best to do the 301 redirects at the registrar level, or the hosting level? So that .net, .biz, or other secondary domains funnel visitors to the correct primary/main domain name. I'm looking for the "best practice" answer and want to avoid duplicate content problems, or penalties from the search engines. I'm not trying to game the system with dozens of domain names, simply the handful of domains that are important to the client. I've seen some registrars recommend hosting secondary domains, and doing redirects from the hosting level (and they use meta refresh for "domain forwarding," which I want to avoid). It seems rather wasteful to set up hosting for a secondary domain and then 301 each URL.
Technical SEO | | Scott-Thomas0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0