"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
-
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical.
The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
https://support.google.com/webmasters/answer/44231?hl=enGWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard.
I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links.
Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content?
Thanks for your feedback.
Rosemary
-
Don't think so as Google will update its index. If it's not too much work you can do it but not sure it's worthful
-
Thank you for your reply.
Do we still need canonicals if we set the preferred domain in GWMT and update htaccess?
-
Yes that the best you should do and then redefine in GWT the preference for www
-
You can edit you htacess file and do something like this
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.yourdomain.com$ [NC]
RewriteRule (.*) http://www.yourdomain.com/$1 [R=301]
RewriteRule ^index.php$ / [R=301]This will make sure that no url loads without www if it does it will 301 redirect it to www also you want to make sure that your homepage / index page is not duplicated so, have that one 301 redirect to the root as well.
-
I'm going through the same kind of situation these days. The client I'm providing SEO services was ranking on the 4th page in the serps. The website was having 8k duplicate pages because of www and non-www issue. I just added the canonical tag in the preferred domain and waiting for Google to deindex the non preferred domain.
But I've also noticed that just after creating 14 backlinks on authority sites, the site is ranking on first page in 5 different keywords.
I don't see any down in backlinks in WMT, I'm not sure but I think Google doesn't cut down backlinks due to duplicate pages. Wait for any senior to reply to your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Google+ Contibutor to: Link To Main Domain or Content Page?
Which is the best practice for the link to claim authorship for a guest post? I have tried both the main domain URL in the "contributor to" section of my Google plus and the page URL where the post is and both show my picture when testing in the Structured Data Testing Tool. Which is best to use? Thanks in advance.
Technical SEO | | WSIDW0 -
Best use of robots.txt for "garbage" links from Joomla!
I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend
Technical SEO | | teleman0 -
Problem www/non-www domain rewrite
Hello, I've made a site for a client about 1 year ago. The rankings are quite okay, but the home page suffers from a penalty I think. I found out via OSE that PageAuthority strangely is higher on the 301-ed page www.myanmar-rundreisen.de - PA 32
Technical SEO | | hgw57
myanmar-rundreisen.de/ - PA 33 I don't understand what is happening here as I am using the usual htaccess 301-redirect: Rewrite domain.com -> www.domain.com RewriteCond %{HTTP_HOST} .
RewriteCond %{HTTP_HOST} !^www.myanmar-rundreisen.de [NC]
RewriteRule (.*) http://www.myanmar-rundreisen.de/$1 [L,R=301] which is working fine with other domains ... I tried also (last line) RewriteRule (.*) http://www.myanmar-rundreisen.de/$1 [L,R=301] So thanks to anyone who can share an idea on that ... Guenter K04xy.jpg0 -
Duplicate content issue index.html vs non index.html
Hi I have an issue. In my client's profile, I found that the "index.html" are mostly authoritative than non "index.html", and I found that www. version is more authoritative than non www. The problem is that I find the opposite situation where non "index.html" are more authoritative than "index.html" or non www more authoritative than www. My logic would tell me to still redirect the non"index.html" to "index.html". Am I right? and in the case I find the opposite happening, does it matter if I still redirect the non"index.html" to "index.html"? The same question for www vs non www versions? Thank you
Technical SEO | | Ideas-Money-Art0 -
Does google "see through" php/asp redirects?
A lot of the time I see companies employing a technique like this: <a target="_blank" href="/external/wcpages/referral.aspx?URL=http%253a%252f%252fwww.xxxx.ca&ReferralType=W&ProfileID=22&ListingID=96&CategoryID=219">xxxxxa> Or similarly with php. In an attempt to log all the clicks that exit their site from certain locations. When google bot comes along and crawls this page, does it still understand that this page links to www.xxxx.ca?
Technical SEO | | adriandg0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0