"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
-
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical.
The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
https://support.google.com/webmasters/answer/44231?hl=enGWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard.
I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links.
Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content?
Thanks for your feedback.
Rosemary
-
Don't think so as Google will update its index. If it's not too much work you can do it but not sure it's worthful
-
Thank you for your reply.
Do we still need canonicals if we set the preferred domain in GWMT and update htaccess?
-
Yes that the best you should do and then redefine in GWT the preference for www
-
You can edit you htacess file and do something like this
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.yourdomain.com$ [NC]
RewriteRule (.*) http://www.yourdomain.com/$1 [R=301]
RewriteRule ^index.php$ / [R=301]This will make sure that no url loads without www if it does it will 301 redirect it to www also you want to make sure that your homepage / index page is not duplicated so, have that one 301 redirect to the root as well.
-
I'm going through the same kind of situation these days. The client I'm providing SEO services was ranking on the 4th page in the serps. The website was having 8k duplicate pages because of www and non-www issue. I just added the canonical tag in the preferred domain and waiting for Google to deindex the non preferred domain.
But I've also noticed that just after creating 14 backlinks on authority sites, the site is ranking on first page in 5 different keywords.
I don't see any down in backlinks in WMT, I'm not sure but I think Google doesn't cut down backlinks due to duplicate pages. Wait for any senior to reply to your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 301 redirect to pass 'back link' juice to a different domain?
Hi, I have a backlink from a high DA/PA Government Website pointing to www.domainA.com which I own and can setup 301 redirects on if necessary. However my www.domainA.com is not used and has no active website (but has hosting available which can 301 redirect). www.domainA.com is also contextually irrelevant to the backlink. I want the Government Website link to go to www.domainB.com - which is both the relevant site and which also should be benefiting from from the seo juice from the backlink. So far I have had no luck to get the Government Website's administrators to change the URL on the link to point to www.domainB.com. Q1: If i use a 301 redirect on www.domainA.com to redirect to www.domainB.com will most of the backlink's SEO juice still be passed on to www.domainB.com? Q2: If the answer to the above is yes - would there be benefit to taking this a step further and redirect www.domainA.com to a deeper directory on www.domianB.com which is even more relevant?
Technical SEO | | DGAU
ie. redirect www.domainA.com to www.domainB.com/categoryB - passing the link juice deeper.0 -
Which URL do I request Google News inclusion for: the http or the non-http?
In Google WMT/Search Console, I've marked the non-www. version of my site as the preferred. But I haven't run into a choice between http:// and non-http:// before. Should I choose the one listed at the top, which is the non-http (AND the non-www) version? Thanks! Unknown.png
Technical SEO | | christyrobinson1 -
Will Google Recrawl an Indexed URL Which is No Longer Internally Linked?
We accidentally introduced Google to our incomplete site. The end result: thousands of pages indexed which return nothing but a "Sorry, no results" page. I know there are many ways to go about this, but the sheer number of pages makes it frustrating. Ideally, in the interim, I'd love to 404 the offending pages and allow Google to recrawl them, realize they're dead, and begin removing them from the index. Unfortunately, we've removed the initial internal links that lead to this premature indexation from our site. So my question is, will Google revisit these pages based on their own records (as in, this page is indexed, let's go check it out again!), or will they only revisit them by following along a current site structure? We are signed up with WMT if that helps.
Technical SEO | | kirmeliux0 -
Has Google Made Unnatural Link Building Easier?
I see lots of competitors and crappy sites ranking well for highly competitive keywords in the web hosting niche. After analysing their backlinks, I noticed that most of them had only 1 or 2 backlinks to the page they wanted to rank. The anchor text is usually a slight variation of the targeted keyword. Now suppose you are able to rank well for a handful of highly lucrative keywords using very few spammy links. That would mean that even if you got a Penguin penalty, cleaning up your link profile would take an hour at most. I really have no intentions of using this strategy but it's frustrating to see spammy competitors outranking you with crappy sites and a handful of backlinks. Your thoughts?
Technical SEO | | sbrault740 -
What is "evttag=" used for?
I see evttag= used on realtor.com, what looks to be for click tracking purposes. Does anyone know if this is an official standard or something they made up?
Technical SEO | | JDatSB0 -
Sitemaps and "noindex" pages
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?
Technical SEO | | LocalLocal0 -
Domain "Forwarded"?
Hi SEOMoz! The company I work for has a website, www.accupos.com, but they also have an old domain which is not used anymore called http://accuposretail.com/ These two sites had duplicate content so I requested the OLD site (http://accuposretail.com/) be redirected to accupos.com to eliminate the dupe content. Unfortunately, I do not understand completely what happened but when they performed this forwarding the accuposretail.com URL is still in use. Now it just displays EXACTLY what accupos.com displays and not something similar. The tech team told me it is forwarded but I can't help but see the URL still in the search box on top. Is this unacceptable? The actual URL has to forward and change to the accupos.com URL in order to not be duplicate content, correct? I have limited experience in this. Please let me know if we are good to go, or if I need to tell them more action is required. Thanks! Derek M
Technical SEO | | DerekM880 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0