Www v.s non www
-
The canonical URLs (and all our link building efforts) is on the www version of the site.
However, the site is having a massive technical problem and need to redirect some links (some of which are very important) from the www to the non www version of the site (for these pages the canonical link is still the www version).
How big of a SEO problem is this?
Can you please explain the exact SEO dangers?
Thanks!
-
Thanks for all your responses - I will use this as the basis of my answer to the technical team.
-
I'm endorsing Stephen's idea, because if you really have no choice, I think it's a good potential alternative. THB's comments (which I thumbed up) are very important, though.
If you really have no choice, I do think the 302 is safer here - the canonical tag should override it. There is some risk, though, and it's definitely not ideal.
I'm not clear on the problem, but could you return a 503? It basically says "We've got a temporary problem - come back later" and, if it really is temporary, Google won't de-index the pages. If you're talking a couple of days, this may be a better solution. If you're talking a few weeks, you may have to take Stephen's advice. You might want to pull in expert help, though, because my gut reaction is that there's a better way to fix what's broken here.
-
Hehe.
Generally speaking, and I've actually come across this quite a bit lately, it's better to just put your efforts towards fixing the technical issues than to try and manipulate the site using redirects and canonical tags. But it's easy to say when it's not my technical problem, nor my money/time on the line to fix it! However, that is always the best-case scenario in my opinion.
-
Agreed. It's a problem waiting to bite you in the proverbials....
-
I worry about setting up a canonical tag that points to a URL Google can't access (as it's just being redirected via 302 back to the non-www version anytime it will try and read the canonical URL). And since a canonical tag is kinda sorta like a 301, you'd ultimately be 301'ing (kinda sorta) back to the www version, only to have a 302 header sent, 302'ing Google back to the non-www. And endless loop, so-to-speak. I'm not sure how Google would handle this.
How about just working 24/7 to resolve the "technical problem" that is causing this? I know, easy for me to say
-
I'm no expert on this but I think you'll be fine IF you:
1 - 302 redirect (temporary redirect) to the non-www page
2 - Add a rel canonical on the non-www page giving the WWW version link credit.
When you've fixed your tech issues remove the 302 redirect.
I THINK google will play nice on this.
Hope that helps
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Microsoft IIS SEO tool claims I have no H1... It's not true.
I have 4300 pages that the tool claims are missing the H1 value but they are there. Here is an example: http://antiquebanknotes.com/rare-currency/first-national-bank-atlanta-illinois-2283.aspx/ Has anyone seen this before?
Technical SEO | | Banknotes0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
Fetching & Rendering a non ranking page in GWT to look for issues
Hi I have a clients nicely optimised webpage not ranking for its target keyword so just did a fetch & render in GWT to look for probs and could only do a partial fetch with the below robots.text related messages: Googlebot couldn't get all resources for this page Some boiler plate js plugins not found & some js comments reply blocked by robots (file below): User-agent: *
Technical SEO | | Dan-Lawrence
Disallow: /wp-admin/
Disallow: /wp-includes/ As far as i understand it the above is how it should be but just posting here to ask if anyone can confirm whether this could be causing any prrobs or not so i can rule it out or not. Pages targeting other more competitive keywords are ranking well and are almost identically optimised so cant think why this one is not ranking. Does fetch and render get Google to re-crawl the page ? so if i do this then press submit to index should know within a few days if still problem or not ? All Best Dan0 -
Weird problems with google's rich snippet markup
Once upon a time, our site was ranking well and had all the markups showing up in the results. We than lost some of our rankings due to dropped links and not so well kept maintenance. Now, we are gaining up the rankings again, but the markups don't show up in the organic search results. When we Google site:oursite.com, the markups show up, but not in the organic search. There are no manual actions against our site. any idea why this would happen?
Technical SEO | | s-s0 -
I broke Google! (random snippet appearing in non-personalized search)
Hello all, so either I broke Google or Google doesn't know how to index my page properly (onradpad.com/paymyrent). If you search "pay rent with credit card", whether you're logged in to Google or not, you'll see a snippet from our signup process (which is js) right under the ad slot in the serps (Awesome! You're signed up!) and it will repeat where my meta data should be. It's been like this for well over a month now and I cannot figure out how to get rid of it. Additionally, if you search for the branded title of the page "pay with radpad", it pulls language that's not on that page (perhaps from somewhere in the js signup form). Though if you search for "pay rent with radpad" you'll see what my meta description is supposed to look like in the serps. Any ideas as to what the heck is going on?
Technical SEO | | RadMatt0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
No Following Existing Non-SEO Pages A Good Idea?
Greetings! Is there an advantage in no-following links to pages like "Terms Of Use" and "Privacy Policy"... pages one isn't trying to rank for? Of course, the idea would be to not waste link juice on unimportant pages. Your thoughts? Thanks!
Technical SEO | | 945010 -
Access To Client's Google Webmaster Tools
Hi, What's the best/easiest way for a client to grant access to his Google Webmaster Tools to me? Thanks! Best...Michael
Technical SEO | | 945010