Rel Canonical issues for two urls sharing same IP address
-
Our client built a wordpress site on url A, then opted for a better url B. Rather than moving all the wordpress files/website over to the new url B, they just contacted GoDaddy, who hosted BOTH urls under the same IP address.
When I do a term target on url B, I'm flagged for rel canonical use. I can only get a B grade for each keyword. (I've also tried using url A, but I get the same flag and B grade results).
I'm not sure if this set-up will thwart our seo efforts for the site, because only the homepage comes up when you type in url B anyway. Every subsequent page displays the original url A. Somewhere, wordpress is also adding a rel canonical link on the homepage source to url A, too, which we can't seem to edit.
So, question is: is it ok to leave this set up as is with both urls hosted on the same IP address, or should we move the whole site over to the desired url B?
Thanks much!
-
Thanks for your answer, John. So, we should have the client 301 the url A to the desired url B. Confirmed!
-
This doesn't sound ideal. So only the home page is under URL B, and the rest of the pages are hosted under URL A? That would seem really strange to me as a user if I entered through URL B, went to another page, and the domain changed completely.
Ideally, you should 301 redirect everything under URL A to URL B rather than using rel=canonical for them. There's no reason to host two identical sites like this. It's fine for multiple sites to be hosted under the same IP. Here's a really good SEOMoz blog post about using 301 redirects vs.canonical tags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"redirects" with no "redirect address"?
Episode 2 of "Damon the idiot noob" web series . . . I have like . . ..90 plus temporary redirects in my moz "medium priority diagnostics". But the majority of them have a url, but no redicret url. How can it be a temporary redirect if there is no redirect address? Some of the addresses simply don't make any sense. Like: http://www.thirdcoastsigns.com/catalog/seo_sitemap/product How on earth would a "seo_sitemap" be followed my a "/product"? This is a Magento site, so I know some of these things get created automatically . .. but what on earth is going on here? Help welcome, appreciated, and welcome. Did I mention it is welcome and appreciated?
Moz Pro | | damon12120 -
Complex Rankings Issue For A Law Firm Site
Be warned, this is a complex issue that I have and will require someone who has some advanced knowledge about 301s and link penalty’s. I have a law firm client whose site is having some issues. There are some very complex details here so I'm going to articulate them in bullet points in hopes of making the issues easy to understand. So here's my root problem: We have poor organic rankings (4th, 5th, 6th page for most terms) despite Domain Authority of 32 (avg. 1st page competitor is 28) and some very strong white hat link building the last 60 days or so. How's their backlink profile look, you ask? When you look at their backlink profile in OSE, their spam score is a 1/17 (not sure if that's credible in any way). Lot's of links that score 5's on the spam score make up about 10% of their OSE links. Here’s where it gets tricky; those links are not directed the client's New URL, they are links that go to some old URLs the client used to have, for which they had an SEO guy who built all those crappy links. Those URLs with the crappy links (we'll call them The Crappy URLs) were 301'd (can we all agree 301'd is a verb?) to the NEW URL for just a couple of months. Shortly after that, NEW URL dropped almost completely out of Google, so the client turned off the 301s. So despite those 301s being turned off, OSE still shows all the links going to The Crappy URLs but is giving The New URL credit for them. Keep in mind, the 301s were turned off about 6 months ago so it’s a little strange that OSE still shows those 301s. This has led me to the conclusion that the Domain Authority that OSE shows of 32, is not a “real” number since it is seemingly based off links inherited from 301s that no longer exist. So now I’m trying to create an action plan for this client that will hopefully help us start to make some real progress in our rankings. This client does not have the budget to wait another 6 months for some sign of hope so time is of the essence. Here’s my theoretical action plans I’m choosing from and would like the communities input on which, if any, they feel is best (Also, if I’m missing something or you have an idea, I’m all ears): **Potential Action Plans: ** Do nothing, keep building quality links, creating quality content, monitor crawl reports/gwt for issues. That strategy is going to win long term. #1 + Create one page sites on The Crappy URLs, setup GWT for them, submit sitemaps thus forcing Google, OSE and other web crawlers to index them, thus removing any potential residual penalties from the 301s. NOTE: Currently The Crappy URLS are just landing on GoDaddy’s default landing page which is of course not being indexed by Google or OSE. #2 + Disavow all the bad links going to The Crappy URLS. Then once the bad links no longer appear in the OSE profile for each of The Crappy Sites, 301 them again, thus inheriting the good links but not the bad. #1 + 301 the Crappy URLS back to the New URL, while also disavow any links going to The Crappy URLs. The logic here is that if the road back to recovery is going to be a few months away no matter what, when the 301 knocked them back 6 months ago no reputable link building was being done. I am cautiously optimistic the linkbuilding we are doing will eventually off set any penalty’s coming from the 301s. Plus now we’ll know the 32 Domain Authority OSE is giving us is real. This is the one I’m leaning towards quite frankly because I think it will reduce the recovery time and we’ll know somewhat quickly (30-60 days) if it’s actually working. 1-3 could each take 90 days before we know if it’s working. So please, if you have any expertise with any of this, your help or advice would be appreciated. I’d rather not share The New URL for obvious reasons but if you must know, simply message me and as long as you’re legit, I’ll share it with you.
Moz Pro | | BrianJGomez0 -
Any tool built into MOZ that can help tell who the owner of a URL is?
I'd like to know if there's any tool which would let us know who the owner of a web domain is.
Moz Pro | | daleseppie0 -
Magento: Moz finding URL and URL?p=1 as duplicate. Solution?
Good day Mozzers! Moz bot is finding URL's in the Catalogue pages with the format www.example.com/something and www.example.com/something?p=1 as duplicate (since they are the same page) Whats the best solution to implement here? Canonical? Any other? Cheers! MozAddict
Moz Pro | | MozAddict0 -
Site explorer Issue
Hello, I'm looking to see in the Site Explorer the links coming from directories such as BOW, yahoo etc. I'm listed there from almost 1 year and these links are not listed, the same with my competitors. I'm missing something? Thank you Claudio
Moz Pro | | SharewarePros0 -
Mozcape API Batching URLs LIMIT
Guys, there's an example to batching URLs using PHP: http://apiwiki.seomoz.org/php Which is the maximum number of URLs I can add to that batch?
Moz Pro | | Srvwiz0 -
Canonical tags and SEOmoz crawls
Hi there. Recently, we've made some changes to http://www.gear-zone.co.uk/ to implement canonical tags to some dynamically generated pages to stop duplicate content issues. Previously, these were blocked with robots.txt. In Webmaster Tools, everything looks great - pages crawled has shot up, and overall traffic and sales has seen a positive increase. However the SEOmoz crawl report is now showing a huge increase in duplicate content issues. What I'd like to know is whether SEOmoz registers a canonical tag as preventing a piece of duplicate content, or just adds to it the notices report. That is, if I have 10 pages of duplicate content all with correct canonical tags, will I still see 10 errors in the crawl, but also 10 notices showing a canonical has been found? Or, should it be 0 duplicate content errors, but 10 notices of canonicals? I know it's a small point, but it could potentially have a big difference. Thanks!
Moz Pro | | neooptic0 -
Tool for scanning the content of the canonical tag
Hey All, question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags? I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/). I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that. Cheers, Josh
Moz Pro | | prima-2535090