Canconical tag on site with multiple URL links but only one set of pages
-
We have a site www.mezfloor.com which has a number of Url's pointing at one site. As the url's have been in use for many years there are links from many sources include good old fashioned hard copy advertising. We have now decided that it would be better to try to start porting all sources to the .co.uk version and get that listing as the prime/master site.
A couple of days ago I went through and used canonical tags on all the pages thinking that would set the priority and that would also strengthen the page in terms of trust due to the reduced duplication. However when I went to scan the site in MOZ the warning that the page redirects came up and I am beginning to think that I need to remove all these canonical tags so that search engines do not get into a confused spiral where we loose the little page rank we have.
Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at.
-
Yes, it is good when there is a clear Google guideline to follow. I'm happy for your quick win!
-
Thanks
I am pleased I do not have to go through the whole site again and even more pleased as I have a number of other sites to work on.These could certainly do with a bit of a boost and this is a quick win.
-
So you want to put a canonical of www.b.co.uk/index.html on a page that can be reached via www.b.co.uk/index.html and you are worried that it will become a loop?
Don't worry. Google specifically thought about the possibility that people might use self-referential canonicals (SEO plugins do it all the time) and engineered it so that this does not cause a loop. (See Matt Cutts on the topic.)
I myself inherited some ugly urls for which I made nice user-friendly aliases and I tagged those pages with the friendly canonical. There were no problems and the pages started doing much better. (In my case it was not cross-domain, but cross-domain canonicals are supposedly supported and in fact I have succesfully used them in other situations.)
-
Hi thanks for the response
The issue is we have one set of pages on a server which is addressed through several different url's.
I never got involved in the server side of things so I do not know if that was by redirects at the route URL. Just maybe I am trying to add canonical links that just are not required.
If I have www.a.co.uk/index.html, www.a.com/index.html, www.b.co.uk/index.html and want them all to point to www.b.co.uk/index.html. As index.html is on the server once then my thought was that I should have a canonical link to that page from within that page with the www.b.co.uk/index.html as the route. This may be right or wrong but there is the risk that a spider stops when it gets to the link and goes to the start of the same page, again and again in a loop.
You are of course right that the Google bot should be OK with this but the Moz bot stopped in its tracks and asked if I wanted the page indexed so I had to do this manually.
Gut feel says I should remove the links for now but need to understand what we did server side. Gut feel maybe wrong and I would prefer to do the right thing!
-
Okay you lost me a little but let me see If I can help.
First off the canonical tag - Its fantastic for duplicate content (even across other sites) now so good if you don't have duplicate content.
301's - It's very similar to above can work well with duplicate content but not essential. Now you can 301 a few pages into one page so if a user types a URL in (or even has it as a bookmark etc.) the will land on the page you want. its normally a good idea to 301 into similar pages to you don't get users thinking they are going to buy (e.g.) a pair of boots and land on a page about t-shirts.
Google getting lost - Don't worry about Google getting lost, if a user can get around so can Google, plan plan and plan again if you plan it all out (you can even draw flow diagrams) so you know where its all going to and from until you are happy. You can also get someone who doesn't know your site to test it see if they get lost.
Hope that background helps a bit, you lost me here-
"Is there a way that I can redirect everything except the target URL without setting up a separate master site just for all the other pages to point at."
Why can't you redirect all your pages to the target URL ?
One helpful tool I recommend is screaming frog it can help you pick up redirects 404 etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG Regarding: https://etcnaija.com
Technical SEO | | etcna0 -
Several hreflang links pointing to same URL
Hi, Does anyone know whether hreflang links can be used using the following markup? I can't seem to find any info on this particular usage, but it "feels" incorrect to me. (duplicate content issues)
Technical SEO | | dimitrihuyghe
Our development team tells me this is the way the markup should be, since languages are initially set using a cookie and all different languages are using the same URL. Thanks! <link rel="<a class="attribute-value">alternate</a>" href="<a class="attribute-value">https://www.littlethingz.be</a>" hreflang="<a class="attribute-value">nl</a>"/><link rel="<a class="attribute-value">alternate</a>" href="<a class="attribute-value">https://www.littlethingz.be</a>" hreflang="<a class="attribute-value">x-default</a>"/><link rel="<a class="attribute-value">alternate</a>" href="<a class="attribute-value">https://www.littlethingz.be</a>" hreflang="<a class="attribute-value">fr</a>"/><link rel="<a class="attribute-value">alternate</a>" href="<a class="attribute-value">https://www.littlethingz.be</a>" hreflang="<a class="attribute-value">en</a>"/><link rel="<a class="attribute-value">alternate</a>" href="<a class="attribute-value">https://www.littlethingz.be</a>" hreflang="<a class="attribute-value">de</a>"/>0 -
Canonical URLs in an eCommerce site
We have a website with 4 product categories (1. ice cream parlors, 2. frozen yogurt shops etc.). A few sub-categories (e.g. toppings, smoothies etc.) and the products contained in those are available in more than one product category (e.g. the smoothies are available in the "ice cream parlors" category, but also in the "frozen yogurt shops" category). My question: Unfortunately the website has been designed in a way that if a subcategory (e.g. smoothies) is available in more than 1 category, then itself (the subcategory page) + all its product pages will be automatically visible under various different urls. So now I have several urls for one and the same product: www.example.com/strawberry-smoothie|SMOOTHIES|FROZEN-YOGURT-SHOPS-391-2-5 and http://www.example.com/strawberry-smoothie|SMOOTHIES|ICE-CREAM-PARLORS-391-1-5 And also several ones for one and the same sub-category (they all include exactly the same set of products): http://www.example.com/SMOOTHIES-1-12-0-4 (the smoothies contained in the ice cream parlors category) http://www.example.com/SMOOTHIES-2-12-0-4 (the same smoothies, contained in the frozen yogurt shops category) This is happening with around 100 pages. I would add canonical tags to the duplicates, but I'm afraid that by doing so, the category (frozen yogurt shops) that contains several non-canonical sub-categories (smoothies, toppings etc.) , might not show up anymore in search results or become irrelevant for Google when searching for example for "products for frozen yoghurt shops". Do you know if this would be actually the case? I hope I explained it well..
Technical SEO | | Gabriele_Layoutweb0 -
Too many links? Do links to named anchors count (ie page#nameanchor)?
Hi, I have an internal search results page that contains approx 200 links in total. This links to approx 50 pages. Each result listing contains a link to the page in the format /page.html and also has 3 more links (for each listing) to named anchors within the page. eg /page.html#section1, /page.html#section2, /page.html#section3 etc. Should i remove the named anchors to keep my links per page under the Seomoz suggested max of 100? Will it impact crawl-ability or link juice being passed? Thanks in advance for your response.
Technical SEO | | blackrails0 -
Dealing with Dead Pages on an Ecommerce Site
Hello everyone! I'm working on a project for a small jewelry store. They have a store in North Carolina and an ecommerce site (on Shopify - which I loathe!). I'm not exactly an SEO expert, but the client likes the way I handle social media and I know enough to get them much farther down the road than they are now. The big problem is that most everything sold is handmade and one of a kind. So, the site has LOTS of dead links. I'd love everyone's suggestions on how to: Best avoid this in the first place as new products are added and promoted via Facebook, Twitter, blog posts and so on Suggestions for managing the sold items - I don't think it seems wise to leave them up as "SOLD" The site is http://www.laurajamesjewelry.com I'm grateful for your assistance! And look forward to sharpening my SEO skills. ~Robin
Technical SEO | | RobinBertelsen0 -
What should I do about links coming in that are from link farm type sites?
I just noticed two back links to a couple of sites around pharmaceuticals/attorneys. The one link is to a chinese site with url: http://e.lifestyle.com.cn/fashionweekly/nzj/353093_2.shtml, and the other is to a site called Adroo: http://adroo.com/us/?view=list&list_id=104154&lang=en. Both appear to be some type of link farm sites, one has come in as a nofollow (surprise, you can buy "ads" on their site, both have decent DA. There is no reason for them to link to theses sites, should I find a way to stop the link? Also, on one of the sites we had a dmoz link and it is not showing in OSE? Link is still open in dmoz though. Thanks for any input.
Technical SEO | | RobertFisher0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0 -
What are the pros and cons of moving one site onto a subdomain of another site?
Two sites. One has weaker sales. What would the benefits and problems for SEO of moving the weak site from its own domain to a subdomain of the stronger site?
Technical SEO | | GriffinHansen0