What is the difference between rel canonical and 301's?
-
Hi Guys
I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories.
Tell me have I got this right or completely wrong?
Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings
And love and relationships category - https://www.zenory.com/love-relationships
Hope this makes sense - I really look forward to your guys feedback!
Cheers
-
Understand what you mean - to be very honest I don't think that this content snippet is generating duplicate content.
However, I don't really understand the mechanism:
https://www.zenory.com/horoscopes/taurus/day -> I would expect to find the daily horoscope for Taurus - when I click on Capricorn I would expect to go to https://www.zenory.com/horoscopes/capricorn/day - however I remain on the same page & the horoscope is shown in a lightbox. I would rather put it on a separate page (if all horoscopes of all signs are present in the HTML of one sign these pages become quite similar when you look at the source code.
Sounds a bit confusing, but I hope you get what I mean.rgds,
Dirk
-
Hi Dirk
I wanted to ask you another question with regard to this.
I have horoscope pages that have just been published today.
We offer daily horoscope for each star sign (12) these are unique and different each day for each star sign, however there is a weekend love section at the bottom of each page for each star sign that is the same for the whole week.
https://www.zenory.com/horoscopes/taurus/day
https://www.zenory.com/horoscopes/aries/day
Above will show you an example of a couple of the daily horoscopes, you can see the weekend love is different - however it will be the same for the same star sign tomorrow - you can't see these as we have only published and released these today. So you will be able to tell the difference when tomorrows one is published, but hopefully I have explained myself well here.
So my question will be - half the content on a single page will be duplicate content: Besides the new daily horoscope entry. I'm wondering if I need to add canonical tags or if I should create a separate page for the weekend love horoscope of each star sign.
I hope this makes sense!
Thanks again Dirk!
-
That answers my question Dirk, thank you again!!!
-
For the examples you gave I would certainly not use a 301 or use a canonical tag. The content is unique - and only a relatively small part is common (the list)
To explain the difference:
A canonical tag is used if you have pages that are identical (or almost identical) and which are accessible under different url's. A good example is an e-commerce site with a list of articles like mysite.com/umbrellas - if by sorting the products the url is changing like mysite.com/umbrellas&sort=high it's best to put a canonical so that google will not index all the variations. If you use a canonical on the second url -pointing to the first. A visitor can however still access the pages. Google bot normally respects the canonical - but is not obliged to do so.
A 301 is different - in fact you give the message to the browser: this page is no longer available on this location but has moved to a new location. It's no longer possible to visit the original page (not for humans & not for bots). Google bot has to respect this directive.
A last option you can use is the "noindex/follow". This you normally use for pages that have very little value for search engines, but where you still would like the bots to follow and index the pages which are listed. This you can use for pages of type blog.com/tag/subject - that are generating lists with all the articles marked with subject. In general pages like this are good for cross linking, however have low value for search engines so it's better to not have them indexed.
Hope this clarifies,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you get penalized for Keyword Stuffing in different page URLs?
If i have a website that provides law services in varying towns and we have pages for each town with unique content on each page, can the page URLS look like the following: mysite.com/miami-family-law-attorney mysite.com/tampa-family-law-attorney mysite.com/orlando-family-law-attorney Does this get penalized when being indexed?
White Hat / Black Hat SEO | | Armen-SEO0 -
Different site behind the flag
Hello, I am in a very complicated situation. I have a site in Itaian which is targeted in Italy by webmaster tools so the majority of the organic traffic comes from there and everything is fine. However this site got a link from a major international site. So now I get traffic from all over the world but I can't take advantage of it. From the Italian traffic I get from this site I see high pageviews numbers and many minutes in average visitor time. The problem in this situation is that for many reasons this website cannot be translated so I can put many language choices in this site. I want to ask, If I put, let's say an English flag in top of my site, that will indicate the English language, but instead of the user to see an English version of the site he/she will be redirected(no follow link) to another site of the same content in English, will this violate any of Google's guideline or hurt the seo of the original site? Thank you all!
White Hat / Black Hat SEO | | Tz_Seo0 -
Are the Majority of SEO Companies 'Spammers, Evildoers, & Opportunists'?
This may not be the most productive Q&A discussion, but I've had some really interesting experiences this last month that have made me even more distrusting of "SEO" companies. I can't help but think of this post (not much has changed since '09). Even though it takes a pretty extreme stance, I agree with the core of it - _"The problem with SEO is that the good advice is obvious, the rest doesn’t work, and it’s poisoning the web." _ I didn't start doing this type of work wanting to have such a negative opinion of SEO companies, but I just keep having the same experience: I'll get referred to someone who isnt' happy with their SEO company. They send me their web address, I check out the site, and seriously can't believe what I find. MISSING PAGE TITLES, EVERY CANONICAL URL ISSUE IMAGINABLE, AND 10'S OF THOUSANDS OF BOT SPAM EMAT LINKS FROM PAGES LIKE THIS...AND THIS and just recently a company a called one of my clients and conned him into paying for this piece of spam garbage, obviously scraped from the site that I made for him. and what's worse, sometimes for whatever reason these companies will have all the client's FTP and CMS logins and it can be hell trying to get them to hand them over. There's no webmaster tools set up, no analytics, nothing.... These businesses are paying a good chunk of change every month, I just can't believe stuff like this is so common...well acutally, it's what i've come to expect this point. But I used to think most SEO companies actually had their clients best interest at heart. Does every honest consultant out there run into this same type of stuff constantly? How common is this type of stuff really? Now, on to the positive. This community rocks, and I feel like it represents real, ethical, solution-oriented, boundary-less SEO. So thank you Mozzers for all you do. and I love using the tools here to help businesses understand why they need an honest person helping them. If anyone has thoughts on the topic, I'd love to hear 'em...
White Hat / Black Hat SEO | | SVmedia3 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
So what's up with UpDowner.com?
I've noticed these guys in link profiles for several sites I manage. They'll usually show up around 1,000-10,000 times in the backlink profile. From what I can tell they index websites, build up keyword relationships, and then when you search for something on their site (e.g. poker) they'll present a list of related sites with stats about them. The stats seem to be yanked straight from Alexa. Where the backlink comes from is that every time 'your' site shows up for a search result they'll put a little iframe that contains your site. This means if your site's name/keywords are pretty broad, you could be showing up thousands and tens of thousands of times as being linked from these guys on their pages that Google indexes. And Google indexes, boy do they ever. At the height, they had over 53 million pages indexed. That has apparently shrunk now to around 25 million. I believe their strategy is to generate a crap-load of automated content in the hopes they can cash in on obscure long tails. So my questions for you guys are: Are you seeing them in your backlinks too? Should I block their spider/referrers? What is their deal man?
White Hat / Black Hat SEO | | icecarats0 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010