What is the difference between rel canonical and 301's?
-
Hi Guys
I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories.
Tell me have I got this right or completely wrong?
Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings
And love and relationships category - https://www.zenory.com/love-relationships
Hope this makes sense - I really look forward to your guys feedback!
Cheers
-
Understand what you mean - to be very honest I don't think that this content snippet is generating duplicate content.
However, I don't really understand the mechanism:
https://www.zenory.com/horoscopes/taurus/day -> I would expect to find the daily horoscope for Taurus - when I click on Capricorn I would expect to go to https://www.zenory.com/horoscopes/capricorn/day - however I remain on the same page & the horoscope is shown in a lightbox. I would rather put it on a separate page (if all horoscopes of all signs are present in the HTML of one sign these pages become quite similar when you look at the source code.
Sounds a bit confusing, but I hope you get what I mean.rgds,
Dirk
-
Hi Dirk
I wanted to ask you another question with regard to this.
I have horoscope pages that have just been published today.
We offer daily horoscope for each star sign (12) these are unique and different each day for each star sign, however there is a weekend love section at the bottom of each page for each star sign that is the same for the whole week.
https://www.zenory.com/horoscopes/taurus/day
https://www.zenory.com/horoscopes/aries/day
Above will show you an example of a couple of the daily horoscopes, you can see the weekend love is different - however it will be the same for the same star sign tomorrow - you can't see these as we have only published and released these today. So you will be able to tell the difference when tomorrows one is published, but hopefully I have explained myself well here.
So my question will be - half the content on a single page will be duplicate content: Besides the new daily horoscope entry. I'm wondering if I need to add canonical tags or if I should create a separate page for the weekend love horoscope of each star sign.
I hope this makes sense!
Thanks again Dirk!
-
That answers my question Dirk, thank you again!!!
-
For the examples you gave I would certainly not use a 301 or use a canonical tag. The content is unique - and only a relatively small part is common (the list)
To explain the difference:
A canonical tag is used if you have pages that are identical (or almost identical) and which are accessible under different url's. A good example is an e-commerce site with a list of articles like mysite.com/umbrellas - if by sorting the products the url is changing like mysite.com/umbrellas&sort=high it's best to put a canonical so that google will not index all the variations. If you use a canonical on the second url -pointing to the first. A visitor can however still access the pages. Google bot normally respects the canonical - but is not obliged to do so.
A 301 is different - in fact you give the message to the browser: this page is no longer available on this location but has moved to a new location. It's no longer possible to visit the original page (not for humans & not for bots). Google bot has to respect this directive.
A last option you can use is the "noindex/follow". This you normally use for pages that have very little value for search engines, but where you still would like the bots to follow and index the pages which are listed. This you can use for pages of type blog.com/tag/subject - that are generating lists with all the articles marked with subject. In general pages like this are good for cross linking, however have low value for search engines so it's better to not have them indexed.
Hope this clarifies,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do the links from top websites' forums boost in-terms of backlinks?
If we get any backlinks from discussions/forums of top websites like wordpress and joomla forums; do they count as valid and authority improving backlinks? I mean about the dofollow links.
White Hat / Black Hat SEO | | vtmoz1 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Why my banklinks haven't been removed?
Hi Everyone So I had over 1500 backlinks in under month, and i found out it was coming from a directory. I asked them to delist me from the directory, but it still shows i have these links pointing to me. How do I get completely take them down? Also I contacted myseotools who I use and they said "It is most likely because you have some dynamic pages that can create thousands of various URLs. Maybe a directory? This is not an issue with our software as it comes directly from ahrefs. Try going to ahrefs.com and enter your domain to see where all the links are coming from." I proceeded to do this and its definely coming from that 1 directory. They said they have removed me from they directory, but my question is I can still see I have 1500 backlinks coming from their domain? Does this take time to clear? Or have I missed something in the process?
White Hat / Black Hat SEO | | edward-may0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
Are directory listings still appropriate in 2013? Aren't they old-style SEO and Penguin-worthy?
We have been reviewing our off-page SEO strategy for clients and as part of that process, we are looking at a number of superb info-graphics on the subject. I see that some of current ones still list "Directories" as being part of their off-page strategy. Aren't these directories mainly there for link-building purposes and provide Users no real benefit? I don't think I've ever seen a directory that I would use, apart for SEO research. Surely Google's Penguin algorithm would see directories in the same way and give them less value, or even penalise websites that use them to try to boost page rank? If I were to list my websites on directories it wouldn't be to share my lovely content with people that use directories to find great sites, it would be to sneakily build page rank. Am I missing the point? Thanks
White Hat / Black Hat SEO | | Crumpled_Dog
Scott0 -
Link Farms and The Relationship between 2 domain with a 301 Redirect
I have an interesting scenario: Domain A was worked on by a disreputable SEO company off shore. The owner of Domain A came to me for my assistance and evaluation on how the off shore company was doing. I concluded that he should terminate the relationship immediately. One of the bad things they did was register Domain A with a LOT of link farms. I started working on a new site that eventually we decided to go with Domain B (a better, but totally related domain name to Domain A). I added a nice new site and had my client write clean, relevant information for it. We've done all legitimate, above ground by-google's-recommendation SEO for Domain B. I have a series of 301 redirects from Domain A to Domain B. Since April 24th, organic search results have plummeted. I see many incoming links via Webmaster Tools as the massive link farms, but those link farms have Domain A in their databases, not Domain B. My question: is Domain B inheriting the link juice from Domain A insofar as the incoming links are showing up in Webmaster Tools as directly related to Domain A? Should I sever the ties with Domain A altogether? Thanks.
White Hat / Black Hat SEO | | KateZDCA1 -
What is the difference between advertizing and a paid link?
I have been told that google frowns on paid links yet I see many site charging for advertizing and the advertizing consists of an anchor text link. What is the difference between a paid link and this type of advertizing?
White Hat / Black Hat SEO | | casper4340 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0