Training events - optimisation and avoiding cannibalisation
-
This is quite a broad question I’m afraid – any help would be appreciated.
I’m trying to find the best way of optimising our new training pages. These events are aimed at teaching our customers how to use our software to do different tasks. Inevitably, the themes and naming of these training workshops overlap with some of our products. A close example would be, to make up a product, ‘Keyword Ranker’ and ‘Keyword Ranker Training’.
Someone has raised the concern that the training pages might start outranking the pages for our main tool, particularly as the training will be heavily promoted via social media. Also, the on-page content talks about similar topics. They’ve suggested that we use rel=canonical tags pointing from the each training page to the related product page to prevent this from happening.
I myself don’t think this is a good idea as this is not what the rel=canonical tags are designed for. I think that they might prevent the events pages ranking for any query at all, which is not what we want. Also, I believe that the training pages and the products are different enough that Google will work out which to rank for relevant queries. Has anyone else had an experience of doing this? Are there any approaches that people would recommend? Or is this something that we shouldn’t be worried about?
A few other thoughts that I’ve had:
-
Using schema.org event markup to emphasise what the events pages are about.
-
Making sure to remove old events once they have expired. I thought it best to let these 404 as I’ve read that 301s to a category page than cause Google to penalise content.
-
Putting internal links from the product pages to the relevant training workshop pages.
-
Using the meta unavailable tag on events pages, so that when the event has happened then it will be removed from Google’s index.
-
-
Thanks for your thoughts Linda - much appreciated
-
A couple of thoughts--yes, that is not what rel=canonical is for. It is meant for identical or nearly identical pages. If you wanted that effect you could noindex the training pages, but you say you don't want that, so both of those choices are out.
If you have multiple training pages that go to one product, you will presumably have links on those multiple training pages back to that one product page, and that will be a sign to Google that the product page is important.
Also, if the product page stays up and the training pages are up for a short while and then go away, those short-term training pages are unlikely to overtake the product page that remains up and is able to attract links and other positive signals.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suggested approach (support) for 301 redirects in event of an acquisition
If an agency has recently been acquired by a new organisation, it will need to be redirected to the new organisation's website as soon as possible. We are aware of the need to 301 redirect all pages (domain authority) across to the current domain of the new organisation's website. The new organisation has less pages than our Agency site however, so we cannot point 301 redirects at page level. Would you therefore advise, A, B or C?: A) Redirecting all pages including all blog posts/services pages etc across from the agency site to the new organisation's domain? * new organisation does not have /blog or /services pages. -Will we lose authority if redirecting from pages of our agency site to the new organisation's top level domain? B) Ensure that the new organisation secures hosting of the agency website, and place a holding page on the Agency website directing visitors through to the new organisation for the interim, until we have a /blog, /services page on the new organisation's site? C) Place 301 redirects from agency across to new organisation, and look moving forward (when pages have been put in place on new organisation website) to retrospectively repoint 301 redirects from top level domain of new organisation's site to the new pages which have just been created on the new organisation's site? Any pointers here would be appreciated. Thanks!
Intermediate & Advanced SEO | | Tangent0 -
How do I Enable Rich Snippets for an Events Page that is updated weekly?
Hello Moz World! I have a client that has an events page that they update every week. They conduct weekly demos with current customers and potential customers for their software. They post dates, times and topics for each demo. I'd like to enable event rich snippets for their website (see attached image for an example), but I am unsure exactly how to do that. A) Do I just need to setup Event Schema Tags? Does it need to be updated manually every week? Is their a software solution? Thanks ahead of time for all the great responses! Cheers Will H. 8Wsh3l8
Intermediate & Advanced SEO | | MarketingChimp100 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
How do I optimise my products for best results?
Hi We have a number of products we want to optimise for example. Barbeque Boss Double Oven Glove, Black When performing keyword research, there are a number of generic terms such as, black oven gloves, barbeque boss and so on. Now i can write the page and optimise for these keyword phrases but I am not sure this is the right way about going for it, particularly if i have several products in the range that are "black oven gloves" or " double oven gloves" How so i best structure my meta tags, meta description and descriptions? Should i just use the product title and optimise around this in the hope Google displays our page in any search queries containing words in this product title? Any advice would be appreciated. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0 -
Optimising a Dynamic website ?
A client has bought the Nostalgia wp theme. I've installed Yoast but because the website is ajax based and the content for the pages are dynamically loaded the plugin won't work. Or at least not to my knowledge? The developer doesn't currently have a solution, which from previous expereience it will never be supported. So I need some possible solutions here. Create a mobile site? Cons more time, more money etc Create non dynamic pages linked in footer area. Cons page duplication etc. It's a small niche so having the basic elements is imperative to getting it ranking.
Intermediate & Advanced SEO | | StephenForde0 -
Should we always avoid drop-down menus?
In Google's SEO Guide, they say avoid the use of drop-down menus, page 12: http://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en/us/webmasters/docs/search-engine-optimization-starter-guide.pdf But, is this always true? What if you create the drop down purely using HTML & CSS? Is it fine to use a bit of javascript to create the drop-down menu, or should it only be HTML & CSS?
Intermediate & Advanced SEO | | nicole.healthline0 -
Optimising My Website Link Containers
Hi, I'm looking at my links containers and trying to optimise them. I would be greatful if anyone can give me some feedback on my plan for perfect optimaisation. My links are constructed as follows: I have a two states:
Intermediate & Advanced SEO | | James77
1/. A Non Hover state which contains an Image and Text
2/. A Hover state which contains a bit more text - I do this as containing full text on the non hover state would not be good for users and would look ugly as well. Here's an example block of the HTML - as you can see from the URL, its quite a deep page level. From the URL and Alt / Titles the Page I am Linking to is about: "The Royal Hotel Accommodation New York Holidays". I Just a bit confused on how I should apply ALT and Title (Titles in particular) attributes given the nested DiV's etc - I can apply these to parent level, or apply all levels, or apply them to a mix. Also is there any obvious thinks you can think of I am missing that may help onsite SEO? Thanks in Advance CURRENT UNOPTIMISED CODE:
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
MY OPTIMISED CODE (Adding Title and Alt attributes):
The Royal Hotel
New York Holidays Accommodation
The Royal Hotel
0 -
Google Places optimisation for service franchise, 150 franchisees with no physical addresses?
So we have a client who is a plumbing franchise with about 150 franchisees across the country. Because its a plumbing franchise the businesses don't have street addresses (apart from the franchisee home addresses but we don't want to use those) We used to have bulk uploaded listings for the franchise locations and used the GPO address is the suburb/city as the address and got away with this fine for years. Google has copped onto this and asked for reverification of the listings by post now. So my question is what's the best way to optimise places for 150+ locations. As a quick fix, we're going to add a new places location as the master franchise HQ office (address exists). We can then add all the suburbs/areas serviced into this location which may or may not show up for local searches in those areas. We could potentially verify all listings by mail by using private mailboxes but mail verify on a mass scale like that is likely to be flaky not to mention an admin nightmare. Does anyone have an experience with this and how they got around it?
Intermediate & Advanced SEO | | Brendo2