Should I use Rel-Canonicals links for a News site with similar articles each year
-
Our small town news site provides coverage in a lot of seasonal areas, and we're struggling with the current year's content ranking above previous years.
For instance, every year we cover the local high school football team, and create 2-3 articles per game. We'll also have some articles preseason with upcoming schedule and general team "talk".
We've seen where articles from past seasons will rank higher than the current season, presumably because the older articles have more links to them from other sources (among other factors). We don't want to delete these old articles and 301 them to the newer article, since most articles include information/stories about specific players...and their families don't want the article to ever come down.
Should we rel-canonical the older articles to the newer one, or perhaps to the "high school football" category page? If to the category page, should we rel-canonical even the new articles to that main category page?
Thanks for the help.
-
Thanks for the feedback, and I'm leaning towards the archiving mentioned.
I have a follow-up on that though...in this high school football example, usually several times a year, we'll get a flood of traffic from a former student that did something "big" in college or pros. Since we have articles that rank well for his name, due to coverage of him/her back in highschool, we'll often double/triple our normal monthly pageviews in a situation where the student receives national attention.
If I archive the articles (many 4-5 years old), then I'm assuming we'll lose the rankings for that former student's name, and therefore lose these burst of traffic we've seen in the past.
Thoughts?
-
I'd move the older articles into a /archive/ subfolder (keeping a site search function available) and then exclude the archive from indexing. The pages stay up and can be found on site by interested parties while newer material gets all the glory.
-
I would make a permanent page for these. The current year information would be displayed at the top of the page. The archive would be readable below.
No redirects, no rel=canonical, fatter content, links accumulate over time, faster development in subsequent years. Visitors will like it.
-
Typically the canonical tag should be used when content is duplicate--like when you have the same article appearing on one or more websites. I don't see any reason why you should be using the canonical tag, as those articles are unique.
Since the Google Panda algorithm might come into play here, though, the older articles may eventually hurt your site's search engine rankings if there are too many of them on the site. You might consider creating an archive of your content, perhaps on archive.yourdomain.com and then stopping the search engines from crawling that archive (but still making the article available to readers).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
SEO Company wants to rebuild site
Hello Community, I am a designer and web developer and I mostly work with squarespace. Squarespace has SEO best practices built into the platform, as well as developer modes for inserting custom code when necessary. I recently built a beautiful website for a Hail Repair Company and referred them to several companies to help them with SEO and paid search. Several of these companies have told this client that in order to do any kind of SEO, they'll need to completely rebuild the site. I've seen some of the sites these companies have built, and they are tacky, over crowded and hard to use. My client is now thinking they need to have their site rebuilt. Is there any merit to this idea? Or are these companies just using the knowledge gap to swindle people into buying more services? The current site is : https://www.denverautohailspecialists.com/ Any advice would be appreciated.
Local Website Optimization | | arzawacki2 -
Best SEO Option for Multi-site Set-up
Hi Guys, We have a Business to Business Software Website. We are Global business but mainly operate in Ireland, UK and USA. I would like your input on best practice for domain set-up for best SEO results in local markets. Currently we have: example.com (no market specified) and now we are creating: example.com/ie (Ireland) example.com/uk (united kingdom) example.com/us (united states) My question is mainly based on the example.com/us website - should we create example.com/us for the US market OR just use example.com for the US the market? If the decision is example.com/us should we build links to the directory or the main .com website. To summarize there is two questions: 1. Advise on domain set-up 2. Which site to build links to if example.com/us is the decision. Thank you in advance, Glen.
Local Website Optimization | | DigitalCRO0 -
Does having 2 separate domains with similar content always = duplicate content?
I work for a global company which is in the process of launching their US & European websites, (just re-launched Australian site, migrated from an old domain) all with separate domains with the purpose of localising. However, the US website content will essentially be the same as the Australian one with minor changes (z instead of s, slightly different service offerings etc) but the core information will be the same as the AU site. Will this be seen as duplicate content and Is there a way we can structure this so that the content won’t be seen as duplicate but is still a separate localised website? Thank you.
Local Website Optimization | | PGAUE0 -
Can you, somehow, use dynamic number insertion on a click to call button (image)
Hello Moz! I have been beating my head against the wall for a few hours, and I am starting to get a headache. My question is simple: I am doing some work for a local salon, and we started a PPC campaign recently. It's very important that I get accurate ROI metrics from both our PPC efforts and Yelp advertising program, and the best way to do this is by using custom phone numbers and dynamic number insertion w/ CallRail to track phone calls being made to the salon. I can then cross reference the numbers used to call with the salon POS software to see what they spent, how many appts. they booked, etc. A VERY large portion, the majority in fact, of traffic comes from mobile, and in the past I had a big, fat, beautiful CTA click-to-call button that showed the salon phone number. However, I have found that with dynamic number insertion, and my near non-existent programming skills, it is impossible to have the number dynamically insert into an href image. Sooooo...any ideas on how to do this, or is it just not possible????
Local Website Optimization | | Sean_Gutermuth0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
International site, be visible on both .com and .co.uk?
Do you guys have any tips to increase the visibility in both Google.com and Google.co.uk? The site today, have good visibility in USA, but its poor in the UK... Information: The server is based in US. No region is set in the Google Webmaster Tools. Incoming links are from global regions, mostly US. Do we need to add a specific section for the UK (uk.site.com or site.com/uk/) and specify region in GWT to make sure Google handle this the right way? Its a lot of work, rewrite all the content for another section, which also is in english...
Local Website Optimization | | Vivamedia0 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0