2 Similar Pages
-
Hello,
I have two very similar pages. The first page is an apartment rental page with a map, rental listings and some neighborhood data below. The neighborhood data includes useful info about the area with photos, text about the area, crime rates, avg. rental rates, etc. The second page is a neighbourhood guide that includes virtually the same data as the rental page, but in longer form ie. more photos, more text, etc.
I want the rental page to rank, while ranking the neighborhood page is not important as it would be used more for link bait. But since the information on the two pages is the same, I don't want them to compete with each other.
I'm thinking of putting a cannonical tag on the neighborhood page pointing back to the rental page. Is that the correct thing to do in this instance?
Thanks for your help.
J
-
The other question is how will the "link bait" page be found by potential linkers? It won't show up in search anymore, if you point a canonical to the other page. [Maybe it is just very easy to find from another popular page on your site, which is easily findable?]
-
Hi Jon,
Use of the canonical tag in this situation is just what it is intended for. Have a quick read of what Google says in these situations:
https://support.google.com/webmasters/answer/139066?hl=en
- Consolidating link signals for the duplicate or similar content. It helps search engines to be able to consolidate the information they have for the individual URLs (such as links to them) on a single, preferred URL. This means that links from other sites to
http://example.com/dresses/cocktail?gclid=ABCD
get consolidated with links tohttps://www.example.com/dresses/green/greendress.html
.
While Google do say it is a recommendation, I find that in situations like this, it will work just fine for you. As David said, give it a go, but I can't see that you will have any problems. Google is pretty good at sorting these issues out, and especially so if you tell them which is your preferred page for ranking:
"While we encourage you to use any of these methods, none of them are required. If you don't indicate a canonical URL, we'll identify what we think is the best version or URL."
-Andy
- Consolidating link signals for the duplicate or similar content. It helps search engines to be able to consolidate the information they have for the individual URLs (such as links to them) on a single, preferred URL. This means that links from other sites to
-
Hi Jon,
A canonical tag can be used as you described, but you may have issues in your case.
The issue is that search engines treat canonical tags as a recommendation, not a directive. Meaning that they can choose to ignore the canonical tag if they think it's not appropriate.
In your case, if the canonicalized (link bait) page is getting more external links than the other page, it could increase the chance of the canonical tag being ignored because search engines might see it as the better page.
I think it would be best to have a single, high-quality page, but if you absolutely must keep the pages separate, use the canonical tag and see how things go.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranked page is not desired page
I have a question on a problem I am currently faced with. There is a certain keyword that my employer wants to rank for. The good news is that sometimes it does rank in the top 5 pages of Google. (It drops in and out) The bad news is that it is going to a page that we need to keep, but not the ideal place we want people who are looking for that keyword to go to. I was wondering if anyone has had any experience with this type of situation and what tactic they used to get people to the better page.
On-Page Optimization | | trumpfinc1 -
Similar URLs
I'm making a site of LSAT explanations. The content is very meaningful for LSAT students. I'm less sure the urls and headings are meaningful for Google. I'll give you an example. Here are two URLs and heading for two separate pages: http://lsathacks.com/explanations/lsat-69/logical-reasoning-1/q-10/ - LSAT 69, Logical Reasoning I, Q 10 http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10/ - LSAT 69, Logical Reasoning II, Q10 There are two logical reasoning sections on LSAT 69. For the first url is for question 10 from section 1, the second URL is for question 10 from the second LR section. I noticed that google.com only displays 23 urls when I search "site:http://lsathacks.com". A couple of days ago it displayed over 120 (i.e. the entire site). 1. Am I hurting myself with this structure, even if it makes sense for users? 2. What could I do to avoid it? I'll eventually have thousands of pages of explanations. They'll all be very similar in terms of how I would categorize them to a human, e.g. "LSAT 52, logic games question 12" I should note that the content of each page is very different. But url, title and h1 is similar. Edit: I could, for example, add a random keyword to differentiate titles and urls (but not H1). For example: http://lsathacks.com/explanations/lsat-69/logical-reasoning-2/q10-car-efficiency/ LSAT 69, Logical Reasoning I, Q 10, Car efficiency But the url is already fairly long as is. Would that be a good idea?
On-Page Optimization | | graemeblake0 -
A question on cached pages
Hi all. I have been testing different ways of writing the text on the homepage of the website www document-management-solutions co.uk because we are ranking very low for the keyword Document Management Software. I've recently learnt about pages being 'cached' and wanted to just clarify something. The last time the page was cached was on 26th March. I have made changed to the homepage since then and wanted to see how it has affected our ranking. Will I not be able to know if it has had a positive/negative effect until the next time our page is cached? Thanks
On-Page Optimization | | janc0 -
Too Many On-Page Links
Hi, I did a SEOmoz campaign and got results today, One of the results is Too "Many On-Page Links" when i am drilling down, i see that that's include inside links. for example, i sale food, i have my main department window - inside i have 30 products - each product is linked to a detailed page about the product. so automatically i have 30 links - not including all the others in this page, and i easily get over 100 and even sometimes 200 is this a big issue? does it damages my SEO? If yes, is there a way to write the HTML in a way that internal links like that wont be counted? Thank you SEOWiseUs
On-Page Optimization | | iivgi0 -
Editing Author Pages
Hi, Quick question regarding author pages. I have the blog set as me for my author page. so the url is: mysite.com/blog/author/miles/ Now, seomoz has picked up that my author page is missing meta description. But, this cannot be editied through wordpress as there is no edit option available. I may really be missing something, but where can I alter the author page, I have a feeling it might be fed from G+ but no really sure what part of G+ is used as the description. Thanks Miles
On-Page Optimization | | easyrider20 -
On-page keyword usage
SEOMOZ gave me all zeros for keyword usage. Why? The site is www.grass2greens.com and the keywords are "Asheville Landscaping Edible." The site includes these words in the title page and throughout the body text. I am not really sure, but maybe one cause for these low keyword usage ratings might be redirects or some meta tag issues, but I am really not sure. Any ideas?
On-Page Optimization | | dcaudio0 -
Should I convert PDFs to pages?
I have a client that has a lot of content in pdf files that are linked to from their website. The content on the site itself is quite thin. Should I recommend to them that they convert at least some of pdf files to actual pages on their website? That way there could be a title tag, meta-description, header tags, etc associated with the content. What role do pdf files play in SEO? Thanks!
On-Page Optimization | | bvalentine0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5