Content Cannibalism Question with example
-
Hi,
Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism.
Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google.
The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS
Two questions:
1. Does that happen often with Google (presenting two lines from the same site on first page)?
2. Would it be better practice for the writer to combine them? - creating a one more powerful page...
Thanks
-
Google will frequently rank two pages from the same site in the same SERP if they feel that both pages serve the user intent of the query. Often this will happen, as is the case with these two pages, when they are two pages that are on the same topic, but answer slightly different questions - either of which could be what the user is really asking, if that makes sense. In your example, the two pages that Google is serving up are answering closely related, but slightly different questions: "What is VVS diamond clarity" and "what is the difference between VS and VVS diamond clarity."
It might be advisable for this site to combine the two pages, if (for example) the wrong page was ranking for the query or one page was getting all the traffic and the other wasn't getting any. Another solution would to make them more different from each other, rather than tackling two long-tail variations on the same overall topic.
I would not recommend creating two pages on long-tail variations of the same topic on purpose to try to lock down two spots in a SERP; your time would likely be better spent researching what specific long-tail topics people are searching on, and creating content to serve those users' needs. Umar does have a good point that a SERP with two results from the same domain often present an opportunity to take one of those spots.
-
Hey,
I agree, this site could combine these both pages not only to rank better but to get a very good response from users and attract links. I am not sure but seems they might did it intentionally to rank better on different keywords. Though, the idea is bit old but it's still working
I suppose that in this scenario, when Google (presenting two lines from the same site on first page) is easy to outrank at least one result by producing something more powerful and interesting. I did that in past once and it's worked out.
Umar
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about region codes and Hreflang?
A client (see example above) has accidentally place region codes into the hreflang when the content is intended for all audiences that speak the language. So "fr-fr" should really just be "fr" since those that are "fr-be", "fr-ca", and "fr-ch" should all be getting to the French version of the website too. And there isn't a specific subdirectory for French speakers in Belgium or France or Switzerland, etc. However, when looking at Google Analytics, these region codes don't seem to be stopping those from other regions from getting to the correct landing page. So a user from Belgium is still getting to https://www.example.com/fr/ depsite the "fr-fr" in the hreflang. So question: is it worth adjusting the hreflang to be non-region specific (from
Intermediate & Advanced SEO | | SearchStan0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
XML sitemaps questions
Hi All, My developer has asked me some questions that I do not know the answer to. We have both searched for an answer but can't find one.... So, I was hoping that the clever folk on Moz can help!!! Here is couple questions that would be nice to clarify on. What is the actual address/name of file for news xml. Can xml site maps be generated on request? Consider following scenario: spider requests http://mypage.com/sitemap.xml which permanently redirects to extensionless MVC 4 page http://mypage.com/sitemapxml/ . This page generates xml. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
A question of rankings (with actual domains)
Working with the main site featured in this Open Site Explorer comparison (you'll need a pro account to view this), and have been for quite some time. Recently we've slid behind Ebay (huge brand, I get it), but the other competitors don't really make sense to me. Main phrase is pontoon boats, and maybe I'm too close to this, but we seem to be in the best shape overall in terms of the domain, the page itself, and even our social media is pretty successful (we're closing in on 5,000 likes and have a pretty engaged audience). More internal linking is an opportunity, but I'd like another set of eyes (or several for that matter) to weigh in on opinions. I'm a bit stumped. Thanks Mozzers!
Intermediate & Advanced SEO | | NetvantageMarketing0 -
Duplicate Content
http://www.pensacolarealestate.com/JAABA/jsp/HomeAdvice/answers.jsp?TopicId=Buy&SubtopicId=Affordability&Subtopicname=What%20You%20Can%20Afford http://www.pensacolarealestate.com/content/answers.html?Topic=Buy&Subtopic=Affordability I have no idea how the first address exists at all... I ran the SEOMOZ tool and I got 600'ish DUPLICATE CONTENT errors! I have errors on content/titles etc... How do I get rid of all the content being generated from this JAABA/JSP "jibberish"? Please ask questions that will help you help me. I have always been 1st on google local and I have a business that is starting to hurt very seriously from being number three 😞
Intermediate & Advanced SEO | | JML11790 -
Duplicate Content on Product Pages
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages. The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately. Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too. As an example these three pages have been identified as being duplicates of each other. http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
Intermediate & Advanced SEO | | gavinhoman0 -
Affiliate Site Duplicate Content Question
Hi Guys I have been un-able to find a definite answer to this on various forums, your views on this will be very valuable. I am doing a few Amazon affiliate sites and will be pulling in product data from Amazon via a Wordpress plugin. The plugin pulls in titles, descriptions, images, prices etc, however this presents a duplicate content issue and hence I can not publish the product pages with amazon descriptions. Due to the large number of products, it is not feasible to re-write all descriptions, but I plan re-write descriptions and titles for 50% of the products and publish then with “index, follow” attribute. However, for the other 50%, what would be the best way to handle them? Should I publish them as “noindex,follow”? **- Or is there another solution? Many thanks for your time.**
Intermediate & Advanced SEO | | SamBuck0 -
$1,500 question
I have $1,500 to spend to promote 8 years old website. Almost no SEO work was done for the site in the past 3-4 years. The site has a couple hundreds (around 300) external backlinks pointing to the homepage, and around 30 backlinks pointing to internal pages. It gets around 60% traffic from referring sites, 30% direct, and 10% from SE. The homepage has PR 4. It ranks around 70th place in Google rankings for one of the main keywords. No keyword research has been done for the site. Looking for long term benefits. What would be the best way, in your opinion, to spend this money?
Intermediate & Advanced SEO | | _Z_0