How Does This Site Get Away With It?
-
The following site is huge in the movie trailer industry:
It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry.
Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs.
We all know Google hates duplicate content at the moment... so how does this site get a away with it?
Does it's root-domain authority keep it up there?
-
I have seen in some instances where sites will add iframes if they have to use duplicate content, then nofollow the iframes, and tell the robot file to ignore them so they aren't analyzed. It works actually, but it's sloppy code having a iframe on every page sometimes multiple. Or at least it is to me for that purpose, but I guess it's better than getting penalized for dup content.
Have a great night.
Matthew Boley
-
Hey Rhys.
A few questions.
Does this site have an affiliate feed.
For example, are other people copying the content from this site through an affiliate feed of some sort.
That could be one of the example here.
( actually they do just found it )
In the case of Dup Content, Google looks at the how trustful a sites content is based on this overall ranking. So if this site is putting out content everyday and then pinging the google bot to come crawl the site first, thats all they would need to be verified as the originator of the content. The other sites get then can easily copy the content from this site, but as long as it gets indexed on this site first, they would not really have a problem.
Same thing goes for a blog you write.
You would have to dig a little deeper with their link structure. They overall ranking is pretty high they have tones of spun link from each url, about 100 linked pages for each domain name . A few good links from Mtv and Vh1 and mostly a lot of blogs.
But ya your right their SE traffic is off the charts . They rank really well for some movie names as well.
what is your end goal in running a comparison against them ?
They root domain is fairly high and that does play a big factor in how well they get ranked for a lot of these keywords as well.
their age of site is about 5.3 years and their domain authority is around 74 to 79 , depending where you look.
PR6 but you might need to dig deeper.
-
Could be that the other key factors like traffic and incoming links are astronomical, so Google's content dupe penalty is out weighed.
It's a case of "doing it better" and not necessarily doing it the first time. While they may be scraping content and it's all duped - they simply must get the huge traffic numbers and incoming links because it's content all in the one place.
I'm just assuming of course - without doing an audit who knows - but must be frustrating for you if you are working on a campaign against them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Hacked site vs No site
So I have this website that got hacked with cloaking and Google has labeled it as such in the SERPs. With due reason of coarse. My question is I am going to relaunch an entirely new redesigned website in less than 30 days, do I pull the hacked site down until then or leave it up? Which option is better?
White Hat / Black Hat SEO | | Rich_Coffman0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
How to get rid of black hat links?
I have recently discovered that one of my clients has either been sabotaged or has done this himself. In the case that he didn't do anything, how do you go about getting rid of bad links? There is now over a 1000 bad links linked to his site, do I report them as spam or what is the best way to fix this?
White Hat / Black Hat SEO | | StrategicEdgePartners0 -
Are there tools out there to determine when a link linked to your site? I want to know when a link farm was done a site.
In Webmaster Tools I discovered that a client of mine with signed up for or hired another company to get links. The links are poor quality and from other countries, so it looks like a link farm was done. I want to know when they links were linked to the site, and not sure how to find that information out. Does anyone know how to find this out?
White Hat / Black Hat SEO | | StrategicEdgePartners0 -
Can someone explain how a site with no DA, links or MozTrust, MozRank can rank #1 in the SERPs?
I do SEO for a legal site in the UK and one of the keywords I'm targeting is 'Criminal Defence Solicitors'. If you search this term in Google.co.uk this site comes top www.cdsolicitors.co.uk, yet in my mozbar it has 0 links, 0 DA etc, I noticed it top a few weeks ago and thought something spammy was going on; I thought if I was patient, Google would remove it, however it still hasn't. Can someone explain how it is top in the SERPs? I've never seen this before. thanks
White Hat / Black Hat SEO | | TobiasM0