What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Metadescription not being pulled by Google? Yoast v SmartCrawl?
Hey guys, For whatever reason, Google isn't pulling the metadescriptions I've provided for a wordpress site I'm working on. We had both Yoast and SmartCrawl installed, so I thought maybe they were confusing Google and deactivated Yoast. Unfortunately, that didn't fix the issue. Instead of using the text I've plugged into SmartCrawl, Google is just using snippets from the blog posts... And it's happening for every single post, leading to a huge uptick in metadata issues in moz. Any idea how to fix this?? Thank you!
On-Page Optimization | | laurendavidson0 -
Fixing Index Errors in the new Google Search Console - Help
Hi, So I have started using the new Search Console and for one of my clients, there are a few 'Index Coverage Errors'. In the old version you could simply, analyse, test and then mark any URLs as fixed - does anyone know if that is possible in the new version? There are options to validate errors but no 'mark as fixed' options. Do you need to validate the errors before you can fix them?
On-Page Optimization | | daniel-brooks0 -
Web spam traffic coming from Fort Lee in Google Analytics
I keep getting a ton of web spam crawler traffic from Fort Lee. Anyone know who this is?
On-Page Optimization | | cbielich0 -
Google Drop
I started using SEOMOZ due to a sudden and huge drop in Google for two main keywords (hair bows and baby headbands). Our site (BloomingBows.com) has held a top three spot for years with these words and then in the last few months has dropped down on the first page and now they are completely off the charts. Is there any insight as to why? Also, we have been very active using the data from here in the last week or so to clean up and improve anything listed, but I am still seeing keywords drop into the 40 - 60 position and our traffic is drying up. Starting to panic and wondering if I am missing something or going about this in the wrong way. ANY insight is appreciated at this point!! Thank you!!
On-Page Optimization | | bloomingB0 -
Google's Page Layout Algorythm
It seems that Google have been or will penalizing websites with too many ads above the fold. Is it me or Google's search result layout is a perfect example of what NOT to do?
On-Page Optimization | | sbrault741 -
Seeking a Google Penalty / Panda & Penguin recovery expert
Hi folks, I've been dealing with an online travel agency who came to me looking for content marketing services. One look at their analytics & GWT was all I needed to see they have some serious site cleanup to do before they can start spending money on our services. Their search traffic floored with the first Panda update back in Feb 2011, and they've gone through all sorts of turbulence with the Penguin updates too. There are no messages in GWT, but I suspect their previous SEO guy might have deleted them to cover his tracks. I did a quick look at their link profile and there's all kinds of junk in there, they have dupe content all over the place and the whole thing needs cleaning up. In other words, it's a mess. I'd like to win some business from them, but first they need to talk to a Panda/Penguin recovery specialist. If you're interested, drop me a private message and I'll put you in touch with them. Thanks, Matt.
On-Page Optimization | | MattBarker0 -
How can I stop google reading a certain section of text with my H1 tag?
Hey Mozzers, I'm wondering if anybody knows of a way that I can stop google reading a certain part of text within my H1 texts? My issue is that I have individual office pages on my site, but many offices are based in the same city; such as 'London'. I want to keep London within the H1 tag for user experience but I do not want it to be picked up by the search engines and start a canonical issue. I've seen some people say to use document.write or use an image. Does anybody know of a correct way of doing this? Many Thanks.
On-Page Optimization | | Lakeside0 -
Does Google still see masked domains as duplicate content?
Older reads state the domain forwarding or masking will create duplicate content but Google has evolved quite a bit and I'm wondering if that is still the case? Not suggesting that a 301 is not the proper way to redirect something but my question is: Does Google still see masked domains as duplicate content? Is there any viable use for domain masking other than for affiliates?
On-Page Optimization | | TracyWeb0