What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What tools or tactics do you use to identify which ranking factors Google is weighting for your industry or keyword?
Google ranking factors are increasingly more complex and less universal. Google is emphasizing different ranking factors for different scenarios. What tools are available that can help identify which ranking factors Google may be weighting for a given query or industry? For example, are there any tools that provide correlative analysis of Google's rankings for a given keyword?
On-Page Optimization | | AdamThompson0 -
How google treats my two different domains with the same content ?
I have two internet stores for two different markets but in the same language (English), the same content and the same url (only domains different). They are in different servers one in USA another in UK. Example: sample.com (global) and sample.uk (for UK).
On-Page Optimization | | VaidasLinen
Currently sample.com (7 years old) is doing better but not very very well, sample.uk (2 years old) is rated poorly. My question is if it's possible that google will rank both stores well in the future ? Thanks Vaidas0 -
Google news rejection
Google News is always rejecting my application. I feel as if my site strongly fits the requirements yet they reject it all the time. My url is hiddentriforce.com Any thoughts?
On-Page Optimization | | Atomicx0 -
How the hell do you get microformat to show up on google serp?
Preface: I implemented Microformat aggregate review (http://data-vocabulary.org/Review-aggregate) for our e-commerce website and included only on the homepage. The vote and count are actually coming from real reviews we are getting from our customers, and in the homepage some reviews are shown prominently and a link points to the full list of all the reviews. Microformat markup is correct, validated in GWT. Have been online for a while (probably a couple of years). Our website: http://www.gomme-auto.it The star rating never showed up. When checking competitors I could see their microformats where not showing up either. But now things changed, if I check one competitor (the market leader www.gommadiretto.it) searching for it with their brand name “gommadiretto” no star rating is showing, but if I search for tires of a specific manufactured like “pneumatici barum” I can see their result in serp is showing the star rating for that specific internal page (the brand page) where they simply put the website overall aggregate review microformat mark up, they actually put it on every page. And that make me scratch my head and start asking myself some questions: is google showing their microformats because they manually awarded them somehow? no other competitor seems to have got the star rating in serp is google showing their microformats because they have so much more reviews than I have? I have around 1700, they have around 11000. is google showing their microformats because their reviews are certified by TrustPilot? is google showing their microformats because they put it in the product page? well of course since I am not putting it there (in the brand page) it's a factor, but isn't it recommended to put the website aggregate reviews microformat only on one page? and shouldn't we show the brand reviews on the brand page? isn't it best practice/recommended to put the website aggregate review microformat only on one page? is google showing their microformats because of some other reasons I can't see? What the hell is google criteria for showing the star rating? Does anyone know?
On-Page Optimization | | max.favilli0 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
Hit by Panda - Google Disavow Help
Hi I hope you can help me A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link. The Website use to be on page 1 for "fancy dress" now it isnt visable for that term at all and most other terms the site has dropped for. I have looked into what might have gone wrong and have removed several links , used the disavow tool 2-3 times and submitted re-consideration requests, but each time google informs me that they are still detecting unnatural links. Could somebody please take a look at our link profile www.partydomain.co.uk for "fancy dress" as an example and show examples of links you would consider that google might not like. It would also be good if anybody had any contacts in the UK that could help thanks Adam
On-Page Optimization | | AMG1000 -
Hi.. Can a E-commerce site have a Google Authorship.
Hi Can a E-commerce site have a Google Authorship, and if yes i have learned google requires a face for Google Authorship, as they are applying Facial recognition with the authorship. If So, how can an e-commerce website have an individual's face. ?
On-Page Optimization | | usef4u0 -
Why my website is duplicating and pasting links one after another,i dont know whats happening??
This is my problem: duplicate_title Starts here:www.ezstreetsports.com/%5C www.ezstreetsports.com/ www.ezstreetsports.com/casino_winners_corner.aspx www.ezstreetsports.com/casino_winners_corner_game.aspx www.ezstreetsports.com/ezfree100_rules.aspx?idSource=474 www.ezstreetsports.com/lounge_bootypatrol.aspx www.ezstreetsports.com/lounge_hardcarporn_1.aspx www.ezstreetsports.com/lounge_hardcarporn.aspx www.ezstreetsports.com/sports_periods.aspx www.ezstreetsports.com/sports_futurebet.aspx www.ezstreetsports.com/sports_referralrules.aspx www.ezstreetsports.com/sports_referralbonus_rules.aspx www.ezstreetsports.com/how_to_bet.aspx . this is the end. Itry to find the reason inside the css and htnl program but i can't find! please you can help me or give me the direction to how i can do it?
On-Page Optimization | | mrMaximo19790