What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Hi, I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file? The ideal result would be for Google to index a script tag instead of a link.
On-Page Optimization | | CopBlaster.com1 -
Google changing my Title
I noticed today that Google is showing a different Title in searches for the homepage. It is showing a title which I believe was active as of late last year or early January. The new title is in the title tag, in Google cache and in MOZ crawls as well. Not sure why it would still be showing a different title in the search and wanted to see if anyone can explain the reason. Thanks
On-Page Optimization | | campaigneast0 -
Google rankings
I have encountered a serious problem with google rankings and the sequence of events goes like this for a high volume phrase. I make changes to the page which gets a score of 98 in Moz and equally good scores in other tools. I resubmit the page to Google for searching via crawl in google webmasters. After about 1/2 to 1 day, the page will be ranked between 30 and 50. Within another 1/2 to 1 day, the ranking goes completely away. This page has much higher scores than pages that are ranked. Other phrases continue to rank OK.
On-Page Optimization | | tiedyedshop0 -
Is there a benefit to hiding words from Google on ecommerce sites?
I think we all know that ecommerce sites have a lot of repeating functional texts on them. Is there a benefit from hiding this text from Google's crawler? Take this page for example, http://storage3.static.itmages.com/i/16/0805/h_1470425505_1090542_224cc344d4.png Some of the most dense words on the page are "Add to cart", "Add to Wishlist", "New", and "Sale". Is there any benefit to hiding those words from Google? The method of hiding I am talking about is not cloaking, but this, https://www.google.com/support/enterprise/static/gsa/docs/admin/70/gsa_doc_set/admin_crawl/preparing.html#1076243 using the google:off index tag. So the content will be there still, but it will not be indexed.
On-Page Optimization | | LesleyPaone0 -
Web spam traffic coming from Fort Lee in Google Analytics
I keep getting a ton of web spam crawler traffic from Fort Lee. Anyone know who this is?
On-Page Optimization | | cbielich0 -
2000 Active pages 404 on LIVE Ecommerce site - what will google do now?
Hi All, One of my ecommerce site having more than 20,000 pages from that one of the categories having 2000 pages showing 404 and still taking time for developer to fix this issue and may be they will be able to fix after 2-3 days so is this okay with google or google will take any action during this period? Thanks! Dev
On-Page Optimization | | devdan0 -
Getting Google to see major changes?
I am making some big changes to my site based on Moz's advice. How can I get google to see all of them? I submitted them to Google Webmasters. Is that the best thing to do? How long until Google updates all my pages?
On-Page Optimization | | dealblogger0 -
Submitting URLs to Bing and Google
Does Submitting URLs to Bing and Google actually do anything? Is it worthwhile? What I mean is submitting intermittently individual URLS after already submitting the sitemap.
On-Page Optimization | | FCAbroad0