What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved Recommended title length for Google search results
I read the recommended title length is 50-60 characters depending on alphabets, etc,.
On-Page Optimization | | Mike555
Anyways, my question is, is there any harm of having longer title?
If all my important keywords are within the 50-60 characters that will show up on search results, I can still make the title longer, it's just that those keywords outside won't have any effect on search results?0 -
Is there a benefit to hiding words from Google on ecommerce sites?
I think we all know that ecommerce sites have a lot of repeating functional texts on them. Is there a benefit from hiding this text from Google's crawler? Take this page for example, http://storage3.static.itmages.com/i/16/0805/h_1470425505_1090542_224cc344d4.png Some of the most dense words on the page are "Add to cart", "Add to Wishlist", "New", and "Sale". Is there any benefit to hiding those words from Google? The method of hiding I am talking about is not cloaking, but this, https://www.google.com/support/enterprise/static/gsa/docs/admin/70/gsa_doc_set/admin_crawl/preparing.html#1076243 using the google:off index tag. So the content will be there still, but it will not be indexed.
On-Page Optimization | | LesleyPaone0 -
Do Google's mobile friendly updates effect visibility on desktop results pages?
Google say that their quest to make websites more mobile friendly impacts mobile search results - https://webmasters.googleblog.com/2016/03/continuing-to-make-web-more-mobile.html But I am wondering if having a website that is less mobile friendly effects desktop SERPs as well? We require Adobe Flash as a tool for people to upload their images to us but not on the landing pages we're trying to rank. So our landing pages are not as mobile friendly as they could be (which we're looking to improve) but am worried this is effecting desktop search results even though Google do not claim they do.
On-Page Optimization | | KerryK1 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
WMT Fetch as Google
Is there any benefits in using 'Fetch as Google' in WMT and then submitting for indexing? I have a page which I'm trying to get to rank so far with no luck is it likely to help or could it hinder? Please speak from experience not hearsay 🙂 Many Thanks
On-Page Optimization | | seoman100 -
Strange google indexing behaviour
Hi all Looking for a second opinion on a strange issue with has occurred on my site. The site is a magento store and because I am using all the default merchant descriptions at the moment I have noindexed the product pages (there are 300k products, the plan is to rewrite the content as we go, starting with most popular sellers). The Gbot is blocked from the pages and all the products have header tag. We forgot to noindex the popular search terms page on the site and as a result google has indexed some search result pages - we may keep this open, not sure yet, We are seeing a very strange thing in the serps. Google has indexed the search result pages, as mentioned above, however, the description and title tag being used do not belong to that page, they belong to the product page the search result links to. If i do a search in google for the indexed pages i get the categories and lots of, what appears to be, product pages. https://www.google.co.uk/search?q=site:arropa.co.uk/store&espv=2&biw=1536&bih=772&ei=LE5xVd3qA4HlUNnggKgH&start=250&sa=N One would assume that a page listed with the title of Ladies 1 Pair Young Trasparenze Mumbai Animal Print . and the description of Come on, program a little of your crazy side! Part of the edgy, sassy Young Trasparenze Medley, these soft touch, nontransparent stockings function a crazy, (along with the price) would be an entry for that individual product. However, clicking on that product opens up a search results page (very slowly as the site is processing an update still - it is not for public use thus far) which can be seen here http://arropa.co.uk/store/catalogsearch/result/?q=+ladies+1+pair+young+trasparenze+mumbai+animal+print+tights+75+off+military+l+ yes, the search result page is for that particular item but nowhere on the page is the title, description and price, nor has it ever been. Am a little puzzled about this and what it would do re duplicate content as im using the manufacturer data at present. Ideally i would like to keep the search results pages open. Any thoughts would be most welcome. Couple of things to note. Im aware the site is too slow for general public use. It will be fully cached once running, as i say, it has 300k+ products so isn't small. Also, am aware that there are no images. They exist but we are moving the images around, hence being down. Always a fun task when there are 25gb of the things!! Many thanks Carl
On-Page Optimization | | WonkyDog0 -
Do these items affect Google ranking or Quality Score?
Hi community, After my first crawl of a site that I'm working to improve the SEO on, I find that I have about 500 issues regarding missing Alt Text and Title text for images on my site, as well as about 175 issues with regard to duplicate meta description, missing meta descriptions, and too short/too long meta descriptions. My client is not sure it matters to fix these items and only wants to do so if they have an affect on Google ranking or Quality Score. Does anyone know? Thanks!
On-Page Optimization | | gataninc0 -
Can't find Text-Only Cached version in Google search
Hi, I'm trying to view the text-only of a webpage to run a SEO audit, however Google does not give me this option. When i click the two arrows that appear to the right of a search result, the only option I get is Cached. Is there a reason this might be happening? I've tried clearing my cookies, signing out of Google, and anything else I could find on my troll of the internet. I also tried text, only please! however whether this works or not is debatable considering it shows me actual pictures on the site. Any ideas, or maybe another add-on that will work?
On-Page Optimization | | JuiceBoxOM0