Ads above the fold penalty. Should I request reinclusion?
-
HI!
My site has been losing traffic slowly for about 18 months. But it was in January 19 that was hit big time.
My site has a lot of ads, including two 300x250 above the fold ads that were very lucrative for me.
After January 19, I decided to remove only one ad of those two, but no change was reflected in the traffic.
It is obvious that I needed to remove the other ad, but I didn't do it for two reasons.
-
I still earn money from that ad and removing it would result in serious problems.
-
A webmaster friend of mine that was hit too by this penalty, removed the ads and tried all sort of stuff to regain the lost traffic with NO LUCK in several months. He has unique and excellent content. So, after seeing his experience I didn't want to touch my biggest source of income and leave it as it is.
My site has other problems that concerns Panda and maybe Penguin, and since yesterday I've been starting to fix them.
Is it a good idea to request a reinclusion to check if I was manually penalized, without being previously notified by GWMT of any problem in my site?
Thanks in advance,
Enrique
-
-
Yes, I have it all. Not sure about incoming spammy links. I did almost everything to my site (datafeeds, lots of ads, duplicate content, etc.) but never engaged in spammy links.
I will try to find some other way to show ads and see what happens.
Thanks!
-
Google DOES allow ads above the fold. As long as your are not slapping the visitors face with your ads and the visitor has zero problems finding your page content without scrolling then ads are allowed.
If Google did not allow ads above the fold then most of the content providers on the web would go bankrupt.
My best is that you have a duplicate content, a skimpy content, a thin affiliate or links problem.
-
Hi Enrique,
Google don't give manual penalties for too many ads above the fold. Their manual penalities are for blatent violations of their webmaster guidlines, so things like buying links, cloaking or hidden text.
Although they recommend not putting too many ads above the fold from a user experience perspective, it's certainly not one of their terms and conditions and wouldn't be the cause of a penalty.
If you friend tried removing his ads and saw no recovery, it could be one of several issues:
-
It might not have been the excessive ads that were causing his problem
-
If it was the ads, he may not have removed them for long enough for the Panda update to be refreshed
One very important thing though; the ads above the fold issue and the Panda issue are the same thing.
It's the same algorithm update that is focused on user experience. It is nothing manual, and the only way to recover is to fix all the issues and wait for the refresh.
If you're fixing the site then that's a great start. With a bit of luck your new site will regain the rankings with your ads still in place, and then everyone is happy
Thanks,
David
-
-
Thanks David, and yes, I've been hit by panda and I know my site's weakness (most of it!). But it is difficult to make changes when your site was built 10 years ago with a different web in mind. I'm rebuilding it again (a whole new site with THIS CURRENT web in mind).
But I was not speaking about Panda specifically, I meant the "Ads above the fold" issue.
In January 19 my site was hit very hard by that update. Very hard. That was why I thought about a penalty.
You may think I'm dumb or something. I could fix it by just removing the ads an that's it.
But as I mentioned in my first post, a webmaster I work closely with had the same problem and removing the ads didn't help him a bit.
So, that's why I was wondering if it was a manual penalty, and requesting reinclusion was a solution to confirm it.
Thanks!
Enrique
-
Hi Enrique,
Firstly, the Panda update which you mention isn't penalty, it's an algorithmic updated that Google implemented with the general aim of reducing the number of poor quality sites in the search results.
By poor quality; they mean sites that have thin or duplicate content, sites that contain excessive advertising, and sites that are poorly designed or constructed.
If it is Panda (which it sounds like it could be considering the 18 month decline), a re-inclusion request won't help. This update is refreshed periodically (roughly every 4-6 weeks), so if you had fixed the issues on your site you would see your rankings return when it was refreshed.
The main issues to fix are generally around the quality and originality of the content, however depending how excessive your advertising is this might need to be addressed to.
There are some great resources out there for finding out if it is Panda that's effecting your site, and if so how to recover. My personal favourite is here on SeoMoz by Cyrus Shepherd:
http://www.seomoz.org/blog/beat-google-panda
From speaking to many Webmasters, the one thing I have found is that the people that recover are the ones who are willing to take a critical look at their own website and really own up to it's weaknesses. For example, you say your friend tried excellent content, but by what standard was it excellent? His own standards might be very different than that of Google.
The best way to stay ahead of updates like Panda is by being your own worst critic, and constantly challenging yourself to make your website the best it can possible be.
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deleting Tags without Penalty?
Hello - We have a site with over 1,000 tags. We added too many and would like a fresh start as they are creating a lot of duplicate pages on the site. What is the best way to go about deleting all of these tags without being penalized by Google? Is there a way to tell Google direclty to stop crawling them? We would prefer to not have that many pages just sit as 404 errors on the site. Thank you.
Technical SEO | | FamiliesLoveTravel0 -
Fetch as Google temporarily lifting a penalty?
Hi, I was wondering if anyone has seen this behaviour before? I haven't! We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword. I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds. I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax. Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied. Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh! So before I dig myself even deeper, has anyone any ideas? Thanks.
Technical SEO | | semcheck11 -
Do people recover from an unnatural outbound links penalty?
Dear Friends I received an unnatural outbound links penalty on 4th of June 2016. I immediately acted on it, and analysed a lot of outbound links and removed most of them. I had to file reconsideration 2 times, but on 26th of June my penalty was revoked.However, traffic drop was even worse 27th June onwards and it is still dropping. Now, i dont know why the traffic is dropping. Any idea how much time it takes to come back from an unnatural outbound link penalty? PHd8BzX.png
Technical SEO | | marketing910 -
Two "Twin" Domains Responding to Web Requests
I do not understand this point in my Campaign Set-Up. They are the same site as fas as I understand Can anyone help please? Quote from SEOMOZ "We have detected that the domain www.neuronlearning.eu and the domain neuronlearning.eu both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here." thanks John
Technical SEO | | johnneuron0 -
Google Alerts almost never alerts me to my own pages being added.
Hello All, So i have a fairly decent blog http://www.symbolphoto.com/bl*g/ * replace with o. However, i'm posting to it once/twice a week and i never ever see in my google alerts my pages being included. I do include my search terms in my pages "Bston Wedding Photgrapher" yet, my page is never included. What on earth am i doing wrong? Any advice would be greatly appreciated! -Brendan
Technical SEO | | symbolphoto0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0 -
Duplicate Content Penalties, International Sites
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
Technical SEO | | endlesspools0