Is Google's reinclusion request process flawed?
-
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process.
As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google.
Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty?
This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more.
If anyone could contribute from the search quality team, we would love to hear your thoughts!
Cheers,
Joe
-
Thank you for your thoughts.
I agree that they must be swamped and most of the 'complaints' you can see on the Google forums fully deserve to be penalised in my humble opinion, but I think that the total lack of communication is more damaging than helpful.
If they want to improve the web, why do they not give more details about what is causing the problem? By being more transparent and helping webmasters to eradicate spammy techniques, everyone will be forced into improving their sites for all the right reasons.
If they don't have the resource to handle the reinclusion requests, then they shouldn't have it as an option.
I still feel that it is very poor not to even look at the files that were prepared - that shows a lack of respect.
I agree that it is likely to be something simple. The 'spike' theory is still the strongest contender for me, due to the timings of events, but that is alarming if it proves to be true as we were effectively penalised for doing exactly what Google encourages (creating good content that will naturally attract links).
Another possible cause is the fact that we have got a number of directory links over the years. Whilst I have never considered these to be high quality links, I have never seen Google saying that you shouldn't submit your site to them (indeed, they used to actively suggest that submitting to Yahoo! was a good idea) and it is a way for Google to outsource some human assessment of sites (assuming that the directories do check your sites).
If it is the directories, then the door for negative SEO is so wide open that it is alarming. As many have said, completely ignoring such links would be better than penalising you.
We are still no closer to understanding what we have done wrong, despite every effort to adhere to the guidelines and a lot of work trying to audit / document our link profile. With very little faith in the reinclusion process, where can we possibly turn to now?
We will see. There were multiple views of the open letter from Google, so somebody somewhere has seen it and I just hope that there is some form of response.
The irony is that we spend most of our life defending Google and encouraging clients to improve what they are doing online. On this occasion, I really find it hard to defend them. I appreciate that we are a drop in a mighty ocean, but the principle is one that I think is an important one and one that I will pursue.
Thanks again for your contribution,
Joe
-
Not sticking up for them but you have to appreciate the amount of people that would probably try to send them all sorts of viruses anyway they can, also they probably don;t have or want to take the time looking at everyone so closely, basically they will just see if the offending stuff is still there, if it is they won't give you any love.
Honestly, it's probably something so simple you have overlooked it. Keyword ratio's above 6% can be taken as spam depending on content amount, non relevent link coming in/going out, links from already penalzed sites, anything unnatural is now what google has been focusing on, not just backlinks, so go through your site, obviously the HP is the first point to start at, if you are really sure there is nothing going on in the way of spaminess or over optimization, comb through your other pages. It will usually be a problem one your best ranked pages. Or, ex best ranked pages if you have been hit with a penguin slap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Title Tags in Sitecore are the same as navigation. How do I add keyword phrases without effecting my website's navigation?
I am working on overhauling the on-page SEO for ecommerce website on Sitecore. I've done all my research and I am ready to plug the Title tags and descriptions in. So, if the page on Navigation is 'SHOP' this is in the Title tag box. How do I add my 70 characters of keywords? Thanks. JOE
Intermediate & Advanced SEO | | iJoe0 -
Can I have multiple 301's when switching to https version
Hello, our programmer recently updated our http version website to https. Does it matter if we have TWO 301 redirects? Here is an example: http://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/dedicated_servers/linux-dedicated.htm 301 https://www.colocationamerica.com/linux-dedicated-server We're getting pulled in two different directions. I read https://moz.com/blog/301-redirection-rules-for-seo and don't know if 2 301's suffice. Please let me know. Greatly appreciated!
Intermediate & Advanced SEO | | Shawn1240 -
What's the best URL structure?
I'm setting up pages for my client's website and I'm trying to figure out the best way to do this. Which of the following would be best (let's say the keywords being used are "sell xgadget" "sell xgadget v1" "sell xgadget v2" "sell xgadget v3" etc.). Domain name: sellgadget.com Potential URL structures: 1. sellxgadget.com/v1
Intermediate & Advanced SEO | | Zing-Marketing
2. sellxgadget.com/xgadget-v1
3. sellxgadget.com/sell-xgadget-v1 Which would be the best URL structure? Which has the least risk of being too keyword spammy for an EMD? Any references for this?0 -
How does Google determine 'top refeferences'?
Does anyone have any insight into how Google determines 'top references' from medical websites?
Intermediate & Advanced SEO | | nicole.healthline
For example, if you search 'skin disorders,' you'll see 'Sources include <cite>nih.gov</cite>, <cite>medicinenet.com</cite> and <cite>dmoz.org</cite>'--how is that determined?0 -
Dynamic 301's causing duplicate content
Hi, wonder if anyone can help? We have just changed our site which was hosted on IIS and the page url's were like this ( example.co.uk/Default.aspx?pagename=About-Us ). The new page url is example.co.uk/About-Us/ and is using Apache. The 301's our developer told us to use was in this format: RewriteCond %{REQUEST_URI} ^/Default.aspx$
Intermediate & Advanced SEO | | GoGroup51
RewriteCond %{QUERY_STRING} ^pagename=About-Us$
RewriteRule ^(.*)$ http://www.domain.co.uk/About-Us/ [R=301,L] This seemed to work from a 301 point of view; however it also seemed to allow both of the below URL's to give the same page! example.co.uk/About-Us/?pagename=About-Us example.co.uk/About-Us/ Webmaster Tools has now picked up on this and is seeing it a duplicate content. Can anyone help why it would be doing this please. I'm not totally clued up and our host/ developer cant understand it too. Many Thanks0 -
Is it possible for a multi doctor practice to have the practice's picture displayed in Google's SERP?
Google now includes pictures of authors in the results of the pages. Therefore, a single practice doctor can include her picture into Google's SERP (http://markup.io/v/dqpyajgz7jkd). How can a multi doctor practice display the practice's picture as opposed to a single doctor? A search for Plastic Surgery Chicago displayed this (query: plastic surgery Chicago) http://markup.io/v/bx3f28ynh4w5. I found one example of a search result showing a picture of both doctors for a multi doctor practice (query: houston texas plastic surgeon). http://markup.io/v/t20gfazxfa6h
Intermediate & Advanced SEO | | CakeWebsites0 -
Adding index.php at the end of the url effect it's rankings
I have just had my site updated and we have put index.php at the end of all the urls. Not long after the sites rankings dropped. Checking the backlinks, they all go to (example) http://www.website.com and not http://www.website.com/index.php. So could this change have effected rankings even though it redirects to the new url?
Intermediate & Advanced SEO | | authoritysitebuilder0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0