How to remove a Google algorithmic penalty
-
My site has a Google penalty. I seem to be stuck in the 64th position for a Google search for my sites name. All my keywords that I used to rank well for are now well above the 60th search place in Google.
I have resolved the issue I recieved the penalty for and I have asked Google for reconsideration. That has been about 3 months ago. The penalty is still firmly in place.
I was wondering if anyone else has had a Google algorithmic penalty removed and if so how did they accomplish this?
-
Hi Todd,
I'm following up on some of the older unanswered questions in Q&A. Is the penalty still in place for you, or did you get it figured out? Are there any lessons learned that you can share with us, or any questions you still have?
Thanks!
-
Hey Todd
I saw some pretty major impact on one site on the 7th of Feb, it seems that a fair few other UK people did as well, I don't think it is a penalty, just an algo change of some sort but still trying to get my head around it.
Can you post the URL and keywords? Happy to take a look.
Cheers
Marcus
-
If you have triggered algorithmic penalty there is no other way to fix it other than removing the cause for the penalty. Only manually assigned penalties can be fixed by submitting reconsideration requests. Google makes the whole thing that much more difficult by NOT telling you what caused it. One of our clients had a penalty of this type assigned and we did the following:
1 - Cleaned up backlinks for paid links
2 - Removed variable driven repetitive text across all pagesIt took 4 weeks for re-inclusion. We don't know what caused it, 1 or 2.
-
Hey,
A site I used to manage picked up a penalty, quite innocently as it was and we got it removed, and a few weeks later it got it back again. Basically, they had some subdomains for testing client sites, removed them in teh hosting control panel and it ended up with the domains pointing at their site and they had several URLs all indexed for their content.
We 301'd all the domains and wrote a reconsideration request and that did the job. Then, somehow, we had missed one and it got penalised again, repeated the process and it was okay again.
As it happens I have another site that seems to have picked up a penalty on the 7th of Feb yet there is really nothing wrong with it as far as I can see and to make matters worse, the competition is pretty much breaking google guidelines left right and centre (mostly doorway pages).
I'll show you mine if you show me yours?
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using Google Reviews for non local business
Hi All, We are deciding on what site is best to capture reviews from customers and I'm just not sure what is the ideal option. We are a SaaS business with multiple offices in different locations but the specific geographies are not really relevant to our customers. Is it worth focusing on google reviews so that when our brand is searched there are plenty of nice shiny stars (plus maybe they can be added into adwords adverts as well...). Search volume for the keywords we don't yet rank for are not massive although still important. Alternatively should we be thinking about something like G2Crowd. None of our competitors are doing anything so there's no real need to our muscle them on a review website and I don't think our end user will visit these sites before buying but we would point to them and say 'hey, look at all these great reviews'. Finally I searched my old company recently who and just under news results were facebook reviews. Maybe that's another option. All advice welcome. All advice appreciated.
Industry News | | jafayeh1 -
How do you evaluate algorithm changes?
First a request: Please answer this only if you own/manage more than 5 to 10 sites. Also, please understand I am trying to go beyond speculation. With the current "pigeon" update (as coined by SEL), we are seeing a lot of changes, good and not good. An example is within the real estate vertical. There is literally no "pack" whatsoever. Page one for every major real estate search in Houston: houston real estate, homes for sale, townhomes for sale, etc. is now populated by a major RE vertical player. All of these companies use the real estate rules, hire a broker, then scrape RE data from other MLS brokers. So, there is not a single Houston real estate company showing on Page one for any of these searches. Zillow, Trulia, HAR, etc. dominate the entire organic page. Within other verticals it is a mixed bag of results. A home service provider shows up with their outlying offices for searches on that service and Houston. Etc. Just curious how those of you who handle multiple sites analytically approach learning what is going on. I know what we do, but am very interested in others methodologies.
Industry News | | RobertFisher
Best EDIT for Clarification: I am speaking to how do you evaluate the changes (the effects of) over a fixed period in order to draw a conclusion as to what has changed, how that affects your clients, and what you will do now given the change and the knowledge you have gained.3 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
If I have a Google+ Business page, do I need a Google Places page as well?
It seems like the two are redundant? Any official word on this? I'm fairly OCD about things being tidy and I dont want to split my reviews / shares / etc between two profiles. Are they not the same thing? I searched for my company, both my plus business page and my places page came up. I attached a SS of the situation. placesvplus.png
Industry News | | jonnyholt1 -
Is this still Google?
My niche, my concern.
Industry News | | webfeatus
http://www.google.com/search?q=jimbaran+villa
My site just dropped out of the rankings completely. But if you look at the Google search above you will notice 2 things:
1. First page: 75% of space above the fold is dedicated to Google making money
2. Subsequent pages: It is like you don't actually search "Google" If you flip through a few pages what you actually search is:
agoda.com
flipkey.com
tripadvisor.com
homeaway.com Do I have a point or am I simply having a cynical day?1 -
Google Releases Penguin Update 1.1
So what affect is this playing on you guys? http://searchengineland.com/google-pushes-first-penguin-algorithm-update-122518 It was not to nice to me... Why over the holiday Google, why!? -Chenzo
Industry News | | Chenzo0 -
Google+ profiles and Rel Author. Extensive question
A bit of a mammoth question for discussion here: With the launch of Google+ and profiles, coupled with the ability to link/verify authorship using rel=me to google+ profile - A few questions with respect to the long term use and impact. As an individual - I can have a Google+ Profile, and add links to author pages where I am featured. If rel=me is used back to my G+ profile - google can recognise me as the writer - no problem with that. However - if I write for a variety of different sites, and produce a variety of different content - site owners could arguably become reluctant to link back or accredit me with the rel=me tag on the account I might be writing for a competitor for example, or other content in a totally different vertical that is irrelevant. Additionally - if i write for a company as an employee, and the rel=me tag is linked to my G+ profile - my profile (I would assume) is gaining strength from the fact that my work is cited through the link (even if no link juice is passed - my profile link is going to appear in the search results on a query that matches something I have written, and hence possibly drain some "company traffic" to my profile). If I were to then leave the employment of that company - and begin writing for a direct competitor - is my profile still benefiting from the old company content I have written? Given that google is not allowing pseudonyms or ghost writer profiles - where do we stand with respect to outsourced content? For example: The company has news written for them by a news supplier - (each writer has a name obviously) - but they don't have or don't want to create a G+ profile for me to link to. Is it a case of wait for google to come up with the company profiles? or, use a ghost name and run the gauntlet on G+? Lastly, and I suppose the bottom line - as a website owner/company director/SEO; Is adding rel=me links to all your writers profiles (given that some might only write 1 or 2 articles, and staff will inevitably come and go) an overall positive for SEO? or, a SERP nightmare if a writer moves on to another company? In essence are site owners just improving the writers profile rather than gaining very much?
Industry News | | IPINGlobal541 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690