What's the best way for SEO newbie to analyze & fix a site after being hit by Panda?
-
Hi,
I have a prospective client who was in the top 3 on Google for two of their primary keywords. They fell way back in the rankings immediately after Panda was rolled out on September 27, 2012. Two weeks ago, they were at position #118 for one keyword. After looking for them in Rank Checker today, they cannot be found in Google at all.
Here's my question. Because of the "bad links" (some pointing to Porno sites)... what's the possibility that this situation cannot be fixed?
I don't know... maybe I'm asking an irrelevant question. I'm attempting to assess the situation so I can go back and present my findings to the prospective client. I'm committed to understanding what's going on with their website so I can assess the situation properly.
Fixing their problem starts with a correct assessment.
They have a ranking problem, and I know I can fix that... IF all their site needs is white hat <acronym title="Search Engine Optimization">SEO</acronym>.
What I DON'T want is... to sell them <acronym title="Search Engine Optimization">SEO</acronym> services, only to find out in 3-6 months... I made an incorrect diagnosis of the problem, and therefore sold them the wrong solution. I know I can close the sale if I can show them with reasonable substantiation that the damage is not beyond repair.
I'm familiar with the basics of <acronym title="Search Engine Optimization">SEO</acronym>, but I'm unfamiliar with how "bad linking" might effect the long-term commitment to optimizing. They're wondering if they should start over on another website. I was attempting to do an assessment to better understand if my typical approach on this site would be sufficient. Also... I wanted to get an assessment/report to show them something to substantiate my conclusion(s) about their website.
If Open Site Explorer is sufficient to do the link analysis... great. At least I know I'm working with the right tool. All I have to do is learn how to use the tool quickly. At this point... I'm not sure which tool would be helpful.
So... can you speak to the following 2 questions:
1.) How do you know when ranking problem is beyond fixing?
2.) What software/tool is ideal for doing some link analysis in order to assess the problem, and prescribe a solution?
Thanks so much!
Ramon
-
Thank you for your reply.
Ramon
-
Thanks Dave... your post is very helpful.
Ramon
-
Great point/question Chris. I was referred to them by a CPA whom I helped with his SEO. He had a most basic issue, and began to see immediate improvements. It made the introduction a no-brainer for him.
Chris... I appreciate your candor.
Ramon
-
Your first step is to understand ... "What is Panda?". "What is Penguin?"
Go learn that, be able to explain it exceptionally well, then come back here and ask questions.
You need to understand the basics before you can fix another person's website.
-
I'll tell you, I have to wonder why a company with such a major problem is requesting that a "newbie" fix it for them. Nothing against you, Ramon, but my first thought is that they're hoping to get someone to spend a bunch of time on it for cheap, which could have been what got them in trouble in the first place.
If they're coming to you for advice and they know you're new to SEO, be up front with them and say your best advise is that they move all the content (minus any low quality outbound links) to a new domain and start over. I wouldn't even redirect the old site to the new one. Unless they're going to pay you well for all the time your going to take learning everything you need to learn to make a serious effort at solving a difficult problem such as this, cut to the chase and see what they say.
If they go for that, then you can spend your time fixing the problems you know you can fix and help them build a better link profile. That's my two cents.
-
Hi Ramon,
This is a very difficult question to answer. Before I even try to answer it I'd like to point out the following:
The first Panda filter came out on Feb 25, 2011. On Feb 24, 2012 Search Engine Round Table published a survey asking how many webmasters recovered from the filter one year later. The survey can be found here. As you can see, not many were able to recover from the filter even one year later. This is by no means a conclusive study, but it's really the best we have.
To answer your questions:
1. In my opinion, it's beyond fixing if you lose rankings and you did NOT receive a warning from Google. If you received a warning consider yourself lucky. Follow Google's instructions and use a service like linkdelete.com to get the bad backlinks removed (or just do it yourself).
2. Open Site Explorer works great for identifying bad backlinks. Go through the list of exact match keyword anchor texts and you'll be off to a good start.
Finally, I'd like to plug my post on this subject:
How-To Recover From Google Penalties & Filters
It's a long one, but I include several pictures and examples to make this process easier for folks in your situation.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url-delimiter vs. SEO
Hi all, Our customer is building a new homepage. Therefore, they use pages, which are generated out of a special module. Like a blog-page out of the blog-module (not only for blogs, also for lightboxes). For that, the programmer is using an url-delimiter for his url-parsing. The url-delimiter is for example a /b/ or /s/. The url would look like this: www.test.ch/de/blog/b/an-article www.test.ch/de/s/management-coaching Does the url-delimiter (/b/ or /s/ in the url) have a negative influence on SEO? Should we remove the /b/ or /s/ for a better seo-performance Thank you in advance for your feedback. Greetings. Samuel
Moz Pro | | brunoe10 -
Is there a easy way to see what pages are crawled?
Hello! Like the questions says... Is there a easy way to see what pages are crawled? I don't mean the ones that have issues, but just the ones that have been crawled? Regards,
Moz Pro | | MattDG0 -
Best way to analyze keyword difficulty for 10,000+ keywords?
I've done quite a bit of searching, but can't seem to find a time efficient way to accurately analyze keyword difficulty for large sets of keywords. All of the keyword difficulty tools out there that I've tried are either 1) not accurate 2) slow (like seomoz's kw difficulty tool only allows 5 entries at a time). Can anyone recommend any shortcuts / tools / processes to analyze kw difficulty for large sets of keywords?
Moz Pro | | nicole.healthline1 -
Why can't I find uipl?
Hi guys, I'm working with the API at the moment. I have most of the information I want, but I cannot find the uipl in any of the returned data using several methods. The methods I have tried are: http://lsapi.seomoz.com/linkscape/links/www.google.co.uk?Scope=page_to_domain&AccessID=xxx&Expires=xxx&Signature=xxx http://lsapi.seomoz.com/linkscape/url-metrics/www.google.co.uk?AccessID=xxx&Expires=xxx&Signature=xxx I can't seem to find how to get the uipl. Any ideas? Thanks in advance!
Moz Pro | | ClickConsult0 -
New to the SEO game
Hello everyone! I am fairly new to this SEO game and it all seems a little overwelming. Let me tell you a little about my experience. I have been trying for a couple years to get the ranking up on my LawnAide site and it seems at times that I keep spinning my wheels. Ive been doing blogs getting links with high ranking sites and all the other stuff. But on the local search there are still smaller companies which lower ranking sites performing better that mine. This gets very frustrating. Is there any advise out there?
Moz Pro | | lawnaide0 -
Where do these URL's come from?! (Indexation issues)
We have an international webshop with languages in the URLs. Our URLs are now set up as follows: http://thermalunderwear.eu/eng/category/product Now, we know that there's some kind of strange redirect problem causing problems with our indexation, this is a technical issue that should be fixed soon. But whether this is the cause of some other strange problems, I do not know. I'd be happy with any help/advice/tips. 1. The SEOmoz site crawler starts at http://thermalunderwear.eu. This currently does not yet redirect to http://thermalunderwear.eu/eng like we want it to, but all the links on the page do include the default language code. So all links on the page are http://thermalunderwear.eu/eng/category etc. However, apart from those URLs, the site crawler finds many URLs in the form http://thermalunderwear.eu/category/product etc., so not including the language variable. Where it gets these I do not know, and since these URLs dont exist and the webshop simply shows the homepage, these URLs all have 50+ duplicate titles/content. Why oh why? 2. If I do a Google search for indexed URL's with English as language, I get many results formatted like this: Coldpruf Enthusiast mens thermal shirt - Thermal wear for men ...
Moz Pro | | DocdataCommerce
thermalunderwear.eu/eng/men/coldpruf-enthusiast-mens-thermal-shirt 170+ items – Fine-ribbed longsleeve thermal shirt men from Enthusiast ... {$SCRIPT_NAME} eng/men/coldpruf-enthusiast-mens-the {$ajax_url} http://thermalunderwear.eu/ajax What are those variables doing there? It looks like it's taking something from our Smarty debug console, which is hidden but still active in the source code, but also the ajax URL which is in a completely different location. What is Google trying to show here?0 -
It won't let me print the secon or third pages of site errors.
When I place the site errors page in pdf format it won't let me print the second or any of the other webpages containing content about my site. Does any one know why?
Moz Pro | | ibex0 -
Is there any way to change the domain in a campaign here?
When I started the campaign I used site.com but now I have an external blog at blog.site.com and is hosted on tumblr. This site gets lots of errors showing on the campaign because insists on following no-follows. If I change the campaign to www.site.com then blog.site.com will not be part of the campaign. It does say in the help that it will NOT track sub-domains, but it does and hence my problem. So, I am not sure if the rest of the info there is old. 'Does the Web App track the subdomains of my campaign’s domain?
Moz Pro | | oznappies
A. Unfortunately we do not currently track subdomains as a part of the
domain you enter. Instead, you must create another campaign slot for any
subdomains you wish to track.' Any ideas on where or how to do this?0