Questions created by wiredseo
-
How Long for Penguin Recovery if All Links Were Removed at Once?
Hi, I was wondering if anyone had thoughts on length of time before a Penguin recovery can take place, assuming you removed all the bad links at once? I have a client that has a Penguin penalty, with all the bad links pointing in to a domain, which then redirected to theirs. So, I simply had that domain removed and it now returns a 404 instead of redirecting to the main site. Now all of the bad backlinks do not pass through. All of this happened at once about 5 weeks ago. I was glad to discover such an easy solution, but am wondering how long something like this would take. My initial thought was it would take 2 or 3 months to see recovery, but I was hoping since the issue was fixed so quickly, recovery would be faster. Any experience with Penguin recoveries with all spam links removed (less that 100 links) at once?
Link Building | | wiredseo0 -
How to rank Product pages over its Resource counterpart?
So, I have a resource page coming up in the SERPs above the product page, obviously both pages are targeting a lot of the same terms... it's like one is how to use the product and the other IS the product. What's your take on getting the money page to rank instead of the resource page? The only things I can think of include making sure that (internal) anchor text hyperlinks are all powering up the product page, and possibly adding more content to the product page and it's sub-pages. Possibly even including the how to use the product info on the product page itself. Any other ideas?
On-Page Optimization | | wiredseo0 -
503 Error or 200 OK??
So, in a Moz crawl and a Screaming From crawl, I'm getting some 503 Service Unavailable responses on the some pages. So I go to the pages in question, and the Moz bar is showing a 200 OK. The SEOBook http status checker (http://tools.seobook.com/server-header-checker/) also shows a 200 OK. What gives? The only reason I'm looking at this is because rankings plummeted a couple of weeks ago. Thanks! UPDATE So, I decided to use the mozbar to set the user agent as Googlebot and when I tried to access the pages in question I receive this message. I don't think this is an issue... anyone else have much experience here? Your access to this site has been limited Your access to this service has been temporarily limited. Please try again in a few minutes. (HTTP response code 503) Reason: Fake Google crawler automatically blocked Important note for site admins: If you are the administrator of this website note that your access has been limited because you broke one of the Wordfence firewall rules. The reason you access was limited is: "Fake Google crawler automatically blocked". If this is a false positive, meaning that your access to your own site has been limited incorrectly, then you will need to regain access to your site, go to the Wordfence "options" page, go to the section for Firewall Rules and disable the rule that caused you to be blocked. For example, if you were blocked because it was detected that you are a fake Google crawler, then disable the rule that blocks fake google crawlers. Or if you were blocked because you were accessing your site too quickly, then increase the number of accesses allowed per minute. If you're still having trouble, then simply disable the Wordfence firewall and you will still benefit from the other security features that Wordfence provides. If you are a site administrator and have been accidentally locked out, please enter your email in the box below and click "Send". If the email address you enter belongs to a known site administrator or someone set to receive Wordfence alerts, we will send you an email to help you regain access. Please read our FAQ if this does not work.
Moz Pro | | wiredseo0 -
Why are URLs like www.site.com/#something being indexed?
So, everything after a hash (#) is not supposed to be crawled and indexed. Has that changed? I see a clients site with all sorts of URLs indexed like ... http://www.website.com/#!category/c11f For the above URL, I thought it was the same as simply http://www.website.com/. But they aren't, they're getting indexed and all the content on the pages with these hash tags are getting crawled as well. Thanks!
Technical SEO | | wiredseo0 -
Will this affect the local SEO listings in Google?
So, I'm trying to list the same exact address across all local listings for a client. How "exact" do they all need to match in order for them to be optimized for the 7-pack listings? Here is an example of what I'm dealing with... 999 Cherry Ln #1 Dallas, TX 75238 - Address on Google Plus page 999 Cherry Ln Ste 1, Dallas, TX 75238 - Address on every other business listing Is this close enough or is this inconsistency really hurting this client?
Technical SEO | | wiredseo0