Website Not Ranking Organically - Not Links, Not Domain Authority...
-
This is regarding the domain: http://lesliekays.com
We got a new client about 5 months back who had a website that was hacked. It took several months, but we've cleaned up most of the spammy links and completely rebuilt their website. We have been successful with every other client that we have ever done getting our customers to rank well organically, but we just can't figure this one out. I'm willing to offer a REWARD to the single person who can figure this out for us. (Let's not get carried away, a small reward). This customer has great quality links to their site too.
We are looking for answers other than:
1. Backlinks
2. Content
3. Malware
4. Domain or Page authority
5. 404 errors
6. We have utilized Googles Webmaster tools endlessly
It is something else and we can not identify it! Let me know what you think! I will give a public shout out to the person who helps us identify this issue!
-
For anyone stumbling across this we were able to identify the issue. We had a WordPress installation that had a hidden database full of SPAM and bad links/images etc. The Database could not be located in MySQL which was interesting, so we had the client agree to allow us to delete ALL content and start from scratch. (They had previously been hacked with another vendor but owned their WordPress website). We were migrating the database because they had hundreds of blog posts. We all agreed since they weren't doing any good we would start from scratch.
Deleted entire website off of server, reinstalled WordPress and we've seen a 745% improvement in keyword rankings. We have over 20 keywords now ranking on page 1. Woot!
-
Hi Dirk,
Again, appreciated the second set of eyes on this. Unfortunately, nothing changed. In fact, it got mildly worse with all the changes! I'm at a loss. I'll keep poking around at it. If you have any revelations, please shoot them my way!
-
The 3 errors listed are for a pop-up chat feature we have running by a company called, Olark. Can this cause issues with Googlebot? This code is embedded into the footer area of the site.
-
I see errors with Googlebot:
https://www.google.com/webmasters/tools/mobile-friendly/?url=http%3A%2F%2Flesliekays.com
and Mobile Friendly test uses Googlebot.You need to go in Search Console -> Crawl -> Crawl errors just to see what results are return to Googlebot.
-
Thanks Dirk,
Worked on much of what you recommended. We'll wait a week and see what happens. This client had an old blog that we moved over, but it was a mess. She has given us permission to remove these old blogs with broken links and missing images from here site. Your time on this was very much appreciated. Let's see what happens! I am marking this as answered.
-
The reward is not important - it's solving your issue which is the thing that matters. I don't know Wordfence - but I'm getting suspicious if a tool from Google (Pagespeed Insights) renders a 503 error. This could indicate that Google bot also might encounter troubles when checking your site. The best way to check is go over your logfiles & check for the responses you serve to the Googlebot.
Removing the country blocking now seems to enable pagespeed insights to work properly. Score is not great (50/100 - mobile) so more or less confirms the test by Webpagetest.org. Try to enable caching for static resources like images and enable gzip to compress data transfer. On top of that - compress your images (some of them are really heavy - check http://www.lesliekays.com/wp-content/uploads/2014/09/brake-hand-position.jpg . The other elements are probably more difficult to do - but these things could already get your score around 70%.
I would try to keep the Wordfence disabled for a week or two - to see if situation improves. Without the log files it's difficult to be sure - but my gut feeling is that this is the main reason why the site isn't ranking. Working on site performance can only help with this - the suggestions above shouldn't be that difficult to implement.
Some other issues - your internal links are not optimal - you have about 600+ links going to 404 (a lot of them = images) - 7 with wrong formatted url (without .com) - 600+ url's which are redirected (in your internal linking you are mixing both www & non-www versions) - so might want to clean up a bit on these ones as well. It shouldn't be a reason for not getting indexed but it's something you should have a look at - especially if the site has some issues before.
I ran a crawl - but it seemed to go on forever - so it's possible you have some kind of eternal loop as well. I stopped at 1781 URI's crawled.
I would recommend you to invest the 99 GBP in Screaming Frog - which would really help you in identifying / solving these issues.
Dirk
-
Screaming Frog... excellent resource!
-
Hi Dirk,
I have removed the Country Blocking from Wordfence so you could take a look at it closer. Thank you for your response and time looking into this! Yes, we run Wordfence on all of our sites (over 80 of them) and haven't had any issues with Googlebot accessing them. We started blocking foreign IP addresses about 3 months ago in an effort to curb some of the hacking and brute force attacks we get.
And I'm serious about this small reward! I value your time.
Taber
-
Really? We have WordFence running on 80 of our sites and don't see any issues with the other sites, all the same type of client as well. Do you have a better suggestions besides WordFence? Perhaps I'll disable it and see if that clears the issue. And, are you saying that you are seeing a return from Googlebot error or that it CAN cause issues?
-
True - i just joining and #WTF seeing such results.
-
Sorry Peter - I was first - the reward is mine
-
Wordfence? RU serious?
This plugin can serious hit your SEO performance because their settings. For example - you return error to GoogleBot...
-
Bonus: You indicate that you don't have problems with 404's - but your site isn't exactly clean concerning the internal links. Can't crawl using Screaming Frog given the 503 I get - but manually checking few pages it seems you have mal formatted links:
Classic Car Insurance (the .com is missing) - it's not a 404 but DNS lookup error
I would do a full scan with Screaming Frog to check what other technical issues you have on the site.
-
Just added the screen copy from the Pagespeed Insights result for sake of completeness.
-
Difficult to check - given the very strict firewall rules you apply (I get a 503 out of Belgium : Reason: Access from your area has been temporarily limited for security reasons)
I assume you are sure that your firewall is allowing Googlebot to access your site? (the sentence "For example, if you were blocked because it was detected that you are a fake Google crawler, then disable the rule that blocks fake google crawlers" scares me a bit)
When I use Pagespeed Insights I get the message that they are seeing a 503 as well (and this is Google after all). You might want to check your logfiles to see that Googlebot is indeed able to crawl your pages (with all the IP addresses it is using)
Other than the config of your firewall - you might want to check loadtimes - according Webpagetest you page takes 25sec to load so there is probably some room for improvement. (I know from other cases where very slow sites were the main reason for indexing issues)
Let me know how to get the reward
(if one of these reasons is causing the indexing issues)Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reasonable to Ask URL of Link from SEO Providing New Links before Link Activation?
My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Posting same content to different high authority websites
Let's say 1 article piece is highly relevant to multiple states and we pitch this article across the different domains in those states. Each article piece will be tweaked to localize the content. I understand that Google devalues links coming from low quality, websites that are spun up, but what about links that are basically the same content (but localized), across different high authority domains?
Intermediate & Advanced SEO | | imjonny1230 -
Root domain authority or page authority - which matters more
When does one matter more than the other? Any help would be greatly appreciated. Thanks! Matthew
Intermediate & Advanced SEO | | Mrupp440 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0 -
I currently have a client that has multiple domains for multiple brands that share the same IP Address. Will link juice be passed along to the different sites when they link to one another or will it simply be considered internal linking?
I have 7 brands that are owned by the same company, each with their own domain. The brands work together to form products that are then sold to the consumer although there is not a e-commerce aspect to any of the sites. I am looking to create a modified link wheel between the sites, but didn't know if my efforts would pay off due to the same IP Address for all the sites. Any insight on this would be greatly appreciated.
Intermediate & Advanced SEO | | HughesDigital0 -
Does a page on a site with high domain authority build page authority easier? i.e. less inbound links?
Is this also why people build backlinks to their BBB profiles, Yellowpages Profiles, etc. i.e. why do people build backlinks to other pages that link to them? Wouldn't it be more beneficial to just build that backlink directly to your target?
Intermediate & Advanced SEO | | adriandg0 -
Why isn't link velocity in the 2011 Ranking Factors?
How come there's no reference to link velocity in the Search Ranking Factors, 2011 or prior? We know that we have to continue building links for a client even if they're already doing well, not just because of the competition nipping at their heels but because if we stop they slip down anyway, so we know that stopping link building will often times have an adverse effect... meaning link velocity right? So how come there's no mention of it? Just curious 🙂
Intermediate & Advanced SEO | | SteveOllington0 -
Link building maximum to different sub domains?
Hi All, I'm launching a new website with a number of country specific sub-domains and I wanted to know if Google will calculate the number of new links as a root domain or if it will treat each subdomain seperately? For instance if I built 50 links per month to each of my five proposed subdomains would google see it as 250 links built to one root domain(and penalise me as a result) or will they view these subdomains independantly and accept these 50 links per page as an acceptable amount per sub domain. Thanks in advance. Ross
Intermediate & Advanced SEO | | Mulith0