$100 to who discovers why our rankings drop
-
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details:
Some very good people have this site:
nlpca(dot)com
and it has dropped for many of it's keywords, including the keywords
"NLP"
"NLP Training"
and many other keywords.
We dropped from 19th to 42nd for the term "NLP".
Here's what I'm doing about it:
(1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(3) We're redoing the home page as (1) above.
(4) We're fixing the 404s
(5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content.
If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you.
I'm going to leave this question 'unanswered' until this is resolved.
-
Sorry, but not remembering 100% what I was thinking at the time of writing the response since it was a week ago, but trying to reread through what was written, I believe I was talking about how the SERP may have been manually rated. While some of the SERPS are ranked via the algorithms google has developed, I've heard and read that there are a number of them that are affected and rated manually by humans. If there was any human interaction by one of their manual raters, they may have deemed your site less "relevant" for the search.
Have you ever seen the "Give us feedback" link at the bottom of the SERPs? Let's say somebody decides your website and the other 2 competitors are not what they were looking for when it came to the search "nlp" or "nlp training. Well, they could complain and potentially be reviewed by the manual raters or whomever responds to the complaints and drop you. Since it was before the most recent panda change, I was speculating that this could of been a cause.
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.<<
Could you expain this, SeattleOrganicSEO. That might be what happened. It looks like there was an algorithm change that effected us and at least 2 other strong competitors and shifted us all down
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.
However, I don't know if I know all the different pieces you do. Even with the above description of the issues, I think there's a lot more going on potentially that as "outsiders", we can't help with as much. Even when we know everything, we still might be clueless. Sorry, but I haven't had this problem with a client before. I know it will sound cocky, but we've only had the opposite problem (well not a problem) that the rankings go up. I call it a problem because sometimes a ranking improvement doesn't always translate into traffic (or qualified traffic for that matter). Sorry, going off on a tangent...
-
SeattleOrganicSEO,
That's worth looking at, but I'm pretty sure it's not only competition. We tumbled form 19th to 42nd in just a few days for the term "nlp". We'd been on the second page for many years.
-
I don't see it being the larger problem.
Have you considered that your competitors have jumped up their SEO efforts? Have you been paying attention to their backlinks and seeing if they've been doing a bit of link building on the keywords you're targeting? It's a lot of work, but if you know the 2 specific SERPs you're targeting, perhaps you can pay attention to what they're doing. Some SEO software out there make it a bit easier to keep track of...
-
I also just realized that we have articles on our website that are elsewhere on the web. Always with permission, but could this be a problem?
-
If this occurred around Nov/Dec, then it might not be the Panda changes. I just though since you posted recently that maybe the recent Panda change (3.2) could of been a possibility.
-
In that article, SeattleOrganicSEO, one of the comments is
Surviving Panda 3.2 - I will target the right keyword and provide superb content.
This drop in rank was occurring around November or December (Panda 3.1?) when I was trying to target several keywords per page and then later adding content to match.
I thought Panda was for scraping and duplicate content problems, do I need to worry about appropriateness of keywords? Do I need to only target keywords that the page is very obviously already optimized for? If it's not code errors, could this be why we've had a ranking drop?
-
I also am a big believer of clean code, crawalability in general.
but i used the bing SEOtolkit, that sees the site just how bing sees, it, I only found one invalid code error, and one page with too much css. I think the w3 validator picks up a lot of issues that are a bit picky.
but I also believe one open tag, can mean huge amounts of content are not read as visisble content.
This is even more concerneing now we have Microsodata, one error can mean your whose scema is useless.
i dont like to have any css or js in my HTML, I like to look at my souce code and be able to read my content easlsy.
This is one of the reasons i dont like CMS.
-
When did it happen? Any chance it happened around the 18th?
http://searchengineland.com/google-panda-3-2-update-confirmed-109321
-
Those errors are just for the homepage, albeit, there may be much less (once a tag is left open, it tends to really confuse the validator). I'd clean up the whole site for good measure; I'm a big fan of SEO PowerSuite's on-page tools when doing this sort of thing.
The line breaks don't all need to be totally replaced, the big gaps at the top just seemed a bit excessive. That particular recommendation is just based in my own superstitions, and those of others, but is based on this: the first 1/3 rule comes into play so much in SEO (weighting content placed high on a page, early in a tag, etc.); condensing the header section to a more sane level seems sensible. Some SEO auditers, such as WebCEO, will also yell at you if your TITLE tag doesn't immediately follow HEAD, for what I'd expect to be a similar thought; although again, not as scientific of a claim to my knowledge as valid code (which absolutely matters).
-
It's been a while since I did code validation, remind me - is that 79 errors just for the home page?
And will the line breaks confuse crawlers?
And remind me what the cleanest thing to replace the line breaks with are.
-
Not necessarily your one path to salvation (and keep your money on this if it does help gain some ground), but I'd personally start with cleaning up the source:
http://validator.w3.org/check?uri=http%3A%2F%2Fnlpca.com%2F&charset=(detect+automatically)&
doctype=Inline&group=0&user-agent=W3C_Validator%2F1.279 validation errors could definitely confuse crawlers about how things are organized, and imply usability issues. I'd also do something about the extreme # of unnecessary line breaks. I recently pushed a legal niche site up from page 5 to page 1 on a very competitive, short-tail phrase with not a lot more than cleaning up ugly code.
-
One think i noticed is your linking structure, this would I assume been like it is all along and would not be the reason of the drop. But your menu is on every page (I am assuming), meaning that all pages are linked by all pages. This pattern leads to all pages sharing teh rank, but what you want is your landing pages to have most of the page rank.
you should link to as many pages as you can from the home page, but only link to the home page and landing pages from every other page (where posible of cause). this will shift the PR to those pages. See link for a simple explaination.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
Help, site traffic has dropped significantly since we changed from http to https
Heya, so I am just in charge of the content on the site, and the SEO content, not the actual back-end stuff. A little under 2 weeks ago we switched to https, and our site traffic has been down a lot ever since. When I SERP check our keywords, they don't seem to have dropped in rankings pages. Here is what I got when I asked our dev guy if 301 redirects were put in: I did not add any redirects so all of the content is accessible on both unless individual links get hardcoded one way or the other. The only thing in place is a Cloudflare plugin which rewrites links in cached pages to match the way its accessed, so if for example you access a page over https you don’t get the version cached with a bunch of http links since that will throw up mixed content warnings in the browser. Other than that WP mostly generates all its links to match whatever protocol you are accessing the current page with. We can make specific pages redirect one way or the other in the future if we want to though... As a startup, site traffic is a metric we track to gouge progress, and so I really need to get to the bottom of if it was the change from http to https that has causes the drop, and if so, what can we do about it? Also, in case it is relevant: the bounce rate is now sky high (ave. 15% to 64% this last week!) Any help is very welcome! Site: https://mobileday.com Thank you!
Web Design | | MobileDay1 -
Recovering organic traffic and Google rankings post-site-crash
Hi everyone, we had a client's Wordpress website go down about 2 weeks ago and since then organic traffic has basically plummeted. We haven't identified exactly what caused the crash, but it happened twice in one week. We spent a lot of time optimizing the site for organic SEO, improving load times, improving user experience, improving the website content, improving CTR, etc. Then one morning we get a notification from our uptime monitoring service that the site was down, and upon further inspection we believe it may have been compromised. The child theme that the website was using, all of the files were deleted and/or blank. We reverted the website to a previous backup, which fixed the problem. Then, a few days later, the same exact thing happened, only this time the child theme files were missing after the backup was restored. We've since re-installed and reconfigured the child theme, changed all passwords (Wordpress, FTP, hosting, etc.), and we're looking into changing hosting providers in the very near future. The site uses the Yoast Wordpress SEO plugin, which has recently been reported as having some security flaws. Maybe that was the cause of the problem. Regardless, the primary focus right now is to recover the organic traffic and Google rankings that we've worked so hard to improve over the past few months up until this disaster occurred. The client is in a very competitive niche and market, so I'm pretty frustrated that this has happened after we were making such great progress, Since the website went down, organic search traffic has decreased by 50%. The site and all internal pages are loading properly again (and have been since the second time the website went down), but Google Webmaster Tools is still reporting a number of pages as "not found" witht he crawl dates as early as this past weekend. We've marked all errors as "fixed", and also re-submitted the Sitemaps in Google Webmaster Tools. The website passes the "mobile-friendly" tests, received A and B grades in GTMMetrix (for whatever that's worth), and still has the same original Google Maps rankings as before. The organic traffic, however, and organic rankings on Google have seen a pretty dramatic decrease. Does anyone have any recommendations when it comes to recovering a website's authority and organic traffic after it's experienced some downtime?
Web Design | | georgetsn0 -
Drop Down Menus and Crawlability
Hello, We are working on a complete site redesign. One of the mock-ups that are being reviewed is of a page that encompasses and entire category of products, but the only way the user can see the products is to fill out several drop down menus, and then a subset of products that match that criteria will appear. Once that list appears, the user will then be able to click on each of the products and will then be taken to the product page. I'm concerned that this layout will pose a crawlability issue since click activity and drop down menus have always been a problem for bots in the past, has anything changed? Will the bot be able to follow the links to these product pages if it can't see them since it can't fill out the form? Also, depending on the functionality of this 'form', I'm assuming the product listing will be populated dynamically and pulled from another source, which means that the product links will not live in the html of the page, and hence cannot be crawled. Does anyone know how this is normally handled? Do the actual results usually live elsewhere or does it live in the html of that page? Any thoughts or clarity around this would be appreciated.
Web Design | | Colbys0 -
Does a loading homepage animation effect rankings?
Our website ( panphoenix dot com) has a Javascript animation when you load it for the first time which takes just over 2 seconds to load. Does having this animation effect rankings negatively? Would appreciate your thoughts!Thanks Rob
Web Design | | roberthseo0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0