$100 to who discovers why our rankings drop
-
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details:
Some very good people have this site:
nlpca(dot)com
and it has dropped for many of it's keywords, including the keywords
"NLP"
"NLP Training"
and many other keywords.
We dropped from 19th to 42nd for the term "NLP".
Here's what I'm doing about it:
(1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(3) We're redoing the home page as (1) above.
(4) We're fixing the 404s
(5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content.
If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you.
I'm going to leave this question 'unanswered' until this is resolved.
-
Sorry, but not remembering 100% what I was thinking at the time of writing the response since it was a week ago, but trying to reread through what was written, I believe I was talking about how the SERP may have been manually rated. While some of the SERPS are ranked via the algorithms google has developed, I've heard and read that there are a number of them that are affected and rated manually by humans. If there was any human interaction by one of their manual raters, they may have deemed your site less "relevant" for the search.
Have you ever seen the "Give us feedback" link at the bottom of the SERPs? Let's say somebody decides your website and the other 2 competitors are not what they were looking for when it came to the search "nlp" or "nlp training. Well, they could complain and potentially be reviewed by the manual raters or whomever responds to the complaints and drop you. Since it was before the most recent panda change, I was speculating that this could of been a cause.
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.<<
Could you expain this, SeattleOrganicSEO. That might be what happened. It looks like there was an algorithm change that effected us and at least 2 other strong competitors and shifted us all down
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.
However, I don't know if I know all the different pieces you do. Even with the above description of the issues, I think there's a lot more going on potentially that as "outsiders", we can't help with as much. Even when we know everything, we still might be clueless. Sorry, but I haven't had this problem with a client before. I know it will sound cocky, but we've only had the opposite problem (well not a problem) that the rankings go up. I call it a problem because sometimes a ranking improvement doesn't always translate into traffic (or qualified traffic for that matter). Sorry, going off on a tangent...
-
SeattleOrganicSEO,
That's worth looking at, but I'm pretty sure it's not only competition. We tumbled form 19th to 42nd in just a few days for the term "nlp". We'd been on the second page for many years.
-
I don't see it being the larger problem.
Have you considered that your competitors have jumped up their SEO efforts? Have you been paying attention to their backlinks and seeing if they've been doing a bit of link building on the keywords you're targeting? It's a lot of work, but if you know the 2 specific SERPs you're targeting, perhaps you can pay attention to what they're doing. Some SEO software out there make it a bit easier to keep track of...
-
I also just realized that we have articles on our website that are elsewhere on the web. Always with permission, but could this be a problem?
-
If this occurred around Nov/Dec, then it might not be the Panda changes. I just though since you posted recently that maybe the recent Panda change (3.2) could of been a possibility.
-
In that article, SeattleOrganicSEO, one of the comments is
Surviving Panda 3.2 - I will target the right keyword and provide superb content.
This drop in rank was occurring around November or December (Panda 3.1?) when I was trying to target several keywords per page and then later adding content to match.
I thought Panda was for scraping and duplicate content problems, do I need to worry about appropriateness of keywords? Do I need to only target keywords that the page is very obviously already optimized for? If it's not code errors, could this be why we've had a ranking drop?
-
I also am a big believer of clean code, crawalability in general.
but i used the bing SEOtolkit, that sees the site just how bing sees, it, I only found one invalid code error, and one page with too much css. I think the w3 validator picks up a lot of issues that are a bit picky.
but I also believe one open tag, can mean huge amounts of content are not read as visisble content.
This is even more concerneing now we have Microsodata, one error can mean your whose scema is useless.
i dont like to have any css or js in my HTML, I like to look at my souce code and be able to read my content easlsy.
This is one of the reasons i dont like CMS.
-
When did it happen? Any chance it happened around the 18th?
http://searchengineland.com/google-panda-3-2-update-confirmed-109321
-
Those errors are just for the homepage, albeit, there may be much less (once a tag is left open, it tends to really confuse the validator). I'd clean up the whole site for good measure; I'm a big fan of SEO PowerSuite's on-page tools when doing this sort of thing.
The line breaks don't all need to be totally replaced, the big gaps at the top just seemed a bit excessive. That particular recommendation is just based in my own superstitions, and those of others, but is based on this: the first 1/3 rule comes into play so much in SEO (weighting content placed high on a page, early in a tag, etc.); condensing the header section to a more sane level seems sensible. Some SEO auditers, such as WebCEO, will also yell at you if your TITLE tag doesn't immediately follow HEAD, for what I'd expect to be a similar thought; although again, not as scientific of a claim to my knowledge as valid code (which absolutely matters).
-
It's been a while since I did code validation, remind me - is that 79 errors just for the home page?
And will the line breaks confuse crawlers?
And remind me what the cleanest thing to replace the line breaks with are.
-
Not necessarily your one path to salvation (and keep your money on this if it does help gain some ground), but I'd personally start with cleaning up the source:
http://validator.w3.org/check?uri=http%3A%2F%2Fnlpca.com%2F&charset=(detect+automatically)&
doctype=Inline&group=0&user-agent=W3C_Validator%2F1.279 validation errors could definitely confuse crawlers about how things are organized, and imply usability issues. I'd also do something about the extreme # of unnecessary line breaks. I recently pushed a legal niche site up from page 5 to page 1 on a very competitive, short-tail phrase with not a lot more than cleaning up ugly code.
-
One think i noticed is your linking structure, this would I assume been like it is all along and would not be the reason of the drop. But your menu is on every page (I am assuming), meaning that all pages are linked by all pages. This pattern leads to all pages sharing teh rank, but what you want is your landing pages to have most of the page rank.
you should link to as many pages as you can from the home page, but only link to the home page and landing pages from every other page (where posible of cause). this will shift the PR to those pages. See link for a simple explaination.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Ranking of non-homepage leads to decrease in website ranking?
Hi all, Google picks up a non-homepage to rank for primary keyword where homepage is actually optimised to rank for same keyword. This means Google is ignoring the actual page and ranking other page. Does this scenario means that we are ranking lower as the homepage is not considered here? We may rank much better if homepage is preferred by Google? Thanks
Web Design | | vtmoz0 -
Is it important to keep your website home index page simple to rank better?
My website http://www.endeavourcottage.co.uk/ markets holiday cottages and it's grown from my own singular cottage into a small letting agency and I used to rank at best number 3 for the short tailed keywords like Whitby holiday cottages with its drop-down to position 10 on Google.co.uk. So this week I was looking for a UK business to help me improve my rankings and the first thing they said was my home page is detrimental with the listing too many conflicting info with it advertising all 12 properties on it. They suggested a door entry page into the site keeping it simple but when I run it through the analysing tool here on Moz for "Whitby holiday cottages" as an example it came out looking okay. I do the usual things of title tags and meta descriptions for my keywords etc any suggestions or advice would be very welcome thank you Alan
Web Design | | WhitbyHolidayCottages0 -
Could our drop in organic rankings have been caused by improper mobile site set-up?
Site: 12 year old financial service 'information' site with lead gen business model. Historically has held top 10 positions for top keywords and phrases. Background: The organic traffic from Google has fallen to 50% of what it was over the past 4 months compared to the same months last year. While several potential factors could be responsible/contributing (not limited to my pro-active removal of a dozen old emat links that may be perceived as unnatural despite no warning), this drop coincides with the same period the 'mobile site' was launched. Because I admittedly know the least about this potential cause, I am turning to the forum for assistance. Because the site is ~200 pages and contains many 'custom' pages with financial tables, forms, data pulled from 3rd parties, custom/different layouts we opted for creating a mobile site of only the top 12 most popular pages/topics just to have a mobile presence (instead of re-coding the entire site to make it responsive utilizing a mobile css). -These mobile pages were set up in an "m." subdomain. -We used bi-directional tagging placing a rel=canonical tag on the mobile page, and a rel=alternate tag on the desktop page. This created a loop between the pages, as advised by Google. -Some mobile pages used content from a sub page, not the primary desktop page for a particular topic. This may have broken the bi-directional 'loop', meaning the rel=canonical on the mobile page would point to a subpage, where the rel=alternate would point to the primary desktop page, even though the content did not come from that page, necessarily. The primary desktop page is the one that ranks for related keywords. In these cases, the "loop" would be broken. Is this a cause for concern? Could the authority held by the desktop page not be transferred to the mobile version, or the mobile page 'pull away' or disperse the strength of the desktop page if that 'loop' was not connected? Could not setting up the bi-directional tags correctly cause a drop in the organic rankings? -Our developer verified the site is set up according to Google's guidelines for identifying device screen size and serving appropriate version of page. -Are there any tools or utilities that I can use to identify issues, and/or verify everything is configured correctly? -Are we missing anything important in the set-up/configuration? -Could the use of a brand new subdomain 'm.' in and of itself be causing issues? -Have I identified any negative seo practices or pitfalls? Am I missing or overlooking something? While i would have preferred maintaining a single, responsive, site with mobile css, it was not realistic given the various layouts, and owner's desire to only offer the top pages in mobile format. The mobile site may have nothing to do with the organic drop, but I'd like to rule it out if so, and I have so many questions. If anyone could address my concerns, it would be greatly appreciated. Thanks! Greg
Web Design | | seagreen0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
Rankings Tanked After Redesign
This is my first Q & A, so thank you for your extended patience. Long story short, we moved platforms from Volusion to Big Commerce. We had a highly recommended company do our new redesign along with all applicable 301 redirects. They told us to expect a slight dip in traffic, but that it should bounce back before long.... especially with a cleaner, better organized site and better URL structure. Our new site went live on 2/8/13 and as of today, 4/17/13 our rankings are not getting any better. We've went from page 1 position 5 or so to midway down page 2 and even on to 3 for the same terms. I had stressed, and even paid extra for an extensive 301 redirect add-on, to ensure our rankings took as little of a beating as possible. Now I have no idea where to even begin. Since the launch, our Organic Google traffic has decreased by a whopping 82%! Any insight is very, very much appreciated.
Web Design | | josh3300 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0