$100 to who discovers why our rankings drop
-
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details:
Some very good people have this site:
nlpca(dot)com
and it has dropped for many of it's keywords, including the keywords
"NLP"
"NLP Training"
and many other keywords.
We dropped from 19th to 42nd for the term "NLP".
Here's what I'm doing about it:
(1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(3) We're redoing the home page as (1) above.
(4) We're fixing the 404s
(5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content.
If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you.
I'm going to leave this question 'unanswered' until this is resolved.
-
Sorry, but not remembering 100% what I was thinking at the time of writing the response since it was a week ago, but trying to reread through what was written, I believe I was talking about how the SERP may have been manually rated. While some of the SERPS are ranked via the algorithms google has developed, I've heard and read that there are a number of them that are affected and rated manually by humans. If there was any human interaction by one of their manual raters, they may have deemed your site less "relevant" for the search.
Have you ever seen the "Give us feedback" link at the bottom of the SERPs? Let's say somebody decides your website and the other 2 competitors are not what they were looking for when it came to the search "nlp" or "nlp training. Well, they could complain and potentially be reviewed by the manual raters or whomever responds to the complaints and drop you. Since it was before the most recent panda change, I was speculating that this could of been a cause.
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.<<
Could you expain this, SeattleOrganicSEO. That might be what happened. It looks like there was an algorithm change that effected us and at least 2 other strong competitors and shifted us all down
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.
However, I don't know if I know all the different pieces you do. Even with the above description of the issues, I think there's a lot more going on potentially that as "outsiders", we can't help with as much. Even when we know everything, we still might be clueless. Sorry, but I haven't had this problem with a client before. I know it will sound cocky, but we've only had the opposite problem (well not a problem) that the rankings go up. I call it a problem because sometimes a ranking improvement doesn't always translate into traffic (or qualified traffic for that matter). Sorry, going off on a tangent...
-
SeattleOrganicSEO,
That's worth looking at, but I'm pretty sure it's not only competition. We tumbled form 19th to 42nd in just a few days for the term "nlp". We'd been on the second page for many years.
-
I don't see it being the larger problem.
Have you considered that your competitors have jumped up their SEO efforts? Have you been paying attention to their backlinks and seeing if they've been doing a bit of link building on the keywords you're targeting? It's a lot of work, but if you know the 2 specific SERPs you're targeting, perhaps you can pay attention to what they're doing. Some SEO software out there make it a bit easier to keep track of...
-
I also just realized that we have articles on our website that are elsewhere on the web. Always with permission, but could this be a problem?
-
If this occurred around Nov/Dec, then it might not be the Panda changes. I just though since you posted recently that maybe the recent Panda change (3.2) could of been a possibility.
-
In that article, SeattleOrganicSEO, one of the comments is
Surviving Panda 3.2 - I will target the right keyword and provide superb content.
This drop in rank was occurring around November or December (Panda 3.1?) when I was trying to target several keywords per page and then later adding content to match.
I thought Panda was for scraping and duplicate content problems, do I need to worry about appropriateness of keywords? Do I need to only target keywords that the page is very obviously already optimized for? If it's not code errors, could this be why we've had a ranking drop?
-
I also am a big believer of clean code, crawalability in general.
but i used the bing SEOtolkit, that sees the site just how bing sees, it, I only found one invalid code error, and one page with too much css. I think the w3 validator picks up a lot of issues that are a bit picky.
but I also believe one open tag, can mean huge amounts of content are not read as visisble content.
This is even more concerneing now we have Microsodata, one error can mean your whose scema is useless.
i dont like to have any css or js in my HTML, I like to look at my souce code and be able to read my content easlsy.
This is one of the reasons i dont like CMS.
-
When did it happen? Any chance it happened around the 18th?
http://searchengineland.com/google-panda-3-2-update-confirmed-109321
-
Those errors are just for the homepage, albeit, there may be much less (once a tag is left open, it tends to really confuse the validator). I'd clean up the whole site for good measure; I'm a big fan of SEO PowerSuite's on-page tools when doing this sort of thing.
The line breaks don't all need to be totally replaced, the big gaps at the top just seemed a bit excessive. That particular recommendation is just based in my own superstitions, and those of others, but is based on this: the first 1/3 rule comes into play so much in SEO (weighting content placed high on a page, early in a tag, etc.); condensing the header section to a more sane level seems sensible. Some SEO auditers, such as WebCEO, will also yell at you if your TITLE tag doesn't immediately follow HEAD, for what I'd expect to be a similar thought; although again, not as scientific of a claim to my knowledge as valid code (which absolutely matters).
-
It's been a while since I did code validation, remind me - is that 79 errors just for the home page?
And will the line breaks confuse crawlers?
And remind me what the cleanest thing to replace the line breaks with are.
-
Not necessarily your one path to salvation (and keep your money on this if it does help gain some ground), but I'd personally start with cleaning up the source:
http://validator.w3.org/check?uri=http%3A%2F%2Fnlpca.com%2F&charset=(detect+automatically)&
doctype=Inline&group=0&user-agent=W3C_Validator%2F1.279 validation errors could definitely confuse crawlers about how things are organized, and imply usability issues. I'd also do something about the extreme # of unnecessary line breaks. I recently pushed a legal niche site up from page 5 to page 1 on a very competitive, short-tail phrase with not a lot more than cleaning up ugly code.
-
One think i noticed is your linking structure, this would I assume been like it is all along and would not be the reason of the drop. But your menu is on every page (I am assuming), meaning that all pages are linked by all pages. This pattern leads to all pages sharing teh rank, but what you want is your landing pages to have most of the page rank.
you should link to as many pages as you can from the home page, but only link to the home page and landing pages from every other page (where posible of cause). this will shift the PR to those pages. See link for a simple explaination.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Conversion drop after site move
Have just migrated current and active site to Magento 1.9 and a new hosting package (dedicated and reasonably powerful for what we need). But we seem to have had a drop in conversion rate. I don't really know whether it's anything to be concerned about the moment. I've set up extensive server level 301 redirects for old pages. Would be glad of any thoughts? Thanks you
Web Design | | seoman100 -
Do more links from sub-domains to domain (website) hurt rankings?
Hi all, If there are multiple sub-domains like abc.website.com, 123.website.com, etc...and if the top pages of website are linked from multiple sub-domains via top menu or footer links; will this hurts? Will too much interlinking of few top pages of a website from it's sub-domains dilute link juice? How many links ideally we can add to website from a sub-domain? Thanks
Web Design | | vtmoz0 -
Google text-only vs rendered (index and ranking)
Hello, can someone please help answer a question about missing elements from Google's text-only cached version.
Web Design | | cpawsgo
When using JavaScript to display an element which is initially styled with display:none, does Google index (and most importantly properly rank) the elements contents? Using Google's "cache:" prefix followed by our pages url we can see the rendered cached page. The contents of the element in question are viewable and you can read the information inside. However, if you click the "Text-only version" link on the top-right of Google’s cached page, the element is missing and cannot be seen. The reason for this is because the element is initially styled with display:none and then JavaScript is used to display the text once some logic is applied. Doing a long-tail Google search for a few sentences from inside the element does find the page in the results, but I am not certain that is it being cached and ranked optimally... would updating the logic so that all the contents are not made visible by JavaScript improve our ranking or can we assume that since Google does return the page in its results that everything is proper? Thank you!0 -
Website Drops Some Traffic after Redesign. What's Happening?
What it is NOT: No Link was broken. I have used Moz, Screaming Frog, Excel, etc - there are not broken links. We have not added spammy links. We kept the same amount of links and content on the homepage - with an exception of 1 or 2. All the pages remained canonical. Our blog uses rel=prev rel=next, and each page is canonicalized to itself. We do not index duplicated content. Our tags are content="noindex,follow" We are using the Genesis Framework (we were not before.) Load time is quicker - we now have a dedicated server. Webmaster tools has not reported any crawl report problems. What we did that should have improved our rankings and traffic: Implemented schema.org Responsive design Our bounce rate is down - Average visit length is up. Any ideas?
Web Design | | Thriveworks-Counseling0 -
Is Fall In Keyword Ranking After Launch of Revamped Website Normal
After launching my redesigned website (www.nyc-officespace-leader.com) Google ranking has dropped significantly for competitive keywords. The previous version of the site and the new version both have approximately 450 pages. My website developer was careful to implement 301 redirects. Monitoring Google Webmaster tools it shows that Google has picked up a quantity of duplicate content. More than 950 pages or shown in their index while my site only has 450 pages. There are also certain pages which require canonical which tags my developer is in the process of implementing. The relaunch was July 10. My developer is of the opinion that this fluctuation in ranking is normal and that it will take Google about one month to reindex the new site and remove the old pages from the directory. Is this accurate? Anyone have any ideas on why my site has tanked in Google's search results? Thank you very much. Sincerely,
Web Design | | Kingalan1
Alan Rosinsky0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
Homepage and Category pages rank for article/post titles after HTML5 Redesign
My site's URL (web address) is: http://bit.ly/g2fhhC Timeline:
Web Design | | mcluna
At the end of March we released a site redesign in HTML5
As part of the redesign we used multiple H1s (for nested articles on the homepage) and for content sections other than articles on a page. In summary, our pages have many many, I mean lots of H1's compared to other sites notable sites that use HTML5 and only one H1 (some of these are the biggest sites on the web) - yet I don't want to say this is the culprit because the HTML5 document outline (page sections) create the equivalent of H1 - H6 tags. We have also have been having Google cache snapshot issues due to Modernzr which we are working to apply the patch. https://github.com/h5bp/html5-boilerplate/issues/1086 - Not sure if this would driving our indexing issues as below. Situation:
Since the redesign when we query our article title then Google will list the homepage, category page or tag page that the article resides on. Most of the time it ranks for the homepage for the article query.
If we link directly to the article pages from a relevant internal page it does not help Google index the correct page. If we link to an article from an external site it does not help Google index the correct page. Here are some images of some example query results for our article titles: Homepage ranks for article title aged 5 hours
http://imgur.com/yNVU2 Homepage ranks for article title aged 36 min.
http://imgur.com/5RZgB Homepage at uncategorized page listed instead of article for exact match article query
http://imgur.com/MddcE Article aged over 10 day indexing correctly. Yes it's possible for Google index our article pages but again.
http://imgur.com/mZhmd What we have done so far:
-Removed the H1 tag from the site wide domain link
-Made the article title a link. How it was on the old version so replicating
-Applying the Modernizr patch today to correct blank caching issue. We are hoping you can assess the number H1s we are using on our homepage (i think over 40) and on our article pages (i believe over 25 H1s) and let us know if this may be sending a confusing signal to Google. Or if you see something else we're missing. All HTML5 and Google documentation makes clear that Google can parse multiple H1s & understand header, sub & that multiple H1s are okay etc... but it seems possible that algorythmic weighting may not have caught up with HTML5. Look forward to your thoughts. Thanks0 -
Changing from Squarespace to Wordpress - Will I Lose My Rankings?
I have a friend who has a squarespace site that is giving him lots of trouble. For one, even though it is supposed to redirect to GreenSpaceConstruct.com...Bing and Yahoo don't seem to recognize this domain. Instead, they show greenlightconstruct.squarespace.com in the serp's. Oddly, Google shows the site as GreenSpaceConstruct.com. The site is ranking well for some terms. I'm afraid that converting to wordpress will hurt his rankings in the short term. If bing and yahoo are crawling this squarespace domain, and he moves it...is there a way not to just completely lose the rankings? Thanks for any thoughts. Much appreciated! Josh
Web Design | | JoshTurner0