Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
-
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging.
And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing.
Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update).
We noticed traffic taking a particular plunge at the beginning of June.
Can anyone offer insights? Very much appreciated.
-
I'm trying to determine right now whether it's been an issue of this particular post being the symptom of a broader discrimination against our site or whether there has been competition introduced for this page. All the peaks and valleys of the site's organic traffic are exactly the peaks and valleys of popularity for this post. Graphing other major (organic) landing pages for our site (the top three of which have much less traffic than this one stupid page) does not indicate that the other pages have been similarly affected — their popularity is far more undulating, and subject to far fewer crazy movements. So I'm pretty sure at this point that it's the one page.
And, yes, this particular blog post accounts for about 1/2 of our site's organic traffic. We've reduced the bounce rate on this blog post down to the low 80's, percentage wise, which I think is respectable for what the blog post is & it's relationship to the site and the site's purpose as a whole, which is commercial and not immensely related to the post's content.
I suppose that's a new question, isn't it? How much should we care about the fortunes of one page that has a high bounce rate? Obviously, we should reduce the bounce rate (and there are some things we haven't done yet to do that) but the nature of this particular post is just not a super strong match for the content and direction of our site. The bounce rate will always been fairly high, it's just the way it will always be. Yet it has so. much. traffic. Another site I work on has a similar page, similarly somewhat-tangential to the site's content: the "when to use spray foam insulation" page. Thus I always want to call these the "spray foam insulation pages."
-
Ahh I see, I think if I was in that position I would try and have the dodgy links removed where possible, if you think they might be doing more harm to the site. Remember just because you've not received a warning notice in Webmaster tools, it doesn't mean that these links aren't negatively affecting your sites rankings, it may just be that there's not enough to have triggered a warning message, or as mentioned before they've simply been devalued.
What was it that caused the popularity around this particular blog post?
Do you mean that the decline in overall site traffic is down to a decline in traffic to this specific post? Or that it just correlates with the decline? -
I've got very little information about these backlinks since they precede my time, but I know that there was never any Google warnings about it. I think you're probably right, though — that the effect from the lousy backlinks is ongoing.
I graphed the decline in GA & found that the decline in traffic is exactly mirrored by the fortunes of this one ridiculously popular blog post. So while I continue to root around for confirmation for this, I'm guessing that this particular post has had found some new competition on the SERP. Yeesh.
-
Hi Novos Jay,
Do the shady backlinks you mentioned still exist and point to the site?
Have you used the disavow tool at all?The reason I ask is that it might just simply be down to the fact that the links that were holding the rankings and traffic up previously, are now gradually being devalued through various algorithm updates, so in spite of your recent work to do the right thing, there's still going to be an overall negative effect.
Perhaps with a little more information about the types of links (the shady ones) and quantity/% of the total backlinks, I/others might be able to give you some more specific ideas on what's happened?
Thanks,
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Managing Removed Content
I am a Real Estate Developer. Once a home goes off market (is sold), I had been using a 404 for that page. The problem: When the home goes up on market again, google will not re-index the new page (same URL) I have also tried to manage it a different way. Instead of removing the page, I left it as-is. At some later point time, the house goes back up on the market. The page is refreshed with new content. However, google decides to use cached version. Please note in either case, the property appears on the main page for a period of indexing. I have been doing this for 10 years, the problem is increasing with time.
Web Design | | Buckey0 -
Why would a developer build all page content in php?
Picked up a new client. Site is built on Wordpress. Previous developer built nearly all page content in their custom theme's PHP files. In other words, the theme's "page.php" file contains virtually all the HTML for each of the site's pages. Each individual page's back-end page editor appears blank, except for some of the page text. No markup, no widgets, no custom fields. And no dedicated, page-specific php files either. Pages are differentiated within page.php using: elseif (is_page("27") Has anyone ever come across this approach before? Why might someone do this?
Web Design | | mphdavidson0 -
Is it cloaking/hiding text if textual content is no longer accessible for mobile visitors on responsive webpages?
My company is implementing a responsive design for our website to better serve our mobile customers. However, when I reviewed the wireframes of the work our development company is doing, it became clear to me that, for many of our pages, large parts of the textual content on the page, and most of our sidebar links, would no longer be accessible to a visitor using a mobile device. The content will still be indexable, but hidden from users using media queries. There would be no access point for a user to view much of the content on the page that's making it rank. This is not my understanding of best practices around responsive design. My interpretation of Google's guidelines on responsive design is that all of the content is served to both users and search engines, but displayed in a more accessible way to a user depending on their mobile device. For example, Wikipedia pages have introductory content, but hide most of the detailed info in tabs. All of the information is still there and accessible to a user...but you don't have to scroll through as much to get to what you want. To me, what our development company is proposing fits the definition of cloaking and/or hiding text and links - we'd be making available different content to search engines than users, and it seems to me that there's considerable risk to their interpretation of responsive design. I'm wondering what other people in the Moz community think about this - and whether anyone out there has any experience to share about inaccessable content on responsive webpages, and the SEO impact of this. Thank you!
Web Design | | mmewdell0 -
I can’t understand, please help?
Our old <acronym title="Search Engine Optimization">SEO</acronym> built a lot of bad links (web 2.0, article ect) and in April of 2012 we received the dreaded Google Webmaster Tools notice of detected unnatural links to XXXXXXXX After 4 requests and disavowing all of the links he created we have finally received Reconsideration request for xxxx Manual spam action revoked The only problem is that this feels more of a penalty than when the received the detected links email? Everything across the board as dropped, I know we added more and more links to the disavow but everything has been going up with the quality we have been putting out until today when we received the manual action revoked. Has anyone experienced this, I feel like I wish we were still under the penalty as the rankings have dived.
Web Design | | BobAnderson0 -
What's the best way to structure original vs aggregated content
We're working on a news site that has a mix of news wires such as Reuters and original opinion articles. Currently the site is setup with /world /sports etc categories with the news wire content. Now we want to add the original opinion content. Would it be better to start a new top /Opinion category and then have sub-categories for each Opinion/world, Opinion/sports subject? Or would it be better to simply add an opinion sub-category under the existing news categories, ie /world/opinion? I know Google requests that original content be in a separate directory to be considered for inclusion in Google news. Which would be better for that? Regarding link building, if the opinion sub-categories were under the top news categories, would the link juice be passed more directly than if we had a separate Opinion top category?
Web Design | | ScottDavis0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Suggestions for content slider/image slider copy/paste application.
Hey Moz Community, I am looking for a content slider that can be easily changed by non-technicals for posting different styles of content/calls to action and this seems to be best: http://www.slidedeck.com/ I have installed a nivo slider on a Seattle Painting site, and flash slider on a commercial painting site. But I want my blog clients to be able to format..then copy/paste code..linke embedding a video. opinions? Thanks John
Web Design | | johnshearer0 -
Help choosing an E-Commerce platform for SEO, Product Videos and Usability today?
7 months ago I asked the same question.. I am reaching out to all of you What platforms do you hate? and want to meet the guys or girls that coded it in a dark ally way... and What platforms are good? I keep looking at the code of some sites and its shocking. No GA Async code, No canonical tags.. it goes on and on.. If you want to pitch to me and you are in the UK email me robert@thefurnituremarket.co.uk
Web Design | | robertrRSwalters0