Traffic down 60% - about to cry, please help
-
Hiya guys and girls,
I've just spent 6 months, a lot of blood sweat and tears, and money developing www.happier.co.uk.
In the last weeks the site started to make a trickle of money, still loss making but showing green shoots. But then on Friday the traffic dropped due to my rankings on google.co.uk dropping.
Visits:
Thur 25th april = 1950
Fri 26th april = 1284
Sat 27th april = 906
So it looks like Ive been hit with some sort of penalty. I did get a warning on the 20th april about an increase in the number of 404 errors, currently showing 77. I've now remove the links to those 404 pages, ive left the 404 pages as is, as was suggested here: http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools. Could that be the reason?
We have spent a lot of time on site design and content. We think the site is good, but I agree it has a long way to go but without income that is hard, so we have been struggling through. Any ideas on the reason/s for the penalty?
Big thanks,
Julian.
-
Yes that all makes total sense.
It's a real shame that google are so harsh on new sites. New sites need traffic from google like a baby needs oyxgen, and without it they might not be able to survive.
Happier is self funded and about 6 months old, and still losing money every week. I've thought about pluging the plug a fair few times. I'm not sure how much more money to invest, I might be just throwing good money after bad. Sorry whinge over.
-
There's an SEO debate going on right now about whether Panda is less likely to hit established brands. My position is that yes, it is more likely to impact small/new brands and sites. Perhaps once your brand is more established and you have a lot of comments on more of these pages you can try removing the noindex.
Without a full audit I may be missing something, but I hope this helps!
-
Firstly I would like to say a big thanks, your advice and thoughts are greatly appreciated. It has taken the pressure off me a bit, I was panicing but now I'm just a little worried.
I will talk to the tech guys today about noindexing the single deal pages and using a simliar layout to the code pages plus the comment layout from facebook. I don't really want to it as those pages are great for the long tail, checkout the traffic to hotukdeals.com who have a single page for every code and every deal, e.g. hotukdeals.com/deals/adidas-vintage-airline-bag-50-off-15-00-adidas-co-uk-1542011
But I can't afford to carry on with the current drop in traffic so have to make same changes fast.
Man do I hate google at the moment, they rank so much crap and obvious spam in their serps, but penalise a site like happier.co.uk on we have spent a lot of time carefully designing, adding quality content and playing by the rules.
-
The easy questions first: no, I don't think meta descriptions have anything to do with it. Google just crawls in spikes, especially if it detects a higher rate of change than usual. It's generally nothing to be concerned about.
As for thin pages, I think it's fine that you make the distinction between deals and codes. As a user, it just feels like "codes" are more valuable. It's a good idea. You have pages for each large brand, which is hugely important. Categories are also fine, as long as you don't multiply them endlessly.
I'm mostly concerned with pages like this:
http://www.happier.co.uk/deals/marks-and-spencer/3-pack-cotton-pyjamas-10-ms
I'd handle all of these as dropdowns like you seem to be doing on the home page, and make it like these pages never existed. I don't think this will hurt you much, either. I doubt anyone Googles, "deals on 3 packs of cotton pyjamas," and even if they do Google will pull up the text on brand pages.
As for the comments on these deals, see how Facebook does them. Load the first 3 most recent, then have an option for "show all comments."
I would personally put all NSFW content, or anything close to it, behind an authorization wall, then store the authorization in a cookie. This might hurt your rankings for these types of deals, but I feel like it would be well worth it to avoid being classified as someone dealing in adult content. It's unlikely to happen, but it's totally not worth the risk.
Again, I think your site will be fine if you keep at it, even in it's current shape. You know your industry better than I do, but from an SEO perspective I'd make these changes. Think about them and let me know if you see a flaw in them.
-
Hey Carson,
I really appreciated you taking the time to give such a well considered answer. It does make total sense. I feel the same that it's less likely to be a problem with links (which have built naturally and in an ethical way, therefore it would be really harsh to get a penalty for links).
I agree the problem could be thin pages, however to add a little more info into the mix.
-
We wanted to only display voucher codes, and not stuff the page with "deals" that add no extra value. We felt only showing codes in the codes category was a better user experience, than something like this: vouchercodes.co.uk/johnlewis.com - who just fill the page up with "deals" when there are no codes available - e.g. this is what we currently do http://www.happier.co.uk/codes/john-lewis - do you think it would be better to add a few "deals" even though the person visiting this page is looking for "discount codes"?
-
The deal page template is similar to the market leader's - e.g. hotukdeals.com/deals/rowntree-jelly-tots-randoms-160g-64p-pick-mix-150g-74p-fruittella-4pk-55p-mentos-1540682 - hotukdeals.com are in the top 100 sites in the UK according to alexa
In summary, I had considered the issue of thin pages before designing Happier.co.uk and knew it could be an issue, but there is only so much content user's want about a deal or code, and the market leaders seem to rank very well, so felt it would appear almost spammy to fill the pages up with more content, which I understand is opposite to normal advice for webpages.
Not sure about the nsfw content - I know we list sites like ann summers which is a high street brand in the uk which sells sex toys, etc, but as a high street brand I thought they were ok, plus everyone in this niche lists their codes and deals. But users can add content to happier.co.uk so I'm not 100% if more nsfw content is listed, I will check that out.
There are some crawl stats from GWT that spike on the same day our traffic drop, e.g.
Normal kilobyes downloaded per day about 15000, on 27th april it peaked at 45164. See attached image.
And finally, I was told that my IT guys fixed a problem with the meta descriptions the day before the drop in traffic, on most pages with the codes category, previously the meta desc what a templated one, but a handwritten one was in the db but not being used. So they changed that and now the handwritten one is being used. Personally, I can't believe that google would drop traffic to a site just because the meta desc was changed, especially as they are not suppose to use that in ranking a site.
So for the long reply but sometime a lot of info is needed, google is such a tricky beast these days. Here is a summary of my questions:
-
Do you think it would be better to add a few "deals" even though the person visiting this page is looking for "discount codes"?
-
Do the crawl stats, especially to spike in kilobytes per day, point at the problem?
-
Could changing the meta description be the cause?
Thank you for your time,
Julian.
-
-
Hi there,
I'm sorry that your site isn't doing as well as you'd like. Hopefully we can get to the bottom of this!
First, I've dealt with a LOT of link penalty questions, and I don't think that's what's happening. Consider, for example, that you have a ton of links for your Frugal 100 program. If you were slapped by Penguin, you would almost definitely not be ranking for "Frugal 100."
If your traffic is down - and SearchMetrics doesn't see it that way - please remember that it could just be fluctuation and SERP volatility that newer sites are prone to. Google tests, gives, takes away, and then gives back.
If you've been penalized (and I'm kinda doubting it), it's probably because of Panda. You have a LOT of really thin pages with little or no content on them and a heavy template. Can you add more content per page or scale back on the template space for individual deals? Do each of these deals need their very own page? I'm not sure, but a lot of deal sites don't do that.
I also noted that you have some adult deals on your site. Do keep in mind that Google might choose not to show the site for SFW deal-related terms if it thinks your content has a lot of NSFW stuff on it.
I hope this helps. Your site doesn't feel like most sites that get penalized - it seems useful to the people it serves. I think you'll be just fine in the end
-
I thought it was now normal for google to send out a "unnatural links" email, e.g. http://searchengineland.com/googles-cutts-on-how-to-locate-unnatural-links-pointing-to-your-web-site-148190
in the article is says "As you know, Google has been sendingunnatural link warnings to Webmasters for about a year now." dated 13th feb 2013.
So can it be a link penalty without a link warning email? By the way the links are natural and certainly not from thousands of sites, GWT is showing links from 292 domains.
-
No, Google won't tell you the specifics, but I agree this is from the quick building of links in a short amount of time.
-
if it was to do with links would I have got an email from saying that they found unnatural links? Then give me a chance to sort the problem out. Because I didnt I assume that links are not the problem.
-
Google is probably not happy with your link building, I would try to contact some of the links that are not relevant and ask to be removed. I don't know if that will be the best way to spend your time considering the 28,000 new links in 3 months. Maybe some positive social signals would take some of the suspicion down, I think to make 28,000 links seem natural your would have to be trending and going viral on the social networks but even some small work may help.
-
Vouchercodes and hotukdeals have been around for many years and therefore got a considerable amount of authority. What is your traffic like on weekends? I used to run a deals website similar to your own and traffic was always down on the weekend because people would use it more during the build up to the weekend.
-
Hi Rich,
28000 sounds a lot but if a site adds us to their blogroll and they have 28000 pages that would account for all the links.
We done the following types of links building:
- Top quality content, e.g:
- http://www.happier.co.uk/blog/apple-ripping-off-everyone-apart-from-americans-average-iphone-5-international-mark-up-32-samsung-s3-11-3153 (got a link from guadian.co.uk)
- http://www.happier.co.uk/blog/the-frugal-100-awards-2013-edition-2741 (this is where a lot of links can from, all from related niche blogs, they either mentioned us in a post or added a badge to their right nav)
- http://www.happier.co.uk/blog/the-uk-economy-a-billion-here-a-billion-there-1530 (this got a few quality links)
- http://www.happier.co.uk/blog/20-of-the-best-positive-psychology-blogs-of-2012-113 (a few quality links)
Also we have done, some guest posting on related and quality niche blogs, listed the site on some directories (not many), and some general out reach.
-
You appear to have aquired an aweful lot of links in a very short time. A look on Majestic SEO is telling me you have gained 28000 in the last 90 days, how have you been going about link building?
-
Hi Andy,
Thanks for the answer.
Yes it an affiliate site, but these types of sites can do amazing well in terms of rankings and traffic, check these two:
http://www.alexa.com/siteinfo/vouchercodes.co.uk#
http://www.alexa.com/siteinfo/hotukdeals.com
and of course http://www.alexa.com/siteinfo/retailmenot.com
They are all similar sites. Maybe linking out from the homepage was not a great idea, but I thought it was user friendly, e.g. user see a code, so they click and go directly to merchant and get the code instead of going to deeper page, then get the code. In fact, I just checked and that is just how retailmenot.com works, which to my knowledge is most popular coupon site in the world.
I'm still dazed and confused.
-
i doubt that is the reason for your drop in all honesty, i'd suggest if you can redirecting to a category to avoid any issues though.
Id suggest adding more value content to the front page, at the moment its just a link-out affiliate site, these don't usually do amazingly well in search - you are adding a blog and forum that is great but maybe look at surfacing them on the front page if you can to add some depth to the site other than via the top nav.
other than that without delving into your stats and GWT i can't say too much.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help me to understand why this page doesn't rank
Hello everyone. I am trying to understand why most of my website category pages don't show up in the in the first 50 organic results on Google, despite my high website DA and high PA of those pages. We used to rank high a few years ago, not clear why most of those pages have almost completely disappeared. So, just to take one as an example, please, help me to understand why this page doesn't shows up in the first 50 organic search results for the keyword "cello sheet music": http://www.virtualsheetmusic.com/downloads/Indici/Cello.html I really can't explain why, unless we are under some sort of "penalization" or similar (a curse?!)... I have analyzed any possible metric, and can't find a logical explanation. Looking forward for your thoughts guys! All the best, Fab.
Intermediate & Advanced SEO | | fablau0 -
Traffic has not recovered from https switch a year ago.
I have an ecommerce site that was switched to https a year ago almost to the day. Our category pages are about half of what they were. The redirects were put in properly, and everything in webmaster tools looks good. Anything out there I may not have thought of? Want to add that the drop is only in Google, Bing stayed just fine.
Intermediate & Advanced SEO | | EcommerceSite0 -
Help... To Optimize Category Page or Not?
My question is about whether to optimize a category page or not, but it’s a rather odd situation. Here’s a bit of background to start. When we relaunched our site, about six months ago, we had primary, secondary, and tertiary categories created. A user could reach all three levels by clicking through the site. Then we decided instead of linking to the tertiary categories, that we’d turn them into filters which can be applied at the secondary level. Thus, there is not actually a direct link to the 3rd level categories on the site. An important side note, I did check and confirm they are still included in the sitemap file. My initial thoughts were to forget any further optimization of those 3rd level categories, but as it turns out we still have rankings for some of them. Now the question… Because some of these pages are ranking and are found in the sitemap, should I include them in my SEO plan to build up and optimize, or because they are no longer linked to directly on the site will they eventually fizzle out (and I shouldn’t do anything further). This is such a unique situation that I am really looking for some insight from the community. Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Strange internal links and trying to improve PR ? - Please advise
Hi All, I've been looking at the internal links on my eCommerce site to try and improve PR and get it as efficient as possible so link juice isnt getting wasted etc and I've come across some odd ones I would like some advice on My website currently has between 125-146 links on every page (Sitemap approx 3500 pages). From what I read ,the ideal number of links is under 100 but can someone confirm is this is still the case ?..Is it a case of less is more , in terms of improving a page PR etc ? in terms of link juice strength etc so it's not getting diluted to unnecessary pages. One of my links is a bad url ( my domain + phone number for reason) which currently goes to a 404 page ?. - Is this okay or do we need to track down the link and remove it. I don't want link juice getting wasted as it's on every page. Another one of my links is my domain.name/# and another one with some characters after the # which both to the home page. Example www.domain.co.uk/# and www.domain.co.uk#abcde both go to homepage. Is this okay or am I potentially getting duplicate content as If I put these urls in , they go to my home page. I have a link on every page which opens up outlook (email) on the contact us. Should this really be changed to a button with a contact us form opening up instead ? I currently have 9 links on the bottom on every page i.e About it , delivery , hire terms,.contact us , trade accounts , privacy, sitemap. When I check , these pages seem to be my strongest pages in terms of PR. Is that because they are on every page?.. Should I look to reduce these links as they are accessible from the navigation menu apart from privacy and sitemap. Any advice on this would be greatly appreciated ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Search traffic decline after redesign and new URL
Howdy Mozzers I’ve been a Moz fan since 2005, and been doing SEO since. This is my first major question to the community! I just started working for a new company in-house, and we’ve uncovered a serious problem. This is a bit of a long one, so I’m hoping you’ll stick it out with me! ***Since the images aren't working, here's a link to the google doc with images. https://docs.google.com/document/d/1I-iLDjBXI4d59Kl3uRMwLvpihWWKF3bQFTTNRb1R3ZM/edit?usp=sharing Background The site has gone through a few changes in the past few years. Drupal 5 and 6 hosted at bcbusinessonline.ca and now on Drupal 7 hosted at bcbusiness.ca. The redesigned responsive design site launched on January 9th, 2013. This includes changing the structure of the URL’s, such as categories, tags, and articles. We submitted a change of address through GWT shortly after the change. Problem Organic site traffic is down 50% over the last three months. Below, Google analytics, and Google Webmaster Tools shows the decline. *They used the same UA number for Google analytics, so that’s why the data is continuous Organic traffic to the site. January 2011 - Dips in January are because of the business crowd on holidays. Google Webmaster Tools data exported for bcbusiness.ca starting as far back as I could get. Redirects During the switch, the site went from bcbusinessonline.ca to bcbusiness.ca. They were implemented as 302’s on January 9th, 2013 to test, then on January 15th, they were all made 301’s. Here is how they were set up: Original: http://www.bcbusinessonline.ca/bcb/bc-blogs/conference/2010/10/07/11-phrases-never-use-your-resume --301-- http://www.bcbusiness.ca/bcb/bc-blogs/conference/2010/10/07/11-phrases-never-use-your-resume --301-- http://www.bcbusiness.ca/careers/11-phrases-never-to-use-on-your-resume Canonical issue On bcbusiness.ca, there are article pages (example) that are paginated. All of the page 2 to page N were set to the first page of the article. We addressed this issue by removing the canonical tag completely from the site on April 16th, 2013. Then, by walking through the Ayima Pagination Guide we decided for immediate and least work choice was to noindex, follow all the pages that simply list articles (example). Google Algorithm Changes (Penguin or Panda) According to SEOmoz Google Algorithm Changes there is no releases that could have impacted our site at the February 20th ballpark. However - Sitemap We have a sitemap submitted to Google Webmaster Tools, and currently have 4,229 pages indexed of 4,312 submitted. But there are a few pages we looked at that there is an inconsistency between what GWT is reporting and what a “site:” search reports. Why would the submit to index button be showing, if it’s in the index? That page is in the sitemap. Updated: 2012-11-28T22:08Z Change Frequency: Yearly Priority: 0.5 *GWT Index Stats from bcbusiness.ca What we looked at so far The redirects are all currently 301’s GWT is reporting good DNS, Server Connectivity, and Robots.txt Fetch We don’t have noindex or nofollow on pages where we haven’t intended them to be. Robots.txt isn’t blocking GoogleBot, or any pages we want to rank. We have added nofollow to all ‘Promoted Content’ or paid advertising / advertorials We had TextLinkAds on our site at one point but I removed them once I satarted working here (April 1). Sitemaps were linking to the old URL, but now updated (April)
Intermediate & Advanced SEO | | Canada_wide_media1 -
Need help creating sitemap
Hello, The details of my question is sitemap related. Below is the background info: we are ecommerce site with around 4000 pages, and 20000 images. we dont have a sitemap implemented on our site yet. i have checked alot of sitemap tools out there, like g-sitecrawler, xml sitemap, a1 sitemap builder etc, and i tried to create sitemaps via them, but all them give different results. the major links are all there, but the results start to vary for level 2, level 3 links and so on. plus no matter how much i read up on sitemaps, the more i am getting confused. i read lots of seomoz articles on sitemaps, and due to my limited seo and technical knowledge, the extra information on these articles gets more confusing. i also just read an article on seomoz that instead of having one sitemap, having multiple smaller sitemaps is very good idea, specially if we are adding lots of new products (which we are). Now my question: My question is having understood the immense value of sitemap (and by having it very poorly implemented before), how can i make sure that i get a very good sitemap (both xml and html sitemap). i do not want to do something again and just repeat old mistakes by having a poorly implemented sitemap for our site. I am hoping that one of the professionals out there, can help me also make and implement the sitemap. If you can please point me to the right direction.
Intermediate & Advanced SEO | | kannu10 -
Google Recon Request 4 Failed - This is crazy. HELP!
We run a niche website selling sunglasses at www.aluminumeyewear.com. I've been trying to resolve a 'Failed Quality Guidelines' message since May. My 4th recon request has just failed and I've exhausted all changes that I believe I need to make. I rely on this site to pay my bills etc so obviously I really need to get this resolved. I would be grateful if someone from Google could actually point out whats wrong instead of an unhelpful auto response.Steps taken.1. Rewrote content as it was a bit thin. Recon failed.2. Removed old products that couldn't be reached from every page. Recon failed.3. Submitted back link audit and added 'sitemap' link to footer. Recon Failed.4. Removed 40+ old urls that existed from old Yahoo! store (didn't realize they still existed). Recon failed.I felt sure #4 would resolve the issue so feeling pretty low right now that it didn't. That being said doing a site:aluminumeyewear.com it looks like I missed one of them which was http://www.aluminumeyewear.com/demora/black/, however it just returns a 404 which would seem harsh to penalize me for.The only other pages that I can think of are some dynamic pages that the store uses to create reviews such as:www.aluminumeyewear.com/product-reviews-add.aspx?product=2www.aluminumeyewear.com/resize.aspxI'm pretty sure that the reviews page is blocked via robots txt. The resize.aspx is a blank page with javascript as its needed by the PowerReviews Express system to work, and many many merchants use that platform so it would be hard to think its that.Thanks in advance.
Intermediate & Advanced SEO | | smckenzie750 -
Should I 301 Poorly Worded URL's which are indexed and driving traffic
Hi, I'm working on our sites structure and SEO at present and wondering when the benefit I may get from a well written URL, i.e ourDomain / keyword or keyphrase .html would be preferable to the downturn in traffic i may witness by 301 redirecting an existing, not as well structured, but indexed URL. We have a number of odd looking URL's i.e ourDomain / ourDomain_keyword_92.html alongside some others that will have a keyword followed by 20 underscores in a long line... My concern is although i would like to have a keyword or key phrase sitting on its own in a well targeted URL string I don't want to mess to much with pages that are driving say 2% or 3% of our traffic just because my OCD has kicked in.... Some further advice on strategies i could utilise would be great. My current thinking is that if a page is performing well then i should leave the URL alone. Then if I'm not 100% happy with the keyword or phrase it is targeting I could build another page to handle the new keyword / phrase with the aim of that moving up the rankings and eventually taking over from where the other page left off. Any advice is much appreciated, Guy
Intermediate & Advanced SEO | | guycampbell0