80% of traffic lost over night, Google Penalty?
-
Hi all.
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day.3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page.
I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress>
Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement.
Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/So:
-
How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google.
-
If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?
-
-
because you have no backlinks you have six total and your site apparently has an incredible amount of duplicate content after running it through screening frog and site liner you have a ton of broken links as well.
From only scanning 250 pages total I was able to come up with quite a few huge issues
with out the www.
400 Bad Request:
Top Issues
1,301 pages were not scanned.
772 broken links were found.
Your site is missing an XML Sitemap.
WWW vs. non-WWW: Server error.Your Pages
250 pages scanned of 1,551 found.
Normal Pages: 67
Skipped, Redirect: 10
Errors: 173
Not Scanned: 1,301Duplicate Content
Duplicate Content: 15%
Common Content: 59%
Unique Content: 25%** the complete PDF report is right here http://cl.ly/WF4O will**
http://www.siteliner.com/www.hemjakt.se/omrade/centralt/?siteliner=broken
OSE backlinks http://www.opensiteexplorer.org/links?page=1&site=http%3A%2F%2Fwww.hemjakt.se%2F
After using the elite addition of Ahrefs ( the different editions offer different abilities) Summary of links found for www.hemjakt.se
You have roughly 2 domain backlinks total from utilizing both tools I would run it through MajesticSEO but I do not see the point.
you need to fix the problems with your site I have a guess as to how you are ranking, but I do not believe it was natural.
If you had built that many pages on a website and really gotten only about 5 or 6 backlinks out of that only 2 our from domains/ IP's to link to your site total I would find that impossible to believe.
You have clearly scraped them or somebody else has been doing a lot of scraping and you may not know about it.
Here is a lot of information I hope this helps,
Thomas
-
I would say that your site goes against the Google guidelines in regards to thin content: https://support.google.com/webmasters/answer/2604719?hl=en
Part of the description of thin content includes "automatically generated content" and "copied content".
Have you checked your Webmaster Tools console to see if you have been given a thin content penalty? Check Search Traffic --> Manual Actions. If there isn't one though, then Panda is probably still the culprit.
-
If your site is only a couple of months old, it's possible that you initially had a honeymoon boost and now you're seeing the traffic you would normally get. But, I think that there is a more serious issue.
According to Google there are over 72,000 pages of your site indexed. How many of these pages contain useful and UNIQUE content that users can't find anywhere else on the web? If you're crawling other people's sites and publishing their content as yours then you're just asking to be demoted by the Panda algorithm. The only thing that doesn't exactly fit with this is the date. As far as we know the last Panda update was May 20. Still, this is my best guess.
"My content is ~90% unique from the source material and I do add user value..."
I would not agree here. In the examples that you gave me, I see that the majority of the actual content (i.e. the text) is 100% copied from the original source. There are several issues with this:
1. Google will see that the vast majority of your site contains content that did not originate with you. This is a sign of low quality and Google will not want to display your site over the originator of that content.
2. Copyright issues. Do you have permission to scrape these sites?
I haven't had a thorough look but from what I've seen, I don't think that this business model will work.
-
Oh yes - I'm sorry about that. So the duplicate content issue still applies, and now I've tested your website - it is quite slow but not as bad as the example I gave. I'd still recommend improving the speed.
Do you link to the website you're scraping the content from? Do you have permission to publish it?
I'm assuming at this stage you don't have any messages from Google in WMT about penalties. Can you pinpoint any onsite changes you made before the traffic drop?
Considering the duplicate content on your site I'm surprised you ranked so well and saw the traffic increases you did - so with your absence of links perhaps the website is at it's natural position at the moment. I have no experience of real estate websites or the Swedish market and competition within it though.
You could consider adding some unique content to the homepage and other location pages - something useful for users. If I was moving to a new city I'd want to know about the different neighbourhoods and which area suited me - which areas are quiet, near to parks? Which have good (public) transport links, might be noisy, have amenities nearby? These could be expanded into really useful standalone location guides, with pictures, maps and more info added.
-
Not sure what you are talking about. There are no links in the footer, there's no footer. On category pages there is a infinity scroll and on normal pages there are no footer at all.
On a normal house page there is only about ~55 <a>tags. </a>
-
Hi Alex thanks for your response.
That is not my website you are linking to, that is the source website that I crawl the content from in the example above.
The domains are very similar so easy to mix up.
Hemjakt.se is my site.
Hemnet.se is the source that I crawled.So in this case if you want to do a Pingdom Speed Test on that page, you should do it on the following URL: http://www.hemjakt.se/bostad/55860-asgatan-15/
But to answer the rest of your question:
- No there have been no significant speed changes to the website according to Google Analytics Speed Page. The last bump when speed times was increased temporarily was 2 weeks ago and only for 1 day.
- The site is hosted on Amazon with a CDN for all images. The site is not perfect but there have been a lot of time spend on making sure that the load times are decent and that the site get a OK score in Google Insights Speed Test.
Here is the Overview page from my Google Analytics to display my Bounce Rate and so on.
Regarding the Bounce Rate. It was ~66% when I had 0 goals set for the site. When I set up goals such as completing forms, viewing the gallery of each house etc the bounce rate dropped to ~38%. That is the reason for the bounce rate drop. -
Footer links are not usually an issue unless you're using them for keyword spamming e.g. you have a load for different locations: "buy house london", "buy house manchester", "buy house liverpool" etc. You won't get a penalty for having over 100 links on a page so that's unlikely to be the issue either.
I've had a quick look at your website and there are two major issues I picked up on:
- Duplicate content - you say you crawl websites and your content is "90% unique from the source material". The content would be better if 90% of it was unique to your website only. I searched Google for a block of text from one of the links you provided and 5 websites appeared. Google doesn't like duplicate content - maybe your website does provide value to users but how can Google's algorithm know this if all it sees is duplicate content? Perhaps your site had poor user metrics (e.g. people pressing back to return to the search results page after a short period of time on your website) so Google demoted it. Can you see any evidence of this in Google Analytics?
- I tested http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964 on Pingdom Speed Test and "Your website is slower than 92% of all tested websites" - has a recent change caused that slowdown? Is the site registered with Google Webmaster Tools? If so, can you see a change in page speed, crawl rate etc.? Google Analytics can also report on page speed. That's very slow and if it's a recurring problem the first thing I'd fix.
While I was browsing this happened to me twice "Just nu har vi tekniska problem".
-
Remove some footer links. More footer links cab be affected any kind of website.No more then 100 links per page.
-
What? Not sure what you're talking about.
Here is an example of a page on my site:
http://www.hemjakt.se/bostad/59873-soldatgatan-13-b/ -
As per my knowledge can you delete footer links then i am sure you will rank again.
You putting more and more links in footer.
Thanks,
Akhilesh
-
What is your suggestion to solve it?
I truely think my application can add a lot of value to the users by giving them a chance to find a lot of data at the same spot, instead of having to navigate to a bunch of other sites. Since I also get access to "Big data" I can add a lot of value to the users by displaying price history from multiple sources.
I'm not trying to do anything shady or black hat. I'm just trying to create a search engine for real estate. Obviously that will generate a lot of content and many pages in a short amount of time.
What am I doing that is against the Google Guidelines? I doubt that "Adding pages too fast" is a reason to give a penalty, but more suspicious activity that would indiciate that a closer look by Google would be needed to see if any spam is going on.
-
Means in your website no instances (Inbound links). I think you create more pages in very short of time that's the reason behind this.
-
I have done no linkbuilding for this site. I have only focused on building a strong crawler and with ~60'000 pages/houses indexed I have been able to rank well on long queries with low competition, and thereby getting good traffic without any linkbuilding.
Moz reports 0 links and from my Google Analytics report I have only been getting some referral traffic from Reddit, a post that has now been deleted. All other traffic is organic search traffic. So I highly doubt that there are any "bad links" that is causing this punishment.
Regarding the crawler. I crawl other websites for real estate objects. I cache the HTML and I then filter out specific information such as price, number of rooms, address and so on.
I then publish this data with my own structure. This makes the data 87-90% unique if compared to the Source site. You can see example in the first post of this topic.
The 10-13% that is not unique, is the description of the real estate object which is usually a 1-2 paragraph text saying something like "This is an amazing house with 12 rooms with a great view that would be a perfect fit for the large family". This text is copied straight off. Which might be what gives me the punishment? Even if its only 10% of the page content. What do you think?
-
HI Marcus Lind,
Can you share me what activity have you done for SEO for this website and how much anchor and which anchor text created by you.
After that we can say clearly what is the exact issue.
-
Please note that I have not done any link building at all to the site, it's only 2 months old. There are no unnatural links, no keyword stuffing and no other black hat seo tactics used. No links have been purchased at all, so Penguin should not be relevant to this site.
I've attached an image of my Google Analytics Overview for the past months, and you can see how it has been naturally steadily growing day by day and then just taking a huge drop the last few days.
-
That's fantastic that you been keeping track of your Bing queries and it sounds like you've been hit by a penalty and going for. I am guessing
i can pull link data for you however my girlfriends want to kill me if I don't stop answering questions right now so look for my answer in about six hours of use
Ahrefs majestic SEO & Moz
-
Try running your queries in Bing and see how it compares. If you see the terms you ranked for earlier ranking well in Bing, then you know you've been hit by a penalty in Google.
That's what made it clear for a site I'm involved with and have been trying to recover for many months now.
-
I'm very sorry to hear that this is happened to you. Here is some information I hopefully helpful
This URL may say on the video that you have to be a member to see the video however you do not have to remember to read the transcript. Therefore read the transcript use the tool that tries to gauge now this is not 100% but it might give you a lot more insight into what's actually occurred. You have to have Moz opensiteexplorer.com and another tool like MajesticSEO.com or Ahrefs.com
I like Ahrefs however either one will do.Your first step is to Google “fruition Google penalty tool”,
https://fruition.net/google-penalty-checker-tool/
because that will bring up a cool tool that will show you whether or not you’ve been hit by Google Penguin, because there are several updateshttp://www.quicksprout.com/university/how-to-identify-and-recover-from-a-google-penguin-penalty/
So your aware this link shows a video that you may not be able to see you and I sure remember however because it is fully transcribed you have a exact how-to manual right below it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
Direct Traffic has Dropped 48% to Last Year
Since February of 2013 our organic traffic at http://www.weddingshoppeinc.com had been declining. We were able to get traffic back up to par with numbers from the previous year by December of 2013. In March of 2014 our direct traffic took a major hit and hasn’t improved. We know our mobile traffic is part of the problem, but the issue has affected traffic from desktop and mobile devices. Is this an organic traffic problem, or is our decrease in direct traffic coming from somewhere else? Has anyone else seen this issue, or does anyone have advice? Here is what we’ve already looked into and updates to note: Before this issue, when we compared organic and direct traffic, direct was usually half of what organic was (i.e., if organic was at 10 visitors, direct was at 5). However organic traffic has followed normal trends and direct has dropped. In August we updated our .net code to MVC to drop our first byte from 1,700 to 300 milliseconds. However, if you look at our m. site, it’s around 1,000 milliseconds. We changed our SEO strategy in May to follow best practices. We’ve been rewriting old content. We haven’t ever done any black hat SEO, just have some old blogs from 2010-2012 that have too many keywords. These are getting edited. In March we moved our images to a CDN for our images. We’re currently working on server errors and broken links, but nothing significant changed around March to affect our traffic. Very recently, our web developers said that they believed our direct traffic had been getting tracked wrong in Google Analytics prior to March 2014. However they think they fixed the issue in a March push. We've taken this theory into account, but we also see a drop in revenue at the time of their push that correlates with the drop in traffic, so we know there’s a bigger issue. Any input you can provide would be greatly appreciated!
White Hat / Black Hat SEO | | JimmyFritz1 -
Strange strategy from a competitor. Is this "Google Friendly"?
Hi all,We have a client from a very competitive industry (car insurance) that ranks first for almost every important and relevant keyword related to car insurance.
White Hat / Black Hat SEO | | sixam
But they could always be doing a good job. A few days ago i found this: http://logo.force.com/ The competitor website is: http://www.logo.pt/ The competitor name is: Logo What I found strange is the fact that both websites are the same, except the fact that the first is in a sub-domain and have important links pointing to the original website (www.logo.pt) So my question is, is this a "google friendly" (and fair) technique? why this competitor has such good results? Thanks in advance!! I look forward to hearing from you guys0 -
Loss of 85-90% of organic traffic within the last 2 weeks.
Hey Everybody, Have a client that recently came to us asking for SEO help. Did some initial analysis on their current SEO status and most everything looked pretty good. On-page work was pretty good, nothing really lacking there other then missing alt tags for all images. Their linking profile looked good too. Lots of good links from quality sources, all relevant. Client has done some good press releases. They could probably use a bit more focus in their content as it is somewhat general and not keyword focused. Initially it didn't look like they needed any help with their SEO, so was a bit curious as to why they contacted us. Today we get their google analytics information and immediately noticed that they have had a 85-90 percent drop in organic traffic from all major search engines that started about two weeks ago. If all their SEO looks to be done properly, any ideas what would account for the massive drop in traffic? The only thing that looks like may have happened is that they may have dropped a couple spots from position #1 to position 2-3 for some of their highest traffic terms. Even if that is the case, I would not expect such a high drop off in terms of organic traffic. Just curious as to what anyone else can attribute the huge drop in traffic to or what else may help identify the issue. It's almost as if analytics was turned off or removed from the site, but that is not the case.
White Hat / Black Hat SEO | | Whebb0 -
Should I report this to Google and will anything happen ?
Hi, I am working with a client and have discovered that a direct competitor has hidden the clients business name in meta information and also hidden the name on the page but off to the side. My intention is to ask the company to remove the content, but the client would like me to report it to Google. Is this a waste of time and what request in webmaster tools should I use. The name is not a trademark but the business name is not generic and it is an obvious attempt to target my clients business. Any help would be appreciated, Thanks in advance
White Hat / Black Hat SEO | | Mozzi0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
What Google considers to be a branded keyword?
We can set our own keywords as branded in SeoMoz campaign, but Google would not necessarily see them like branded. After reading the Blog post at http://www.seomoz.org/blog/how-wpmuorg-recovered-from-the-penguin-update I had a question: Are there known rules (or at least guesses) what Google considers a branded keyword/anchor text? I guess the first one would be your website domain. So bluewidget.com for example would be a branded keyword for bluewidget.com website. How about Blue Widget or Blue Widget Company?
White Hat / Black Hat SEO | | SirMax0