80% of traffic lost over night, Google Penalty?
-
Hi all.
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day.3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page.
I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress>
Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement.
Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/So:
-
How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google.
-
If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?
-
-
because you have no backlinks you have six total and your site apparently has an incredible amount of duplicate content after running it through screening frog and site liner you have a ton of broken links as well.
From only scanning 250 pages total I was able to come up with quite a few huge issues
with out the www.
400 Bad Request:
Top Issues
1,301 pages were not scanned.
772 broken links were found.
Your site is missing an XML Sitemap.
WWW vs. non-WWW: Server error.Your Pages
250 pages scanned of 1,551 found.
Normal Pages: 67
Skipped, Redirect: 10
Errors: 173
Not Scanned: 1,301Duplicate Content
Duplicate Content: 15%
Common Content: 59%
Unique Content: 25%** the complete PDF report is right here http://cl.ly/WF4O will**
http://www.siteliner.com/www.hemjakt.se/omrade/centralt/?siteliner=broken
OSE backlinks http://www.opensiteexplorer.org/links?page=1&site=http%3A%2F%2Fwww.hemjakt.se%2F
After using the elite addition of Ahrefs ( the different editions offer different abilities) Summary of links found for www.hemjakt.se
You have roughly 2 domain backlinks total from utilizing both tools I would run it through MajesticSEO but I do not see the point.
you need to fix the problems with your site I have a guess as to how you are ranking, but I do not believe it was natural.
If you had built that many pages on a website and really gotten only about 5 or 6 backlinks out of that only 2 our from domains/ IP's to link to your site total I would find that impossible to believe.
You have clearly scraped them or somebody else has been doing a lot of scraping and you may not know about it.
Here is a lot of information I hope this helps,
Thomas
-
I would say that your site goes against the Google guidelines in regards to thin content: https://support.google.com/webmasters/answer/2604719?hl=en
Part of the description of thin content includes "automatically generated content" and "copied content".
Have you checked your Webmaster Tools console to see if you have been given a thin content penalty? Check Search Traffic --> Manual Actions. If there isn't one though, then Panda is probably still the culprit.
-
If your site is only a couple of months old, it's possible that you initially had a honeymoon boost and now you're seeing the traffic you would normally get. But, I think that there is a more serious issue.
According to Google there are over 72,000 pages of your site indexed. How many of these pages contain useful and UNIQUE content that users can't find anywhere else on the web? If you're crawling other people's sites and publishing their content as yours then you're just asking to be demoted by the Panda algorithm. The only thing that doesn't exactly fit with this is the date. As far as we know the last Panda update was May 20. Still, this is my best guess.
"My content is ~90% unique from the source material and I do add user value..."
I would not agree here. In the examples that you gave me, I see that the majority of the actual content (i.e. the text) is 100% copied from the original source. There are several issues with this:
1. Google will see that the vast majority of your site contains content that did not originate with you. This is a sign of low quality and Google will not want to display your site over the originator of that content.
2. Copyright issues. Do you have permission to scrape these sites?
I haven't had a thorough look but from what I've seen, I don't think that this business model will work.
-
Oh yes - I'm sorry about that. So the duplicate content issue still applies, and now I've tested your website - it is quite slow but not as bad as the example I gave. I'd still recommend improving the speed.
Do you link to the website you're scraping the content from? Do you have permission to publish it?
I'm assuming at this stage you don't have any messages from Google in WMT about penalties. Can you pinpoint any onsite changes you made before the traffic drop?
Considering the duplicate content on your site I'm surprised you ranked so well and saw the traffic increases you did - so with your absence of links perhaps the website is at it's natural position at the moment. I have no experience of real estate websites or the Swedish market and competition within it though.
You could consider adding some unique content to the homepage and other location pages - something useful for users. If I was moving to a new city I'd want to know about the different neighbourhoods and which area suited me - which areas are quiet, near to parks? Which have good (public) transport links, might be noisy, have amenities nearby? These could be expanded into really useful standalone location guides, with pictures, maps and more info added.
-
Not sure what you are talking about. There are no links in the footer, there's no footer. On category pages there is a infinity scroll and on normal pages there are no footer at all.
On a normal house page there is only about ~55 <a>tags. </a>
-
Hi Alex thanks for your response.
That is not my website you are linking to, that is the source website that I crawl the content from in the example above.
The domains are very similar so easy to mix up.
Hemjakt.se is my site.
Hemnet.se is the source that I crawled.So in this case if you want to do a Pingdom Speed Test on that page, you should do it on the following URL: http://www.hemjakt.se/bostad/55860-asgatan-15/
But to answer the rest of your question:
- No there have been no significant speed changes to the website according to Google Analytics Speed Page. The last bump when speed times was increased temporarily was 2 weeks ago and only for 1 day.
- The site is hosted on Amazon with a CDN for all images. The site is not perfect but there have been a lot of time spend on making sure that the load times are decent and that the site get a OK score in Google Insights Speed Test.
Here is the Overview page from my Google Analytics to display my Bounce Rate and so on.
Regarding the Bounce Rate. It was ~66% when I had 0 goals set for the site. When I set up goals such as completing forms, viewing the gallery of each house etc the bounce rate dropped to ~38%. That is the reason for the bounce rate drop. -
Footer links are not usually an issue unless you're using them for keyword spamming e.g. you have a load for different locations: "buy house london", "buy house manchester", "buy house liverpool" etc. You won't get a penalty for having over 100 links on a page so that's unlikely to be the issue either.
I've had a quick look at your website and there are two major issues I picked up on:
- Duplicate content - you say you crawl websites and your content is "90% unique from the source material". The content would be better if 90% of it was unique to your website only. I searched Google for a block of text from one of the links you provided and 5 websites appeared. Google doesn't like duplicate content - maybe your website does provide value to users but how can Google's algorithm know this if all it sees is duplicate content? Perhaps your site had poor user metrics (e.g. people pressing back to return to the search results page after a short period of time on your website) so Google demoted it. Can you see any evidence of this in Google Analytics?
- I tested http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964 on Pingdom Speed Test and "Your website is slower than 92% of all tested websites" - has a recent change caused that slowdown? Is the site registered with Google Webmaster Tools? If so, can you see a change in page speed, crawl rate etc.? Google Analytics can also report on page speed. That's very slow and if it's a recurring problem the first thing I'd fix.
While I was browsing this happened to me twice "Just nu har vi tekniska problem".
-
Remove some footer links. More footer links cab be affected any kind of website.No more then 100 links per page.
-
What? Not sure what you're talking about.
Here is an example of a page on my site:
http://www.hemjakt.se/bostad/59873-soldatgatan-13-b/ -
As per my knowledge can you delete footer links then i am sure you will rank again.
You putting more and more links in footer.
Thanks,
Akhilesh
-
What is your suggestion to solve it?
I truely think my application can add a lot of value to the users by giving them a chance to find a lot of data at the same spot, instead of having to navigate to a bunch of other sites. Since I also get access to "Big data" I can add a lot of value to the users by displaying price history from multiple sources.
I'm not trying to do anything shady or black hat. I'm just trying to create a search engine for real estate. Obviously that will generate a lot of content and many pages in a short amount of time.
What am I doing that is against the Google Guidelines? I doubt that "Adding pages too fast" is a reason to give a penalty, but more suspicious activity that would indiciate that a closer look by Google would be needed to see if any spam is going on.
-
Means in your website no instances (Inbound links). I think you create more pages in very short of time that's the reason behind this.
-
I have done no linkbuilding for this site. I have only focused on building a strong crawler and with ~60'000 pages/houses indexed I have been able to rank well on long queries with low competition, and thereby getting good traffic without any linkbuilding.
Moz reports 0 links and from my Google Analytics report I have only been getting some referral traffic from Reddit, a post that has now been deleted. All other traffic is organic search traffic. So I highly doubt that there are any "bad links" that is causing this punishment.
Regarding the crawler. I crawl other websites for real estate objects. I cache the HTML and I then filter out specific information such as price, number of rooms, address and so on.
I then publish this data with my own structure. This makes the data 87-90% unique if compared to the Source site. You can see example in the first post of this topic.
The 10-13% that is not unique, is the description of the real estate object which is usually a 1-2 paragraph text saying something like "This is an amazing house with 12 rooms with a great view that would be a perfect fit for the large family". This text is copied straight off. Which might be what gives me the punishment? Even if its only 10% of the page content. What do you think?
-
HI Marcus Lind,
Can you share me what activity have you done for SEO for this website and how much anchor and which anchor text created by you.
After that we can say clearly what is the exact issue.
-
Please note that I have not done any link building at all to the site, it's only 2 months old. There are no unnatural links, no keyword stuffing and no other black hat seo tactics used. No links have been purchased at all, so Penguin should not be relevant to this site.
I've attached an image of my Google Analytics Overview for the past months, and you can see how it has been naturally steadily growing day by day and then just taking a huge drop the last few days.
-
That's fantastic that you been keeping track of your Bing queries and it sounds like you've been hit by a penalty and going for. I am guessing
i can pull link data for you however my girlfriends want to kill me if I don't stop answering questions right now so look for my answer in about six hours of use
Ahrefs majestic SEO & Moz
-
Try running your queries in Bing and see how it compares. If you see the terms you ranked for earlier ranking well in Bing, then you know you've been hit by a penalty in Google.
That's what made it clear for a site I'm involved with and have been trying to recover for many months now.
-
I'm very sorry to hear that this is happened to you. Here is some information I hopefully helpful
This URL may say on the video that you have to be a member to see the video however you do not have to remember to read the transcript. Therefore read the transcript use the tool that tries to gauge now this is not 100% but it might give you a lot more insight into what's actually occurred. You have to have Moz opensiteexplorer.com and another tool like MajesticSEO.com or Ahrefs.com
I like Ahrefs however either one will do.Your first step is to Google “fruition Google penalty tool”,
https://fruition.net/google-penalty-checker-tool/
because that will bring up a cool tool that will show you whether or not you’ve been hit by Google Penguin, because there are several updateshttp://www.quicksprout.com/university/how-to-identify-and-recover-from-a-google-penguin-penalty/
So your aware this link shows a video that you may not be able to see you and I sure remember however because it is fully transcribed you have a exact how-to manual right below it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Fluctuation on "Canvas Prints" keyword in google.co.uk
Hello Moz We are struggling for "canvas prints" ranking in google.co.uk since last 2 years. every time in SERP my webpage has been changed. i want to rank this URL on this particular keyword - "canvas prints" Can you tell me why my page has been fluctuate every time in SERP's. mtwpvf
White Hat / Black Hat SEO | | CommercePundit1 -
Does building backlinks help improve Google rankings? If so which links work nowadays?
Hi Guys, Please only reply of you have real experience.... So as the title implies does building backlinks work in improving the rankings in Google? I know they are not on the same level as some are spammy, in blog networks etc but how about other backlinks that are of higher quality? If yes, what sorts of backlinks work nowadays in boosting rankings but not risking getting penalized? So should you build backlinks ongoing? If so how many per month? I have a real struggle trying to get backlinks on really high quality sites. Any advice? Cheers John
White Hat / Black Hat SEO | | whiteboardwiz1 -
Google webmasters tools, Majestic and Ahref in a simple case study (Bad links and Good links)
Hey guys, This case study started from here. A simple summary, I discover that I got +1000 backlinks from Blogspot though Google webmasters tools after making a connection with owners of these blogs which points to my new blog. Before starting I proudly invite Thomas Zickell and Gary Lee in this discussion. I wish you accept my invitation. Let's go to the main point, I've used Google webmaster tools so I will start with. Then Ahref which used by **Thomas **and then Majestic which used by Gary. Take a look at "001" screenshot, you will see that Google webmaster tools discovered 1291 links points to my site. Take another look at "002" screenshot, you will find that there are 22 domains points to my site. Most of them are good links since they are coming from websites such as Google.com, Wikipedia.org, Reddit, Shoutmeload, WordPress.org, ...etc. Beside +1000 backlinks came from Blogspot.com (blogs). Also, there's some bad links such as this one came from tacasino.com Necessary to say that I've got some competitors and they nicely asked me to stop the competition for some keywords and I've ignored their request. So, I'm not surprised to see these bad links. At "002" screenshot, we can see that Google didn't discover the bad links as they discovered the good links. And they discovered a lot of backlinks which not discovered by any other tools. **Let's move to Ahref, ** I will use screenshots provided by Thomas. At "003" screenshot, you can see Ahref report that say 457 links from 10 domains. By the way, social engagements data are wrong. I got more than zero engagements .. really. At "004" screenshot, you can see domains points to my site, links with anchor text. Take a look at the second link you will find that it's a spammy link coming from PR2 home page since it's is over optimized. the third link is also a spammy link since it coming from a not-relevant website. Beside other bad links need to be removed. So, Ahref didn't discover all of my good links. Instead of that it discovered few good links and a lot of bad links. In a case like this a question come needs to be answered since there are some people trying so hard to hurt my site, Do I have to remove all this bad links? Or, just links discovered by Google. Or, Google understand the case? **Let's move to majestic, ** Gray Lee provided data from majestic which say "10 Unique Referring Domains, with 363 links, 2 domains make up a majority." Since Gray didn't take any screenshots I will provide mine. At "005" screenshot, you can see some of the bad links discovered by Majestic. Not all of them discovered by Ahref or Google. In the other hand, Majestic didn't discover all of my Good links. Also, there's a miss understand I would like to explain here. When I published the Discussion about +1000 link. Some people may think that I trying to cheat you by providing fake info and this totally wrong. I said before and I'm saying that again you are so elite and I respect you. Also, I'm preparing for an advanced case study about this thing. If any expert would like to join me this will be great. Thank you for reading and please feel free to share you thoughts, knowledge and experience in this Discussion. EE5bFNc jYg21cf Xyfgp28.png iR4UOwi.png D1pGAFO
White Hat / Black Hat SEO | | Eslam-yosef1 -
Direct Traffic has Dropped 48% to Last Year
Since February of 2013 our organic traffic at http://www.weddingshoppeinc.com had been declining. We were able to get traffic back up to par with numbers from the previous year by December of 2013. In March of 2014 our direct traffic took a major hit and hasn’t improved. We know our mobile traffic is part of the problem, but the issue has affected traffic from desktop and mobile devices. Is this an organic traffic problem, or is our decrease in direct traffic coming from somewhere else? Has anyone else seen this issue, or does anyone have advice? Here is what we’ve already looked into and updates to note: Before this issue, when we compared organic and direct traffic, direct was usually half of what organic was (i.e., if organic was at 10 visitors, direct was at 5). However organic traffic has followed normal trends and direct has dropped. In August we updated our .net code to MVC to drop our first byte from 1,700 to 300 milliseconds. However, if you look at our m. site, it’s around 1,000 milliseconds. We changed our SEO strategy in May to follow best practices. We’ve been rewriting old content. We haven’t ever done any black hat SEO, just have some old blogs from 2010-2012 that have too many keywords. These are getting edited. In March we moved our images to a CDN for our images. We’re currently working on server errors and broken links, but nothing significant changed around March to affect our traffic. Very recently, our web developers said that they believed our direct traffic had been getting tracked wrong in Google Analytics prior to March 2014. However they think they fixed the issue in a March push. We've taken this theory into account, but we also see a drop in revenue at the time of their push that correlates with the drop in traffic, so we know there’s a bigger issue. Any input you can provide would be greatly appreciated!
White Hat / Black Hat SEO | | JimmyFritz1 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
How do I write tags on a youtube video for a local Google search?
I've been reading into tags, and I would like to know what the best ways to do them for a local search are. Right now I have a title that reads similar to, "Keyword1 and Keyword2 in City X" Would I make a corresponding tag that reads "Keyword 1 and Keyword 2 in City X,"? Or would I do "Keyword 1," "Keyword 2," and, "City X," as separate tags? Thanks!
White Hat / Black Hat SEO | | OOMDODigital0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0 -
If a site is punished by google like -30, or -60, are the link from that site efficient?
Like this way, if I build a blog and in some situation, the blog is punished by google as some reason I don't know, all the rank dropped and got the -30 punishment. If I put a outbound link on the sidebar, or footer position. what it'll be for that link? A is punished, a link is put on the A website and link to B website what that link means to B punished got many ways Thank you
White Hat / Black Hat SEO | | yifang01230