80% of traffic lost over night, Google Penalty?
-
Hi all.
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day.3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page.
I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress>
Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement.
Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/So:
-
How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google.
-
If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?
-
-
because you have no backlinks you have six total and your site apparently has an incredible amount of duplicate content after running it through screening frog and site liner you have a ton of broken links as well.
From only scanning 250 pages total I was able to come up with quite a few huge issues
with out the www.
400 Bad Request:
Top Issues
1,301 pages were not scanned.
772 broken links were found.
Your site is missing an XML Sitemap.
WWW vs. non-WWW: Server error.Your Pages
250 pages scanned of 1,551 found.
Normal Pages: 67
Skipped, Redirect: 10
Errors: 173
Not Scanned: 1,301Duplicate Content
Duplicate Content: 15%
Common Content: 59%
Unique Content: 25%** the complete PDF report is right here http://cl.ly/WF4O will**
http://www.siteliner.com/www.hemjakt.se/omrade/centralt/?siteliner=broken
OSE backlinks http://www.opensiteexplorer.org/links?page=1&site=http%3A%2F%2Fwww.hemjakt.se%2F
After using the elite addition of Ahrefs ( the different editions offer different abilities) Summary of links found for www.hemjakt.se
You have roughly 2 domain backlinks total from utilizing both tools I would run it through MajesticSEO but I do not see the point.
you need to fix the problems with your site I have a guess as to how you are ranking, but I do not believe it was natural.
If you had built that many pages on a website and really gotten only about 5 or 6 backlinks out of that only 2 our from domains/ IP's to link to your site total I would find that impossible to believe.
You have clearly scraped them or somebody else has been doing a lot of scraping and you may not know about it.
Here is a lot of information I hope this helps,
Thomas
-
I would say that your site goes against the Google guidelines in regards to thin content: https://support.google.com/webmasters/answer/2604719?hl=en
Part of the description of thin content includes "automatically generated content" and "copied content".
Have you checked your Webmaster Tools console to see if you have been given a thin content penalty? Check Search Traffic --> Manual Actions. If there isn't one though, then Panda is probably still the culprit.
-
If your site is only a couple of months old, it's possible that you initially had a honeymoon boost and now you're seeing the traffic you would normally get. But, I think that there is a more serious issue.
According to Google there are over 72,000 pages of your site indexed. How many of these pages contain useful and UNIQUE content that users can't find anywhere else on the web? If you're crawling other people's sites and publishing their content as yours then you're just asking to be demoted by the Panda algorithm. The only thing that doesn't exactly fit with this is the date. As far as we know the last Panda update was May 20. Still, this is my best guess.
"My content is ~90% unique from the source material and I do add user value..."
I would not agree here. In the examples that you gave me, I see that the majority of the actual content (i.e. the text) is 100% copied from the original source. There are several issues with this:
1. Google will see that the vast majority of your site contains content that did not originate with you. This is a sign of low quality and Google will not want to display your site over the originator of that content.
2. Copyright issues. Do you have permission to scrape these sites?
I haven't had a thorough look but from what I've seen, I don't think that this business model will work.
-
Oh yes - I'm sorry about that. So the duplicate content issue still applies, and now I've tested your website - it is quite slow but not as bad as the example I gave. I'd still recommend improving the speed.
Do you link to the website you're scraping the content from? Do you have permission to publish it?
I'm assuming at this stage you don't have any messages from Google in WMT about penalties. Can you pinpoint any onsite changes you made before the traffic drop?
Considering the duplicate content on your site I'm surprised you ranked so well and saw the traffic increases you did - so with your absence of links perhaps the website is at it's natural position at the moment. I have no experience of real estate websites or the Swedish market and competition within it though.
You could consider adding some unique content to the homepage and other location pages - something useful for users. If I was moving to a new city I'd want to know about the different neighbourhoods and which area suited me - which areas are quiet, near to parks? Which have good (public) transport links, might be noisy, have amenities nearby? These could be expanded into really useful standalone location guides, with pictures, maps and more info added.
-
Not sure what you are talking about. There are no links in the footer, there's no footer. On category pages there is a infinity scroll and on normal pages there are no footer at all.
On a normal house page there is only about ~55 <a>tags. </a>
-
Hi Alex thanks for your response.
That is not my website you are linking to, that is the source website that I crawl the content from in the example above.
The domains are very similar so easy to mix up.
Hemjakt.se is my site.
Hemnet.se is the source that I crawled.So in this case if you want to do a Pingdom Speed Test on that page, you should do it on the following URL: http://www.hemjakt.se/bostad/55860-asgatan-15/
But to answer the rest of your question:
- No there have been no significant speed changes to the website according to Google Analytics Speed Page. The last bump when speed times was increased temporarily was 2 weeks ago and only for 1 day.
- The site is hosted on Amazon with a CDN for all images. The site is not perfect but there have been a lot of time spend on making sure that the load times are decent and that the site get a OK score in Google Insights Speed Test.
Here is the Overview page from my Google Analytics to display my Bounce Rate and so on.
Regarding the Bounce Rate. It was ~66% when I had 0 goals set for the site. When I set up goals such as completing forms, viewing the gallery of each house etc the bounce rate dropped to ~38%. That is the reason for the bounce rate drop. -
Footer links are not usually an issue unless you're using them for keyword spamming e.g. you have a load for different locations: "buy house london", "buy house manchester", "buy house liverpool" etc. You won't get a penalty for having over 100 links on a page so that's unlikely to be the issue either.
I've had a quick look at your website and there are two major issues I picked up on:
- Duplicate content - you say you crawl websites and your content is "90% unique from the source material". The content would be better if 90% of it was unique to your website only. I searched Google for a block of text from one of the links you provided and 5 websites appeared. Google doesn't like duplicate content - maybe your website does provide value to users but how can Google's algorithm know this if all it sees is duplicate content? Perhaps your site had poor user metrics (e.g. people pressing back to return to the search results page after a short period of time on your website) so Google demoted it. Can you see any evidence of this in Google Analytics?
- I tested http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964 on Pingdom Speed Test and "Your website is slower than 92% of all tested websites" - has a recent change caused that slowdown? Is the site registered with Google Webmaster Tools? If so, can you see a change in page speed, crawl rate etc.? Google Analytics can also report on page speed. That's very slow and if it's a recurring problem the first thing I'd fix.
While I was browsing this happened to me twice "Just nu har vi tekniska problem".
-
Remove some footer links. More footer links cab be affected any kind of website.No more then 100 links per page.
-
What? Not sure what you're talking about.
Here is an example of a page on my site:
http://www.hemjakt.se/bostad/59873-soldatgatan-13-b/ -
As per my knowledge can you delete footer links then i am sure you will rank again.
You putting more and more links in footer.
Thanks,
Akhilesh
-
What is your suggestion to solve it?
I truely think my application can add a lot of value to the users by giving them a chance to find a lot of data at the same spot, instead of having to navigate to a bunch of other sites. Since I also get access to "Big data" I can add a lot of value to the users by displaying price history from multiple sources.
I'm not trying to do anything shady or black hat. I'm just trying to create a search engine for real estate. Obviously that will generate a lot of content and many pages in a short amount of time.
What am I doing that is against the Google Guidelines? I doubt that "Adding pages too fast" is a reason to give a penalty, but more suspicious activity that would indiciate that a closer look by Google would be needed to see if any spam is going on.
-
Means in your website no instances (Inbound links). I think you create more pages in very short of time that's the reason behind this.
-
I have done no linkbuilding for this site. I have only focused on building a strong crawler and with ~60'000 pages/houses indexed I have been able to rank well on long queries with low competition, and thereby getting good traffic without any linkbuilding.
Moz reports 0 links and from my Google Analytics report I have only been getting some referral traffic from Reddit, a post that has now been deleted. All other traffic is organic search traffic. So I highly doubt that there are any "bad links" that is causing this punishment.
Regarding the crawler. I crawl other websites for real estate objects. I cache the HTML and I then filter out specific information such as price, number of rooms, address and so on.
I then publish this data with my own structure. This makes the data 87-90% unique if compared to the Source site. You can see example in the first post of this topic.
The 10-13% that is not unique, is the description of the real estate object which is usually a 1-2 paragraph text saying something like "This is an amazing house with 12 rooms with a great view that would be a perfect fit for the large family". This text is copied straight off. Which might be what gives me the punishment? Even if its only 10% of the page content. What do you think?
-
HI Marcus Lind,
Can you share me what activity have you done for SEO for this website and how much anchor and which anchor text created by you.
After that we can say clearly what is the exact issue.
-
Please note that I have not done any link building at all to the site, it's only 2 months old. There are no unnatural links, no keyword stuffing and no other black hat seo tactics used. No links have been purchased at all, so Penguin should not be relevant to this site.
I've attached an image of my Google Analytics Overview for the past months, and you can see how it has been naturally steadily growing day by day and then just taking a huge drop the last few days.
-
That's fantastic that you been keeping track of your Bing queries and it sounds like you've been hit by a penalty and going for. I am guessing
i can pull link data for you however my girlfriends want to kill me if I don't stop answering questions right now so look for my answer in about six hours of use
Ahrefs majestic SEO & Moz
-
Try running your queries in Bing and see how it compares. If you see the terms you ranked for earlier ranking well in Bing, then you know you've been hit by a penalty in Google.
That's what made it clear for a site I'm involved with and have been trying to recover for many months now.
-
I'm very sorry to hear that this is happened to you. Here is some information I hopefully helpful
This URL may say on the video that you have to be a member to see the video however you do not have to remember to read the transcript. Therefore read the transcript use the tool that tries to gauge now this is not 100% but it might give you a lot more insight into what's actually occurred. You have to have Moz opensiteexplorer.com and another tool like MajesticSEO.com or Ahrefs.com
I like Ahrefs however either one will do.Your first step is to Google “fruition Google penalty tool”,
https://fruition.net/google-penalty-checker-tool/
because that will bring up a cool tool that will show you whether or not you’ve been hit by Google Penguin, because there are several updateshttp://www.quicksprout.com/university/how-to-identify-and-recover-from-a-google-penguin-penalty/
So your aware this link shows a video that you may not be able to see you and I sure remember however because it is fully transcribed you have a exact how-to manual right below it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
I have 100+ Landing Pages I use for PPC... Does Google see this as a blog farm?
I am currently using about 50-100 domains for geotargeted landing pages for my PPC campaigns. All these pages basically have the same content, I believe are hosted on a single unique ip address and all have links back to my main url. I am not using these pages for SEO at all, as I know they will never achieve any significant SEO value. They are simply designed to generate a higher conversion rate for my PPC campaigns, because they are state and city domains. My question is, does google see this as a blog/link farm, and if so, what should I do about it? I don't want to lose any potential rankings they may be giving my site, if any at all, but if they are hurting my main urls SEO performance, then I want to know what I should do about it. any advice would be much appreciated!
White Hat / Black Hat SEO | | jfishe19881 -
Strategies to recover from a Google Penalty?
2 years ago we took over a client who had a hacked site and also had signed up with a black hat SEO team that set up 50 spammy directory links back to the site. Since then we have cleaned up the hacks, had the site reviewed by Google and readded to the Search Index, and disavowed all the directory links through GWT. Over the last 2 years, we've encouraged the client to create new content and have developed a small but engaged social following. The website is www.fishtalesoutfitting.com/. The site's domain authority is 30, but it struggles to rank higher than 20 for even uncompetitive long tail keywords. Other sites with much lower domain authorities outrank the site for our primary keywords. We are now overhauling the site design and content. We are considering creating an entirely new URL for the primary domain. We would then use 301 redirects from the old url to the new. We'd welcome insight into why the current site may still be getting penalized, as well as thoughts on our strategy or other recommendations to recover from the events of 2 years ago. Thank you.
White Hat / Black Hat SEO | | mlwilmore0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Do inbound links from forums hurt our traffic?
We have a manual action against us on Google webmaster tools for unnatural links. While evaluating our back links, I noticed that forums with low page rank/domain authority are linking to us. Is this hurting us?
White Hat / Black Hat SEO | | imlovinseo0 -
Is this Penguin or Manual Penalty?
I have a client that's traffic dropped off on April 10th. They did get a message in GWT on March 21st. The April 10th date leads me to believe that it is a manual penalty and couldn't be penguin since penguin was released on April 24th. I guess either way backlinks need to be cleaned up though.
White Hat / Black Hat SEO | | RonMedlin0 -
When to give up on a website with a Google penalty?
I recently had a Google 60 penalty hit my website. The main two issues were that I had a person helping me with SEO and they bought some links. The second issue is that I own about 90 URL's in the my vertical. I created about 60 one page sites for these keyword targeted domains. I then linked these sites to main site. Big mistake! I kept these URL's all on the same server as my main site. In October 2010 I noticed my site hits dropped dramatically. I started looking for the issue. I didn't know which issue caused the penalty. I fixed both issues in November 2010 and asked Google for reconsideration in early December 2010. I kept link building for my site by finding quality links.I was extremely honest with Google. I gave them all of the domains I own and I told them the name of the person that bought links for me and the websites where those links were placed. As of late February 2011 a Google search for my domain still showed up in approximately the 64th position. I recently asked Google again to lift the penalty. I basically told them that I fixed all of my issues that led to the penalty and let them know I have been waiting for almost 3 months. I told them I have put the past 2 years of my life into this website and begged them to forgive me. I also asked them to let me know if my site was never going to be forgiven? I got the typical canned response from the Google team. As of today the penalty is still in effect. I just want to know when you should give up on a site. I have spent about $20,000 on this site and about 2 years of hard work. I don't want to give up, but I don't want to keep putting my hard work and time into the site if it will never escape the dreaded Google penalty. Do you think I should continue to wait and if so how long? Anything else I can do to persuade Google to release me from this penalty hell? If I do abandon the site and start from scratch what steps should I take? Do I need a new server? What if any content can I take from my current site and transfer to the new site? If I can how do I do this without getting another penalty or lose the credit for the original content. I created about 2,000 pages of original content for this site. I'd love to be able to transfer this content if I have to start from scratch. Any ideas or detailed help plans would be greatly appreciated.
White Hat / Black Hat SEO | | tadden0