80% of traffic lost over night, Google Penalty?
-
Hi all.
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day.3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page.
I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress>
Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement.
Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/So:
-
How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google.
-
If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?
-
-
because you have no backlinks you have six total and your site apparently has an incredible amount of duplicate content after running it through screening frog and site liner you have a ton of broken links as well.
From only scanning 250 pages total I was able to come up with quite a few huge issues
with out the www.
400 Bad Request:
Top Issues
1,301 pages were not scanned.
772 broken links were found.
Your site is missing an XML Sitemap.
WWW vs. non-WWW: Server error.Your Pages
250 pages scanned of 1,551 found.
Normal Pages: 67
Skipped, Redirect: 10
Errors: 173
Not Scanned: 1,301Duplicate Content
Duplicate Content: 15%
Common Content: 59%
Unique Content: 25%** the complete PDF report is right here http://cl.ly/WF4O will**
http://www.siteliner.com/www.hemjakt.se/omrade/centralt/?siteliner=broken
OSE backlinks http://www.opensiteexplorer.org/links?page=1&site=http%3A%2F%2Fwww.hemjakt.se%2F
After using the elite addition of Ahrefs ( the different editions offer different abilities) Summary of links found for www.hemjakt.se
You have roughly 2 domain backlinks total from utilizing both tools I would run it through MajesticSEO but I do not see the point.
you need to fix the problems with your site I have a guess as to how you are ranking, but I do not believe it was natural.
If you had built that many pages on a website and really gotten only about 5 or 6 backlinks out of that only 2 our from domains/ IP's to link to your site total I would find that impossible to believe.
You have clearly scraped them or somebody else has been doing a lot of scraping and you may not know about it.
Here is a lot of information I hope this helps,
Thomas
-
I would say that your site goes against the Google guidelines in regards to thin content: https://support.google.com/webmasters/answer/2604719?hl=en
Part of the description of thin content includes "automatically generated content" and "copied content".
Have you checked your Webmaster Tools console to see if you have been given a thin content penalty? Check Search Traffic --> Manual Actions. If there isn't one though, then Panda is probably still the culprit.
-
If your site is only a couple of months old, it's possible that you initially had a honeymoon boost and now you're seeing the traffic you would normally get. But, I think that there is a more serious issue.
According to Google there are over 72,000 pages of your site indexed. How many of these pages contain useful and UNIQUE content that users can't find anywhere else on the web? If you're crawling other people's sites and publishing their content as yours then you're just asking to be demoted by the Panda algorithm. The only thing that doesn't exactly fit with this is the date. As far as we know the last Panda update was May 20. Still, this is my best guess.
"My content is ~90% unique from the source material and I do add user value..."
I would not agree here. In the examples that you gave me, I see that the majority of the actual content (i.e. the text) is 100% copied from the original source. There are several issues with this:
1. Google will see that the vast majority of your site contains content that did not originate with you. This is a sign of low quality and Google will not want to display your site over the originator of that content.
2. Copyright issues. Do you have permission to scrape these sites?
I haven't had a thorough look but from what I've seen, I don't think that this business model will work.
-
Oh yes - I'm sorry about that. So the duplicate content issue still applies, and now I've tested your website - it is quite slow but not as bad as the example I gave. I'd still recommend improving the speed.
Do you link to the website you're scraping the content from? Do you have permission to publish it?
I'm assuming at this stage you don't have any messages from Google in WMT about penalties. Can you pinpoint any onsite changes you made before the traffic drop?
Considering the duplicate content on your site I'm surprised you ranked so well and saw the traffic increases you did - so with your absence of links perhaps the website is at it's natural position at the moment. I have no experience of real estate websites or the Swedish market and competition within it though.
You could consider adding some unique content to the homepage and other location pages - something useful for users. If I was moving to a new city I'd want to know about the different neighbourhoods and which area suited me - which areas are quiet, near to parks? Which have good (public) transport links, might be noisy, have amenities nearby? These could be expanded into really useful standalone location guides, with pictures, maps and more info added.
-
Not sure what you are talking about. There are no links in the footer, there's no footer. On category pages there is a infinity scroll and on normal pages there are no footer at all.
On a normal house page there is only about ~55 <a>tags. </a>
-
Hi Alex thanks for your response.
That is not my website you are linking to, that is the source website that I crawl the content from in the example above.
The domains are very similar so easy to mix up.
Hemjakt.se is my site.
Hemnet.se is the source that I crawled.So in this case if you want to do a Pingdom Speed Test on that page, you should do it on the following URL: http://www.hemjakt.se/bostad/55860-asgatan-15/
But to answer the rest of your question:
- No there have been no significant speed changes to the website according to Google Analytics Speed Page. The last bump when speed times was increased temporarily was 2 weeks ago and only for 1 day.
- The site is hosted on Amazon with a CDN for all images. The site is not perfect but there have been a lot of time spend on making sure that the load times are decent and that the site get a OK score in Google Insights Speed Test.
Here is the Overview page from my Google Analytics to display my Bounce Rate and so on.
Regarding the Bounce Rate. It was ~66% when I had 0 goals set for the site. When I set up goals such as completing forms, viewing the gallery of each house etc the bounce rate dropped to ~38%. That is the reason for the bounce rate drop. -
Footer links are not usually an issue unless you're using them for keyword spamming e.g. you have a load for different locations: "buy house london", "buy house manchester", "buy house liverpool" etc. You won't get a penalty for having over 100 links on a page so that's unlikely to be the issue either.
I've had a quick look at your website and there are two major issues I picked up on:
- Duplicate content - you say you crawl websites and your content is "90% unique from the source material". The content would be better if 90% of it was unique to your website only. I searched Google for a block of text from one of the links you provided and 5 websites appeared. Google doesn't like duplicate content - maybe your website does provide value to users but how can Google's algorithm know this if all it sees is duplicate content? Perhaps your site had poor user metrics (e.g. people pressing back to return to the search results page after a short period of time on your website) so Google demoted it. Can you see any evidence of this in Google Analytics?
- I tested http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964 on Pingdom Speed Test and "Your website is slower than 92% of all tested websites" - has a recent change caused that slowdown? Is the site registered with Google Webmaster Tools? If so, can you see a change in page speed, crawl rate etc.? Google Analytics can also report on page speed. That's very slow and if it's a recurring problem the first thing I'd fix.
While I was browsing this happened to me twice "Just nu har vi tekniska problem".
-
Remove some footer links. More footer links cab be affected any kind of website.No more then 100 links per page.
-
What? Not sure what you're talking about.
Here is an example of a page on my site:
http://www.hemjakt.se/bostad/59873-soldatgatan-13-b/ -
As per my knowledge can you delete footer links then i am sure you will rank again.
You putting more and more links in footer.
Thanks,
Akhilesh
-
What is your suggestion to solve it?
I truely think my application can add a lot of value to the users by giving them a chance to find a lot of data at the same spot, instead of having to navigate to a bunch of other sites. Since I also get access to "Big data" I can add a lot of value to the users by displaying price history from multiple sources.
I'm not trying to do anything shady or black hat. I'm just trying to create a search engine for real estate. Obviously that will generate a lot of content and many pages in a short amount of time.
What am I doing that is against the Google Guidelines? I doubt that "Adding pages too fast" is a reason to give a penalty, but more suspicious activity that would indiciate that a closer look by Google would be needed to see if any spam is going on.
-
Means in your website no instances (Inbound links). I think you create more pages in very short of time that's the reason behind this.
-
I have done no linkbuilding for this site. I have only focused on building a strong crawler and with ~60'000 pages/houses indexed I have been able to rank well on long queries with low competition, and thereby getting good traffic without any linkbuilding.
Moz reports 0 links and from my Google Analytics report I have only been getting some referral traffic from Reddit, a post that has now been deleted. All other traffic is organic search traffic. So I highly doubt that there are any "bad links" that is causing this punishment.
Regarding the crawler. I crawl other websites for real estate objects. I cache the HTML and I then filter out specific information such as price, number of rooms, address and so on.
I then publish this data with my own structure. This makes the data 87-90% unique if compared to the Source site. You can see example in the first post of this topic.
The 10-13% that is not unique, is the description of the real estate object which is usually a 1-2 paragraph text saying something like "This is an amazing house with 12 rooms with a great view that would be a perfect fit for the large family". This text is copied straight off. Which might be what gives me the punishment? Even if its only 10% of the page content. What do you think?
-
HI Marcus Lind,
Can you share me what activity have you done for SEO for this website and how much anchor and which anchor text created by you.
After that we can say clearly what is the exact issue.
-
Please note that I have not done any link building at all to the site, it's only 2 months old. There are no unnatural links, no keyword stuffing and no other black hat seo tactics used. No links have been purchased at all, so Penguin should not be relevant to this site.
I've attached an image of my Google Analytics Overview for the past months, and you can see how it has been naturally steadily growing day by day and then just taking a huge drop the last few days.
-
That's fantastic that you been keeping track of your Bing queries and it sounds like you've been hit by a penalty and going for. I am guessing
i can pull link data for you however my girlfriends want to kill me if I don't stop answering questions right now so look for my answer in about six hours of use
Ahrefs majestic SEO & Moz
-
Try running your queries in Bing and see how it compares. If you see the terms you ranked for earlier ranking well in Bing, then you know you've been hit by a penalty in Google.
That's what made it clear for a site I'm involved with and have been trying to recover for many months now.
-
I'm very sorry to hear that this is happened to you. Here is some information I hopefully helpful
This URL may say on the video that you have to be a member to see the video however you do not have to remember to read the transcript. Therefore read the transcript use the tool that tries to gauge now this is not 100% but it might give you a lot more insight into what's actually occurred. You have to have Moz opensiteexplorer.com and another tool like MajesticSEO.com or Ahrefs.com
I like Ahrefs however either one will do.Your first step is to Google “fruition Google penalty tool”,
https://fruition.net/google-penalty-checker-tool/
because that will bring up a cool tool that will show you whether or not you’ve been hit by Google Penguin, because there are several updateshttp://www.quicksprout.com/university/how-to-identify-and-recover-from-a-google-penguin-penalty/
So your aware this link shows a video that you may not be able to see you and I sure remember however because it is fully transcribed you have a exact how-to manual right below it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Our ranking as not returned after penalty, Why?
Hi, We have had a Google action against us for over a year. After many "SEO Company's" we found someone who help us remove (December 2013) the action. Which was due to our bad back link profile. We have 100% improved our content for our website, as Google has requested. We are active within social media, we add relevant content to our blog and we clean up our desk after we finish work 🙂 After looking at Moz tools we have great results, sometimes even better than our competitors. But we are still not getting or improving on our traffic, if anything its decreasing. Is anyone else in the same position? or has anyone recovered from a similar situation? Josh
White Hat / Black Hat SEO | | JoshuaKersh0 -
Recovering from Pinguin Penalty
We have big issue with a website who has been hardly penalized by Pinguin on october 4th. After a lot of try to remove bad links and sending two disavow files, none of our actions has improved our situation. We're wondering if this solution might be good : changing the domaine name Keeping the same content Not using Webmaster tools and redirect 301 and wait until the site will be fully indexed Build new links Please tell us your opinion and solution. Thanks
White Hat / Black Hat SEO | | webit400 -
Got dropped on Google rank - Tips to discover why please
Hi guys originally my website was poor ranked on Google. So, after sign in on Moz and follow their tips I achieved the 4th position for one of my keywords (amazing!). But a few days ago my page dropped to bellow the first 50th pages for this same keyword, but I didn't make any changes on it. Anybody has some tips of how can I discover/repair what happened? Thank you all in advance. Best regards Paulo
White Hat / Black Hat SEO | | phlcastro0 -
Should you include keywords in your domain name to rank well on Google Places?
Is it okay to include keywords in your domain name (as well as business name) to rank well on Google Places? In my opinion, this is very spammy and the sites using this technique will be slapped by Google sooner or later.
White Hat / Black Hat SEO | | thegoatman1 -
Google Reconsideration Requests no problem... So what do I do next?
Hi all, So I recently filed a Google reconsideration request - but it came back saying "No manual spam actions found" - ok, so that's that. But from what I've read, if we have been hit by Panda for duplicate or thin content, we wouldn't know - in other words, Google does not report it as it is an algorhythm penalty as opposed to a manual one. So what are my options - do I wait until the next Panda update? when can that be? Or do I start over on a fresh domain? Input and views appreciated. thanks,
White Hat / Black Hat SEO | | bjs20100 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0