80% of traffic lost over night, Google Penalty?
-
Hi all.
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day.3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page.
I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress>
Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement.
Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/So:
-
How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google.
-
If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?
-
-
because you have no backlinks you have six total and your site apparently has an incredible amount of duplicate content after running it through screening frog and site liner you have a ton of broken links as well.
From only scanning 250 pages total I was able to come up with quite a few huge issues
with out the www.
400 Bad Request:
Top Issues
1,301 pages were not scanned.
772 broken links were found.
Your site is missing an XML Sitemap.
WWW vs. non-WWW: Server error.Your Pages
250 pages scanned of 1,551 found.
Normal Pages: 67
Skipped, Redirect: 10
Errors: 173
Not Scanned: 1,301Duplicate Content
Duplicate Content: 15%
Common Content: 59%
Unique Content: 25%** the complete PDF report is right here http://cl.ly/WF4O will**
http://www.siteliner.com/www.hemjakt.se/omrade/centralt/?siteliner=broken
OSE backlinks http://www.opensiteexplorer.org/links?page=1&site=http%3A%2F%2Fwww.hemjakt.se%2F
After using the elite addition of Ahrefs ( the different editions offer different abilities) Summary of links found for www.hemjakt.se
You have roughly 2 domain backlinks total from utilizing both tools I would run it through MajesticSEO but I do not see the point.
you need to fix the problems with your site I have a guess as to how you are ranking, but I do not believe it was natural.
If you had built that many pages on a website and really gotten only about 5 or 6 backlinks out of that only 2 our from domains/ IP's to link to your site total I would find that impossible to believe.
You have clearly scraped them or somebody else has been doing a lot of scraping and you may not know about it.
Here is a lot of information I hope this helps,
Thomas
-
I would say that your site goes against the Google guidelines in regards to thin content: https://support.google.com/webmasters/answer/2604719?hl=en
Part of the description of thin content includes "automatically generated content" and "copied content".
Have you checked your Webmaster Tools console to see if you have been given a thin content penalty? Check Search Traffic --> Manual Actions. If there isn't one though, then Panda is probably still the culprit.
-
If your site is only a couple of months old, it's possible that you initially had a honeymoon boost and now you're seeing the traffic you would normally get. But, I think that there is a more serious issue.
According to Google there are over 72,000 pages of your site indexed. How many of these pages contain useful and UNIQUE content that users can't find anywhere else on the web? If you're crawling other people's sites and publishing their content as yours then you're just asking to be demoted by the Panda algorithm. The only thing that doesn't exactly fit with this is the date. As far as we know the last Panda update was May 20. Still, this is my best guess.
"My content is ~90% unique from the source material and I do add user value..."
I would not agree here. In the examples that you gave me, I see that the majority of the actual content (i.e. the text) is 100% copied from the original source. There are several issues with this:
1. Google will see that the vast majority of your site contains content that did not originate with you. This is a sign of low quality and Google will not want to display your site over the originator of that content.
2. Copyright issues. Do you have permission to scrape these sites?
I haven't had a thorough look but from what I've seen, I don't think that this business model will work.
-
Oh yes - I'm sorry about that. So the duplicate content issue still applies, and now I've tested your website - it is quite slow but not as bad as the example I gave. I'd still recommend improving the speed.
Do you link to the website you're scraping the content from? Do you have permission to publish it?
I'm assuming at this stage you don't have any messages from Google in WMT about penalties. Can you pinpoint any onsite changes you made before the traffic drop?
Considering the duplicate content on your site I'm surprised you ranked so well and saw the traffic increases you did - so with your absence of links perhaps the website is at it's natural position at the moment. I have no experience of real estate websites or the Swedish market and competition within it though.
You could consider adding some unique content to the homepage and other location pages - something useful for users. If I was moving to a new city I'd want to know about the different neighbourhoods and which area suited me - which areas are quiet, near to parks? Which have good (public) transport links, might be noisy, have amenities nearby? These could be expanded into really useful standalone location guides, with pictures, maps and more info added.
-
Not sure what you are talking about. There are no links in the footer, there's no footer. On category pages there is a infinity scroll and on normal pages there are no footer at all.
On a normal house page there is only about ~55 <a>tags. </a>
-
Hi Alex thanks for your response.
That is not my website you are linking to, that is the source website that I crawl the content from in the example above.
The domains are very similar so easy to mix up.
Hemjakt.se is my site.
Hemnet.se is the source that I crawled.So in this case if you want to do a Pingdom Speed Test on that page, you should do it on the following URL: http://www.hemjakt.se/bostad/55860-asgatan-15/
But to answer the rest of your question:
- No there have been no significant speed changes to the website according to Google Analytics Speed Page. The last bump when speed times was increased temporarily was 2 weeks ago and only for 1 day.
- The site is hosted on Amazon with a CDN for all images. The site is not perfect but there have been a lot of time spend on making sure that the load times are decent and that the site get a OK score in Google Insights Speed Test.
Here is the Overview page from my Google Analytics to display my Bounce Rate and so on.
Regarding the Bounce Rate. It was ~66% when I had 0 goals set for the site. When I set up goals such as completing forms, viewing the gallery of each house etc the bounce rate dropped to ~38%. That is the reason for the bounce rate drop. -
Footer links are not usually an issue unless you're using them for keyword spamming e.g. you have a load for different locations: "buy house london", "buy house manchester", "buy house liverpool" etc. You won't get a penalty for having over 100 links on a page so that's unlikely to be the issue either.
I've had a quick look at your website and there are two major issues I picked up on:
- Duplicate content - you say you crawl websites and your content is "90% unique from the source material". The content would be better if 90% of it was unique to your website only. I searched Google for a block of text from one of the links you provided and 5 websites appeared. Google doesn't like duplicate content - maybe your website does provide value to users but how can Google's algorithm know this if all it sees is duplicate content? Perhaps your site had poor user metrics (e.g. people pressing back to return to the search results page after a short period of time on your website) so Google demoted it. Can you see any evidence of this in Google Analytics?
- I tested http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964 on Pingdom Speed Test and "Your website is slower than 92% of all tested websites" - has a recent change caused that slowdown? Is the site registered with Google Webmaster Tools? If so, can you see a change in page speed, crawl rate etc.? Google Analytics can also report on page speed. That's very slow and if it's a recurring problem the first thing I'd fix.
While I was browsing this happened to me twice "Just nu har vi tekniska problem".
-
Remove some footer links. More footer links cab be affected any kind of website.No more then 100 links per page.
-
What? Not sure what you're talking about.
Here is an example of a page on my site:
http://www.hemjakt.se/bostad/59873-soldatgatan-13-b/ -
As per my knowledge can you delete footer links then i am sure you will rank again.
You putting more and more links in footer.
Thanks,
Akhilesh
-
What is your suggestion to solve it?
I truely think my application can add a lot of value to the users by giving them a chance to find a lot of data at the same spot, instead of having to navigate to a bunch of other sites. Since I also get access to "Big data" I can add a lot of value to the users by displaying price history from multiple sources.
I'm not trying to do anything shady or black hat. I'm just trying to create a search engine for real estate. Obviously that will generate a lot of content and many pages in a short amount of time.
What am I doing that is against the Google Guidelines? I doubt that "Adding pages too fast" is a reason to give a penalty, but more suspicious activity that would indiciate that a closer look by Google would be needed to see if any spam is going on.
-
Means in your website no instances (Inbound links). I think you create more pages in very short of time that's the reason behind this.
-
I have done no linkbuilding for this site. I have only focused on building a strong crawler and with ~60'000 pages/houses indexed I have been able to rank well on long queries with low competition, and thereby getting good traffic without any linkbuilding.
Moz reports 0 links and from my Google Analytics report I have only been getting some referral traffic from Reddit, a post that has now been deleted. All other traffic is organic search traffic. So I highly doubt that there are any "bad links" that is causing this punishment.
Regarding the crawler. I crawl other websites for real estate objects. I cache the HTML and I then filter out specific information such as price, number of rooms, address and so on.
I then publish this data with my own structure. This makes the data 87-90% unique if compared to the Source site. You can see example in the first post of this topic.
The 10-13% that is not unique, is the description of the real estate object which is usually a 1-2 paragraph text saying something like "This is an amazing house with 12 rooms with a great view that would be a perfect fit for the large family". This text is copied straight off. Which might be what gives me the punishment? Even if its only 10% of the page content. What do you think?
-
HI Marcus Lind,
Can you share me what activity have you done for SEO for this website and how much anchor and which anchor text created by you.
After that we can say clearly what is the exact issue.
-
Please note that I have not done any link building at all to the site, it's only 2 months old. There are no unnatural links, no keyword stuffing and no other black hat seo tactics used. No links have been purchased at all, so Penguin should not be relevant to this site.
I've attached an image of my Google Analytics Overview for the past months, and you can see how it has been naturally steadily growing day by day and then just taking a huge drop the last few days.
-
That's fantastic that you been keeping track of your Bing queries and it sounds like you've been hit by a penalty and going for. I am guessing
i can pull link data for you however my girlfriends want to kill me if I don't stop answering questions right now so look for my answer in about six hours of use
Ahrefs majestic SEO & Moz
-
Try running your queries in Bing and see how it compares. If you see the terms you ranked for earlier ranking well in Bing, then you know you've been hit by a penalty in Google.
That's what made it clear for a site I'm involved with and have been trying to recover for many months now.
-
I'm very sorry to hear that this is happened to you. Here is some information I hopefully helpful
This URL may say on the video that you have to be a member to see the video however you do not have to remember to read the transcript. Therefore read the transcript use the tool that tries to gauge now this is not 100% but it might give you a lot more insight into what's actually occurred. You have to have Moz opensiteexplorer.com and another tool like MajesticSEO.com or Ahrefs.com
I like Ahrefs however either one will do.Your first step is to Google “fruition Google penalty tool”,
https://fruition.net/google-penalty-checker-tool/
because that will bring up a cool tool that will show you whether or not you’ve been hit by Google Penguin, because there are several updateshttp://www.quicksprout.com/university/how-to-identify-and-recover-from-a-google-penguin-penalty/
So your aware this link shows a video that you may not be able to see you and I sure remember however because it is fully transcribed you have a exact how-to manual right below it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site traffic halved not sure why
Hi guys, not sure if anyone can help, but we had a client's google organic traffic literally halve from the week at the end of August to September (29 Aug 2016 to be precise) and it hasn't recovered since (here's a screenshot from GA http://puu.sh/sAmd3/b071dd1e57.png) I've been doing a lot of digging around on Moz and elsewhere about any Google updates that may have gone through around that time and there doesn't seem to be anything that I would think would affect it. I thought it might be to do with Penguin, but that doesn't seem to be the case. A while ago before then we did have some domains and pages 301 redirected to the main site when multiple other sites were rolled into the one, but I wouldn't have thought that should affect it. After that I've also gone and removed all those sites and redirects too (couple of weeks ago) but that doesn't seem to have fixed it. There's no black hat SEO done on the site so very odd to have this happen. I'm rather out of ideas what it could be that has impacted things so suddenly and that we couldn't get it recovered from. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | BrisbaneSEOWorks0 -
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Traffic exchange referral URL's
We have a client who once per month is being hit by easyihts4u.com and it is creating huge increases in their referrals. All the hits go to one page specifically. From the research we have done, this site and others like it, are not spam bots. We cannot understand how they choose sites to target and what good it does for them, or our client to have hits all on one days to one page? We created a filter in analytics to create what we think is a more accurate reflection of traffic. Should be block them at the server level as well?
White Hat / Black Hat SEO | | Teamzig0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Why Google still display search result so bad?
When I search this keyword Backlink คือ by Google browser(Google.co.th) then I saw these Domain that is spam keyword and worse content (Spin content and can not understand what it said) อํานาจเจริญ.dmc.tv/?p=19
White Hat / Black Hat SEO | | taradmkt
ฉะเชิงเทรา.dmc.tv/?p=28 พังงา.dmc.tv/?tag=backlink หนองคาย.dmc.tv/?p=97 ขอนแก่น.dmc.tv/?tag=backlink ชัยนาท.dmc.tv/?p=70 ตราด.dmc.tv/?tag=backlink etc As you can see the search result**.** My question is 1. How to tell Google to check these network 2. Why these network display Top 10 for 3 weeks!!!!! and after that they rank drop. 3. Why Facebook page rank on Google in the search result Please make me clear.0 -
Confusing penalties
Dear Mozzers, I've been working on a friend's website that is fighting for pretty competitive keywords (+90,000 gms) and has been relying almost exclusively on $1800/mo of comment spam to rank on the first page. Now that I've taken over SEO my first priorities were to: eliminate duplicate content improve site structure optimize internal links build legitimate do-follows add some keyword density fix titles and H tags Essentially just the basics, right? But since cancelling the comment spam, rankings for their primary keyword have consistently dropped over the last 3 months. I'm using the same strategies that I've used successfully on at least 6 similar websites. At the moment their homepage is still almost entirely duplicate content -- which is obviously a huge problem, but it seems a little odd that they could have been held up exclusively by that comment spam for so long, doesn't it? Even stranger, their authority and trust scores are now higher than any of the competition. Needless to say, my friends are getting pretty antsy and I'm starting to second guess myself. Do you think I should continue to push them to improve content, eliminate penalties, and build legitimate links -- or should I give in and suggest buying links as a short term solution? Advice is really appreciated!
White Hat / Black Hat SEO | | brevityworks0 -
How does someone rank page one on google for one domain for over 150 keywords?
A local seo is exclaiming his fantastic track record for a pool company(amonst others) in our local market. Over 150 keywords on page one of google. I checked out a few things using some moz tools and didn't find anything that would suggest that this has come from white hat strategies, tactics or links etc. Interested in how he is doing this and if it is white hat? Thanks, C
White Hat / Black Hat SEO | | charlesgrimm0