Website penalised can't find where the problem is. Google went INSANE
-
Hello,
I desperately need a hand here!
Firstly I just want to say that I we never infracted google guidelines as far as we know. I have been around in this field for about 6 years and have had success with many websites on the way relying only in natural SEO and was never penalised until now.
The problem is that our website www.turbosconto.it is and we have no idea why. (not manual)
The web has been online for more than 6 months and it NEVER started to rank. it has about 2 organic visits a day at max.
In this time we got several links from good websites which are related to our topic which actually keep sending us about 50 visits a day. Nevertheless our organic visita are still 1 or 2 a day.
All the pages seem to be heavily penalised ... when you perform a search for any of our "shops"even including our Url, no entries for the domain appear.
A search example: http://www.turbosconto.it zalando
What I will expect to find as a result: http://www.turbosconto.it/buono-sconto-zalando
The same case repeats for all of the pages for the "shops" we promote.
Searching any of the brads + our domain shows no result except from "nike" and "euroclinix" (I see no relationship between these 2)
Some days before for these same type of searches it was showing pages from the domain which we blocked by robots months ago, and which go to 404 error instead of our optimised landing pages which cannot be found in the first 50 results. These pages are generated by our rating system...
We already send requests to de index all theses pages but they keep appearing for every new page that we create. And the real pages nowhere to be found...
Here isan example: http://www.turbosconto.it/shops/codice-promozionale-pimkie/rat
You can see how google indexes that for as in this search: site:www.turbosconto.it rateWhy on earth will google show a page which is blocked by the robots.txt displaying that the content cannot retrieved because it is blocked by the robots instead of showing pages which are totally SEO Friendly and content rich...
All the script from TurboSconto is the same one that we use in our spanish version www.turbocupones.com. With this last one we have awesome results, so it makes things even more weird...
Ok apart from those weird issues with the indexation and the robots, why did a research on out backlinks and we where surprised to fin a few bad links that we never asked for.
Never the less there are just a few and we have many HIGH QUALITY LINKS, which makes it hard to believe that this could be the reason.
Just to be sure we, we used the disavow tool for these links, here are the bad links we submitted 2 days ago:
domain: www.drilldown.it #we did not ask for this
domain: www.indicizza.net #we did not ask for this
domain: urlbook.in #we did not ask for this, moreover is a spammy one
http://inpe.br.way2seo.org/domain-list-878 #we did not ask for this, moreover is a spammy one
http://shady.nu.gomarathi.com/domain-list-789 #we did not ask for this, moreover is a spammy one
http://www.clicdopoclic.it/2013/12/i-migliori-siti-italiani-di-coupon-e.html #we did not ask for this, moreover and is a copy of a post of an other blog http://typo.domain.bi/turbosconto.it
I have no clue what can it be, we have no warning messages in the webmaster tools or anything.
For me it looks as if google has a BUG and went crazy on judging our italian website. Or perhaps we are just missing something ???If anyone could throw some light on this I will be really glad and willing to pay some compensation for the help provided.
THANKS A LOT!
-
Hi Sebastian,
I was wondering if you have any new things to add to your question or ask over the base of the answers received.
If it is not so, I kindly ask you to eventually flag your answer as "answered".
Ciao
-
My Italian isn't that great, but I've gathered it's a coupon site. It just looks like a new site to me. Though the link profile appears to be questionable. Turbocupones.com: 370 backlinks from one domain. That looks like a site wide, or something else.
I would say the best thing he can do is implement a canonical link element.
But also, followed advertising is a one way ticket to penalty town.
-
Just had a quick look and noticed that if you use the following query with the site: operator, you'll get what you expect to see:
site:http://www.turbosconto.it zalando
So I don't think you need to worry that your pages aren't being included in the index. I'm not sure that you're being penalised.
What keywords are your targeting and how strong is the competition? In Google Webmaster Tools, can you see which queries you're getting impression for (but not traffic?). Can you check the strength of the competition for these keywords. Is it just a case that your domain is just too week to get much search visibility?
-
Did you receive any message on Google Webmaster Tools as for a Manual Penalization?
OSE is not showing that many links... but try to check your links with attention.
For instance, I saw that in this site (http://www.solefrutta.it/) the link come from a banner, and that banner should be nofollowed IMHO.
As well you should avoid links from Directories, especially Italian SEO directories, which are - I'm Italian and I know well the Italian landscape - a sure way for being penalized partially or completely (check this: http://www.profdirectory.it/aziende-ecommerce/e-commerce/?order=alphabetic).
Also - just for being sure to check out every potential backlink issues - nofollow the links to the different language sites you have (turbocupones and turbocoupons); in fact, it would not be the first time I see those kind backlinks harming a site due to their site wide nature.
Related to URLs and robots.txt... if a URLs was indexed, even if you block googlebot via robots.txt, that URL will still be present in the index.
If you do a site:turbosconto.it, you will see how Google is not showing every URLs it has in the index, considering that 117 URLs are substantially identical to the ones it has already shown.
The easiest way to de-index something is using the "noindex" meta robots.
A tip: once you have put the noindex in every URL, create a specific sitemaps.xml file with just them, and upload it on GWT. This will make faster the crawl and de-indexation of the URLs you want to quit from the SERPs.
Said all this, I don't know how much useful I was, also because for understanding the reasons of such issues the limited time of a Q&A is not enough.
One thing I am quite sure... when it comes to On Site, Google very rarely does mistakes, and what may seem a strange behavior from its side, it is... but because we caused it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If Fetch As Google can render website, it should be appear on SERP ?
Hello everyone and thank you in advance for helping me. I have a Reactjs application which has been made by Create-React-App that is zero configuration. Also I connect it using Axios to the API using Codeigniter(PHP). Before using Reactjs, this website was at the top Google's SERPs for specific keywords. After Using Reactjs and some changes in URLs with no redirection in htaccess or something else, I lost my search engine visibility! I guess it should be caused by Google penalties! I tried using "react-snap", "react-snapshot" and so forth for prerendering but there are so many problem with them. Also I tried using Prerender.io and unfortunately my host provider didn't help me to config the shared host! Finally I found a great article that my website eventually display in Rendering box of Fetch As Google. But still in Fetching box, the dynamic content didn't display. But I can see my entire website in both "This is how Googlebot saw the page" and "This is how a visitor to your website would have seen the page" for all pages without any problem. If Fetch As Google can render the entire of the website, is it possible to index my pages after a while and it would be appear on Google's SERP? mokaab_serp.png
Intermediate & Advanced SEO | | hamoz10 -
Google blocks certain articles on my website ... !
Hello I have a website with more than 350 unique articles, Most of them are crawled by Google without a problem, but I find out certain articles are never indexed by Google. I tried to rewrite them, adding fresh images and optimizing them but it gets me nowhere. Lately, I rewrite an article of those and tried to (fetch and render) through Google Webmasters, and I found this result, can you tell me if there is anything to do to fix that? BMVh4
Intermediate & Advanced SEO | | Evcindex0 -
Http resolving to https - why isn't it doing that?
Hi everyone I've just been looking at a few https websites and noticed the http urls weren't redirecting to their https equivalents - why would a website owner not bother redirecting? As an example: http://www.marksandspencer.com I look forward to your feedback. L
Intermediate & Advanced SEO | | McTaggart0 -
How I can improve my website On page and Off page
My Website is guitarcontrol.com, I have very strong competition in market. Please advice me the list of improvements on my websites. In regarding ON page, Linkbuiding and Social media. What I can do to improve my website ranking?
Intermediate & Advanced SEO | | zoe.wilson170 -
How can a recruitment company get 'credit' from Google when syndicating job posts?
I'm working on an SEO strategy for a recruitment agency. Like many recruitment agencies, they write tons of great unique content each month and as agencies do, they post the job descriptions to job websites as well as their own. These job websites won't generally allow any linking back to the agency website from the post. What can we do to make Google realise that the originator of the post is the recruitment agency and they deserve the 'credit' for the content? The recruitment agency has a low domain authority and so we've very much at the start of the process. It would be a damn shamn if they produced so much great unique content but couldn't get Google to recognise it. Google's advice says: "Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content." - But none of that can happen. Those big job websites just won't do it. A previous post here didn't get a sufficient answer. I'm starting to think there isn't an answer, other than having more authority than the websites we're syndicating to. Which isn't going to happen any time soon! Any thoughts?
Intermediate & Advanced SEO | | Mark_Reynolds0 -
Finding Cause of Google Demotion (second time around!)
Our website, christnotes.org has historically ranked very well in it's space. We have always been in top 3 positions for daily bible verse related searches. There have been no fluctuations in rankings until it took a hit around September 4th through October 14th with approximately 35% drop in PVs and over 60% drop in traffic from Google. The site fully recovered google traffic mid-Oct. On November 24th the site was once again hit, this time with a 50% drop in pageviews and over 75% drop in traffic from google. Google Analytics Image depicting the two drops attached. When the first drop hit, we checked everything - bad links, broken URLs, page speed, etc. There was a slight increase in page speed so we did a little tweaking and made some improvements (8.36 second page load to 5.5) This time around, I can find no cause and no areas that need fixed to recover our rankings and traffic. Very confused on Google dropping rank then recovering after what looks like a page speed fix and then dropping again a month later. Any suggestions???? KGOgzEm
Intermediate & Advanced SEO | | KristieWahlquist0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Any idea why I can't add a Panoramio image link to my Google Places page?
Hey guys & gals! Last week, I watched one of the Pro Webinars on here related to Google Places. Since then, I have begun to help one of my friends with his GP page to get my feet wet. One of the tips from the webinar was to geotag images in Panoramio to use for your images on the Places page. However, when I try to do this, I just get an error that says they can't upload it at this time. I tried searching online for answers, but the G support pages that I have found where someone asks the same question, there is no resolution. Can anyone help? PS - I would prefer not to post publicly the business name, URL, etc. So, if that info is needed, I can PM. Thanks a lot!
Intermediate & Advanced SEO | | strong11