Please help on this penalized site!
-
OK, this is slowly frying my brain and would like some clarification from someone in the know, we have posted multiple reconsideration requests the regular "site violates googles quality guidelines" .."look for unnatural links etc" email back in March 2012,
I came aboard the business in August 2012 to overcome bad SEO companies work. So far i have filled several disavow requests by domain and cleared over 90% of our backlink profile which where all directory, multiple forum spam links etc from WMT, OSE and Ahrefs and compiled this to the disavow tool, as well as sending a google docs shared file in our reconsideration request of all the links we have been able to remove and the disavow tool, since most where built in 2009/2010 a lot where impossible to remove.
We managed to shift about 12 - 15% of our backlink profile by working very very hard too remove them. The only links that where left where quality links and forum posts created by genuine users and relevant non spam links As well as this we now have a high quality link profile which has also counteracted a lot of the bad "seo" work done by these previous companies, i have explained this fully in our reconsideration request as well as a massive apology on behalf of the work those companies did, and we are STILL getting generic "site violates" messages, so far we have spent in excess of 150 hours to get this penalty removed and so far Google hasn't even batted an eyelid.
We have worked SO hard to combat this issue it almost feels almost very personal, if Google read the reconsideration request they would see how much work we have done too remove this issue. If anyone can give any updates or help on anything we have missed i would appreciate it, i feel like we have covered every base!!
Chris
-
Ha, okay, got it!
Go after some long tail content and start adding some value to the site and you should have a broader scope and some pages that will rank on their own merit. Gives you something constructive to do and some possible link targets!
-
Absolutely, our rankings have been static since i started so i hope we will see some results soonish will keep you posted.
Chris
Ps - Peter owns the business and set our Moz account up! i do the work! hah
-
Hey Chris (peter? sorry)
Yeah, will be interesting, every case is unique I figure but certainly, if you get rid of the problems you can at least concentrate on moving forwards.
Keep me posted.
-
thanks Marcus, we have whilst trying to remove the bad replace with good and get white hat all the way, so my biggest worry now is too see how much recovery actually happens, the general consensus seems to be anywhere between 10 - 90 days!
Chris
-
Hey Peter, that's great news, keep us posted on how that works out from a traffic perspective and would love a poke around in your analytics once the dust has settled.
Cheers,
Marcus
-
Just an update Marcus - this morning we got our "manual spam penalty revoked" notice so lets see if rankings return.
-
so far copyscape/plagspotter has shown nothing, website has been changed since the warning came in and a lot of content was changed. i feel that one issue might be down to the enjin.com list of forums that run one forum but span hundreds of websites with different layouts (horrid spam that wasnt actually created by bad seo's!) ive been asking for ages to get this removed and its all been filed into our disavow. Like i said i think 90%+ of our backlinks across WMT/ahrefs/OSE have been requested to disavow.
None of our blog content has been duplicated but this is new and didnt exist when the unnatural warning was issued, its just so frustrating how i feel we have covered every angle and you find most of your working life becoming a minion to the google algorithmn!
-
Hi Brian, will definately look at this thanks, i know that our duplicate issues across the site are slim and other onpage things - massive increase in site loading times etc etc are going well, thanks
-
Hi Marcus, i agree, sure, in detail:
-
built high quality natural link profile with varied anchor text since Aug
-
provided a text document of removed links (about 12 - 15% of WMT/OSE/Ahrefs link profile) in recon request
-
provided a list of every directory and bad looking forum links, irrelevant websites in disavow tool by domain:
-
made a simple text copy on google docs of disavow in recon request.
-
apologised on behalf of work done by previous companies
-
promised all future work will be compliant to webmaster standards.
I have ran several audits since working with Palicomp, everything from canonicalisation to crawl errors to dupe content and ive found a small amount of source code duplicates (down to cubecarts awful cms) but not much else
-
-
Hey Chris
Often, it would seem they want to see a continued effort on your part to clean up and the link profile still looks pretty stinky in OSE. That said, I appreciate, it's almost impossible to get many links removed so you can only do so much.
It may be that you just have to keep trying, keep cleaning up, keep emailing, send letters, make phonecalls etc and just show even more willing before they will let you out of penalty jail.
Have you had a professional audit done with regards to the problems on the site? Have you looked at duplication etc?
Can you detail what you have done in your reconsideration requests?
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalized By Google
My site name is bestmedtour .it's in English. I also want to have the Arabic version of the site. If I translate it with Google Translate, is it possible that the Arabic version of the site will be penalized?
Intermediate & Advanced SEO | | aalinlandacc0 -
PLEASE HELP - Old query string URL causing problems
For a long time, we were ranking 1st/2nd for the term "Manual handling training". That was until about 5 days ago when I realised that Google had started to index not only a query stringed URL, but also an old version of the URL. What was even weirder was that when you clicked on the result it 301 redirected to the page that it was meant to display... The wrong URL that Google had started to index was: www.ihasco.co.uk/courses/detail/manual-handling?channel=retail The correct URL that it should have been indexing is: https://www.ihasco.co.uk/courses/detail/manual-handling-training I can't get my head around why it has done this as a 301 was in place already and we use rel canonical tags which point to the main parent pages. Anyway, we slapped a noindex tag in our robots.txt file to stop that page from being indexed, which worked but now I can't get the correct page to be indexed, even after a Google fetch. After inspecting the correct URL in the new search console I discovered that Google has ignored the rel canonical on the page (Which points to itself) and has selected the wrong, query stringed URL as the canonical. Why? and how do I rectify this?
Intermediate & Advanced SEO | | iHasco1 -
Language Tunnel - Help!
Hi, First post here. A few months back (before they were my client), my client updated their site to include a language tunnel. It looks like some other updates were made as well to "prettify" the site's URLs. Unfortunately, after this update, lots of well-ranking landing pages are now completely gone with no redirects in place. Normally, I would just give them a list of these old pages and say "301 Redirect" to X page. However, as part of this site update, they added country code into the mix. So now, instead of just 6 or 7 languages, we are looking at 30-40 permutations of language and country (with some countries having multiple languages). The functionality of the new site is fine, but all of the old 404s are not being kind to the search engine traffic. My question is: what's the best way to resolve this problem? These old pages usually specify a language code (but no country code). So, for example, I am thinking of redirecting all of the Spanish 404 urls to a Spanish "country tunnel". However, this is obviously not the same as what we had before, where the actual pages were indexed. Since my old pages no longer exist and I've got this country problem now (to stand in the way of a straightforward redirect), is there any way to appease the SEO gods on this?
Intermediate & Advanced SEO | | navdm0 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Google cache is showing my UK homepage site instead of the US homepage and ranking the UK site in US
Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR
Intermediate & Advanced SEO | | adzhass0 -
Why does old "Free" site ranks better than new "Optimized" site?
My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!
Intermediate & Advanced SEO | | WhatUpHud0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0