Should We Pull The Plug On This Site?
-
I am helping a retailer out with their site. They were hit hard with the Penguin update, and traffic has dropped by about 75%. Here are the stats:
-
It is fairly new, has been up for about 3 years.
-
Has partial match domain name
-
Is nearly fully indexed with over 4K pages
-
Has NOT received an unnatural link message from Google, so no manual penalty.
-
Has had most keywords BURIED in the search results.
-
Link profile: Has done about 50-100 blog comments, 500 directory submissions, 800 social bookmarks, 5-6 press releases, 300 article submissions (most removed), about 30-50 guest blog posts.
I am thinking it may have just been hit because of aggressive use of anchor text as opposed to massive spamming. Then again, the site has never really added great content and the product pages have no unique content.
Any thoughts?
-
-
Thanks. I've watched the video before but it's worth reviewing. Still seems a bit strange that someone can violate terms of service which G never bothered to enforce for years and get slammed with "Double Secret Probatiion" while a malicious site can clean up and eventually get the penalty lifted. No doubt a malicious site manual penalty should result in a long time in the penalty box but at least it's obvious what to fix. There doesn't seem to be a reliable consensus or even many case studies on garden variety Penguin recoveries yet. Not knowing what Dean Wormer wants me to change is irritating.
-
InHouseSEO - It's not an e-commerce site. (It's a blog with a couple of hundred posts many of which need pruning but many of which are high informative and written by someone with substantial experience in the subject.)
Sounds like you're telling me the best gamble is put in the work on this blog to try to grow the legit links so that the bad ones dip below the "tipping point" which prompts the Penguin attack. Have you had success with this tactic?
The home page appears to be penalized b/c of keyword rich text from relevant blog comments on mostly relevant blogs/pages. (It's also quite possible it's just a rather severe devaluation 30 or so spots in the SERPs for the EMD keyword). Other pages are hit or miss but the stronger pages (high bounce but very high times on pages) are beginning to return to some of their former strength (probably 50% of peak traffic).
Site traffic declined just before the 25th (the date that is associated with Panda 3.5) and resulted in a 20% hit. After Panda 3.5, the G traffic dove steadily (which I assume is Penguin added to the mix). Traffic is now off by around 2/3 without excluding the Bing traffic. (Have probably seen 15 -20% improvement recently with no new posts and only added one authorative directory link (Nat'l Trade Assoc. picked up the blog).
I just reread all of the comments in the thread you linked to. (Never received a warning in WMT so I assume the penalty is algo.)
Reading your comments, it sounds like you recomment attempting to remove any blog comments that I created. (I don't expect much success based on what people are sharing.
If my pet Penquin is algorhythmic and isn't scheduled to lift anytime in the next several months, should I try to guest blog my way out of the penalty? (Assume I have access to decent releveant indy blogs that are low authority but extremely legit.)
Thanks for the reminder to re-read the thread with you and Egol.
-
Do you have an e-commerce site? Is the site as a whole hit, or is it certain keywords/pages?
I would be careful with removing links, unless they are really spammy. You might do more harm than good.
I wrote about this here:
http://www.seomoz.org/q/using-dripable-to-build-url-links-too-dilute-link-profile
Anyways, good luck.
-
InHouseSEO - this is a GREAT question. I wish there were more discussion of realistic case studies like this one rather than so much "focus" on negative SEO and a handful of high authority sites that were probably hit by mistake.
The consensus seems to be that you can file for lifting a penalty IF you can show you removed bad links AND document the efforts you made to remove the bad links that remain despite your efforts.
Matt Cutts appears to say you're more screwed if the penalty is algorhythmic. Huh? Buy BMR links, remove them and escape the penalty G imposed on your site for 50 -100 presumably manual and relevant blog comments? Gimmee a break!
The 50 - 100 blog comments are probably going to be the worst of the lot to attempt to remove. Have you had any sucess removing the trash directories? You might be able to out grow the penalty by developing new links so that the number of suspicious (or bad) links falls below the tipping point. On a recent WBF, Danny Sullivan opined that Penguin is just a devaluation of the bad links. (Not my opinion but it's an interesting opinion.) No one has shared results but some people have suggested combining removing links with developing new strong ones.
Penguin is bizarre. Some of my pages are (very) slowly returning to their former top positions even when some of the bad links point to them. New pages with extensive content (think 2,000 words of unique/expert content) were among the first 2 - 3 to cover the event but now rank around 120. (Ouch).
I share your suspicion that for many of our sites, it's aggressive use of anchor text. Developing non-aggressive links may dig us out. Would love to hear from anyone who had tried this and what results they acheived.
-
if it was an algorithmic hit check out this video
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Does google give any advantage to Webmaster tools verified sites?
Hello friends, I am seeing a strange pattern. i register 2 new domain and make sites on them and add no backlinks nothing only put content and did on page seo right. After 1month of google indexing. both sites are not showing in search for the targeted keywords, but as soon as i add them to Google Webmaster tools they both automatically comes to the 16th and 24th number for their specific keywords. So my question is does Google give any advantage to sites which are verified and added into its webmaster tools in terms of seo or authority?
White Hat / Black Hat SEO | | RizwanAkbar0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Is guest posting good for main link-building tactic for eCommerce site
Hello, Is guest posting going to be devalued? We've been offering a guest post with one link in the body pointing towards one of our articles, and one home page link in the bio. We're looking at doing this as the main link building strategy. Is this still a good idea now and in the future? Thanks!
White Hat / Black Hat SEO | | BobGW0 -
More than one site in same industry
A client wants to have 3 sites in the same industry with a lot of overlapping keywords. Is that white hat? Will Google mind?
White Hat / Black Hat SEO | | BobGW0 -
Is this a 'real site' or a spam site for backlinks
I have been asked what type of site this is? What kind of page is this? [http://www.gotocostarica.com/](http://www.gotocostarica.com/) In my opinion it is site put up to create back links and should be avoided (especially in the light of the new Penguin and Panda updates coming). But I don't want to give wrong advice. What are your opinions?
White Hat / Black Hat SEO | | Llanero0 -
Will my association's network of sites get penalized for link farming?
Before beginning I found these similar topics here: http://www.seomoz.org/q/multiple-domains-on-same-ip-address-same-niche-but-different-locations http://www.seomoz.org/q/multiple-domains-on-1-ip-address We manage over two dozen dental sites that are individually owned through out the US. All these dentists are in a dental association which we also run and are featured on (http://www.acedentalresource.com/). Part of the dental associations core is sharing information to make them better dentists and to help their patients which in addition to their education, is why they are considered to be some of the best dentists in the world. As such, we build links from what we consider to be valuable content between the sites. Some sites are on different IPs and C-Blocks, some are not. Given the fact that each site is only promoting the dentist at that brick and mortar location but also has "follow" links to other dentists' content in the network we fear that we are in the grey area of link building practices. Questions are: Is there an effective way to utilize the power of the network if quality content is being shared? What risks are we facing given our network? Should each site be on a different IP? Would having some of our sites on different servers make our backlinks more valuable than having all of our sites under the same server? If it is decided that having unique IPs is best practice, would it be obvious that we made the switch? Keep in mind that ALL sites are involved in the association, so naturally they would be linking to each other, and the main resource website mentioned above. Thanks for your input!
White Hat / Black Hat SEO | | DigitalElevator0