Penalty or Algorithm hit?
-
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
-
Michael,
if you got hit with on the 24th of february, this was the Panda algorithm update.
First, if you sure that your content is 100% unique and a high quality site? i would go to
http://www.google.com/support/forum/p/Webmasters/thread?tid=76830633df82fd8e&hl=en&start=800
This thread is dedicated to people that have a high quality site that has been negatively affected by this change. a Google employee will look closer .
On the other hand, the stuff you can do to help your site are (this is my opinion, still webmasters and SEOs trying to figure our how they can get out or what are the criteria that triggered the panda update on their site)
- Trustworthy UI (user interface). your website is old (it looks like an old 1). see if there is a possibility to make a new site built on a robust CMS.
- site speed
-
Can I post analytics data or do I have to edit it first....
CLIFFS:
Site Traffic Drops 50% on Feb 24th and continues to ...
Traffic rises back to 100% March 3rd - 8th
Traffic drops back down to 50% on March 9th - Day after host advised me poorly and changed IP...
Traffic has been 50% of what it had been in the last few to this day, it is killing me financially
-
Okay, if this is the case, what are webmasters recommended to do ? Increase site speed? Any links appreciated.
Thanks
-
I think your talking about askthetrainer.com
after short analysis am sure that is not a penalty.
your site might be harmed from the Google Panda update
-
Michael, answering your question fully will require analyzing your site, and probably your traffic and GWT data. If you can post your site url, I'll take a brief look at it. If you can put together a traffic data graph showing your drops in traffic and how they coincided with what changes you were making to your site, that would be helpful, too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Covid19 algorithm update
So I've felt a rather large cliff dive on my major keywords but minor keywords are doing well. I've been told about a COVID19 update. Does anyone in here have any pointers or knowledge about this? I've dropped around 9 places and more importantly off the first page for my important keywords
Algorithm Updates | | Libra_Photographic0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
A company claiming to have a proprietary software that replicates Google algorithm?
Hi all, Unfortunately, getting into a bit of a p*ssing match 😞 with a company trying to compete for the business of one of our clients and just wanted to some feedback from the community here. The company competing for the client's business claims to have spent $1 million to replicate Google's algorithm so they create a replica site (not sure I understand this) of the client site, then test and optimize on-page SEO changes in their software to determine whether the on-page changes are ideal. Sounds fishy to me. Thoughts?
Algorithm Updates | | RickyShockley0 -
Have we been hit by an update?
Hi Everyone : ) We have an E-commerce website has been getting less and less organic traffic each month since October 2015 after it had received a massive spike overnight! I can't see any logical reason for it or find any updates that would have caused this. Have anyone got an idea? I know I am not giving much information out but wondered if anyone else had seen a similar issue. aZ7F6YA
Algorithm Updates | | O2C0 -
Im I being hit by hummingbird?
Hello, Our website is a private equity firm database, privateequityfirms.com. We rank well for a number of private equity definitions and terms and have been increasing rank in those terms but unfortunately we have been losing ranking in our main keyword and url "private equity firms" .We have ranked as high as 3rd under wikipedia in recent months. The only real changes we have made are too the sitemap that is auto generated every time some thing is changed in the database. Does anyone have any ideas what is going on? im i being hit by hummingbird. Thank you! MozAnalyticsPDF115_zpsddec64fa.png
Algorithm Updates | | Nicktaylor10 -
Can I only submit a reconsideration request if I have a penalty?
Hey guys, One of the sites I'm looking after took a hit with their rankings (particularly for one keyword that went from 6/7 to 50+) post-Penguin in May. Although, after cleaning-up the link profile somewhat we started to see some slow and steady progression in positions. The keyword that dropped to 50+ was moving upwards in advance of 20. However, a couple of weeks back, the keyword in question took another slide towards 35-40. I therefore wondered whether it would be best to submit a reconsideration request - even though the site did not receive a manual penalty. The website has a DA of 40 which more than matches a lot of the competitor websites that are ranking on first page for the aforementioned keyword. At this stage, I would have expected the site to have returned to its original ranking - four-and-a-half months after Penguin - but it hasn't. So a reconsideration request seemed logical. That said, when I came to go through the process on Webmaster Tools I was unable to find the option! Has it now been removed for sites that don't receive manual penalties?
Algorithm Updates | | Webrevolve1 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google Local Algorithm Changes?
I was wondering if you have heard about any Google Local algorithm changes. We have about 200 franchise locations. Some of our locations have dropped significantly over the past few weeks. Locations that were showing up in the 1-3 positions are now no longer showing on the first page. This is for very relevant phrases for our main line of business (which is also in our business name)... ‘Phrase, CITY NAME’. These locations have plenty of positive Google reviews. We would typically rank well for a phrase like that based on our relevance. I did some brainstorming. Do you think any of these could have any impact? Google is all about things looking and feeling natural including link building, etc. We have used Yext which made a lot of changes across the web to fix addresses, etc. Do you think Google may be seeing this as unnatural? Too many changes at to many sites in to short a period of time? Along those same lines, do you think Google may be penalizing some of our franchise pages for being to ‘perfect’? It would be ‘natural’ for addresses to have some difference across the web and a bit unnatural to have them all match so perfectly. I know that Google has always stated the business name should be listed in Google Local the way it is listed to the general public. Things such as “Business Name Boston” should be listed as “Business Name”. Each of our franchise locations is named in house to reflect their geo location..... "Business Name Boston", "Business Name St. Louis". Many of our competitors also use the practice of attaching geo terms as well. Do you think we may be getting hit with a penalty now even though we have listed things on Google with the Geo term for years.... and is how WE refer to each location? Is it possible that by working with Yext, we drew attention to this practice? Should we remove our local listings geo term on Google Local? How about across the web? We are in a business that does not require customers to come to our location. Some of our locations have not suppressed the address in their local listings while others have. Many of our competitors have not. Do you think this could play into it? Some of our locations that are not showing in Local have good organic results. Have you heard anything about Google dropping Local if they show in organic? I know Google has been looking at social media more and more and I believe they will continue to do so. If our local pages have no social presence, could this adversely affect things? (I think this is probably not the case…. but wanted to throw it out there) I have noticed that in some cases where Local has dropped, we have multiple offices in that metro area. Is it possible that this could affect things? Have you heard of any Local algorithm changes? I know they are releasing a new dashboard sporadically, could this be in conjunction with a larger Local algorithm change? Our CMS tool does not allow us to change Title/Meta per page (I know... terrible!!). So every page has the same title and same meta description. (We are changing our CMS system! Can't wait!). Could this play into it? Thanks for any feedback!
Algorithm Updates | | MABES1