Tuesday July 12th = We suddenly lost all our top Google rankings. Traffic cut in half. Ideas?
-
The attached screenshot shows all.
Panda update hit us hard = we lost half our traffic.
Three months later, Panda tweak gave us traffic back.
Now, this past Tuesday we lost half our traffic again and ALL our top ranking Keywords/phrases on Google (all other search engines keywords holding rank fine).
Did they tweak their algorithm again? What are we doing wrong??
-
couldn't they edit out the links back to your site?
-
something to throw into the mix from a guy I know who was also hit hard by the panda update was the incoming links cname. Do the incoming links you have all come from one particular source or host ?
-
It's a possibility. There have been ongoing discussions on internal linking on most of the major SEO forums for quite some time.
My personal feeling is that most websites can effectively reduce the number of internal links by auditing their link flow, determining their "ideal" real estate and ensuring that they are not necessarily duplicating unnecessary links, which would steal some of the juice that might otherwise go to some of the bigger pages.
Only <5% of people ever see the footer in the average website, so my opinion has always been that the footer should contain supporting links to areas to help the user in "context". Contact, About, Sitemap and Investors, for examples, are classic links one might find there.
Big real estate - or important pages in the website - should be linked to from your main nav or areas above the fold with lots of user exposure.
Keep in mind when changing and removing links - it is a process. Do not go in and remove all or a significant part of your links in one week.
Make one or two good changes, then wait for a period of a week or so, then make others small changes over time
Hope this helps.
Todd
www.seovisions.com -
OK. Well, my suggested strategy would be:
- Go through the list Todd gave to make sure that there is nothing wrong on your site. If not, you can probably assume it was a Google algorithm tweak, so proceed to...
- Work on improving your site, starting with any areas you know of that might be an issue. I would say any content that is not unique would be a good place to start.
-
If you think people are scraping your content, you might want to look into making sure you link back to your pages within your content so that you always get links back in those cases.
I noticed that when scraper site pick up SEOmoz content, tehy dont pick up the footer with the author links back. So make sure to keep links in the actual body text
I also saw a number of site that pull seomoz content into an iframe on their site. Places like twitter and Flickr detect when their pages are opened in an iframe and give a error message -> tabs.to/POL-Lp
-
Thanks - reading up on rel=author now. Looks kinda complicated from the Google instructions..
-
Great advice - I especially like the suspicious inbound links idea.. never thought of that.
We have had experts say that we have too many links per page on average on our site, and that we should consolidate the links in the sitewide footer as they add so many links per page.
Is this a good idea? Are we being hurt by having so many outbound links and such a link heavy footer?
Thanks
-
Place links within your content too if people are talking your content, also add the rel=author attribute.
I would also look at the link profile for your site, have you added any dodgy links recently.
Regards.
-
Drops such as the one you have experienced can be difficult to assess. I would advise the following procedures to rule out other issues first. As Egol correctly stated it is important to not jump to fixes right away, for two reasons. First of all, the situation could be temporary and could revert. Second of all, changes you make will obscure the potential issues, making it more difficult to find problem spots.
1. Check robots.txt to ensure there are no issues with file additions that could be blocking major pages
2. Check link canonical tag, if you use it, to ensure there are no issues there with incorrect urls
3. Check inbound links both using inbound link tools and Webmaster tools for any suspicious bursts of links or links that look dodgy you might not account for
4. Run a sitewide meta check on all titles and meta descriptions and ensure everything is correct. There are software companies that offer fairly inexpensive options that will spider the entire website relatively quickly. Do this late at night post-swell
4. Use Xenu to check all broken links and fix. Even if there are only a few.
5. Run the google bot indexing tool in GWT and check for any instances of funny code or potential problems
6. Analyze your analytics to determine which keyword clusters lots the most positioning. This can often give you clues as to what might have happened.Hope this helps.
Todd
www.seovisions.com -
Study this... implement carefully....
http://www.google.com/support/webmasters/bin/answer.py?answer=1229920
-
So at the footer of any article we publish by another author have a link to that author's original article with rel="author" in it?
-
Hold your fire. Your traffic might come back tomorrow.
However, it looks like you are on the edge of whatever Google does not like because you are flashing in and out.
Hang in there, keep working to improve and iron out any problems that you think might be causing this.
-
One thing that I would do is implement the rel="author" attribute in a link to your author pages if you have not already done that.
-
That page has been around for 10 years on our site, and as you can see, many many people are ripping it off, along with all other pages on our site.
Would you recommend an aggressive campaign of "cease and desist" emails to these plagiarizers, coupled with more addition of unique content??
-
All of our top ranked articles are unique, although many people rip them off and it's hard as heck for us to track them all down and get them to remove our content.
I recently did this for our "Raised Garden Bed" Page, which contributes a huge amount of our traffic. It was very labour intensive.
Our blog has a lot of articles that other authors have written and are republished elsewhere.
-
Adam is asking the right question. On http://eartheasy.com/live_water_saving.htm, I grabbed some text "Many beautiful shrubs and plants thrive with far less watering than other species. Replace herbaceous perennial borders with native plants. Native plants will use less water and be more resistant to local plant diseases." That text appears on a bunch of sites. I tried this with several other phrases from different pages on your site, and almost every time several other sites shared identical text to yours.
Google is penalizing you because your content is identical to a bunch of other sites. The more unique, original content you have, the more you should see your rankings rise.
-
Are your articles unique or are they syndicated?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rank brain and User stats
We have a company that has very good link metrics (much better than competitors), great content, conversions and generates 3x the amount of turnover the other companies do. The issue is we are being challenged on our number one keywords by this competitor selling lower value items to get to 70% of the same amount of customers as us and I feel we can put this down to Rank brain and us having a strong sales force and due to this very few make it back to the websites (30% are new and 70% are repeat and for repeat we have a strong sales force unlike that competitor which catch the reorder before the make it back to the site again) so we lose the ctr, brand etc from the repeat not returning (the competitor only sells though the website so all return) In effect they have 500 customers and we have 200 so Ctr, brand, back to serp would be stronger could this effect the rankings? If you look at it like this Two companies with 500 customers (200 new and 300 repeat) 500 customers to the first all have to order online so 500 customers going back to serps, CTR up due to them searching to get the site up etc 500 customers for the second but 300 (the repeat) go though the sales lines and find the number on the email or agents call them when predicted to order again so never have to go back to the site = 40% less CTR and staying on the site customers
Algorithm Updates | | BobAnderson0 -
Traffic cut-off since Google core update
Hi all, I am the webmaster of www.chepicap.com/en (Cryptocurrency news), and since the 3rd of june (Google core algorithm update) we got the hammer from Google. Organic traffic dropped with 90%+ overnight. We are still in the dark whether we can do to improve the current situation. Does someone have suggestions regarding this issue?
Algorithm Updates | | NielsDE0 -
Traffic inconsistency
Hello I verified my website on Google Search Console and submitted sitemap a long time ago. I have been getting data for impressions, but no clicks. However, Cpanel shows that people have been visiting my website, but Google Search Console does not. The data do not match up. Would you be able to explain why and what can I do about this? Also, Google Keyword Planner and Ubersuggest does not identify keywords on my website. Kind regards,
Algorithm Updates | | zi9gy
Igor kRdsaku MFFNnlZ1 -
Do I need to track my rankings on the keywords "dog" and "dogs" separately? Or does Google group them together?
I'm creating an SEO content plan for my website, for simplicity's sake lets say it is about dogs. Keeping SEO in mind, I want to strategically phrase my content and monitor my SERP rankings for each of my strategic keywords. I'm only given 150 keywords to track in Moz, do I need to treat singular and plural keywords separately? When I tried to find estimated monthly searches in Google's keyword planner, it is grouping together "dog" and "dogs" under "dogs"... and similarly "dog company" and "dog companies" under "dog companies". But when I use Moz to track my rankings for these keywords, they are separate and my rankings vary between the plural version and singular version of these words. Do I need to track and treat these keywords separately? Or are they grouped together for SEO's sake?
Algorithm Updates | | Fairstone0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Need Advice - Google Still Not Ranking
Hi Team - I really need some expert level advice on an issue I'm seeing with our site in Google. Here's the current status. We launched our website and app on the last week of November in 2014 (soft launch): http://goo.gl/Wnrqrq When we launched we were not showing up for any targeted keywords, long tailed included, even the title of our site in quotes. We ranked for our name only, and even that wasn't #1. Over time we were able to build up some rankings, although they were very low (120 - 140). Yesterday, we're back to not ranking for any keywords. Here's the history: While developing our app, and before I took over the site, the developer used a thin affiliate site to gather data and run a beta app over the course of 1 - 2 years. Upon taking on the site and moving to launch the new website/app I discovered what had been run under the domain. Since than the old site has been completely removed and rebuild, with all associated urls (.uk, .net, etc...) and subdomains shutdown. I've allowed all the old spammy pages (thousands of them to 404). We've disavowed the old domains (.net, .uk that were sending a ton of links to this), along with some links that seemed a little spammy that were pointing to our domain. There are no manual actions or messaged in Google Webmaster Tools. The new website uses (SSL) https for the entire site, it scores a 98 / 100 for a mobile usability (we beat our competitors on Google's PageSpeed Tool), it has been moved to a business level hosting service, 301's are correctly setup, added terms and conditions, have all our social profiles linked, linked WMT/Analytics/YouTube, started some Adwords, use rel="canonical", all the SEO 101 stuff ++. When I run the page through the moz tool for a specific keyword we score an A. When I did a crawl test everything came back looking good. We also pass using other tools. Google WMT, shows no html issues. We rank well on Bing, Yahoo and DuckDuckGo. However, for some reason Google will not rank the site, and since there is no manual action I have no course of action to submit a reconsideration request. From an advanced stance, should we bail on this domain, and move to the .co domain (that we own, but hasn't been used before)? If we 301 this domain over, since all our marketing is pointed to .com will this issue follow us? I see a lot of conflicting information on algorithmic issues following domains. Some say they do, some say they don't, some say they do since a lot of times people don't fix the issue. However, this is a brand new site, and we're following all of Google's rules. I suspect there is an algorithmic penalty (action) against the domain because of the old thin affiliate site that was used for the beta and data gathering app. Are we stuck till Google does an update? What's the deal with moving us up, than removing again? Thoughts, suggestions??? I purposely, did a short url to leave out the company name, please respect that, since I don't want our issues to popup on a web search. 🙂
Algorithm Updates | | get4it0 -
7 Pack Google Serps?
What is the best way to get into the 7 pack of google serps? I have a site that ranked well before this changed but not was pushed back to page 2. I have Unique content and I currently have provided my info to all the standard local sites, like Yelp, Manta, Local.com and others. I already have a Google Local page and I also have links from local sites. What else can be done?
Algorithm Updates | | bronxpad0 -
Why is this site ranking 1st?
I'm a relative SEO newbie, so please go easy on me. I've been an SEOMOZ pro user for a few months and have used it to dramatically improve my organic rankings. However, for the life of me, I cannot determine why the site that currently ranks number one, does so. For the factors I can determine, they shouldn't be ranking where they are, but reality is different. Could someone please offer me some ideas? My target keyword is "photography classes edmonton" My site is www.bsop.ca and I'm targetting the Google Canada engine. Any and all assistance is appreciated.
Algorithm Updates | | pburwell0