Tuesday July 12th = We suddenly lost all our top Google rankings. Traffic cut in half. Ideas?
-
The attached screenshot shows all.
Panda update hit us hard = we lost half our traffic.
Three months later, Panda tweak gave us traffic back.
Now, this past Tuesday we lost half our traffic again and ALL our top ranking Keywords/phrases on Google (all other search engines keywords holding rank fine).
Did they tweak their algorithm again? What are we doing wrong??
-
couldn't they edit out the links back to your site?
-
something to throw into the mix from a guy I know who was also hit hard by the panda update was the incoming links cname. Do the incoming links you have all come from one particular source or host ?
-
It's a possibility. There have been ongoing discussions on internal linking on most of the major SEO forums for quite some time.
My personal feeling is that most websites can effectively reduce the number of internal links by auditing their link flow, determining their "ideal" real estate and ensuring that they are not necessarily duplicating unnecessary links, which would steal some of the juice that might otherwise go to some of the bigger pages.
Only <5% of people ever see the footer in the average website, so my opinion has always been that the footer should contain supporting links to areas to help the user in "context". Contact, About, Sitemap and Investors, for examples, are classic links one might find there.
Big real estate - or important pages in the website - should be linked to from your main nav or areas above the fold with lots of user exposure.
Keep in mind when changing and removing links - it is a process. Do not go in and remove all or a significant part of your links in one week.
Make one or two good changes, then wait for a period of a week or so, then make others small changes over time
Hope this helps.
Todd
www.seovisions.com -
OK. Well, my suggested strategy would be:
- Go through the list Todd gave to make sure that there is nothing wrong on your site. If not, you can probably assume it was a Google algorithm tweak, so proceed to...
- Work on improving your site, starting with any areas you know of that might be an issue. I would say any content that is not unique would be a good place to start.
-
If you think people are scraping your content, you might want to look into making sure you link back to your pages within your content so that you always get links back in those cases.
I noticed that when scraper site pick up SEOmoz content, tehy dont pick up the footer with the author links back. So make sure to keep links in the actual body text
I also saw a number of site that pull seomoz content into an iframe on their site. Places like twitter and Flickr detect when their pages are opened in an iframe and give a error message -> tabs.to/POL-Lp
-
Thanks - reading up on rel=author now. Looks kinda complicated from the Google instructions..
-
Great advice - I especially like the suspicious inbound links idea.. never thought of that.
We have had experts say that we have too many links per page on average on our site, and that we should consolidate the links in the sitewide footer as they add so many links per page.
Is this a good idea? Are we being hurt by having so many outbound links and such a link heavy footer?
Thanks
-
Place links within your content too if people are talking your content, also add the rel=author attribute.
I would also look at the link profile for your site, have you added any dodgy links recently.
Regards.
-
Drops such as the one you have experienced can be difficult to assess. I would advise the following procedures to rule out other issues first. As Egol correctly stated it is important to not jump to fixes right away, for two reasons. First of all, the situation could be temporary and could revert. Second of all, changes you make will obscure the potential issues, making it more difficult to find problem spots.
1. Check robots.txt to ensure there are no issues with file additions that could be blocking major pages
2. Check link canonical tag, if you use it, to ensure there are no issues there with incorrect urls
3. Check inbound links both using inbound link tools and Webmaster tools for any suspicious bursts of links or links that look dodgy you might not account for
4. Run a sitewide meta check on all titles and meta descriptions and ensure everything is correct. There are software companies that offer fairly inexpensive options that will spider the entire website relatively quickly. Do this late at night post-swell
4. Use Xenu to check all broken links and fix. Even if there are only a few.
5. Run the google bot indexing tool in GWT and check for any instances of funny code or potential problems
6. Analyze your analytics to determine which keyword clusters lots the most positioning. This can often give you clues as to what might have happened.Hope this helps.
Todd
www.seovisions.com -
Study this... implement carefully....
http://www.google.com/support/webmasters/bin/answer.py?answer=1229920
-
So at the footer of any article we publish by another author have a link to that author's original article with rel="author" in it?
-
Hold your fire. Your traffic might come back tomorrow.
However, it looks like you are on the edge of whatever Google does not like because you are flashing in and out.
Hang in there, keep working to improve and iron out any problems that you think might be causing this.
-
One thing that I would do is implement the rel="author" attribute in a link to your author pages if you have not already done that.
-
That page has been around for 10 years on our site, and as you can see, many many people are ripping it off, along with all other pages on our site.
Would you recommend an aggressive campaign of "cease and desist" emails to these plagiarizers, coupled with more addition of unique content??
-
All of our top ranked articles are unique, although many people rip them off and it's hard as heck for us to track them all down and get them to remove our content.
I recently did this for our "Raised Garden Bed" Page, which contributes a huge amount of our traffic. It was very labour intensive.
Our blog has a lot of articles that other authors have written and are republished elsewhere.
-
Adam is asking the right question. On http://eartheasy.com/live_water_saving.htm, I grabbed some text "Many beautiful shrubs and plants thrive with far less watering than other species. Replace herbaceous perennial borders with native plants. Native plants will use less water and be more resistant to local plant diseases." That text appears on a bunch of sites. I tried this with several other phrases from different pages on your site, and almost every time several other sites shared identical text to yours.
Google is penalizing you because your content is identical to a bunch of other sites. The more unique, original content you have, the more you should see your rankings rise.
-
Are your articles unique or are they syndicated?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google asking questions in SERPs
I just did s search for Hayley Kiyoko, and Google asked me which song is my favourite from her new album. Is this a new thing? I've never asked Google a question before and had it ask me something back, other than "did you mean... (the correct spelling for what I was looking for)?" u6qYnwq.png
Algorithm Updates | | 4RS_John1 -
Ranking impact: Traffic in website pages vs sub directory vs sub domain
Hi all, I need clarification on this. Not every time website main pages rank, some times even pages from sub directories or sub domains like blogs or guides; especially for branded keywords. I just wonder what happens when so much traffic is generating in sub directories and sub domains just because of limited landing pages in main website. Will this traffic be counted as traffic in main website as per Google? Traffic increase in main website really an ranking factor? Will the "brand + topic" related keywords' traffic is more for a website; will it ranking improves even for "topic keywords"? Thanks
Algorithm Updates | | vtmoz0 -
Very strange, inconsistent and unpredictable Google ranking
I have been searching through these forums and haven't come across someone that faces the same issue I am. The folks on the Google forums are certain this is an algorithm issue, but I just can't see the logic in that because this appears to be an issue fairly unique to me. I'll take you through what I've gone through. Sorry for it being long. Website URL: https://fenixbazaar.com 1. In early February, I made the switch to https with some small hiccups. Overall however the move was smooth, had redirects all in place, sitemap, indexing was all fine. 2. One night, my organic traffic dropped by almost 100%. All of my top-ranking articles completely disappeared from rank. Top keyword searches were no longer yielding my best performing articles on the front page of results, nor on the last page of results. My pages were still being indexed, but keyword searches weren't delivering my pages in results. I went from 70-100 active users to 0. 3. The next morning, everything was fine. Traffic back up. Top keywords yielding results for my site on the front page. All was back to normal. Traffic shot up. Only problem was the same issue happened that night, and again for the next three nights. Up and down. 4. I had a developer and SEO guy look into my backend to make sure everything was okay. He said there were some redirection issues but nothing that would cause such a significant drop. No errors in Search Console. No warnings. 5. Eventually, the issue stopped and my traffic improved back to where it was. Then everything went great: the site was accepted into Google News, I installed AMP pages perfectly and my traffic boomed for almost 2 weeks. 6. At this point numerous issues with my host provider, price increases, and incredibly outdated cpanel forced me to change hosts. I did without any issues, although I lost a number of articles albeit low-traffic ones in the move. These now deliver 404s and are no longer indexed in the sitemap. 7. After the move there were a number of AMP errors, which I resolved and now I sit at 0 errors. Perfect...or so it seems. 8. Last week I applied for hsts preload and am awaiting submission. My site was in working order and appeared set to get submitted. I applied after I changed hosts. 9. The past 5 days or so has seen good traffic, fantastic traffic to my AMP pages, great Google News tracking, linking from high-authority sites. Good performance all round. 10. I wake up this morning to find 0 active people on my site. I do a Google search and notice my site isn't even the first result whenever I do an actual search for my name. The site doesn't even rank for its own name! My site is still indexed but search results do not yield results for my actual sites. Check Search Console and realised the sitemap had been "processed" yesterday with most pages indexed, which is weird because it was submitted and processed about a week earlier. I resubmitted the sitemap and it appears to have been processed and approved immediately. No changes to search results. 11. All top-ranking content that previously placed in carousal or "Top Stories" in Google News have gone. Top-ranking keywords no longer bring back results with my site: I went through the top 10 ranking keywords for my site, my pages don't appear anywhere in the results, going as far back as page 20 (last page). The pages are still indexed when I check, but simply don't appear in search results. It's happening all over again! Is this an issue any of you have heard of before? Where a site is still being indexed, but has been completely removed from search results, only to return within a few hours? Up and down? I suspect it may be a technical issue, first with the move to https, and now with changing hosts. The fact the sitemap says processed yesterday, suggests maybe it updated and removed the 404s (there were maybe 10), and now Google is attempting to reindexed? Could this be viable? The reason I am skeptical of it being an algorithm issue is because within a matter of hours my articles are ranking again for certain keywords. And this issue has only happened after a change to the site has been applied. Any feedback would be greatly appreciated 🙂
Algorithm Updates | | fenixbazaar0 -
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
Lost 75% of my traffic on Oct 25, help appreciated
So I've been running coolquotescollection.com since 1997 (!) as a hobby project. I lost about 75% of my organic search traffic on the 25th of October, literally overnight. I've been doing a lot of research but I still don't know why I was penalized. Image attached. I naturally thought this was because of Penguin (Oct 17, my drop was Oct 25). However, after checking backlinks I only discovered 11 domains with about 100-400 links each, the major ones were forum signatures and blog sidebars, 6 domains were spam sites / directories. They almost exclusively used the same anchor text (domain name or similar), so this doesn't seem like a black hat attack. Some of the directories used keywords in their urls however (like "funny quotes").
Algorithm Updates | | Sire
1. Is this really enough for such a heavy penalty?
I added these domains to be disavowed today, I'm aware this might take weeks or months to change. I've automated so that pictures gets uploaded to my Facebook page with a link to my page. This started in early 2014.
2. Can Facebook links be considered link spam?
They don't even show up in webmaster tools.
Example: https://www.facebook.com/CoolQuotesCollection/photos/a.510328825689624.1073741825.326096120779563/615403025182203/?type=1&theater I analyzed keywords and the major ones dropped between 2 and 6 positions. Notable exception: I seem to still rank nr 1 for "cool movie quotes" even though page is not optimized for that keyword. Moz warned about over 5000 pages with duplicate content. It was a single page that used a querystring url parameter I have excluded in webmaster tools. I have now entered a canonical link on these pages. Example:
http://coolquotescollection.com/Home/TShirts?url=http-url-example...
http://coolquotescollection.com/Home/TShirts?url=http-another-url.......
3. Could the Google algo penalize this even though I have excluded the "url" parameter? I have a lot of internal links in the page navigation. Can this cause problems? See the absolute bottom of this page where I have 94 links for example: http://coolquotescollection.com/laughs
4. Could a lot of internal links (navigation to page numbers) be the problem? Some more facts: Site is http://coolquotescollection.com/ Domain is 14 years old. The web site launched in Sep 1997, a year before Google! (Not relevant but you might understand why this is important to me). I haven't done any SEO work for at least 12 months, probably closer to two years. The only SEO work I've done is to optimize the pages, no link building at all, no black hat stuff. I'm automatically building a sitemap that contains all pages, see here: http://coolquotescollection.com/robots.txt I've used webmaster tools for years, haven't gotten any warnings. I checked backlinks there, also here from moz and ahrefs. I'm annoyed that a quality content site can be penalized so hard (75% drop) when there are no, or just smaller issues. I'm just lucky this is not my business site, if so I would have gone out of business. Any help in this matter would be greatly appreciated! z3yFNdb.png Cci7vfI.png0 -
Page details in Google Search
I noticed this morning a drop in the SERPs for a couple of my main keywords. And even though this is a little annoying the more pressing matter is that Google is not displaying the meta title I have specified for the majority of my sites pages, despite one being specified and knowing my site has them in place. Could this sudden change to not using my specified title be the cause of the drop, and why would they be being displayed by Google in the first place, when they are there to be used. The title currently being displayed inthe SERPs is not anything that has been specified in the past or from the previous latest crawl etc. Any insight would be appreciated. Tim
Algorithm Updates | | TimHolmes0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Google new update question
I was just reading this, http://www.entrepreneur.com/blog/220662 We have our official site, which has 200+ service pages, which we wrote once and we keep doing SEO for them, so they rank high all the time. Now my question is, how does Google handle the site freshness ? Service static pages or if we are adding blog items, then also they consider them as fresh site, right ? So, we dont have to update those service pages, right ?
Algorithm Updates | | qubesys0