Tuesday July 12th = We suddenly lost all our top Google rankings. Traffic cut in half. Ideas?
-
The attached screenshot shows all.
Panda update hit us hard = we lost half our traffic.
Three months later, Panda tweak gave us traffic back.
Now, this past Tuesday we lost half our traffic again and ALL our top ranking Keywords/phrases on Google (all other search engines keywords holding rank fine).
Did they tweak their algorithm again? What are we doing wrong??
-
couldn't they edit out the links back to your site?
-
something to throw into the mix from a guy I know who was also hit hard by the panda update was the incoming links cname. Do the incoming links you have all come from one particular source or host ?
-
It's a possibility. There have been ongoing discussions on internal linking on most of the major SEO forums for quite some time.
My personal feeling is that most websites can effectively reduce the number of internal links by auditing their link flow, determining their "ideal" real estate and ensuring that they are not necessarily duplicating unnecessary links, which would steal some of the juice that might otherwise go to some of the bigger pages.
Only <5% of people ever see the footer in the average website, so my opinion has always been that the footer should contain supporting links to areas to help the user in "context". Contact, About, Sitemap and Investors, for examples, are classic links one might find there.
Big real estate - or important pages in the website - should be linked to from your main nav or areas above the fold with lots of user exposure.
Keep in mind when changing and removing links - it is a process. Do not go in and remove all or a significant part of your links in one week.
Make one or two good changes, then wait for a period of a week or so, then make others small changes over time
Hope this helps.
Todd
www.seovisions.com -
OK. Well, my suggested strategy would be:
- Go through the list Todd gave to make sure that there is nothing wrong on your site. If not, you can probably assume it was a Google algorithm tweak, so proceed to...
- Work on improving your site, starting with any areas you know of that might be an issue. I would say any content that is not unique would be a good place to start.
-
If you think people are scraping your content, you might want to look into making sure you link back to your pages within your content so that you always get links back in those cases.
I noticed that when scraper site pick up SEOmoz content, tehy dont pick up the footer with the author links back. So make sure to keep links in the actual body text
I also saw a number of site that pull seomoz content into an iframe on their site. Places like twitter and Flickr detect when their pages are opened in an iframe and give a error message -> tabs.to/POL-Lp
-
Thanks - reading up on rel=author now. Looks kinda complicated from the Google instructions..
-
Great advice - I especially like the suspicious inbound links idea.. never thought of that.
We have had experts say that we have too many links per page on average on our site, and that we should consolidate the links in the sitewide footer as they add so many links per page.
Is this a good idea? Are we being hurt by having so many outbound links and such a link heavy footer?
Thanks
-
Place links within your content too if people are talking your content, also add the rel=author attribute.
I would also look at the link profile for your site, have you added any dodgy links recently.
Regards.
-
Drops such as the one you have experienced can be difficult to assess. I would advise the following procedures to rule out other issues first. As Egol correctly stated it is important to not jump to fixes right away, for two reasons. First of all, the situation could be temporary and could revert. Second of all, changes you make will obscure the potential issues, making it more difficult to find problem spots.
1. Check robots.txt to ensure there are no issues with file additions that could be blocking major pages
2. Check link canonical tag, if you use it, to ensure there are no issues there with incorrect urls
3. Check inbound links both using inbound link tools and Webmaster tools for any suspicious bursts of links or links that look dodgy you might not account for
4. Run a sitewide meta check on all titles and meta descriptions and ensure everything is correct. There are software companies that offer fairly inexpensive options that will spider the entire website relatively quickly. Do this late at night post-swell
4. Use Xenu to check all broken links and fix. Even if there are only a few.
5. Run the google bot indexing tool in GWT and check for any instances of funny code or potential problems
6. Analyze your analytics to determine which keyword clusters lots the most positioning. This can often give you clues as to what might have happened.Hope this helps.
Todd
www.seovisions.com -
Study this... implement carefully....
http://www.google.com/support/webmasters/bin/answer.py?answer=1229920
-
So at the footer of any article we publish by another author have a link to that author's original article with rel="author" in it?
-
Hold your fire. Your traffic might come back tomorrow.
However, it looks like you are on the edge of whatever Google does not like because you are flashing in and out.
Hang in there, keep working to improve and iron out any problems that you think might be causing this.
-
One thing that I would do is implement the rel="author" attribute in a link to your author pages if you have not already done that.
-
That page has been around for 10 years on our site, and as you can see, many many people are ripping it off, along with all other pages on our site.
Would you recommend an aggressive campaign of "cease and desist" emails to these plagiarizers, coupled with more addition of unique content??
-
All of our top ranked articles are unique, although many people rip them off and it's hard as heck for us to track them all down and get them to remove our content.
I recently did this for our "Raised Garden Bed" Page, which contributes a huge amount of our traffic. It was very labour intensive.
Our blog has a lot of articles that other authors have written and are republished elsewhere.
-
Adam is asking the right question. On http://eartheasy.com/live_water_saving.htm, I grabbed some text "Many beautiful shrubs and plants thrive with far less watering than other species. Replace herbaceous perennial borders with native plants. Native plants will use less water and be more resistant to local plant diseases." That text appears on a bunch of sites. I tried this with several other phrases from different pages on your site, and almost every time several other sites shared identical text to yours.
Google is penalizing you because your content is identical to a bunch of other sites. The more unique, original content you have, the more you should see your rankings rise.
-
Are your articles unique or are they syndicated?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Not Indexing Pages
Hi there! I have a problem that I was hoping someone could help me with. On google search console, my website does not seem to be indexed well. In fact, even after rectifying problems that Moz's on-demand crawl has pointed out, it still does not become "valid". There are some of the excluded pages that Google has pointed out. I have rectified some of the issues but it doesn't seem to be helping. However, when I submitted the sitemap, it says that the URLs were discoverable, hence I am not sure why they can be discovered but are not deemed "valid". I would sincerely appreciate any suggestions or insights as to how can I go about to solve this issue. Thanks! Screenshot+%28341%29.png Screenshot+%28342%29.png Screenshot+%28343%29.png
Algorithm Updates | | Chowsey0 -
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hi all, Our log-in page is ranking in SERP instead of homepage and some times both pages rank for the primary keyword we targeted. We have even dropped. I am looking for a solution for this. Three points here to consider is: Our log-in page is the most visited page and landing page on the website. Even there is the primary keyword in this page or not; same scenario continues Log-in page is the first link bots touch when they crawling any page of our website as log-in page is linked on top navigation menu If we move login page to sub-domain, will it works? I am worrying that we loose so much traffic to our website which will be taken away from log-in page sub domain Please guide with your valuable suggestions. Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Google Hangout Video Takeover?
A while back I posted about a youtube video campaign that dominated the attorney rankings throughout Florida. Today, I noticed a new hangout video that does not have the reach of the before mentioned video, but it has just popped up as number three for the term "Tampa Car Accident Attorney." It wasn't even listed anywhere in the first few pages Monday. http://www.youtube.com/watch?v=barTgGYQTIM Has anyone else noticed Google Hangout Videos having this kind of success or is this a "flash in the pan" incident? Also, is there any significance to this even being a Google Hangout video as opposed to just a youtube video? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Google Authorship and Hobby Blog
I hope that someone can help me come up with the best option. Please forgive my ignorance on this issue. I have a hobby blog and up until now I have not wanted to associate it with my real name. It is a menswear blog about classic American style. I was afraid that it may be a hindrance if I was ever looking for a more conservative career than SEO. I am now reconsidering this and thinking that claiming it may be of more help than harm. Which brings me to Google Authorship. My dilemma and misunderstanding stems from the fact that I have mutliple Gmail accounts. I am guessing that some of the newer accounts have a G+ associated with them. So my question is do I use the email that is associated with my blog or my main gmail that I use personally? If I do use the gmail associated with the blog will it then become my default Google plus profile? Any insight would be helpful. Thanks in advance. If any of you are interested the hobby blog is Oxford Cloth Button Down.
Algorithm Updates | | JerrodDavid0 -
Will Ranking Reports be Affected with the new Google Changes?
For example: Raven stopped use of scraped Google, SEMRush data on Jan. 2 Raven stopped offering unauthorized Google SERP rankings and keyword data (a.k.a. scraped Google data) on Jan. 2, 2013. The change included the retirement of the SERP Tracker and the elimination of SEMRush data from the Raven platform. Raven has released new SEO performance reports that make it easy to show clients the impact of campaigns to improve organic traffic. Raven will continue to upgrade reports through the year. We thank the many customers who continue their business with Raven. More details about the SEO performance reports and other recent releases are available Is SEOMoz protected in some way? Or will you have to give up rankings reports too?
Algorithm Updates | | MSWD0 -
How To Rank High In Google Places?
Hello SEOmoz, This question has been hounding me for a long time and I've never seen a single reliable information from the web that answers it. Anyway here's my question; Supposing that there are three Google places for three different websites having the same categories and almost same keywords and same district/city/IP how does Google rank one high from the other? Or simply put if you own one of those websites and you would want to rank higher over your competitors in Google places Search results how does one do it? A number of theories were brought up by some of my colleagues: 1. The age of the listing 2. The number of links pointing to the listing (supposing that one can build links to ones listing) 3. The name/url of the listing, tags, description, etc. 4. The address of the listing. 5. Authority of the domain (linked website) You see some listings have either no description, and only one category and yet they rank number one for a specific term/keyword whereas others have complete categories, descriptions etc. If you could please give me a definite answer I will surely appreciate it. Thank you very much and more power!
Algorithm Updates | | LeeAnn300