Identifying why my site has a penalty
-
Hi,
My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this.
I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%.
Can anyone advise on what it could be?
Thanks!
Will
-
Great. Just audit it, fix problems, audit again, write more great content and give it time. Even if you fix the problem (assuming it was an onsite problem) it may take some time for Google to show the love agian.
-
Oh okay! That makes sense. Found a few issues with my php rules that automatically write links on a few of my contents pages.
I've learned some valuable tips here, such a fantastic help. I'm going to get the new site up in a week or two and we'll see if things change.
I'll keep you updated!
-
Ok so. If a bit of content resides at /bikes/mountain-bikes/ and the menulink I use is /bikes/mountain-bikes/ I'll get a status code 200. There is no added delay, no page rank lost, 200 == OK. The menu link points directly to the destination content.
Now lets say you've decided to change the location of that content to /bikes/mountain-bikes/index.html.
You do the 301 redirects on from the old url to the new one, THEN you need to update your links to reflect the new location so you're not just pointing at 301 redirects.
-
Thanks for the table of links. I'll see to it.
I'll work on the code on the new version of the site, seems pointless to do it now.
I've installed the plugin. How do I change the status code of a page? I don't really understand how it can be anything but 200, as if i'm viewing it it's obviously there! I thought 301's pushed the user to the 200 version of the page and only existed temporarily in the browser? Obviously I'm wrong, perhaps you could explain it for me?
Cheers for the screaming frog tool, looks great.
-
Did you change them. The scan I just did doesn't show them.... Maybe your host was getting funky or something lol.
Get this and click the links on your site. You want to link to status code 200, not 301
https://chrome.google.com/webstore/detail/server-status-code-inspec/bmngiaijlojlejaiijgedgejgcdnjnpk
I wouldn't de-index them, I havent found a legitimate reason to de-index anything since 2005, but im a programmer and normally don't need to patch things. You could probably quickly fix them just by adding some content/images.
im going to private msg you another spreadsheet. this should show you source+destination of your 404's and 301's.
btw, the spider im using is Screaming Frog, its the best I've found.
-
Just checked the 418's and they do seem to be already re-directed with 301's, or are actually in place. What would be the protocol here?
-
Got your message, thank you. What tool did you completed the crawl with? I'm sort of disappointed this stuff didn't come up in my seomoz weekly scans.
A few questions;
- How do i know where the 301's are being sent from? So in a this chain of events...
Link on a page on my site > routed via a 301 > landing on the desired page
... how do I find the first step in the process? the table you sent me seems to point out only the middle step.
- Yes the 'about us' and 'contact us' pages are weak. I'm building a new version of the site as we speak and will take care of it then. In the mean time, if i no-index them is that as good as getting rid of them?
I will now sort the 404's and 418's. Without wanting to sound like a broken record; thanks again! Do let me know if there is anything I can do in return once we've got to the bottom of this.
Will
-
private messaged you a google doc of the crawl. Looks like pages that no longer exist, they need 301's.
-
Wow, thanks for all this. It's late now in the UK so I'll check it out tomorrow.
Cheers
p.s. Where are my 418's coming from!?!
-
My crawl finished. You also have a bunch of status 418 "I'm a teapot" status codes. IDK what this is so I looked it up.
Per wikipedia:
418 I'm a teapot (RFC 2324)This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.
-
-
You'd think so, but 1) we cant fully trust everything Google says and 2) it could have been something that the algorithm progressively finds and penalizes.
Its possible that this is not related to links or content.
Take care of your RCS and make it awesome (real company shtuff)
About us (under construction content, not good)
Contact us (weak and thin, include social
FAQ
Terms and Conditions (404 error on your site!). I once broke all my footer links on a blog that was getting 5k/day and it slammed me down to 600/day nearly instantaneously. Ive seen other sites with 404 errors survive and even Cutts has downplayed the issue of 404 errors, but I believe any 404 can be indicative of a bad user experience. Scan your site for 404s and fix them all.
Also, many of your internal links appear to be pointing to 301 redirects. Update your links to point to the status 200 status code (directly to the destination, not through 301)
In just a quick overview, the above are my notes. This isnt a detailed audit, but you should scan your site for 404 errors and fix them, get your RCS stuff in order and conduct a full site review looking for anything that may be frowned upon by google.
-
Thanks devknob,
In answer to your questions;
-
it is across all organic traffic and all keywords to my entire site
-
the content on my site is fairly squeaky clean. I've been using the seomoz pro-tool to keep it in check. I use yoast seo for wordpress to handle my canonicals and employ no dodgy js hiding techniques. I did not remove content.
-
I haven't been buying links. I do have 20,000+ sitewide links coming from bikingbis.com and 12,000 sitewide links coming from citycyclingedinburgh.info/bbpress/. The ones from bikingbis have been removed and have requested removal of the other. Anchor text is varied and is mainly branded keywords
My question is though, if it's a bad backlink problem, wouldn't it coincide with a panda or penguin update?
Thanks again
Will
-
-
Check your analytics
- Is it a specific group of keywords?
- Is it organic traffic at all?
- Is it traffic to specific page or pages?
Check your website.
- Are your link canonicals setup CORRECTLY?
- Do you have content that is hidden via css/javascript and has no mechanism for unhiding?
- Have you changed alot of links recently and not performed 301 redirects?
- Do you have good content, title tags and meta descriptions?
- Did you remove content
Check your links
- Have you been buying links? Check your backlink profile using opensite explorer. Is there any unusual activity here?
- Is your anchor text varied?
Have you gotten a notice in Google Webmasters tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Month old site and alreasdy ranks 3 for competitive keyword
I know this individual does this with several sites and then offers them for sale to his competitors. Obviously spammy thru and thru, but how can google reward a site thats not even two months old, with 1900 + links with a ranking of #3 for a highly competitive keyword? Please dont post the actual name or url of the website as we dont want to give him any more credit but this blows my mind as he has done this several times with other sites and never gets penalized. http://tinyurl.com/b9jysa5 Any ideas as to how he can accomplish this besides almost 2000 links in less than 2 months? How is that even remotely natural? I know his other sites have been reported to google but they never did anything about it. Thanks for any feedback.
White Hat / Black Hat SEO | | anthonytjm0 -
How The HELL Is This Site Ranking So Well In Google Places?
When I do a search for this site it ranks number 2 on Google just below the official federation of master builders website for the keyword phase "builders in london" this is the site http://bit.ly/Lypo8E which is a nasty looking blog which has nothing to do with builders and they don't even have an address anywhere on the site. The only thing I can see is that they are sharing there address with a lot of other businesses and all of the citations from those other businesses are causing them to rank higher on Google places, but surely Google can't be that stupid right?
White Hat / Black Hat SEO | | penn730 -
Is there a way to check if your site has a Google penalty?
Is there a way to find out if your site has an over optimization penalty?
White Hat / Black Hat SEO | | RonMedlin0 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0