Identifying why my site has a penalty
-
Hi,
My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this.
I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%.
Can anyone advise on what it could be?
Thanks!
Will
-
Great. Just audit it, fix problems, audit again, write more great content and give it time. Even if you fix the problem (assuming it was an onsite problem) it may take some time for Google to show the love agian.
-
Oh okay! That makes sense. Found a few issues with my php rules that automatically write links on a few of my contents pages.
I've learned some valuable tips here, such a fantastic help. I'm going to get the new site up in a week or two and we'll see if things change.
I'll keep you updated!
-
Ok so. If a bit of content resides at /bikes/mountain-bikes/ and the menulink I use is /bikes/mountain-bikes/ I'll get a status code 200. There is no added delay, no page rank lost, 200 == OK. The menu link points directly to the destination content.
Now lets say you've decided to change the location of that content to /bikes/mountain-bikes/index.html.
You do the 301 redirects on from the old url to the new one, THEN you need to update your links to reflect the new location so you're not just pointing at 301 redirects.
-
Thanks for the table of links. I'll see to it.
I'll work on the code on the new version of the site, seems pointless to do it now.
I've installed the plugin. How do I change the status code of a page? I don't really understand how it can be anything but 200, as if i'm viewing it it's obviously there! I thought 301's pushed the user to the 200 version of the page and only existed temporarily in the browser? Obviously I'm wrong, perhaps you could explain it for me?
Cheers for the screaming frog tool, looks great.
-
Did you change them. The scan I just did doesn't show them.... Maybe your host was getting funky or something lol.
Get this and click the links on your site. You want to link to status code 200, not 301
https://chrome.google.com/webstore/detail/server-status-code-inspec/bmngiaijlojlejaiijgedgejgcdnjnpk
I wouldn't de-index them, I havent found a legitimate reason to de-index anything since 2005, but im a programmer and normally don't need to patch things. You could probably quickly fix them just by adding some content/images.
im going to private msg you another spreadsheet. this should show you source+destination of your 404's and 301's.
btw, the spider im using is Screaming Frog, its the best I've found.
-
Just checked the 418's and they do seem to be already re-directed with 301's, or are actually in place. What would be the protocol here?
-
Got your message, thank you. What tool did you completed the crawl with? I'm sort of disappointed this stuff didn't come up in my seomoz weekly scans.
A few questions;
- How do i know where the 301's are being sent from? So in a this chain of events...
Link on a page on my site > routed via a 301 > landing on the desired page
... how do I find the first step in the process? the table you sent me seems to point out only the middle step.
- Yes the 'about us' and 'contact us' pages are weak. I'm building a new version of the site as we speak and will take care of it then. In the mean time, if i no-index them is that as good as getting rid of them?
I will now sort the 404's and 418's. Without wanting to sound like a broken record; thanks again! Do let me know if there is anything I can do in return once we've got to the bottom of this.
Will
-
private messaged you a google doc of the crawl. Looks like pages that no longer exist, they need 301's.
-
Wow, thanks for all this. It's late now in the UK so I'll check it out tomorrow.
Cheers
p.s. Where are my 418's coming from!?!
-
My crawl finished. You also have a bunch of status 418 "I'm a teapot" status codes. IDK what this is so I looked it up.
Per wikipedia:
418 I'm a teapot (RFC 2324)This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.
-
-
You'd think so, but 1) we cant fully trust everything Google says and 2) it could have been something that the algorithm progressively finds and penalizes.
Its possible that this is not related to links or content.
Take care of your RCS and make it awesome (real company shtuff)
About us (under construction content, not good)
Contact us (weak and thin, include social
FAQ
Terms and Conditions (404 error on your site!). I once broke all my footer links on a blog that was getting 5k/day and it slammed me down to 600/day nearly instantaneously. Ive seen other sites with 404 errors survive and even Cutts has downplayed the issue of 404 errors, but I believe any 404 can be indicative of a bad user experience. Scan your site for 404s and fix them all.
Also, many of your internal links appear to be pointing to 301 redirects. Update your links to point to the status 200 status code (directly to the destination, not through 301)
In just a quick overview, the above are my notes. This isnt a detailed audit, but you should scan your site for 404 errors and fix them, get your RCS stuff in order and conduct a full site review looking for anything that may be frowned upon by google.
-
Thanks devknob,
In answer to your questions;
-
it is across all organic traffic and all keywords to my entire site
-
the content on my site is fairly squeaky clean. I've been using the seomoz pro-tool to keep it in check. I use yoast seo for wordpress to handle my canonicals and employ no dodgy js hiding techniques. I did not remove content.
-
I haven't been buying links. I do have 20,000+ sitewide links coming from bikingbis.com and 12,000 sitewide links coming from citycyclingedinburgh.info/bbpress/. The ones from bikingbis have been removed and have requested removal of the other. Anchor text is varied and is mainly branded keywords
My question is though, if it's a bad backlink problem, wouldn't it coincide with a panda or penguin update?
Thanks again
Will
-
-
Check your analytics
- Is it a specific group of keywords?
- Is it organic traffic at all?
- Is it traffic to specific page or pages?
Check your website.
- Are your link canonicals setup CORRECTLY?
- Do you have content that is hidden via css/javascript and has no mechanism for unhiding?
- Have you changed alot of links recently and not performed 301 redirects?
- Do you have good content, title tags and meta descriptions?
- Did you remove content
Check your links
- Have you been buying links? Check your backlink profile using opensite explorer. Is there any unusual activity here?
- Is your anchor text varied?
Have you gotten a notice in Google Webmasters tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Seeking Top Notch Marketing Company with experience in growing sites post manual penalty
Does anyone know of a company who has direct experience with growing websites AFTER a manual link penalty has been lifted? Any referrals would be great!
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Manual Penalty Question
Hello dear MoZ community, I have already communicated this problem before but now it reaches to a level I have to make some hard decisions and would like your help. One of our new accounts (1 month old) got a manual penalty notification few weeks ago from Google for unnatural link building. I went through the whole process, did link detox and analysis and indeed there were lots of blog networks existing purely for cross linking. I removed these and the links got decreased dramatically. The company had around 250,000 links and truth be told if I go by the book only 700-800 of them are really worth and provide value. They will end up with roughly 15000 -20000 left which I acknowledge are a lot but some are coming from web 2 properties such as blogger, wordpress etc. Because the penalty was in some of the pages and not the whole web site I removed the ones that I identified were harming the web site, brought the anchor text down to normal levels and filed a very detailed reconsideration request and disavow file. I do not have a response so far by webmasters but here is where my concerns begin: Should I go for a new domain? losing 230.000 links ? How can there even be a "reconsideration" request for a web site with 85% of its link profile being cross linking to self owned directories and web 2 properties? If I go for a new domain should I redirect? Should I keep the domain, keep cleaning and adding new quality links so I take it with a fresh new approach? Thanks everyone in advance!
White Hat / Black Hat SEO | | artdivision0 -
Untrusted site - malware!
I recently had my link profile done as I was badly effected by something in 2012 (Penguin, Panda.. who knows? never got a message from google in webmaster about anything). Loads of INBOUND links were identified as being 'dodgy'' and the person highlighted them in different colors. However, another seo éxpert' told me to leave them (perhaps remove just 3 of them) and don't bother with the rest. Now I am not sure what to do? Any opinions? RED
White Hat / Black Hat SEO | | Llanero
3 were highlighted as being from untrusted malware. I think I should disavow them but really, would 3 make that much difference for a fall in my site? ORANGE
240 were said to be spam articles and I was advised:
The following pages highlighted in orange are on sites created for the purpose of publishing articles for link building. Since the same articles appear on multiple sites, Google views this as duplicate content. Links to Monteverde Tours in these articles should be removed or tagged "nofollow." Where this is not possible, the domains should be disavowed. YELLOW
85 were said to be from Low-quality directories
The following pages highlighted in yellow are on low-quality directories and link farms. Links to Monteverde Tours on these pages should be removed or the domains disavowed. GREEN
340 were said to be from sites were the page was not found , Account suspended, Problem loading page, Link removed, domain expired
The following pages highlighted in green include pages whose links to Monteverde Tours have been removed and pages that were inaccessible for various reasons, as shown in the Comments column. These pages or their domains should be disavowed to remove them from the Google index. I have read (and asked on this forum) about disavow but the more I read the more I am getting confused about the next action. I tried for one year to get rid of any bad outbound links, did blogging, social media, improved content, landing pages etc but all to no avail. Any opinions appreciated. I am not looking for a magic bullet, I know there isn't one. I know I need to keep improving content etc but after a year of NO improvements should I consider the link removal route? <colgroup><col width="215"></colgroup>
| Untrusted site - malware! |0 -
Am i getting backlink benefits from sites i design and host
I own & host over 300 domains for as many businesses. They all link back to my site from every page. but seomoz shows only hundred. so do other seo tools. why is that?
White Hat / Black Hat SEO | | nooptee0 -
Removing Unnatural Link Penalties
As soon as I began working in my current position at my current company I noticed my predecessor's tendency towards buying link packages from blackhat companies... I knew we were being penalized, and had to prove to him that we needed to halt those campaigns immediately and try our darndest to remove all poison links from the internet. I did convince him and began the process. There was 57% of our backlinks tied to the same anchor phrase with 836 domains linking to the same phrase, same page. Today there are 643 of those links remaining. So I have hit a large number of them, but not nearly enough. So now I am getting messages from Google announcing that our site has been hit with an unnatural link penalty. I haven't really seen the results of this yet in the keywords I am trying to rank for, but fear it will hurt very soon and know that I could be doing better in the meantime. I really don't know what to do next. I've tried the whole "contact the webmasters" technique and maybe have had 1/100 agree to remove our links. They all want money or don't respond.. Do I really need to use this Disavow tool?
White Hat / Black Hat SEO | | jesse-landry
I hear mixed things about it.. Anybody with experience here like to share their stories? Thanks for the moral support!0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Is it possible that since the Google Farmer's Update, that people practicing Google Bowling can negatively affect your site?
We have hundreds of random bad links that have been added to our sites across the board that nobody in our company paid for. Two of our domains have been penalized and three of our sites have pages that have been penalized. Our sites are established with quality content. One was built in 2007, the other in 2008. We pay writers to contribute quality and unique content. We just can't figure out a) Why the sites were pulled out of Google indexing suddenly after operating well for years b) Where the spike in links came from. Thanks
White Hat / Black Hat SEO | | dahnyogaworks0