Original content, widely quoted - yet ignored by Google
-
Our website is https://greatfire.org. We are a non-profit working to bring transparency to online censorship in China. By helping us resolve this problem you are helping us in the cause of internet freedom.
If you search for "great firewall" or "great firewall of china", would you be interested in finding a database of what websites and searches are blocked by this Great Firewall of China? We have been running a non-profit project with this objective for almost a year and in so doing have created the biggest and most updated database of online censorship in China. Yet, to this date, you cannot find it in Google by searching for any relevant keywords.
A similar website, www.greatfirewallofchina.org, is listed as #3 when searching for "great firewall". Our website provides a more accurate testing tool, as well as historic data. Regardless of whether our service is better, we believe we should at least be included in the top 10.
We have been testing out an Adwords campaign to see whether our website is of interest to users using these keywords. For example, users searching for "great firewall of china" end up browsing on average 2.62 pages and spending 03:18 minutes on the website. This suggests to us that our website is of interest to users searching for these keywords.
Do you have any idea what the problem could be that is grave enough to not even include us in the top 100 for these keywords?
We have recently posted this same question on the Google Webmaster Central but did not get a satisfactory answer: http://www.google.com/support/forum/p/Webmasters/thread?tid=5c14a7e16c07cbb7&hl=en&fid=5c14a7e16c07cbb70004b5f1d985e70e
-
Thanks very much for your reply Jerod!
Google Webmaster Tools is set up and working. Some info:
-
No detected malware
-
1 crawl error (I think this must have been temporary. Only reported once, and this url is not in the robots.txt now):
- http://greatfire.org/url/190838
- URL restricted by robots.txt
- Dec 10, 2011
-
Pages crawled per day, average: 1102
-
Time spent downloading a page (in milliseconds), average: 2116
The robots.txt is mostly the standard one provided by Drupal. We've added "Disallow: /node/" because all interesting urls should have a more interesting alias than that. We'll look more into whether this can be the cause.
Anything else that you notice?
-
-
Hi, GreatFire-
We had a very similar problem with one of the sites we manage at http://www.miwaterstewardship.org/. The website is pretty good, the domain has dozens of super high-quality backlinks (including EDU and GOV links), but The Googles were being a real pain and not displaying the website in a SERP no matter what we did.
Ultimately, we think we found the solution in robots.txt. The entire site had been disallowed for quite a long time (at the client's request) while it was being built and updated. After we modified the robots.txt file, made sure Webmaster tools was up and running, pinged the site several times, etc. it was still being blocked in the SERPs. After two months or more of researching, trying fixes, and working on the issue, the site finally started being displayed. The only thing we can figure is that Google was "angry" (for all intents and purposes) at us for leaving the site blocked for so long.
No one at Google would come out and tell us that this was the case or even that it was a possibility. It's just our best guess at what happened.
I can see that greatwall.org also has a rather substantial robots.txt file in place. It looks like everything is in order in that file but it might still be causing some troubles.
Is Webmaster tools set up? Is the site being scanned and indexed properly?
You can read up on our conversation with SEOmoz users here if you're interested: http://www.seomoz.org/q/google-refuses-to-index-our-domain-any-suggestions
Good luck with this. I know how frustrating it can be!
Jerod
-
Hi GreatFire,
With regard to the homepage content - you really don't have much there for the search engines to get their teeth into. I would work on adding a few paragraphs of text explaining what your service does and what benefits it provides to your users.
I disagree that your blog should be viewed as only an extra to your website. It can be a great way to increase your keyword referral traffic, engage with your audience and get picked up by other sites.
Just because Wikipedia have written about your topic already doesn't mean you should't cover the subject in more detail - otherwise no one would have anything to write about!
As you have the knowledge on the subject, involved with it everyday, and have a website dedicated to it - you are the perfect candidate to start producing better content and become the 'hub' for all things related to the how China uses the internet.
Cheers
Andrew
-
Hi Andrew,
Thank you very much for your response. The two main differences you point out are very useful for us. We will keep working on links and social mentions.
One thing I am puzzled about though is the labeling of the site as "not having a lot of content". I feel this is misunderstanding the purpose of the website. The blog is only an extra. What we provide is a means to test whether any url is blocked or not in China, as well as download speed. For each url in our database, we provide a historic, calendar-view to help identify when a website was blocked or unblocked in the past.
So our website first and foremost offers a tool and a lot of non-text data. To me, expanding on the text content, while I understand the reasoning, sounds like recommending Google to place a long description of what a search engine is on their front page.
If you want to read the history of the Great Firewall of China, you can do it on Wikipedia. I don't see why we should explain it, when they do it better. On the other hand, if you want to know if website X is blocked or not in China, Wikipedia is not practical since it's only manually updated. Our data offers the latest status at all times.
Do you see what I mean? It would be great to hear what you think about this.
-
Hi GreatFire,
Your competitor has a much stronger site in the following two main areas:
- More backlinks (resulting in a higher PR)
- More social mentions
Focus on building more backlinks by researching your competitors domain with Open Site Explorer and MajesticSEO. Keep up your activity in your social circles, and also get going with Google+ if you haven't already.
You should also fix your title tag to include the target keyword at the start - not at the end. So it would read something like 'Great firewall of china - bringing transparency from greatfire.org'
Looking through your site you don't appear to have that much content (this was also mentioned in your Google Support thread) so I would focus on building out the content on the homepage and also further developing your blog. For example your 'Wukan Blocked only on Weibo' blog post is not really long enough to generate you much referral traffic. Larger authority articles of 1000+ words plus with richer content (link references, pictures, Google+ author/social connections) etc will help you far more.
Conduct the relevant keyword research for your blog posts in the same way you did with your root domain. This will keep your website niche focused and generating lots of similar 'china firewall' terms.
Hope that helps.
Cheers,
Andrew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
What does this kind of rel="canonical" mean?
It looks like our CMS may not be configured correctly as there is an empty section in the rel="canonical" rel="canonical" href="{page_uri}" /> Will having the above meta tag be harmful to our SEO?
Intermediate & Advanced SEO | | voicesdotcom0 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Is my landing page "over-optimized"? Please help
Hello out there My website www.painterdublin.com and www.tilers-dublin.com were heavily hit by google panda update on 27.9.2012 and EMD update few days after. I lost about 70% of the traffic mainly from combination of the keywords from my domain name (painter dublin and tilers dublin) and never managed to recover from it. I am wondering if I should also concentrate on rewriting the content of both home landing pages in the terms of "KEYWORD DENSITY". Do you think my content is "OVER OPTIMIZED" for my main keywords? (painter dublin, tilers-dublin). What is the correct use? Is there any tool to guide me? I am aware I am using those terms quite often. I don't want to start deleting those terms before I know the right way to do it. Is there anybody willing to have a look at my sites and give me advice please? kind regards Jaro
Intermediate & Advanced SEO | | jarik0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
Duplicate content
Is there manual intervention required for a site that has been flagged for duplicate content to get back to its original rankings, once the duplicated content has been removed? Background: Our site recently experienced a significant drop in traffic around the time that a chunk of content from other sites (ie. duplicate) went live. While it was not an exact replica of the pages on other sites, there was quite a bit of overlap. That content has since been removed, but our traffic hasn't improved. What else can we do to improve our ranking?
Intermediate & Advanced SEO | | jamesti0 -
If google ignores links from "spammy" link directories ...
Then why does SEO moz have this list: http://www.seomoz.org/dp/seo-directory ?? Included in that list are some pretty spammy looking sites such as: <colgroup><col width="345"></colgroup>
Intermediate & Advanced SEO | | adriandg
| http://www.site-sift.com/ |
| http://www.2yi.net/ |
| http://www.sevenseek.com/ |
| http://greenstalk.com/ |
| http://anthonyparsons.com/ |
| http://www.rakcha.com/ |
| http://www.goguides.org/ |
| http://gosearchbusiness.com/ |
| http://funender.com/free_link_directory/ |
| http://www.joeant.com/ |
| http://www.browse8.com/ |
| http://linkopedia.com/ |
| http://kwika.org/ |
| http://tygo.com/ |
| http://netzoning.com/ |
| http://goongee.com/ |
| http://bigall.com/ |
| http://www.incrawler.com/ |
| http://rubberstamped.org/ |
| http://lookforth.com/ |
| http://worldsiteindex.com/ |
| http://linksgiving.com/ |
| http://azoos.com/ |
| http://www.uncoverthenet.com/ |
| http://ewilla.com/ |0