Original content, widely quoted - yet ignored by Google
-
Our website is https://greatfire.org. We are a non-profit working to bring transparency to online censorship in China. By helping us resolve this problem you are helping us in the cause of internet freedom.
If you search for "great firewall" or "great firewall of china", would you be interested in finding a database of what websites and searches are blocked by this Great Firewall of China? We have been running a non-profit project with this objective for almost a year and in so doing have created the biggest and most updated database of online censorship in China. Yet, to this date, you cannot find it in Google by searching for any relevant keywords.
A similar website, www.greatfirewallofchina.org, is listed as #3 when searching for "great firewall". Our website provides a more accurate testing tool, as well as historic data. Regardless of whether our service is better, we believe we should at least be included in the top 10.
We have been testing out an Adwords campaign to see whether our website is of interest to users using these keywords. For example, users searching for "great firewall of china" end up browsing on average 2.62 pages and spending 03:18 minutes on the website. This suggests to us that our website is of interest to users searching for these keywords.
Do you have any idea what the problem could be that is grave enough to not even include us in the top 100 for these keywords?
We have recently posted this same question on the Google Webmaster Central but did not get a satisfactory answer: http://www.google.com/support/forum/p/Webmasters/thread?tid=5c14a7e16c07cbb7&hl=en&fid=5c14a7e16c07cbb70004b5f1d985e70e
-
Thanks very much for your reply Jerod!
Google Webmaster Tools is set up and working. Some info:
-
No detected malware
-
1 crawl error (I think this must have been temporary. Only reported once, and this url is not in the robots.txt now):
- http://greatfire.org/url/190838
- URL restricted by robots.txt
- Dec 10, 2011
-
Pages crawled per day, average: 1102
-
Time spent downloading a page (in milliseconds), average: 2116
The robots.txt is mostly the standard one provided by Drupal. We've added "Disallow: /node/" because all interesting urls should have a more interesting alias than that. We'll look more into whether this can be the cause.
Anything else that you notice?
-
-
Hi, GreatFire-
We had a very similar problem with one of the sites we manage at http://www.miwaterstewardship.org/. The website is pretty good, the domain has dozens of super high-quality backlinks (including EDU and GOV links), but The Googles were being a real pain and not displaying the website in a SERP no matter what we did.
Ultimately, we think we found the solution in robots.txt. The entire site had been disallowed for quite a long time (at the client's request) while it was being built and updated. After we modified the robots.txt file, made sure Webmaster tools was up and running, pinged the site several times, etc. it was still being blocked in the SERPs. After two months or more of researching, trying fixes, and working on the issue, the site finally started being displayed. The only thing we can figure is that Google was "angry" (for all intents and purposes) at us for leaving the site blocked for so long.
No one at Google would come out and tell us that this was the case or even that it was a possibility. It's just our best guess at what happened.
I can see that greatwall.org also has a rather substantial robots.txt file in place. It looks like everything is in order in that file but it might still be causing some troubles.
Is Webmaster tools set up? Is the site being scanned and indexed properly?
You can read up on our conversation with SEOmoz users here if you're interested: http://www.seomoz.org/q/google-refuses-to-index-our-domain-any-suggestions
Good luck with this. I know how frustrating it can be!
Jerod
-
Hi GreatFire,
With regard to the homepage content - you really don't have much there for the search engines to get their teeth into. I would work on adding a few paragraphs of text explaining what your service does and what benefits it provides to your users.
I disagree that your blog should be viewed as only an extra to your website. It can be a great way to increase your keyword referral traffic, engage with your audience and get picked up by other sites.
Just because Wikipedia have written about your topic already doesn't mean you should't cover the subject in more detail - otherwise no one would have anything to write about!
As you have the knowledge on the subject, involved with it everyday, and have a website dedicated to it - you are the perfect candidate to start producing better content and become the 'hub' for all things related to the how China uses the internet.
Cheers
Andrew
-
Hi Andrew,
Thank you very much for your response. The two main differences you point out are very useful for us. We will keep working on links and social mentions.
One thing I am puzzled about though is the labeling of the site as "not having a lot of content". I feel this is misunderstanding the purpose of the website. The blog is only an extra. What we provide is a means to test whether any url is blocked or not in China, as well as download speed. For each url in our database, we provide a historic, calendar-view to help identify when a website was blocked or unblocked in the past.
So our website first and foremost offers a tool and a lot of non-text data. To me, expanding on the text content, while I understand the reasoning, sounds like recommending Google to place a long description of what a search engine is on their front page.
If you want to read the history of the Great Firewall of China, you can do it on Wikipedia. I don't see why we should explain it, when they do it better. On the other hand, if you want to know if website X is blocked or not in China, Wikipedia is not practical since it's only manually updated. Our data offers the latest status at all times.
Do you see what I mean? It would be great to hear what you think about this.
-
Hi GreatFire,
Your competitor has a much stronger site in the following two main areas:
- More backlinks (resulting in a higher PR)
- More social mentions
Focus on building more backlinks by researching your competitors domain with Open Site Explorer and MajesticSEO. Keep up your activity in your social circles, and also get going with Google+ if you haven't already.
You should also fix your title tag to include the target keyword at the start - not at the end. So it would read something like 'Great firewall of china - bringing transparency from greatfire.org'
Looking through your site you don't appear to have that much content (this was also mentioned in your Google Support thread) so I would focus on building out the content on the homepage and also further developing your blog. For example your 'Wukan Blocked only on Weibo' blog post is not really long enough to generate you much referral traffic. Larger authority articles of 1000+ words plus with richer content (link references, pictures, Google+ author/social connections) etc will help you far more.
Conduct the relevant keyword research for your blog posts in the same way you did with your root domain. This will keep your website niche focused and generating lots of similar 'china firewall' terms.
Hope that helps.
Cheers,
Andrew
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the "Homepage" for an International Website With Multiple Languages?
BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
Intermediate & Advanced SEO | | mirabile
/en-us, /de, etc....
2. Hreflang tags
3. Universal footer links so user can select their preferred language.
and
4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /. Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing. In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories. I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts? Thanks for any tips/feedback!2 -
"No Index, No Follow" or No Index, Follow" for URLs with Thin Content?
Greetings MOZ community: If I have a site with about 200 thin content pages that I want Google to remove from their index, should I set them to "No Index, No Follow" or to "No Index, Follow"? My SEO firm has advised me to set them to "No Index, Follow" but on a recent MOZ help forum post someone suggested "No Index, No Follow". The MOZ poster said that telling Google the content was should not be indexed but the links should be followed was inconstant and could get me into trouble. This make a lot of sense. What is proper form? As background, I think I have recently been hit with a Panda 4.0 penalty for thin content. I have several hundred URLs with less than 50 words and want them de-indexed. My site is a commercial real estate site and the listings apparently have too little content. Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Fixed "lower-case/mixed-case" Internal Links causing duplicate- Now What?
Hi, So after a site re-launch, Moz crawled it and reported over 150 duplicate content errors. It was determined that it was because of incorrect uses of capitalization in internal links. Using screaming frog, I found all (500+) internal links and fixed them to match the actual URL. Now the site is100% consistent across the board as best I can tell. I am unsure what to do next though. We launched the site with all the internal link errors, and now many of the pages that are indexed and ranked are with the incorrect URL form. Some have said to use a canonical tag. But how can I use a canonical tag on a page doesn't even exist? Same thing with 301. Can I redirect /examplepage to /ExamplePage if only /ExamplePage actually exists? I would really appreciate some advice on what to do. After I fixed the internal links, I waited a week and Moz crawled the site again and reported all the same errors, and then even more. All capitalization. Seems like it's a mess. After I did another Screaming Frog crawl, it showed no duplicates, so I know I was successful in fixing the internals. Help!!
Intermediate & Advanced SEO | | yogitrout10 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0 -
Where does "Pages Similar" link text come from?
When I type in a competitor name (in this case "buycostumes") Google shows several related websites in it's "Pages Similar to..." section at the bottom of the page: My question, can anyone tell me where the text comes from that Google uses as the link. Our competitors have nice branded links and our is just a keyword. I can find nothing on-page that Google is using so it must be coming from someplace off-page, but where?
Intermediate & Advanced SEO | | costume0 -
Google.ca vs Google.com Ranking
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca. Any help or thoughts would be appreciated.
Intermediate & Advanced SEO | | seorm0