Should I noindex user-created fundraising pages?
-
Hello Moz community!
I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages.
These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site?
My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website.
Any insight you fine folks could provide would be greatly appreciated!
-
I'd tread very carefully here as thing 1 and thing 2 seem to contradict each other at face value. You're right, Google can send traffic to a site in ways other than keywords, but it's not the norm. The next thing I'd look at is, hmm - how are we tracking keyword rankings? Is it an online, cloud based rank tracker that relies on you specifying all of (and all of the right) keywords to track? Most of those trackers track between 50 and 300 KWs (daily, weekly) but it's not uncommon for such sites to have 10,000+ keywords contributing. If they're not all in there, it's a bad sample you are looking at. Connect Google Search Console to Google Analytics. let it run for a few weeks, analyse the 'search query' data from within Google Analytics (which can be done once it's all hooked up). GSC only lets you export 1k keywords (usually, sometimes it can be more) but GA will take 5k and that's much better for your analysis. You might be surprised to find, those pages rank for more keywords than you thought. maybe hundreds of little ones, instead of a few big ones
-
Effectdigital is right in looking at your analytics and backlinks to help make this decision.
In the Moz case study we referenced earlier, they were getting rid of pages that didn't provide value at all to anyone. Those pages probably didn't have any links pointing to them at all. So it made sense to get rid of them.
Since your pages are providing value (it seems) and your getting 1/3 of your traffic coming into those pages, we would tread carefully on meta noindexing them.
You might only consider meta noindexing a group of them that haven't brought in any traffic this whole year and that don't have any links pointing to them. That way, you won't lose any existing traffic that your getting but you can see if the trimming helps your site's overall traffic and rankings.
-
Appreciate the word of caution, I'm relatively new and am looking for well-rounded opinions about the repercussions such a massive move could make for our site. As a response:
Thing #1: We don't have many fundraising pages that rank highly for keywords, as we're still working on juicing up our regular site pages as is to improve in the SERP results. I was mainly wondering whether the glut of fundraising pages could be harming our SERP results. Some certainly have duplicate content but that's beyond our control, and I'm not sure if that could significantly be harming our results. Any thoughts on that?
Thing #2: Great call on checking the data. YTD nearly 1/3 of our user sessions have landed on one of these fundraising pages. I'm guessing that's likely either the hosts using google to find their page and then subsequently log in, or friends searching for it on google and then navigating and donating. We do still have a Google Custom Search Engine on our site. Presumably people could find them that way?
If you have any additional opinions or feedback given what I detailed above, I'd very much appreciate it!
-
Be VERY careful
Thing #1) Just because you stop Google indexing and crawling some pages, that doesn't mean they will give that same traffic (keywords linking to those pages) to other URLs on your site. They may decide that your other URLs, do not satisfy the specific keywords connecting with the fundraiser URLs
Thing#2) CHECK. Go onto Google Analytics and actually check what percentage of your Google traffic (and overall traffic, I guess) comes specifically through these URLs. If it's like 2-3%, no big deal. If most of your traffic comes to and lands on these pages, no-indexing them all could be the single largest mistake you'll ever make
Blog posts and articles are fun but no substitute for checking your own, real, actual, factual data. Always always do that
-
Thanks! I've been wondering about it for awhile and actually stumbled upon this very article today - which prompted the question
-
Britney Muller, with Moz, did just that when she meta noindexed over 70,000 low quality profile pages created by users. As a result, Moz saw an increase in organic users, almost 9% the following month and then they saw a lift of 13.7% year-over-year for organic traffic the following month.
You can read all about it or watch the interview about it here: https://www.getcredo.com/britney-muller/
We think it's worth a try for sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Why does one page rank while a similar page doesn't?
We have a blog post (actually several of them that have the same SEO characteristics) that brings a fair amount of traffic to our site (relatively speaking), and according to Google Webmaster Tools it averages in the top 10 in SERPs for various terms. This page has no external links to it, and very few internal links pointing to it. When I run Moz's On-Page Grader for the various keywords it ranks for, the page get's an F on all of them. It was not optimized for any keyword, and it isn't our best content; it was a blog post written a few years ago and forgotten about and was never promoted in any way. The topic does happen to be about something that people search for frequently. According to the keyword difficulty tool, all of the keywords it ranks for have 40-45% difficulty. We have lots of other pages on our site that we have tried to optimize and that get A's and B's in On-Page Grader, that have both internal (from the home page and main menu) and external links pointing at them, etc, but they don't rank well at all. Keyword difficulty for these keywords is in the same range, from 37 - 53%. Why does this one page rank so well when the other pages don't? Additionally, we have been looking at a competitor who has a page that ranks #1 in universal results for numerous keywords according to SEMRush, yet the page gets an F On-page Grader for those keywords. The page has 3 links to it, all from the same domain and it has a very low domain and page authority. The Domain Authority of this page is 47 and the page authority is 33 according to Open Site Explorer (compared with our DA of 30, and PA of 1), and the social metrics are a bit higher than ours, but neither has a lot (they may have 15 likes to our 10). Why does this page rank so well for them? How can we get our Page Authority higher? Thanks for any and all help.
Moz Pro | | mukunig1 -
Redirected pages still sending response code 200
SEO Moz tool reports missing title tags on all the links that have been redirected. E.g. this page: http://www.imoney.my/ms/personal-loan When I check the response code on the page with redirect checker it shows code 200 (page exists). Has it happened to anyone else? How can a redirected page send a 200 code?
Moz Pro | | imoney0 -
Benefits of reducing on page links
This is more of a discussion point. What would be the measurable results of reducing the number of on page links, specifically on a home page, but let's assume by way of a large navigation menu most of the pages have a lot of links" For instance, would any of the stats on the MozBar be affected (let's start with the home page). Would the Page Authority or MozRank change at all, perhaps because there is less "juice" flowing out of the home page? Thanks! 🙂
Moz Pro | | ntcma0 -
On Page Ranking Tool Giving Weird Reports
My on page ranking tool is giving two entirely different reports for my website - I get one report for my domain name and a different report for the index.html even though essentially both of those pages are...well...the same. Not sure what's going on, hoping it's not an indication of a more serious issue? I appreciate any help!
Moz Pro | | Virage0 -
What the . . ! Duplicate Pages and Titles WAY up?
My duplicate pages went up 50 plus in the past week, and my duplicate page titles went over more then 100. We recently redesigned the website, but it has been up for several weeks now. The only change I made specifically last week or late the week before was to get my 301 redirects done to get the www. version and the non www version pointing to the same place (as well as a couple other sites that point to it). I'm sure this is not enough info to figure out what went wrong . . . I'd love some help in figuring this out though.
Moz Pro | | damon12120 -
Help with On-page Optimization in Campaign Manager
Hi, I don’t know if I am incredibly stupid or am just missing something with the on-page optimization report page. I have 11 key phrases set up covering five or pages. When I grade each key phrase against the page I have optimized for it using the on-page keyword optimization (Term Target) they score A/B’s. The issue I have is the campaign manager on-page optimization report only appears to check all the key phrases against the home page, as it shows several F reports. How do I set it up so it grades each key phase against the correct site page. Thanks Fraser
Moz Pro | | fraserhannah1