Should I noindex user-created fundraising pages?
-
Hello Moz community!
I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages.
These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site?
My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website.
Any insight you fine folks could provide would be greatly appreciated!
-
I'd tread very carefully here as thing 1 and thing 2 seem to contradict each other at face value. You're right, Google can send traffic to a site in ways other than keywords, but it's not the norm. The next thing I'd look at is, hmm - how are we tracking keyword rankings? Is it an online, cloud based rank tracker that relies on you specifying all of (and all of the right) keywords to track? Most of those trackers track between 50 and 300 KWs (daily, weekly) but it's not uncommon for such sites to have 10,000+ keywords contributing. If they're not all in there, it's a bad sample you are looking at. Connect Google Search Console to Google Analytics. let it run for a few weeks, analyse the 'search query' data from within Google Analytics (which can be done once it's all hooked up). GSC only lets you export 1k keywords (usually, sometimes it can be more) but GA will take 5k and that's much better for your analysis. You might be surprised to find, those pages rank for more keywords than you thought. maybe hundreds of little ones, instead of a few big ones
-
Effectdigital is right in looking at your analytics and backlinks to help make this decision.
In the Moz case study we referenced earlier, they were getting rid of pages that didn't provide value at all to anyone. Those pages probably didn't have any links pointing to them at all. So it made sense to get rid of them.
Since your pages are providing value (it seems) and your getting 1/3 of your traffic coming into those pages, we would tread carefully on meta noindexing them.
You might only consider meta noindexing a group of them that haven't brought in any traffic this whole year and that don't have any links pointing to them. That way, you won't lose any existing traffic that your getting but you can see if the trimming helps your site's overall traffic and rankings.
-
Appreciate the word of caution, I'm relatively new and am looking for well-rounded opinions about the repercussions such a massive move could make for our site. As a response:
Thing #1: We don't have many fundraising pages that rank highly for keywords, as we're still working on juicing up our regular site pages as is to improve in the SERP results. I was mainly wondering whether the glut of fundraising pages could be harming our SERP results. Some certainly have duplicate content but that's beyond our control, and I'm not sure if that could significantly be harming our results. Any thoughts on that?
Thing #2: Great call on checking the data. YTD nearly 1/3 of our user sessions have landed on one of these fundraising pages. I'm guessing that's likely either the hosts using google to find their page and then subsequently log in, or friends searching for it on google and then navigating and donating. We do still have a Google Custom Search Engine on our site. Presumably people could find them that way?
If you have any additional opinions or feedback given what I detailed above, I'd very much appreciate it!
-
Be VERY careful
Thing #1) Just because you stop Google indexing and crawling some pages, that doesn't mean they will give that same traffic (keywords linking to those pages) to other URLs on your site. They may decide that your other URLs, do not satisfy the specific keywords connecting with the fundraiser URLs
Thing#2) CHECK. Go onto Google Analytics and actually check what percentage of your Google traffic (and overall traffic, I guess) comes specifically through these URLs. If it's like 2-3%, no big deal. If most of your traffic comes to and lands on these pages, no-indexing them all could be the single largest mistake you'll ever make
Blog posts and articles are fun but no substitute for checking your own, real, actual, factual data. Always always do that
-
Thanks! I've been wondering about it for awhile and actually stumbled upon this very article today - which prompted the question
-
Britney Muller, with Moz, did just that when she meta noindexed over 70,000 low quality profile pages created by users. As a result, Moz saw an increase in organic users, almost 9% the following month and then they saw a lift of 13.7% year-over-year for organic traffic the following month.
You can read all about it or watch the interview about it here: https://www.getcredo.com/britney-muller/
We think it's worth a try for sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
API for On Page tool
I'm looking for a tool similar to On Page Grader (Moz) or Focus Keyword (Yoast) with API. We are building out or internal CRM system. Even though none of these tools can replace manual on page analysis, it will be used as a metric and to catch human mistakes.
Moz Pro | | OscarSE0 -
Why is my MOZ report only crawling 1 page?
just got this weeks MOZ report and it states that it have only crawled: Pages Crawled: 1 | Limit: 10,000 it was over 1000 a couple of weeks ago, we have moved servers recently but is there anything i have done wrong here? indigocarhire.co.uk thanks
Moz Pro | | RGOnline0 -
Duplicate Page content
I found these URLs in Issue: Duplicate Page Content | http://www.decoparty.fr/Products.asp?SubCatID=4612&CatID=139 1 0 10 1 http://www.decoparty.fr/Products.asp?SubCatID=4195&CatID=280 1 0 10 1 http://www.decoparty.fr/Catproducts.asp?CatID=124 | 28 | 0 | 12 | 1 |
Moz Pro | | partyrama0 -
On Page URL's not updating?
I recently moved my site to Shopify which completely changed the URL structure. I went to my old pages and created permanent 301 redirects to my new pages on shopify. For some reason, the On Page reports are not picking up on one of the new pages. This is effecting my grading. Just wondering why this is happening & whether this may be an indication of a larger problem? Any help would be greatly appreciated! Thanks!
Moz Pro | | PedroAndJobu0 -
Campaign 4XX error gives duplicate page URL
I ran the report for my site and had many more 4xx errors than I've had in the past month. I updated my .htaccess to include 301 statements based on Google Webmaster Tools Crawl Errors. Google has been reporting a positive downward trend in my errors, but my SEOmoz campaign has shown a dramatic increase in the 4xx pages. Here is an example of an 4xx URL page: http://www.maximphotostudio.net/engagements/266/inniswood_park_engagements/http:%2F%2Fwww.maximphotostudio.net%2Fengagements%2F266%2Finniswood_park_engagements%2F This is strange because URL: http://www.maximphotostudio.net/engagements/266/inniswood_park_engagements/ is valid and works great, but then there is a duplicate entry with %2F representing forward slashes and 2 http statements in each link. What is the reason for this?
Moz Pro | | maximphotostudio1 -
Why are these pages considered duplicate page content?
A recent crawl diagnostic for a client's website had several new duplicate page content errors. The problem is, I'm not sure where the error comes from since the content in the webpage is different from one another. Here's the pages that SEOMOZ reported to have duplicate page content errors: http://www.imaginet.com.ph/wireless-internet-service-providers-term http://www.imaginet.com.ph/antivirus-term http://www.imaginet.com.ph/berkeley-internet-name-domain http://www.imaginet.com.ph/customer-premises-equipment-term The only thing similar that I see is the headline which says "Glossary Terms Used in this Site" - I hope that the one sentence is the reason for the error. Any input is appreciated as I want to find out the best solution for my client's website errors. Thanks!
Moz Pro | | TheNorthernOffice790 -
Multiple users & logins for one Pro account?
Hi, Is there a way of setting up multiple users under one account? Ideally I need to be able to provide access to different users without access to billing/account settings. If this isn't available, is it something that could be developed? This would be a very useful feature. Thanks Joe
Moz Pro | | j_brickell1 -
How to check Page Authority in bulk?
Hey guys, I'm on the free trial for SEOmoz PRO and I'm in love. One question, though. I've been looking all over the internet for a way to check Page Authority in bulk. Is there a way to do this? Would I need the SEOmoz API? And what is the charge? All I really need is a way to check Page Authority in bulk--no extra bells and whistles. Thanks, Brandon
Moz Pro | | thegreatpursuit0