Should I noindex user-created fundraising pages?
-
Hello Moz community!
I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages.
These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site?
My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website.
Any insight you fine folks could provide would be greatly appreciated!
-
I'd tread very carefully here as thing 1 and thing 2 seem to contradict each other at face value. You're right, Google can send traffic to a site in ways other than keywords, but it's not the norm. The next thing I'd look at is, hmm - how are we tracking keyword rankings? Is it an online, cloud based rank tracker that relies on you specifying all of (and all of the right) keywords to track? Most of those trackers track between 50 and 300 KWs (daily, weekly) but it's not uncommon for such sites to have 10,000+ keywords contributing. If they're not all in there, it's a bad sample you are looking at. Connect Google Search Console to Google Analytics. let it run for a few weeks, analyse the 'search query' data from within Google Analytics (which can be done once it's all hooked up). GSC only lets you export 1k keywords (usually, sometimes it can be more) but GA will take 5k and that's much better for your analysis. You might be surprised to find, those pages rank for more keywords than you thought. maybe hundreds of little ones, instead of a few big ones
-
Effectdigital is right in looking at your analytics and backlinks to help make this decision.
In the Moz case study we referenced earlier, they were getting rid of pages that didn't provide value at all to anyone. Those pages probably didn't have any links pointing to them at all. So it made sense to get rid of them.
Since your pages are providing value (it seems) and your getting 1/3 of your traffic coming into those pages, we would tread carefully on meta noindexing them.
You might only consider meta noindexing a group of them that haven't brought in any traffic this whole year and that don't have any links pointing to them. That way, you won't lose any existing traffic that your getting but you can see if the trimming helps your site's overall traffic and rankings.
-
Appreciate the word of caution, I'm relatively new and am looking for well-rounded opinions about the repercussions such a massive move could make for our site. As a response:
Thing #1: We don't have many fundraising pages that rank highly for keywords, as we're still working on juicing up our regular site pages as is to improve in the SERP results. I was mainly wondering whether the glut of fundraising pages could be harming our SERP results. Some certainly have duplicate content but that's beyond our control, and I'm not sure if that could significantly be harming our results. Any thoughts on that?
Thing #2: Great call on checking the data. YTD nearly 1/3 of our user sessions have landed on one of these fundraising pages. I'm guessing that's likely either the hosts using google to find their page and then subsequently log in, or friends searching for it on google and then navigating and donating. We do still have a Google Custom Search Engine on our site. Presumably people could find them that way?
If you have any additional opinions or feedback given what I detailed above, I'd very much appreciate it!
-
Be VERY careful
Thing #1) Just because you stop Google indexing and crawling some pages, that doesn't mean they will give that same traffic (keywords linking to those pages) to other URLs on your site. They may decide that your other URLs, do not satisfy the specific keywords connecting with the fundraiser URLs
Thing#2) CHECK. Go onto Google Analytics and actually check what percentage of your Google traffic (and overall traffic, I guess) comes specifically through these URLs. If it's like 2-3%, no big deal. If most of your traffic comes to and lands on these pages, no-indexing them all could be the single largest mistake you'll ever make
Blog posts and articles are fun but no substitute for checking your own, real, actual, factual data. Always always do that
-
Thanks! I've been wondering about it for awhile and actually stumbled upon this very article today - which prompted the question
-
Britney Muller, with Moz, did just that when she meta noindexed over 70,000 low quality profile pages created by users. As a result, Moz saw an increase in organic users, almost 9% the following month and then they saw a lift of 13.7% year-over-year for organic traffic the following month.
You can read all about it or watch the interview about it here: https://www.getcredo.com/britney-muller/
We think it's worth a try for sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
API for On Page tool
I'm looking for a tool similar to On Page Grader (Moz) or Focus Keyword (Yoast) with API. We are building out or internal CRM system. Even though none of these tools can replace manual on page analysis, it will be used as a metric and to catch human mistakes.
Moz Pro | | OscarSE0 -
Filter Pages
Howdy Moz Forum!! I have a headache of a job over here in the UK and I'd welcome any advice! - It's sunny today, only 1 of 5 days in a year and i'm stuck on this! I have a client that currently has 22,000 pages indexed to Google with almost 4000 showing as duplicate content. The site has a "jobs" and "candidates" list. This can cause all sorts of variations such as job title, language, location etc. The filter pages all seem to be indexed. Plus the static pages are indexed. For example if there were 100 jobs at Moz being advertised, it is displaying the jobs on the following URL structure - /moz
Moz Pro | | Slumberjac
/moz/moz-jobs
/moz/moz-jobs/page/2
/moz/moz-jobs/page/3
/moz/moz-jobs/page/4
/moz/moz-jobs/page/5 ETC ETC Imagine this with some going up to page/250 I have checked GA data and can see that although there are tons of pages indexed this way, non of them past the "/moz/moz-jobs" URL get any sort of organic traffic. So, my first question! - Should I use rel-canonical tags on all the /page/2 & /page/3 etc results and point them all at the /moz/moz-jobs parent page?? The reason for this is these pages have the same title and content and fall very close to "duplicate" content even though it does pull in different jobs... I hope i'm making sense? There is also a lot of pages indexed in a way such as- https://www.examplesite.co.uk/moz-jobs/search/page/9/?candidate_search_type=seo-consulant&candidate_search_language=blank-language These are filter pages... and as far as I'm concerned shouldn't really be indexed? Second question! - Should I "no follow" everything after /page in this instance? To keep things tidy? I don't want all the variations indexed! Any help or general thoughts would be much appreciated! Thanks.0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
2 canonical links on 1 page, 1 for print version
Our developer has added a 2nd canonoical link for the "print" version of our page. I read on another post that this appears to be not be the correct way to do this. Is there a better way ? Here is an example of the code:
Moz Pro | | foodsleuth0 -
Can't figure out why some of my pages are duplicate content
Within the crawl diagnostics area I'm getting duplicate page content issues on several pages. I don't know why, would anyone be able to tell me how these links are duplicate so I can fix them? http://www.sagenews.ca/Column.asp?id=3010 http://www.sagenews.ca/Column.asp?id=2808 http://www.sagenews.ca/Column.asp?id=2998 http://www.sagenews.ca/Column.asp?id=2837 http://www.sagenews.ca/Column.asp?id=2981
Moz Pro | | INMCA0 -
Why is my MOZ report only crawling 1 page?
just got this weeks MOZ report and it states that it have only crawled: Pages Crawled: 1 | Limit: 10,000 it was over 1000 a couple of weeks ago, we have moved servers recently but is there anything i have done wrong here? indigocarhire.co.uk thanks
Moz Pro | | RGOnline0 -
Where can I find the page strength tool?
I have an SEOmoz account and can't seem to find the page strength tool. Is it somewhere in research tools?
Moz Pro | | elenaroi0 -
Keywords in front of the title element and on page keyword optimization
After running one of my landing pages through the SEOmoz on-page keyword optimization tool, I see that my keywords are 13 characters from the front of the title. My page is already receiving an A letter grade. The 13 characters in front of my keyword phrase are not vital, but they related to the keyword phrase. Is it that important that I drop the related word, in order to get my keyword phrase to start at the front?
Moz Pro | | Ticket_King0