Noindex user profile
-
I have a social networking site with user- and company profiles. Some profiles have little to no content. One of the users here at moz suggested noindex-ing these profiles. I am still investigating this issue and have some follow up questions:
- What is the possible gain of no-indexing uninteresting profiles? Especially interested in this since these profiles do bring in long-tail traffic atm.
- How "irreversable" is introducing a noindex directive? Would everything "return to normal" if I remove te noindex directive?
- When determining the treshold for having profiles indexed, how should the following items be weighed
- Sum of number of words on the page (comprised of one or more of the following: full name, city, 0 to N company names, bio, activity)
- (unique) Profile picture
- (Nofollowed) Links to user's profiles on social networks or user's own site.
- Embedded Google Map
Thanks!
-
The one thing I would add to your list of criteria, if you choose to go that route, is to look at Google Analytics landing pages and make sure the individual profiles don't any inbound search traffic.
-
The gain would be that you don't index a bunch of URLs on your site that contain essentially similar/thin content. I wouldn't necessarily count those that do bring in long tail traffic as ones you'd want to noindex. Things will return to normal once you remove the noindex, but unless you have decent links pointing to those profiles, it may take up to numerous months to for them to be recrawled. I'd weigh most heavily links (followed or no followed) to the profiles from decent sites, as well as activity that shows on the profile page. The rest I wouldn't consider in the threshold calculation.
-
1. unless you have a big thin content problem there is no gain
2. completely reversible, just remove and wait
3. you will have to decide, you seem like you are on the right track.
4. Question you should have asked, is there any downside to no-indexing these pages, Answer Yes there is, all links pointing to a no-indexed page will leak all their link juice, noindex is a last resort, I have never used.
if you must noindex a page, do it with a meta no-index,follow tag, note that was "follow", not "no-follow", then your link juice will flow into the page and back out again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Log in, sign up, user registration and robots
Hi all, We have an accommodation site that asks users only to register when they want to book a room, in the last step. Though this is the ideal situation when you have tons of users, nowadays we are having around 1500 - 2000 per day and making tests we found out that if we ask for a registration (simple, 1 click FB) we mail them all and through a good customer service we are increasing our sales. That is why, we would like to ask users to register right after the home page ie Home/accommodation or and all the rest. I am not sure how can I make to make that content still visible to robots.
Technical SEO | | Eurasmus.com
Will the authentication process block google crawling it? Maybe something we can do? We are not completely sure how to proceed so any tip would be appreciated. Thank you all for answering.3 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
Wordpress: Should your blog posts be noindex?
Wordpress defaults all blog posts to no index/nofollow Is this how it should be handled? I understand the nofollow from the page.com/blog to the page.com/blog/blogtitle But why noindex? We have Yoast installed and this is the default.
Technical SEO | | cschwartzel0 -
Google Change of Address with Questionable Backlink Profile
We have a .com domain where we are 301-ing the .co.uk site into it before shutting it down - the client no longer has an office in the UK and wants to focus on the .com. The .com is a nice domain with good trust indicators. I've just redesigned the site, added a wad of healthy structured markup, had the duplicate content mostly rewritten - still finishing off this job but I think we got most of it with Copyscape. The site has not so many backlinks, but we're working on this too and the ones it does have are natural, varied and from trustworthy sites. We also have a little feature on the redesign coming up in .Net magazine early next year, so that will help. The .co.uk on the other hand has a fair few backlinks - 1489 showing in Open Site Explorer - and I spent a good amount of time matching the .co.uk pages to similar content on the .com so that the redirects would hopefully pass some pagerank. However, approximately a year later, we are struggling to grow organic traffic to the .com site. It feels like we are driving with the handbrake on. I went and did some research into the backlink profile of the .co.uk, and it is mostly made up of article submissions, a few on 'quality' (not in my opinion) article sites such as ezine, and the majority on godawful and broken spammy article sites and old blogs bought for seo purposes. So my question is, in light of the fact that the SEO company that 'built' these shoddy links will not reply to my questions as to whether they received a penalty notification or noticed a Penguin penalty, and the fact that they have also deleted the Google Analytics profiles for the site, how should I proceed? **To my mind I have 3 options. ** 1. Ignore the bad majority in the .co.uk backlink profile, keep up the change of address and 301's, and hope that we can just drown out the shoddy links by building new quality ones - to the .com. Hopefully the crufty links will fade into insignificance over time.. I'm not too keen on this course of action. 2. Use the disavow tool for every suspect link pointing to the .co.uk site (no way I will be able to get the links removed manually) - and the advice I've seen also suggests submitting a reinclusion request afterwards- but this seems pointless considering we are just 301-ing to the new (.com) site. 3. Disassociate ourselves completely from the .co.uk site - forget about the few quality links to it and cut our losses. Remove the change of address request in GWT and possibly remove the site altogether and return 410 headers for it just to force the issue. Clean slate in the post. What say you mozzers? Please help, working myself blue in the face to fix the organic traffic issues for this client and not getting very far as yet.
Technical SEO | | LukeHardiman0 -
Developing a link profile.....
So we are a brand new site looking to establish a link profile of earned links vs. manipulative link building practices and have received some conflicting information. Our goal is to provide users and webmasters of relevant websites with useful content about the areas and topics we cover and let them decide to link to us. We have been advised by some parties that in order to develop a base set of links we should enter our website into directories. Now I understand entering it into some of the main directories such as BOTW and Yahoo etc, but please offer your thoughts on smaller less official directories. Thanks in advance. Scott
Technical SEO | | jackaveli0 -
Google not showing profile photo in search
I seen on google snippet review and it was showing profile photo in search but when i see directly on google.com it is not showing there.. what's the problem ?
Technical SEO | | xplodeguru0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0