Noindex user profile
-
I have a social networking site with user- and company profiles. Some profiles have little to no content. One of the users here at moz suggested noindex-ing these profiles. I am still investigating this issue and have some follow up questions:
- What is the possible gain of no-indexing uninteresting profiles? Especially interested in this since these profiles do bring in long-tail traffic atm.
- How "irreversable" is introducing a noindex directive? Would everything "return to normal" if I remove te noindex directive?
- When determining the treshold for having profiles indexed, how should the following items be weighed
- Sum of number of words on the page (comprised of one or more of the following: full name, city, 0 to N company names, bio, activity)
- (unique) Profile picture
- (Nofollowed) Links to user's profiles on social networks or user's own site.
- Embedded Google Map
Thanks!
-
The one thing I would add to your list of criteria, if you choose to go that route, is to look at Google Analytics landing pages and make sure the individual profiles don't any inbound search traffic.
-
The gain would be that you don't index a bunch of URLs on your site that contain essentially similar/thin content. I wouldn't necessarily count those that do bring in long tail traffic as ones you'd want to noindex. Things will return to normal once you remove the noindex, but unless you have decent links pointing to those profiles, it may take up to numerous months to for them to be recrawled. I'd weigh most heavily links (followed or no followed) to the profiles from decent sites, as well as activity that shows on the profile page. The rest I wouldn't consider in the threshold calculation.
-
1. unless you have a big thin content problem there is no gain
2. completely reversible, just remove and wait
3. you will have to decide, you seem like you are on the right track.
4. Question you should have asked, is there any downside to no-indexing these pages, Answer Yes there is, all links pointing to a no-indexed page will leak all their link juice, noindex is a last resort, I have never used.
if you must noindex a page, do it with a meta no-index,follow tag, note that was "follow", not "no-follow", then your link juice will flow into the page and back out again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NoIndex tag, canonical tag or automatically generated H1's for automatically generated enquiry pages?
What would be better for automatically generated accommodation enquiry pages for a travel company? NoIndex tag, canonical tag, automatically generated H1's or another solution? This is the homepage: https://www.discoverqueensland.com.au/ You would enquire from a page like this: https://www.discoverqueensland.com.au/accommodation/sunshine-coast/twin-waters/the-sebel-twin-waters This is the enquiry form: https://www.discoverqueensland.com.au/accommodation-enquiry.php?name=The+Sebel+Twin+Waters®ion_name=Sunshine+Coast
Technical SEO | | Kim_Lazaro0 -
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
Dynamically Inserting Noindex With Javascript
Hello, I have a broken plugin creating hundreds of WP-Content directory pages being indexed by Google. I can not access the source code of these pages to add a noindex to them. The page URL's all have the plugin name within them. In order to resolve the issue, I wrote a solution with javascript to dynamically add in a noindex tag to any URL containing the plugin name. Would this noindex be respected by Google and is there a way to immediately check that it is respected? Currently, I can not delete the plugin due to issues with it's php. If you would like to view the code: https://codepen.io/trodrick/pen/Gwwaej?editors=0010 Thanks!
Technical SEO | | Tom3_150 -
Include or exclude noindex urls in sitemap?
We just added tags to our pages with thin content. Should we include or exclude those urls from our sitemap.xml file? I've read conflicting recommendations.
Technical SEO | | vcj0 -
Backlink Profile: Should I disavow these links? Auto-Generated Links etc
Hello Moz Community, At first I wanted to say that I really like the Q&A section and that I read and learned a lot - and today it is time for my first own question 😉 I checked our backlink-profile these days and I found in my opinion a few bad/spammy links, most of them are auto-generated by pickung up some (meta) information from our webpage. Now my question is if I should dasavow these links over webmasters or if these links shouldn't matter as I guess basically every webpage will be picked up from them. Especially from the perspective that our rankings dropped significantly last weeks, but I am not sure if this can be the real reason. Examples are pages like: https://www.askives.com/ -Auto-Generates for example meta descriptions with links http://www.websitesalike.com/ -find similar websites http://mashrom.ir/ -no idea about this, really crazy Or we are at http://www.europages.com/, which makes sense for me and we get some referral traffic as well, but they auto-generated links from all their TLDs like .gr / .it / .cn etc. -just disavow all other TLDs than .com? Another example would be links from OM services like: seoprofiler.com Moreover we have a lot of links from different HR portals (including really many outdated job postings). Can these links “hurt” as well? Thanks a lot for your help! Greez Heiko
Technical SEO | | _Heiko_0 -
Best way to implement noindex tags on archived blogs
Hi, I have approximately 100 old blogs that I believe are of interest to web browsers that I'd potentially like to noindex due to the fact that they may be viewed poorly by Google, but I'd like to keep on our website. A lot of the content in the blogs is similar to one another (as we blog about the same topics quite often), which is why I believe it may be in our interests to noindex older blogs that we have newer content for on more recent blogs. Firstly does that sound like a good idea? Secondly, can I use Google Tag Manager to implement noindex tags on specific blog pages? It's a hassle to get the webmaster to add in the code, and I've found no mention of whether you can implement such tags on Tag Manager on the usual SEO blogs. Or is there a better way to implement noindex tags en masse? Thanks!
Technical SEO | | TheCarnage0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | | agencycentral0 -
Do user metrics really mean anything?
This is a serious question, I'd also like some advice on my experience so far with the Panda. One of my websites, http://goo.gl/tFBA4 was hit on January 19th, it wasn't a massive hit, but took us from 25,000 to 21,000 uniques per day. It survived Panda completely prior. The only thing that had changed, was an upgrade in the CMS, which caused a lot of duplicate content, i.e 56 copies of the homepage, under various URLs. These were all indexed in Google. I've heard varying views, as to whether this could trigger Panda, I believe so, but i'd appreciate your thoughts on it. There was also the above the fold update on the 19th, but we have 1 ad MAX on each page, most pages have none. I hate even having to have 1 ad. I think we can safely assume it was Panda that did the damage. Jan 18th was the first Panda refresh, since we upgraded our CMS in mid-late December. As it was nothing more than a refresh, I feel it's safe to assume, that the website was hit, due to something that had changed on the website, between the Jan 18th refresh and the one previous. So, aside from fixing the bugs in the CMS, I felt now was a good time to put a massive focus on user metrics, I worked hard and continuing to spend a lot of time, improving them. Reduced bounce rate from 50% to 30% (extremely low in the niche) Average page views from 7 to 12 Average time on site from 5 to almost 8 minutes Plus created a mobile optimised version of the site Page loading speeds slashed. Not only did the above improvements have no positive effect, traffic continued to slide and we're now close to a massive 40% loss. Btw I realise neither mobile site nor page loading speeds are user metrics. I fully appreciate that my website is image heavy and thin on text, but that is an industry wide 'issue'. It's not an issue to my users, so it shouldn't be an issue to Google. Unlike our competitors, we actively encourage our users to add descriptions to their content and provide guidelines, to assit them in doing so. We have a strong relationship with our artists, as we listen to their needs and develop the website accordingly. Most of the results in the SERPs, contain content taken from my website, without my permission or permission of the artist. Rarely do they give any credit. If user metrics are so important, why on earth has my traffic continued to slide? Do you have any advice for me, on how I can further improve my chances of recovering from this? Fortunately, despite my artists download numbers being slashed in half, they've stuck by me and the website, which speaks volumes.
Technical SEO | | seo-wanna-bs0