Need suggestion: Should the user profile link be disallowed in robots.txt
-
I maintain a myBB based forum here. The user profile links look something like this http://www.learnqtp.com/forums/User-Ankur
Now in my GWT, I can see many 404 errors for user profile links. This is primarily because we have tight control over spam and auto-profiles generated by bots. Either our moderators or our spam control software delete such spammy member profiles on a periodic basis but by then Google indexes those profiles.
I am wondering, would it be a good idea to disallow User profiles links using robots.txt?
Something like
Disallow: /forums/User-*
-
Maybe you should noindex user profile pages until they've been active on the site long enough that a) you're sure they're not a spam bot and b) they're actually active users, with some interesting information to share on their user page?
Like Shaliendra said, this is your call. It's really about your website goals, and what those user pages do for the site. Are you getting much organic traffic from user pages? If you aren't, it's probably easiest to just noindex all user pages. But if some profiles are bringing in organic traffic, I'd recommend that you add a noindex tag immediately, but remove it once someone has commented a few times.
Does that help?
-
I am only talking about user profile pages not the links inside them. I have already taken care of them.
Request others to please come up with suggestions.
-
If you are talking about user profile pages, it is your call.
If you are talking about links that users post in profile pages, use as signatures or use in threads, make sure to mark them nofollow.
Regards
-
Let me re-phrase my question a bit. Would it be a good idea to get the user profile links noindexed in the first place? or should we keep them indexed in Google?
-
Use meta="noindex" tag for the user profile pages. It is more effective.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog issue broken link
Taking Great Photographs Underwater May 25, 2015 By sdwellers@aol.com No comments yet florida keys, key largo diving Excuse my ignorance, I suspect this is an easy issue...but at the the top of each of my blog posts have what you see above....the "No Comments yet" tab is showing as a broken link 404 error...?Why? And how to fix?Thank you
On-Page Optimization | | sdwellers0 -
URL Structure Suggestion
Hi
On-Page Optimization | | sandeep.clickdesk
My site url: http://goo.gl/AiOgu1
We are working on URL structure of our website. I have one query about URL structure.
Which one is good URL structure according to user and SEO prospective.
The targeted keyword for the particular page is "wordpress live chat". Is it worthful to rewrite the present url "https://www.abc.com/wordpress" to "https://www.abc.com/wordpress-live-chat" Please suggest.0 -
Not sure if I need to be concerned with duplicate content plus too many links
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes. We seem to have a lot of duplicate content and duplicate titles. This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed? The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do? Many thanks.
On-Page Optimization | | Niamh20 -
What is everyone doing to reduce the number of links on a page?
Some clients of mine have sites that are throwing the "too many links on one page" error and we're not just talking a little more than the status quo 100 links, it's much more. I believe it could be due to the fly-out navigation. My Solution: shorten the Tier 2 categories in the left nav down to 5 and add a "View All" link after the 5th and remove top nav fly-outs. I'm not sure if these are best practices or the best for usability though?
On-Page Optimization | | LisaS130 -
Spammy link for each keyword
Some people believe that having a link for each keyword and a page of content for each keyword (300+ words) can help ranking for those keywords. However, the old approach of having "restaurant New York", "restaurant Buffalo", "restaurant Newark" approach has become seen as a terrible SEO practice. I don't know whether this was because it's spammy or because people usually combined it with thin content that was 95% duplicate. Which brings us to; http://hungryhouse.co.uk/ Why does such a major company have the following on the site (see the footer); Aberdeen Takeaway Birmingham Takeaway Brighton Takeaway Bristol Takeaway Cambridge Takeaway Canterbury Takeaway Cardiff Takeaway Coventry Takeaway Edinburgh Takeaway Glasgow Takeaway Leeds Takeaway Leicester Takeaway Liverpool Takeaway London Takeaway Manchester Takeaway Newcastle Takeaway Nottingham Takeaway Sheffield Takeaway Southampton Takeaway York Takeaway Indian Takeaway Chinese Takeaway Thai Takeaway Italian Takeaway Cantonese Takeaway Pizza Delivery Sushi Takeaway Kebab Takeaway Fish and Chips Sandwiches Do they know something I don't? [unnecessary links removed by staff]
On-Page Optimization | | JamesFx0 -
Advice needed form seo point of view
We have a client whose website is http://www.indigenaskincare.com/ There are frames in the site and will they hurm seo efforts. Also, what metrics are missing so that we can give advice to the client.
On-Page Optimization | | seoug_20050 -
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
On-Page Optimization | | cyaindc0 -
Building content pages, redirecting and linking
Previously the company had created some .HTML content pages around top shoe styles and top manufactures. One or two of these pages used to rank but have been neglected over the page 18 months. I want to build out new content round our top styles / top manufactures and I am wondering if I should use the existing HTML pages or create new pages that use our content management system. The .HTML pages can contain keywords in the URL, using our content management system, all URL’s are www.site.com/content/home/contentid=1234abcd. If we use the .HTML pages all content is managed manually. If we build out 6 to 10 pages, this can become a resource issue and may result in a bad experience for the website visitor. From an SEO perspective, does the benefit of having the keywords in the URL outweigh the manual management hassles? And if not, should we 301 all the HTML pages to the new content pages? And from a linking standpoint, I want these content pages to point to the new version of the top style. From a navigation standpoint, we also want to provide access to all styles from the manufacture. Should we nofollow the links to all styles?
On-Page Optimization | | seorunner0