Do we need to Disallow profiles from discussions or forums?
-
Hi,
We have a forum where users create different threads like any other community...ex..Moz. Thousands of pages are getting created. New threads and comments are Okay as they have relevant content. We are planning to "Disallow" all profile pages as they do not help with content relevancy and may dilute the link juice with thousands of such profile pages. Is this right way to proceed?
Thanks
-
We have never had to nofollow internal links within any of our websites, I am sure their are some circumstances where you could, but since Google's algorithm no longer "penalizes" links I don't see a need to as they are simply ignored. Just avoid spammy tactics like having 100 links under the footer or a bunch of affiliate links, etc.
-
Hi Lure,
Thanks for the answer. We are going to de index these profile pages. Just wondering about "nofollow" as nofollowing internal links is not always correct. What's your call on this?
-
I would if I were in your shoes. Having all profile and/or comment links be "do-follow links" can cause alot of spam and potentially be the downfall to the longevity and credibility of your forum.
-
I would normally say yes, but it really depends on how good your forum is. Sometimes people like to do reputation marketing and if the site is authoritative you want to show for your name's queries that you are part of an important website with lots of contributions. But if your profile pages are not that relevant or nice or provide any interest to your users, I don't see why you'd like to let google index and serve them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Indexed, though blocked by robots.txt: Need to bother?
Hi, We have intentionally blocked some of the website files which were indexed for years. Now we receive a message "Indexed, though blocked by robots.txt" in GSC. We can ignore as per my knowledge? Are any actions required about this? We thought of blocking them with meta tags but these are PDF files. Thanks
Algorithm Updates | | vtmoz1 -
Google search console: 404 and soft 404 without any back-links. Redirect needed?
Hi Moz community, We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed? Thanks
Algorithm Updates | | vtmoz0 -
Need to be reindexed quickly - SERP is showing a 404
So there was a mistake made where a 404 error was placed in the canonical URL for the pages my company made. We need to have these pages quickly reindexed. I asked GWT to fetch them and have an updated sitemap but the SERPs are still the same. Any tricks anyone knows that would allow me to get reindexed faster?
Algorithm Updates | | mattdinbrooklyn0 -
Backlink profile maintenance?
hello all, I have a large website that generates a few thousand organic links a day per majestic, and has over 2k referring domains. a few months back I did a disavow, but I'm wondering your thoughts on how often I should be looking to get bad links removed and add new links to my disavow list. Also, what is really considered a bad organic link? Are there any companies out there that offer monthly backlink maintenance or something?
Algorithm Updates | | juicyresults0 -
A web audit for web traffic? Need answers please..
Hi, We are a PR agency based in Dubai and we produce a lot of web content. The website is build on ruby on rails and we have implemented keywords and SEO strategies but sadly the traffic pattern has not changed since the past three years. What surprised us today that we created a page 2-3 days ago for a client who is participating in Arab Health (a very prestigious healthcare event) and suddenly our page comes on top 3 on google.ae as well as google.com We are kind of convinced that there is something wrong with our code.. Do you think this could be a possibility? and the lack of change in the traffic pattern might not be an SEO issue but a code issue? What could be the possible reasons for this pattern? In such a scenario what would experts like you recommend we do? Do a SEO Audit? Web audit? code audit? hire a seo/ web / code consultant? Thanks - helpful answers are really appreciated and just btw if anyone feels they could professionally help us out of this mess, we are willing to work with him/her. Thanks in advance
Algorithm Updates | | LaythDajani0 -
Local search ranking tips needed
Hi there, I've been working on my clients website for a while now. About a month ago I created him a local business listing in Google. I was wondering if there are any new tips to get his business up the rankings in local search? I've researched and only really found information relevant to the old way Google displayed local search.
Algorithm Updates | | SeoSheikh0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1