Duplicate content because of member only restrictions on a forum.
-
Our website's Community Forum links to the membership profile pages, which by default are blocked for non-members.
We're getting warnings in Moz for duplicate content (and errors) on these member profile pages.
Any ideas for how we can creatively solve this problem? Should we redirect those pages or just beef them up with more content? Just ignore it and assume that search spiders will be smart enough to figure it out?
See attached video for further explanation.
-
Awesome. Makes total sense.
I've dropped this:
User-agent: *
Disallow: /community/member/Into our robots.txt file.
Thanks Ray!
-
Googlebot won't be able to see the full community member profiles, since it is hidden behind a user login wall. However, the duplicate content error is an issue you want to correct.
- https://www.foodbloggerpro.com/community/member/1041/
- https://www.foodbloggerpro.com/community/member/2373/
- https://www.foodbloggerpro.com/community/member/1301/
All of the community member pages produce an error, if viewing logged out. Unfortunately, the URL remains the same and only shows an error, it does not redirect the user to an error page. This results in all of the community member profiles having unique URLs, all with duplicate content.
You should add a noindex tag to community member pages and, if possible, redirect the user to an error page rather than the unique community profile page for those viewing while logged out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
To avoid the duplicate content issue I have created new urls for that specific site I am posting to and redirecting that url to the original on my site. Is this the right way to do it?
I am trying to avoid the duplicate content issue by creating new urls and redirecting them to the original url. Is this the proper way of going about it?
On-Page Optimization | | yagobi210 -
Should I remove 'local' landing pages? Could these be the cause of traffic drop (duplicate content)?
I have a site that has most of it's traffic from reasonably competitive keywords each with their own landing page. In order to gain more traffic I also created landing pages for counties in the UK and then towns within each county. Each county has around 12 towns landing pages within the county. This has meant I've added around 200 extra pages to my site in order to try and generate more traffic from long tail keywords. I think this may have caused an issue in that it's impossible for me to create unique content for each town/country and therefore I took a 'shortcut' buy creating unique content for each county and used the same content for the towns within it meaning I have lots of pages with the same content just slightly different page titles with a variation on town name. I've duplicated this over about 15 counties meaning I have around 200 pages with only about 15 actual unique pages within them. I think this may actually be harming my site. These pages have been indexed for about a year an I noticed about 6 months ago a drop in traffic by about 50%. Having looked at my analytics this town and county pages actually only account for about 10% of traffic. My question is should I remove these pages and by doing so should I expect an increase in traffic again?
On-Page Optimization | | SamCUK0 -
Duplicate Content - Blog Rewriting
I have a client who has requested a rewrite of 250 blog articles for his IT company. The blogs are dispersed on a variety of platforms: his own website's blog, a business innovation website, and an IT website. He wants to have each article optimised with keyword phrases and then posted onto his new website thrice weekly. All of this is in an effort to attract some potential customers to his new site and also to establish his company as a leader in its field. To what extent would I need to rewrite each article so as to avoid duplicating the content? Would there even be an issue if I did not rewrite the articles and merely optimised them with keywords? Would the articles need to be completely taken by all current publishers? Any advice would be greatly appreciated.
On-Page Optimization | | StoryScout0 -
Duplicate Mega tags
we have a e-commerce site, we have products that are the exact same but different sizes each has a page, we use the same mega tag would it be better to use no mega tag
On-Page Optimization | | DFC0 -
Duplication issue on my website
hi I have a cms website with 2000 pages.my problem is that 1. www.test.com/abc.html 2. www.test.com/abc.html?gallery?123testing it showing duplication page in me seomoz error list. It is a single page. Please suggest solution for it
On-Page Optimization | | wmsindia0 -
Content by Country
Currently we have a news website aimed at several countries. We want to filter the content of some url (home, category pages, ...) using the country of origin of the visitor. For example in the home we've heard of global character, and a column with news of the country of origin of the visitor. This may affect the position or cause a Google penalty? thank you very much
On-Page Optimization | | promonet0 -
Duplicat contents on wordpress
I ran a crawl error and found that I have many pages with "tag" i.e. http://www.soobumimphotography.com/tag/70-200-2-8-is/ What's the best way to deal with this problems? Is it worth to visit all of them and fix? Delete? Could you give me some suggestions?
On-Page Optimization | | BistosAmerica0 -
Is there any benefit in on-site duplicate content?
I have about 50 internal pages on my site that I want to add a "Do it yourself tutorial" to in an effort to build the quality of the pages. Is this going to de-value the content if I put it on all 50 pages? It's difficult to write similar content 50 different ways.
On-Page Optimization | | BradBorst0