Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Help FORUM ( User generated content ) SEO best practices
-
Hello Moz folks !
For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC.
I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated.
Best,
Yan
-
do logged in-user and anonymous user should have the same behavior ?
For the most part, yes, however it depends on the forum you are running. The important piece to understand is that whatever is hidden behind a login wall, remains hidden to the search engines. So, you have to weigh that factor when deciding which content to display to everyone versus the content to display to only logged in users.
How do you suggest handling canonical in a UGC world ?
Canonicalization isn't too hard to manage. Your forum software should include canonical URLs, but if not you will want those implemented into the template as soon as possible. The use of the rel=prev and rel=next tags are highly recommended. This allows you to keep the main forum thread as the canonical URL and Google understands that the subsequent pages are related to the main page and how they add value.
Do you have specific editorial guidelines enforced on UGC ?
Again, that's up to you and your community. What work editorially for one forum may not be the most desirable for another (e.g. the use of profanity). As long as the content being added is of value, then I consider it good content. With forums, you can be a lot more loose with the guidelines and allow users to interact as they desire.
Don't let your forum become infested with Spam, obvious self-promoting threads, and make sure all links are nofollow. Many forums implement restrictions on users in regards to links and only when they prove themselves can they add links to their posts. Link and Spam management are very important for forums.
-
Thanks Ray-PP,
Is there any specific ? Exemple , do logged in-user and anonymous user should have the same behavior ? How do you suggest handling canonical in a UGC world ? Do you have specific editorial guidelines enforced on UGC ? Exemple should we noindex a post with a 3 word question and an image ?
Cheers,
Yan
-
Hello Yan,
Fortunately, the on-site SEO for UGC is not very different from the on-site SEO of other forms of content. We can still apply those best-practices to the forums and UGC you're experiencing in forums.
Duplicate content / on-page factors
- Make sure the forum is using proper canonicalization
- use of rel=prev/next for paginated threads
- Semantic SEO where appropriate
- Make sure to have all on-page SEO factors optimized (title, headings, images optimized, ect)
Broken links
- Use Moz or a tool like Screaming Frog SEO Spider to identify 404 pages. Redirect any important pages with a 301 to its nearest related page (save SEO authority for the important dead pages).
Are there more specific issues you are experiencing with the forum?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Having 2 brands with the same content - will this work from an SEO perspective
Hi All, I would love if someone could help and provide some insights on this. We're a financial institution and have a set of products that we offer. We have recently joined with another brand and will now be offering all our products to their customers. What we are looking to do is have 1 site that masks the content for both sites so it appears as there are 2 seperate brands with different content - in fact we have a main site and then a sister brand that offers the same products. Is there anyway to do this so when someone searches for Credit Card from Brand A it is indexed under Brand A and same when someone searched for Credit Card from Brand B it is indexed under Brand B. The one thing is we would not want to rel:can the pages nor be penalised by googles latest PR algorithm. Hope someone can help! Thanks Dave
Intermediate & Advanced SEO | | CFCU1 -
Local SEO - two businesses at same address - best course of action?
Hi Mozzers - I'm working with 2 businesses at the moment, at the same address - the only difference between the two is the phone number. I could ask to split the business addresses apart, so that NAP(name, address, phone number) is different for each businesses (only the postcode will be the same). Or simply carry on at the moment, with the N and Ps different, yet with the As the same - the same addresses for both businesses. I've never experienced this issue before, so I'd value your input. Many thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
CDN for SEO (or not)?
Does CDN impact on SEO or not? There seems conflicting ideas as to whether they impact positively or negatively, I realise that if the page loads quicker this is a good thing for SEO and usability of course. Does Google see CDN as just cheating and a get-around for not doing the work from the ground up and using good hosting etc? Do you have any direct experience? All constructive input much appreciated!
Intermediate & Advanced SEO | | seoman101 -
Woocommerce SEO & Duplicate content?
Hi Moz fellows, I'm new to Woocommerce and couldn't find help on Google about certain SEO-related things. All my past projects were simple 5 pages websites + a blog, so I would just no-index categories, tags and archives to eliminate duplicate content errors. But with Woocommerce Product categories and tags, I've noticed that many e-Commerce websites with a high domain authority actually rank for certain keywords just by having their category/tags indexed. For example keyword 'hippie clothes' = etsy.com/category/hippie-clothes (fictional example) The problem is that if I have 100 products and 10 categories & tags on my site it creates THOUSANDS of duplicate content errors, but If I 'non index' categories and tags they will never rank well once my domain authority rises... Anyone has experience/comments about this? I use SEO by Yoast plugin. Your help is greatly appreciated! Thank you in advance. -Marc
Intermediate & Advanced SEO | | marcandre1 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
ECommerce product listed in multiple places, best SEO practice?
We have an eCommerce site we have built for a customer and the products are allowed to appear in more than one product category within the web site. Now I know this is a bad idea from a duplicate content point of view, But we are going to allow the customer to select which out of the multiple categories the product appears in will be the default category. This will mean we will have a way of defining what the default url is for a product. So am I correct in thinking all the other urls where the product appears we should add a rel canonical to these pages pointing to the default url to stop duplicate content? Is this the best way?
Intermediate & Advanced SEO | | spiralsites0