User comments with page content or as a separate page?
-
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
-
actually, on second thoughts I think the view-all page solution with rel=canonical (http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html) might be the smarter choice.
-
Hi Peter
That's actually a pretty good idea, I like it!
Only thing I'm not sure about: If we do paginate, the product description should still stay on top of the page, while only the comments below change. That way we get duplicate content, and the paginated pages with the additional comments would not be ranking well anyhow, I guess. So using rel=next/prev and rel=canonical might be the right choice, even if that way, only the first page with comments will be able to rank?
-
After posting this topic, we found that including all of the comments on the same page helped with long tail queries and alike. We haven't implemented pagination with the comments though, I think the most we have on one page is around 120 reasonably lengthy comments. I would add pagination for anything longer than that - you could use the REL=next and REL=previous tags on these pages to ensure that the engines group the pages together so they know they are the same piece of content. I hope this helps! Let us know what you decide.
-
I'm wondering about the same thing. Would you actually limit the amount of user comments on a page? And if so, would you place the surplus comments on an extra page?
-
You will want the comments on the same page as the actual content for sure. The UGC on the main page will help keep it fresh as well as being another possible reason for people to link to it. Asking a user to browse to a second page would make it that less likely they would actually comment as well. Keeping it simple would be best. It's kind of the same idea as to why you would want to have your blog on the same sub domain as your main site as in the newest whiteboard Friday.
-
Comments on a separate page is a PITA.
**....especially as extra pages add extra pagerank... ** Extra pages have nothing to do with adding pagerank. In fact the more pages you have the less pagerank any single page on your site has.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Schema for Landing Pages
Hi guys, I do a lot of landing pages for cars and would like to know what the best practices are for some things in Schema, so I can enhance their web presence. I would like to make some bullet points of the features of the vehicles show up in Google search results. What would be the best way to make this happen with Schema? Also, can I use coordinates in the code to make the landing page show up on a search result saying "2014 Volkswagen Beetle near me," rather than "2014 Volkswagen Beetle near Clarence, NY?" Can I make an image of the brand or car show up in the search results along with the meta description (kind of what bloggers do). Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Site Search external hosted pages - Penguin
Hi All, On the site www.myworkwear.co.uk we have a an externally hosted site search that also creates separately hosted pages of popular searches which rank in Google and create traffic. An example of this is listed below: Google Search: blue work trousers (appears on front page of Google) Site Champion Page: http://workwear.myworkwear.co.uk/workwear/Navy%20Blue%20Work%20Trousers Nearest Category page: http://www.myworkwear.co.uk/category/Mens-Work-Trousers-936.htm Could this be a penalisation or duplication factor? Could these be interpreted as a dodgy link factor? Thanks in advance for your help. Kind Regards, Andy Southall
White Hat / Black Hat SEO | | MarzVentures0 -
Publishing the same article content on Yahoo? Worth It? Penalties? Urgent
Hey All, I am currently working for a company and they are publishing exactly the same content on their website and yahoo. In addition to this when I put the same article's title it gets outranked by Yahoo. Isn't against Google guidelines? I think Yahoo also gets more than us since they are on the first position. How do you think should the company stop this practice? Please need urgent responses for these questions. Also look at the attachment and look at the snippets. We have a snippet (description) like the first paragraph but yahoo somehow scans the content and creates meta descriptions based on the search queries. How do they do That?
White Hat / Black Hat SEO | | moneywise_test0 -
Ask Bloggers/Users To Link To Website
I have a web service that help bloggers to do certain tasks and find different partners. We have a couple of thousand bloggers using the service and ofcourse this is a great resource for us to build links from. The bloggers are all from different platforms and domains. Currently when a blogger login to the service we tell the blogger that if they write a blog post about us with their own words, and tell their readers what they think of our service. We will then give them a certain benifit within the service. This is clearly encouraging a dofollow-link from the bloggers, and therefore it's not natural link building. The strategy is however working quite good with about 150 new blog posts about our service per month, which both gives us a lot of new visitors and users, but also give us link power to increase our rankings within the SERP. Now to my questions: This is not a natural way of building links, but what is your opinion of this? Is this total black hat and should we be scared of a severe punishment from Google? We are not leaving any footprints more than we are asking the users for a link, and all blogposts are created with their own unique words and honest opinions. Since this viral marketing method is working great, we have no plans of changing our strategy. But what should we avoid and what steps should we take to ensure that we won't get in any trouble in the future for encouraging our users to linking back to us in this manner?
White Hat / Black Hat SEO | | marcuslind0 -
No-Follow Comments from 2010
An old SEO consultant left a lot of comments with exact anchor text links on non relevant blogs back in 2010. At this point most of them are no-follow, but I'm obviously still concerned they are damaging. Is the no-follow enough? Or should I still work to remove them? Is the time worth the effort? Thanks,
White Hat / Black Hat SEO | | CleanEdisonInc0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0