User comments with page content or as a separate page?
-
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
-
actually, on second thoughts I think the view-all page solution with rel=canonical (http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html) might be the smarter choice.
-
Hi Peter
That's actually a pretty good idea, I like it!
Only thing I'm not sure about: If we do paginate, the product description should still stay on top of the page, while only the comments below change. That way we get duplicate content, and the paginated pages with the additional comments would not be ranking well anyhow, I guess. So using rel=next/prev and rel=canonical might be the right choice, even if that way, only the first page with comments will be able to rank?
-
After posting this topic, we found that including all of the comments on the same page helped with long tail queries and alike. We haven't implemented pagination with the comments though, I think the most we have on one page is around 120 reasonably lengthy comments. I would add pagination for anything longer than that - you could use the REL=next and REL=previous tags on these pages to ensure that the engines group the pages together so they know they are the same piece of content. I hope this helps! Let us know what you decide.
-
I'm wondering about the same thing. Would you actually limit the amount of user comments on a page? And if so, would you place the surplus comments on an extra page?
-
You will want the comments on the same page as the actual content for sure. The UGC on the main page will help keep it fresh as well as being another possible reason for people to link to it. Asking a user to browse to a second page would make it that less likely they would actually comment as well. Keeping it simple would be best. It's kind of the same idea as to why you would want to have your blog on the same sub domain as your main site as in the newest whiteboard Friday.
-
Comments on a separate page is a PITA.
**....especially as extra pages add extra pagerank... ** Extra pages have nothing to do with adding pagerank. In fact the more pages you have the less pagerank any single page on your site has.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question RE: Links in Headers, Footers, Content, and Navigation
This question is regarding this Whiteboard Friday from October 2017 (https://moz.com/blog/links-headers-footers-navigation-impact-seo). Sorry that I am a little late to the party, but I wanted to see if someone could help out. So, in theory, if header links matter less than in-content links, and links lower on the page have their anchor text value stripped from them, is there any point of linking to an asset in the content that is also in the header other than for user experience (which I understand should be paramount)? Just want to be clear.Also, if in-content links are better than header links, than hypothetically an industry would want to find ways to organically link to landing pages rather than including that landing page in the header, no? Again, this is just for a Google link equity perspective, not a user experience perspective, just trying to wrap my head around the lesson. links-headers-footers-navigation-impact-seo
White Hat / Black Hat SEO | | 3VE0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
Duplicate content or not? If you're using abstracts from external sources you link to
I was wondering if a page (a blog post, for example) that offers links to external web pages along with abstracts from these pages would be considered duplicate content page and therefore penalized by Google. For example, I have a page that has very little original content (just two or three sentences that summarize or sometimes frame the topic) followed by five references to different external sources. Each reference contains a title, which is a link, and a short abstract, which basically is the first few sentences copied from the page it links to. So, except from a few sentences in the beginning everything is copied from other pages. Such a page would be very helpful for people interested in the topic as the sources it links to had been analyzed before, handpicked and were placed there to enhance user experience. But will this format be considered duplicate or near-duplicate content?
White Hat / Black Hat SEO | | romanbond0 -
Switching site content
I have been advised to take a particular path with my domain, to me it seems "black hat" but ill ask the experts: Is it acceptable when one owns an exact match location domain eg london.com, to run as a tourist information site, gathering links from wikipedia,bbc,local paper/radio/sports websites etc, then after 6 - 12 months, switch the content to a business site? What could the penalties be? Please advise...
White Hat / Black Hat SEO | | klsdnflksdnvl0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0 -
Will Google Penalize Content put in a Div with a Scrollbar?
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique? Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
White Hat / Black Hat SEO | | BrandLabs0 -
Single Domain With Different Pages Deep Linking To Different Pages On External Domain
I've been partaking in an extensive trial study and will be releasing the results soon, however I do have quite a strong indication to the answer to this question but would like to see what everyone else thinks first, to see where the common industry mindset is at. Let's say SiteA.com/page1.html is PR5 and links out to SiteB.com/page1.html This of course would count as a valuable backlink. Now, what would happen if SiteA.com/page2.html, which is also PR5, links out to SiteB.com/page2.html ? The link from SiteA is coming from a different page, and is also pointing to a different deeplink on SiteB, however it will contain the same IP address. What would the benefit be for having multiple deeplinks in this way (as outlined above, please read it carefully before responding) as opposed to having just a single deeplink from the domain? If a benefit does exist, then does the benefit start to become trivial? This has nothing to do with sitewide links. Serious answers only please.
White Hat / Black Hat SEO | | stevenheron1