User comments with page content or as a separate page?
-
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
-
actually, on second thoughts I think the view-all page solution with rel=canonical (http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html) might be the smarter choice.
-
Hi Peter
That's actually a pretty good idea, I like it!
Only thing I'm not sure about: If we do paginate, the product description should still stay on top of the page, while only the comments below change. That way we get duplicate content, and the paginated pages with the additional comments would not be ranking well anyhow, I guess. So using rel=next/prev and rel=canonical might be the right choice, even if that way, only the first page with comments will be able to rank?
-
After posting this topic, we found that including all of the comments on the same page helped with long tail queries and alike. We haven't implemented pagination with the comments though, I think the most we have on one page is around 120 reasonably lengthy comments. I would add pagination for anything longer than that - you could use the REL=next and REL=previous tags on these pages to ensure that the engines group the pages together so they know they are the same piece of content. I hope this helps! Let us know what you decide.
-
I'm wondering about the same thing. Would you actually limit the amount of user comments on a page? And if so, would you place the surplus comments on an extra page?
-
You will want the comments on the same page as the actual content for sure. The UGC on the main page will help keep it fresh as well as being another possible reason for people to link to it. Asking a user to browse to a second page would make it that less likely they would actually comment as well. Keeping it simple would be best. It's kind of the same idea as to why you would want to have your blog on the same sub domain as your main site as in the newest whiteboard Friday.
-
Comments on a separate page is a PITA.
**....especially as extra pages add extra pagerank... ** Extra pages have nothing to do with adding pagerank. In fact the more pages you have the less pagerank any single page on your site has.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What can I put on a 404 page?
When it comes to SEO what can I put on a 404 page? I want to add content that actually makes the page useful so visitors will more likely stay on the website. Most pages just have a big image of 404 and a couple sentences saying what happened. I am wondering if Google would like if there was blog suggestions or navigational functions?
White Hat / Black Hat SEO | | JoeyGedgaud0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Glossary pages - keyword stuffing danger?
I've put together a glossary of terms related to my industry that have SEO value and am planning on building out a section on our site with unique pages for each term. However, most of these terms have synonyms or are highly similar to other valuable terms. If I were to make a glossary, and on each page (that will have high-quality, valuable, and accurate definitions and more), wrote something like "{term}, also commonly referred to as {synonym}, {synonym}," would I run the risk of keyword stuffing penalties? My only other idea beyond creating a glossary with separate pages defining each synonym is to use schema.org markup to add synonyms to the HTML of the page, but that could be seen as even more grey-hat type keyword stuffing. I guess one other option would be to work the synonyms into the definition so that the presence of the keyword reads more organically. Thanks!
White Hat / Black Hat SEO | | alecfwilson0 -
I need a lot of content completed in a short amount of time. Suggestions on where to look?
I'm looking for writers to write content for 1000+ key words. 300-400 words per keyword. I would like this done by the end of July. Any suggestions or recommendations on where to find a team that can produce quality content in that amount of time? Thank you!
White Hat / Black Hat SEO | | cloudhasher0 -
Obscene anchor text linking to non-existent pages on my site
My website seems to be rapidly accumulating links from what seem to be reputable websites and which are going to non-existent pages on my website. The anchor text of many of these links is obscene. Here is the URL of one of the pages that is linking to me. I contacted the originating site a couple of weeks ago and they are looking into it but I've not heard back. I'm guessing the originating sites have been hacked. Should I be concerned? Why are they linking to pages on my site that don't exist? http://www.radicalartistsagency.com/htmlarea/language/0content_abo_utus.html Looking at the page source of this page reveals the hidden links.
White Hat / Black Hat SEO | | MartinDS0 -
Doorway Page? or just a flawed idea?
I have a website which is on a .co.uk TLD and is primarily focused to the UK. Understandably I get very little in the way on US traffic, even though a lot of the content is applicable to the UK or US and could be made more so with a little tinkering. The domain has some age to it and ranks quite well for a variety of keywords and phrases, so it seems sensible to keep the site on this domain. The .com version of the domain is no longer available, and the current owner does not seem inclined to sell it to me. So, I am considering registering a very similar .com domain and simply using it to drive some traffic to the .co.uk site. To do this, I would have the same category pages and the same (or similar) list of links to the various pages in those categories. But instead instead of linking to a page on the new .com, it would take visitors to the existing page on the .co.uk. I would make this transparent to visitors ("Take a look at these pages on our sister site bluewidgets.co.uk") and the .com would have some unique content of its own. Would this be considered some kind of Doorway site/page (content rich doorway), or is it simply bad idea which is unlikely to drive any traffic?
White Hat / Black Hat SEO | | Jingo010