User comments with page content or as a separate page?
-
With the latest Google updates in both cracking down on useless pages and concentrating on high quality content, would it be beneficial to include user posted comments on the same page as the content or a separate page? Having a separate page with enough comments on it would he worth warranting, especially as extra pages add extra pagerank but would it be better to include them with the original article/post? Your ideas and suggestions are greatly appreciated.
-
actually, on second thoughts I think the view-all page solution with rel=canonical (http://googlewebmastercentral.blogspot.com/2011/09/view-all-in-search-results.html) might be the smarter choice.
-
Hi Peter
That's actually a pretty good idea, I like it!
Only thing I'm not sure about: If we do paginate, the product description should still stay on top of the page, while only the comments below change. That way we get duplicate content, and the paginated pages with the additional comments would not be ranking well anyhow, I guess. So using rel=next/prev and rel=canonical might be the right choice, even if that way, only the first page with comments will be able to rank?
-
After posting this topic, we found that including all of the comments on the same page helped with long tail queries and alike. We haven't implemented pagination with the comments though, I think the most we have on one page is around 120 reasonably lengthy comments. I would add pagination for anything longer than that - you could use the REL=next and REL=previous tags on these pages to ensure that the engines group the pages together so they know they are the same piece of content. I hope this helps! Let us know what you decide.
-
I'm wondering about the same thing. Would you actually limit the amount of user comments on a page? And if so, would you place the surplus comments on an extra page?
-
You will want the comments on the same page as the actual content for sure. The UGC on the main page will help keep it fresh as well as being another possible reason for people to link to it. Asking a user to browse to a second page would make it that less likely they would actually comment as well. Keeping it simple would be best. It's kind of the same idea as to why you would want to have your blog on the same sub domain as your main site as in the newest whiteboard Friday.
-
Comments on a separate page is a PITA.
**....especially as extra pages add extra pagerank... ** Extra pages have nothing to do with adding pagerank. In fact the more pages you have the less pagerank any single page on your site has.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
White Hat / Black Hat SEO | | stever9990 -
[linkbuilding] link partner page on webshop, is it working?
Hello Mozzers, I am wondering about the effect of link building by swapping links between websites and adding a link partner page to the web shop containing hundreds of links. I have this new competitor coming in to the SERP of Google competing on the keywords I am targeting. The competitor has way more links than our web shop. The competitor has a page with hundreds of links to other web shops witch on there turn has a link to there web shop. (not all off them link back btw) I always thought it is no use sharing links with other websites this way in creating a huge page with hundreds of links. it is of no benefit for neighter website to do this. Still it does seems to work (?) and tis strategy is used by a lot of web shops in the Netherlands. How are you guys looking at this?
White Hat / Black Hat SEO | | auke1810
Witch of you guy's are using strategy like this?
Should I pick up this strategy myself?0 -
Site Search external hosted pages - Penguin
Hi All, On the site www.myworkwear.co.uk we have a an externally hosted site search that also creates separately hosted pages of popular searches which rank in Google and create traffic. An example of this is listed below: Google Search: blue work trousers (appears on front page of Google) Site Champion Page: http://workwear.myworkwear.co.uk/workwear/Navy%20Blue%20Work%20Trousers Nearest Category page: http://www.myworkwear.co.uk/category/Mens-Work-Trousers-936.htm Could this be a penalisation or duplication factor? Could these be interpreted as a dodgy link factor? Thanks in advance for your help. Kind Regards, Andy Southall
White Hat / Black Hat SEO | | MarzVentures0 -
Content ideas?
We run a printing company and we are struggling to come up with unique content people will actually want to know, is there any way of getting the ball rolling? We were thinking of ideas such as exhibition guide but this seems to have been overdone. Any help would be appreciated.
White Hat / Black Hat SEO | | BobAnderson0 -
Starting every page title with the keyword
I've read everywhere that it's vital to get your target keyword to the front of the title that you're writing up. Taking into account that Google likes things looking natural I wanted to check if my writing title's like this for example: "Photographers Miami- Find the right Equipment and Accessories" ..Repeated for every page (maybe a page on photography in miami, one on videography in Orlando etc) is a smart way to write titles or if by clearly stacking keywords at the front of every title won't be as beneficial as other ways of doing it?
White Hat / Black Hat SEO | | xcyte0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0