Influence of users' comments on a page (on-page SEO)
-
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social).
thx
-
Do you think comments are a ranking factor because of the keyword usage or because search engines can check comment count, for instance?
How a search engine can determine who comment if it's a local comment system? What about facebook comments?
-
I think comments are a ranking factor. I think that we will also see who comments (i.e. author rank) become more important than merely having comments too.
-
Ahhh, Gotchya! That's not a bad idea. Biggest factor I would see with them doing is how to determine which comments were authentic. Seems like the majority of site owners don't understand how scrapebox (comment kahuna, fast blog finder, DoFellow) works and will approve SOME spam no matter what.
I would definitely be interested in seeing posts with high quality dialogging going on though. Maybe another reason they are pushing the +1 button..
-
I was talking about blog comments. In a high competitive market it might be interesting for Google to list sites with more traffic (or higher user engagement = # of likes = # of tweets = # of comments updates) on top.
I had this question because I've just added Facebook comments into my page and I'm going to use their graph API to load comments and put it within the page (so Google can crawl Facebook comments).
The only bad thing about comments is that they usually are very shallow in terms of keyword usage.
thx
-
Are you talking about blog comments or comments on social media wall posts/tweets?
If you were referring to, comments within your domain, they can see the last updated info about the page and may come back the new comments.
When new content is added through the comments, then they can determine more/less relevance of the page via KW saturation and outbound link anchor text & destinations. It wouldn't really be a social signal, just one that tells the search engine the page is still be used/spammed. Probably couldn't determine too much from that!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
My ranking dropped 3 pages on 18 november 2012
Hi There my site ranking dropped suddenly today for my main keywords such as security companies in london and security services in london from first page to 4th-5th page. these keywords were ranked on homepage http://www.armstrongsecurity.co.uk/ other keywords from some internal pages, such as this one http://www.armstrongsecurity.co.uk/security-services/event-security-london.html theygot hit slightly and went couple of listings down the road for event security london, event security companies london as well. same slight hitting happened on this page for main keywords http://www.armstrongsecurity.co.uk/bodyguard-for-hire-london.html can anyone help me, how to get the rankings back? my site authority is around 60 which is far better than most sites ranking higher than me now. these are some problems that i understand so far. keyword rich anchor text link profile for my main keywords over optimised pages let me know if anything you might find suspicious on my site that i can fix either on site or in my link profile. looking forward to your help. thanks gill
White Hat / Black Hat SEO | | spciuk0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0