Influence of users' comments on a page (on-page SEO)
-
Do you think when Google crawls your page, it "monitors" comments updates to use this as a ranking factor? If Google is looking for social signs, looking for comments updates might be a social sign as well (ok a lot easier to manipulate, but still social).
thx
-
Do you think comments are a ranking factor because of the keyword usage or because search engines can check comment count, for instance?
How a search engine can determine who comment if it's a local comment system? What about facebook comments?
-
I think comments are a ranking factor. I think that we will also see who comments (i.e. author rank) become more important than merely having comments too.
-
Ahhh, Gotchya! That's not a bad idea. Biggest factor I would see with them doing is how to determine which comments were authentic. Seems like the majority of site owners don't understand how scrapebox (comment kahuna, fast blog finder, DoFellow) works and will approve SOME spam no matter what.
I would definitely be interested in seeing posts with high quality dialogging going on though. Maybe another reason they are pushing the +1 button..
-
I was talking about blog comments. In a high competitive market it might be interesting for Google to list sites with more traffic (or higher user engagement = # of likes = # of tweets = # of comments updates) on top.
I had this question because I've just added Facebook comments into my page and I'm going to use their graph API to load comments and put it within the page (so Google can crawl Facebook comments).
The only bad thing about comments is that they usually are very shallow in terms of keyword usage.
thx
-
Are you talking about blog comments or comments on social media wall posts/tweets?
If you were referring to, comments within your domain, they can see the last updated info about the page and may come back the new comments.
When new content is added through the comments, then they can determine more/less relevance of the page via KW saturation and outbound link anchor text & destinations. It wouldn't really be a social signal, just one that tells the search engine the page is still be used/spammed. Probably couldn't determine too much from that!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Nuisance visitors to non active page. What's going on?
Hi Guys, for the past several months, I get high volume of searches on a non-existing page /h/9249823.html. These searches come from all over the world from different domains and have a zero session duration. They are automatically forwarded to my home page. The source re Google Analytics is 12-reasons-for-seo.com. The full referrer is 12.reasons-for-seo.com/seo2php. Any idea what is provoking this activity? Any chance it's screwing with my legitimate search results or rankings?
White Hat / Black Hat SEO | | Lysarden0 -
Page plumetting with a optimisation score of 97\. HELP
Hi everyone, One of my pages has an optimisation score of 93, but ranks in 50+ place. What on earth can I do to address this? It's a course page so I've added the 'course' schema. I've added all the alt tags to say the keyword, UX signals aren't bad. Keyword is in the title tag. It has a meta description. Added an extra 7 internal, anchor-rich links pointing at the page this week. Nothing seems to address it. Any ideas? Cheers, Rhys
White Hat / Black Hat SEO | | SwanseaMedicine1 -
UKBF 'forex' clones appearing
Hi all, Just been looking at my referring domains and it seems someone is taking the pleasure of cloning the UK Business Forums website and adding 'forex' based links on all the external anchors. This includes everyone who is listed in their directory. I've put below the domains I know of, but if anyone else knows of more please add them so we can all get them disavowed. domain:redwood96.ru
White Hat / Black Hat SEO | | phero
domain:zanier.it
domain:selskie-zori.ru
domain:gabrielloni.it
domain:reserva-ideal.com
domain:imexaf.com
domain:rassemblementpourjouy.com
domain:windsorlegion.ca
domain:powerconector.com
domain:eltallerdelorfebrewd.com
domain:aepedome.net
domain:spkvarc.ru
domain:mtdnk.ru
domain:koning.rs
domain:rassemblementpourjouy.com
domain:imexaf.com
domain:gabrielloni.it0 -
Ajax Pagination on Ecommerce category pages - Good or Bad?
We have an ecommerce site. We installed an AJAX feature that when you scroll down to say, the end of 6 rows of products, it loads another page below the seam. Question is, is this good or bad for SEO? Any tests you can suggest? Thanks Ben
White Hat / Black Hat SEO | | bjs20100 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Hit by Negative SEO
I've seen some discussion here about whether or not negative seo is real. I've just spent 6 months recovering from Penguin, rewriting content, removing hundreds of bad links, and seeing our traffic slowly improve. Yesterday we noticed in Google webmasters tools that we're ranking for the term "Free Sex." Here... http://screencast.com/t/ezoo2sCRXQ Now we have discovered that thousands of "sex" links have been directed at our improving domain. I am convinced I know who the culprit is. What would you advise a client to do in my situation? Forget about removing these damn links. I don't have the time, money or energy to go through that again. I'm sure he can add them much faster than I can ever remove them. Is the disavow tool best answer in this case? Or is there an international court of seo justice that I can appeal to?
White Hat / Black Hat SEO | | DarrenX0 -
How to rank internal pages?
Hello, I have a website about consoles, on the homepage are a few thoughts about what consoles are and a short history. The main attraction are the pages about Xbox 360, PlayStation 3, Nintendo Wii, PSP Vita. So, I want to rank my homepage and my internal pages about the consoles ranking for "xbox360", "play station 3" each one on a separate page of course. Basically I want to rank brands. My main questions are: 1. How much link builing should I do for my homepage considering that I'm not really interested in ranking it as much as the internal pages? In percentage how it would look like? Random (stupid) example: 60% links to homepage, 10% to each internal page? 2. I guess I must do links for internal pages otherwise they won't rank good, only linking to homepage. 3. Considering the penguin update, my main keyword should be around what % of the overall anchors to each internal page? Thank you very much for your help!
White Hat / Black Hat SEO | | corodan0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0