One Blog Comment Now on Many Pages of The Same Domain
-
My question is I blog commented on this site http://blogirature.com/2012/07/01/half-of-200-signals-in-googles-ranking-algorithm-revealed/#comment-272
under the name "Peter Rota". For some reason the recent comments is a site wide link so, bascially my link from my website is pretty much on each page of their site now. I also noticed that the anchor text for each one of my links says "Peter Rota".
This is my concern will google think its spammy if im on a lot of pages on a same site for one blog comment, and will I be penailzied for the exact same anchor text on each page? If this is the case what could I do in trying to get the links removed? thanks
-
Okay thanks guys for all your answers I really appreciate it.
-
Ok this answer is half so i am not really sure why it is appearing on every page of the website... there must be something wit the comment plug-in he is using... (I may be wrong on this)
But your real concern that Google will find it spamming... I don’t think so... and one of the reason is that your anchor text is similar to your domain name and this is one think that makes it natural...
I would recommend not using Blog Comment as a way to generate links on your site as Google will count as SPAM!
-
In my opinion you will be completely fine.
For starters, the sitewide link is nofollow, just like in the original comment thread. As nofollow links are removed from the link graph, the assumption is they are also ignored by any webspam calculations. It's a discussion I've read a few times around the SEO forums, and most people seem to concur.
In addition, even if the links were _not _nofollowed, the anchor text here is only a person's name, rather than any money keywords. This undoubtedly would make it far less likely to be a cause for concern in the context of Penguin.
edit: apologies for repeating what Irving said, I didn't see his comment until I'd finished mine.
-
Ok - an article about being good at SEO citizen with black hat techniques in the comments section - how funny
I wouldn't be concerned since they nofollow the link
but - if they are legit, you should be able to contact them to get the comment removed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are links on a press page considered "reciprocal linking"?
Hi, We have a press page with a list of links to the articles that have mentioned us (most of which also have a link to our website). Is there any SEO impact with this approach? Does Google consider these reciprocal links? And if so, would making the links on the press page 'nofollow' solve the issue?
White Hat / Black Hat SEO | | mikekeeper0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Does Backlinks from User Profile Page Still Matter
Though, lot of algorithmic changes, updates have happened. However, backlinks from quality source or trusted sites has hardly lost its importance and thats why Open SIte Explorer or Majestic seo is still there to find quality of backlinks, trust factor and authority factor of the backlinks. I need to know does having backlink (dofollow) from Profile Page - still matters and if yes - will a do follow backlink from Moz Profile Page or any authority site or forum page, having dofollow link from user profile page - does it still count as a mark to authority of your site.
White Hat / Black Hat SEO | | Modi0 -
Link Removal and Disavow - Is Page Rank a sign directory is okay with Google
Hi, Currently cleaning up a clients link profile in preparation for disavow file and I have reached the stage where I am undecided on some directories as I don't want to remove all links. Is Page Rank an indication that Google is okay with a particular directory? For example the following domain is questionable, but has a PR of 3. Do I need to consider scrapping all such links in anticipation of future updates? http://www.easyfinddirectory.com/shopping-and-services/clothing http://www.toplocallistings.co.uk/Apparel/West_Midlands/Shropshire/ Thanks in advance Andy
White Hat / Black Hat SEO | | MarzVentures0 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
New domain or flagged domain?
New client had a domain get flagged by Google and disappear from search rankings. He left is old website company and wants us to design new site using the flagged domain. Are we better off using a new domain or try to resurrect the flagged domain?
White Hat / Black Hat SEO | | Group20 -
How to rank internal pages?
Hello, I have a website about consoles, on the homepage are a few thoughts about what consoles are and a short history. The main attraction are the pages about Xbox 360, PlayStation 3, Nintendo Wii, PSP Vita. So, I want to rank my homepage and my internal pages about the consoles ranking for "xbox360", "play station 3" each one on a separate page of course. Basically I want to rank brands. My main questions are: 1. How much link builing should I do for my homepage considering that I'm not really interested in ranking it as much as the internal pages? In percentage how it would look like? Random (stupid) example: 60% links to homepage, 10% to each internal page? 2. I guess I must do links for internal pages otherwise they won't rank good, only linking to homepage. 3. Considering the penguin update, my main keyword should be around what % of the overall anchors to each internal page? Thank you very much for your help!
White Hat / Black Hat SEO | | corodan0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0