Leaving Comments on blogs when html is removed
-
I found the following blog. It is pagerank 5 do follow
http://www.unssc.org/web1/programmes/rcs/cca_undaf_training_material/teamrcs/forumdetail.asp?ID=32
If you attempt to leave a comment with html, the html is removed. There is a button which allows you to leave a comment but if you do it gets redirected to the domain of the blog not your site. However there are still people leaving links with the url of the intended site. As late as today.
look at this comment
Comment posted by : Alex on 09/09/2011 I love to se percorsi on this site very oftenHow is this done, if anyone knows
I got the code done to this
The important part being mce_real_href
-
There's a fresh post on Search Engine People yesterday about blog commenting guidelines that likely mirrors what many people here would say about commenting. http://www.searchenginepeople.com/blog/blog-commenting-guideline.html It's a good read for developing a comment strategy that will work in the long term.
-
I've looked at a few of your questions, and would like to suggest that you look at the Beginner's Guide to SEO and get an good idea of the big picture of SEO and a feel for the SEOmoz approach to SEO. The people here tend to focus on doing things that provide value to the viewer of the site and finding the right way to do things so that your results last, rather than trying to find shortcuts and things that will only work for a short while until there's another update of the search engines.
-
A bot? You're scaring me a bit, like the lunch lady in Billy Madison. Are you going to write a bot to spam your links on their comment pages? That sounds fruitless... as these comment pages are already filled from top to bottom with spam.
If you use Firebug in Firefox, or Chrome, you can write what text you want in the field and add the link, then inspect the input where you see the text, and grab all the HTML that was generated by adding the text. Then it's just a matter of having a bot inject that text into the page. You'll still have to beat their captcha though to fully automate the process.
-
Another Thing does anyone know how to do this automatically, with a bot instead of highlighting the string and pressing the link button and putting in the info.
-
I tried it and it worked...
My guess is you're not putting the "http://" in the Link URL field, so the comment thinks it's a relative URL, so it ends up amending whatever you put after the page URL in the link text. Make sure to put the absolute URL in the Link URL field, starting with "http://", and it should work fine.
-
Ok I figured it out a bit more - you need to put in http://yourdomain.com not yourdomain.com when the pop up box asked for the url. My fault. Thanks for the good answer.
-
yeah, but it puts the following infront of your url
http://www.unssc.org/web1/programmes/rcs/cca_undaf_training_material/
which makes it a link to the forum not your website
TRY IT - you will see
-
You can do it the same way you can add links in for comments in this Q&A forum. Write the anchor text in the comment, then select the text, click on the link button, and then put the anchor text in the Link URL field in the pop-up that appears.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove Directory
www.enakliyat.com.tr/detaylar/page-6255 www.enakliyat.com.tr/detaylar/page-6240 www.enakliyat.com.tr/detaylar/page-6253 www.enakliyat.com.tr/detaylar/page-6255 ..... ..... ..... We have so many page like this page and we want to remove all this page from Google search results. How can I remove from Google Webmaster Tools Help!
Technical SEO | | iskq0 -
Google ranks my sitemap.xml instead of blog post
Hello, For some reason Google shows sitemap results when i search for my blog url website.com/blog/postwhy is Google ranking my sitemap but not a post, especially when i search for full URL? Thanks
Technical SEO | | KentR0 -
SEO credit for subdomain blogs?
One of my clients is currently running a webstore through Volusions. They would like to add a blog to their website, but since Volusions doesn't currently support blogs on the same domain we would have to create a Wordpress blog and link it to a subdomain. (http://support.volusion.com/article/linking-your-blog-your-volusion-store) Using this method, will their primary website receive any SEO credit for the content being created on the blog or will it only count towards the subdomain? Thanks!
Technical SEO | | CMSSolutions980 -
Feedburner - Why Sending My Blog Posts A Day After I Post Them?
I have my feed setup through feedburner for my wife's blog ktlouise.com. Whenever she posts a new blog post, it doesn't get emailed to her subscribers until the next day. Does anyone know how to change this so that the updates go out the same day? Thanks for the help! REF
Technical SEO | | FergusonSEO0 -
Micro formats to block HTML text portions of pages
I have a client that wants to use micro formatting to keep a portion of their page (the disclaimer) from being read by the search engines. They want to do this because it will help with their keyword density on the rest of the page and block the “bad keywords” that come from their legally required disclaimer. We have suggested alternate methods to resolve this problem, but they do not want to implement those, they just want a POV from us explaining how this micro formatting process will work. And that’s where the problem is. I’ve never heard of this use case and can’t seem to find anyone who has. I'm posting the question to the Moz Community to see if anyone knows how microformats can keep copy from being crawled by the bots. Please include any links to sites that you know that are using micro formatting in this way. Have you implemented it and seen results? Do you know of a website that is using it now? We're looking for use cases please!
Technical SEO | | Merkle-Impaqt0 -
How to remove crawl errors in google webmaster tools
In my webmaster tools account it says that I have almost 8000 crawl errors. Most of which are http 403 errors The urls are http://legendzelda.net/forums/index.php?app=members§ion=friends&module=profile&do=remove&member_id=224 http://legendzelda.net/forums/index.php?app=core&module=attach§ion=attach&attach_rel_module=post&attach_id=166 And similar urls. I recently blocked crawl access to my members folder to remove duplicate errors but not sure how i can block access to these kinds of urls since its not really a folder thing. Any idea on how to?
Technical SEO | | NoahGlaser780 -
When to SEO optimize a blog post?
Hi there, Here's our situation: there are two people working on the blog. person 1) writes the posts person 2) SEO optimizes the posts I know this is not ideal but it's the best we can do and it's a whole lot better than no blog. 🙂 I'm the fellow optimizing the posts. I've found that my best SEO efforts usually slightly undermine the readability of these posts -- not in an extreme way, I'm not going overboard with keywords or anything. Rather, things like a sexy & enticing article heading may have to be dummed down for search engines... Because of this dumming down, I like to wait a couple of weeks to SEO optimize our posts, the logic being that we get the best of both worlds: a happy regular readership on topic articles that are clearly described for (and aligned to the terms used by) our search engine visitors What I'm wondering is, Generally: can you see any problems with this setup? would you do it differently? Specifically: does Google (et al) punish this sort of backwards re-writing? and, does it somehow amount to less SEO mojo when done retroactively? Thanks so much for your time! Best, Jon
Technical SEO | | JonAmar0 -
How to fix 404 (Client Error) errors in wordpress blog?
hey A very quick question...after analyzed my wp blog I've found "34" 404 (Client Error) Errors and I don't know how to fix it, do you know how?? *I renew html code of 404 of my wordpress blog.
Technical SEO | | akitmane1