Is the copy-paste function under tynt.com SEO friendly
-
Hello you suggest http://www.tynt.com/publisher-tools/copy-and-paste-to-share-content/ in your pro tipps but I wonder why you are not using it for your own blog under seomoz.org.
I noticed that when somebody copy and paste something form my blog a strange code is added to the link: Expample: http://www.janik.cc/webdesigner-blog/2011/02/sample/#ixzz1FQvadWNV
Is the #ixzz1FQvadWNV maybe not that good in a seo view of point?
-
I don't think they need to circulate their content anymore as it's highly distributed already. I'm also not completely sure about this, but I think google ignores the pound # tag. As it's so commonly used as an anchor and for ajax/javascript. Try searching # in google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Alt Tags - how important for SEO?
Hi I know alt tags should be on an image, however at the moment I have 23,741 missing on the site, how important are these? It's a big project for someone to update & I need some justification Thanks Mozzers 🙂
Technical SEO | | BeckyKey0 -
SEO for sub domains
I've recently started to work on a website that has been previously targeting sub domain pages on its site for its SEO and has some ok rankings. To better explain, let me give an example...A site is called domainname.com. And has subdomains that they are targeted for seo (i.e. pageone.domainname.com, pagetwo.domainname.com, pagethree.domianname.com). The site is going through a site re-development and can reorganise its pages to another URL. What would be best way to approach this situation for SEO? Ideally, I'm tempted to recommend that new targeted pages be created - domainname.com/pageone, domainname.com/pagetwo, domainname.com/pagethree, etc - and to perform a 301 redirect from the old pages. Does a subdomain page structure (e.g. pageone.domainname.com) have any negative effects on SEO? Also, is there a good way to track rankings? I find that a lot of rank checkers don't pick up subdomains. Any tips on the best approach to take here would be appreciated. Hope I've made sense!
Technical SEO | | Gavo0 -
Question about creating friendly URLs
I am working on creating new SEO friendly URLs for my company website. The products are the items with the highest search volume and each is very geo-specific
Technical SEO | | theLotter
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well. Do you think it is preferable to leave the location out of the URL or include it?0 -
Adding an SEO Friendly Blog Module
Hi, Our website is developed in .NET. We need to add a new "out the box" blog module. Is there such a thing as a blog module that is good for SEO. For example updatable Page Titles and Title Tags etc. If so can anyone recommend one Thanks Andrew
Technical SEO | | Studio330 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Pager + SEO - Is it possible?
Hi, I am having this issue. I know that pager are not friends with SEO, but I want to know which is the best to do in this situations. for example, I work in a news company, and I have a lot of news pages that are very extensive so I use a pager. Well here I have the problem. suppose that the url is www.mysite.com/news/id/here-comes-the-title When you enter that URL you are viewing the first page that has this meta: title keywords description Now, the problem comes when the user goes to the page 2 of this news article. What I shall do? 1- Change the url to www.mysite.com/news/id/here-comes-the-title-PAGE2 www.mysite.com/news/PAGE2-id/here-comes-the-title www.mysite.com/news/id/PAGE2/here-comes-the-title 2- in the page 2,3,4,5 ... add a meta robot noindex? In the option 2 I think that I am loosing the opportunity to index the body of my article. Is this correct? Thanks
Technical SEO | | informatica8100 -
Seo on a dk site
hi my client has asked if we can seo their dk site , my question is does all link building and article submission have to be in danish
Technical SEO | | Westernoriental0 -
SEO Friendly Calendar System
Does anyone have a recommendation for a calendar system that is SEO friendly? I have been using Helios Calendar but the current version lacks proper SEO bones (canonical URLs, mini calendar generates links to empty events from 1950 to 2020, 302 re-directs, and it is generating thousands of crawl errors in Google Webmaster Tools. The developer has plans to implement some fixes, and I would rather not rip apart what is currently there to fix core issues. I have found that calendars in general are a nightmare. If anyone has any suggestions, or has experience in tidying up Helios I would be interested. Thanks, Dan
Technical SEO | | DanLaBate0