Server Side Java Script Redirects
-
I would like to use a redirect through a server based Java script to redirect visitors only referenced from a certain site. So let's say anyone clicking on a link to my site page-A from seomoz.org would automatically be redirected to page-B. All other users as well as direct and search engine traffic would only see the regular page A. The reason I am doing this is because the linking site is linking to page A which doesn't serve the user the correct content. Rather than contacting the webmaster to change the link to point to page -B, I want to redirect them. Is there any danger of Google penalizing this for cloaking? and how would they be able to tell?
-
Ideally, if you do have any contact with the webmaster you'd be better off asking them to change it. If you don't, starting to network with people already willing to link to you is a great way to get more links anyway.
If you really don't want to try and speak with them (and again, you really should) then I have done something similar with affiliates; when they send a visitor I send them to a co-branded page based on the referral header, and haven't seen anything bad happen.
Although Google is getting better at reading javascript it still doesn't seem to follow links run in real time. As you're not intending to deceive the bot your use of it is fairly legit, but I couldn't tell you how it'd be interpreted once the datacentres have a look at it.
Here's the guidelines - http://www.google.com/support/webmasters/bin/answer.py?answer=66355 - up to you if you think you're breaking them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Deleting 301 Redirect URLs from the CMS
Hi Everyone, Would there be a negative SEO effect from deleting pages with 301 redirects in your CMS? Does anyone know of an average time of authority transfer from a redirect? Thanks,
White Hat / Black Hat SEO | | JMSCC
Jon0 -
.com geotagging redirect to subdomains - will it affect SEO?
Hi guys, We have a .com domain and we've got geoIP on it, so UK goes to .co.uk and USA goes to .com/us We're just migrating over to another platform so we're thinking of keeping a "dummy" server just to do this geoIP pointing for us. Essentially .com will just point over to the right place and hold a specific .com/abc (which is generic for everyone worldwide) Current Scenario:
White Hat / Black Hat SEO | | Infruition
.com (Magento + geoIP)
.com/us (US Magento)
.co.uk (UK - geoIP redirect to Shopify)
.com/abc (sits on Magento server) Wanted Scenario:
.com - used for GEOIP and a specific .com/abc (for all users)
.co.uk (UK) - Shopify eCom
.com/us -> migration to us.xx.com (USA) - Shopify eCom I just wanted to know if this will affect our rankings on google? Also, any advice as to the best practises here would be great. Thanks! Nitesh0 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Redirecting location-specific domains
I am working on a project for a physician who only cares about reaching patients within a specific geographic region. He has a new technique at his practice and wants to get the word out via radio spots. I want to track the effectiveness of the radio campaigns without the use of call-tracking numbers or special promo codes. Since the physician's primary domain is very long (but well-established), my thought is to register 3-4 short domains referencing the technique and location so they would be easy for listeners to remember and type-in later. 301 these domains to the relevant landing page on the main domain. As an alternative. Each domain could be a single relevant landing page with a link to the relevant procedure on the main site. It's not as if there is anything deceptive going on, rather, I would simply be using a domain in place of a call tracking number. I think I should be able to view the type-in traffic in Analytics, but would Google have an issue with this? Thoughts and suggestions appreciated!
White Hat / Black Hat SEO | | SCW0 -
Suggestions in redirecting an old URL to new URL with parenthesis () ?
What should I use in .htaccess if I will redirect an old URL with parentheses to a new URL like below? RedirectMatch 301 http://www.olddomain.com/buy/nike-shoes/kobe(7)/red http://www.newdomain.com/buy/nike-shoes/kobe(7)/red Or RedirectMatch 301 http://www.olddomain.com/buy/nike-shoes/kobe(7)/red http://www.newdomain.com/buy/nike-shoes/kobe(7)/red
White Hat / Black Hat SEO | | esiow20130 -
Removing/ Redirecting bad URL's from main domain
Our users create content for which we host on a seperate URL for a web version. Originally this was hosted on our main domain. This was causing problems because Google was seeing all these different types of content on our main domain. The page content was all over the place and (we think) may have harmed our main domain reputation. About a month ago, we added a robots.txt to block those URL's in that particular folder, so that Google doesn't crawl those pages and ignores it in the SERP. We now went a step further and are now redirecting (301 redirect) all those user created URL's to a totally brand new domain (not affiliated with our brand or main domain). This should have been done from the beginning, but it wasn't. Any suggestions on how can we remove all those original URL's and make Google see them as not affiliated with main domain?? or should we just give it the good ol' time recipe for it to fix itself??
White Hat / Black Hat SEO | | redcappi0 -
Separate Servers for Humans vs. Bots with Same Content Considered Cloaking?
Hi, We are considering using separate servers for when a Bot vs. a Human lands on our site to prevent overloading our servers. Just wondering if this is considered cloaking if the content remains exactly the same to both the Bot & Human, but on different servers. And if this isn't considered cloaking, will this affect the way our site is crawled? Or hurt rankings? Thanks
White Hat / Black Hat SEO | | Desiree-CP0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0