Would using javascript onclick functions to override href target be ok?
-
Hi all,
I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!!
i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that...
I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page...
They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages...
This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website...
Can any one advise if this is OK, or a "no no"...
P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
-
Hello James,
Why do these pages have "no SEO value"? Is it because they are AJAX pages or because you have them noindexed? Or both?
To answer your original question, using an on-click javascript event to send a user to a page other than the URL listed in the href tag is borderline. It goes beyond the risk level I would feel comfortable with on an eCommerce site, but a lot of affiliate sites do this. For instance, all of their links out to merchant sites may go through a directory called /outlink/ so the href tag might look like .../outlink/link1234 and appear to send the user to another page on their domain, when actually the user gets redirected to the merchant's (e.g. Amazon.com, Best Buy...) website. Sometimes the user is redirected from the /outlink/... URL and sometimes they never even get that far because the javascript sends them to the merchant's URL first.
It is not cloaking unless you are specifically treating Google differently. If Google doesn't understand your site that is their problem. If you have code that essentially says "IF Google, THEN do this. ELSE do that" it is your problem because you are cloaking. Make sense? There is a very distinct line there.
The bottom line is if you want to show users a certain page then you should be showing that page to Google as well. If the problem is the content on that page doesn't appear for Google (e.g. AJAX) then you should look into optimizing that type of content to the best of your ability. For example, look into the use of hashbangs (#!) as in:
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
-
1. Google understands simple JS that is inline with your HTML. So Google understands that
is a link to domain.com. You can obfuscate this further and Google might not understand it. I've not seen Google try to parse or execute JS but that doesn't mean they can't or won't in the future.3. Google is very unlikely to spider AJAX. Many AJAX pages don't return any user readable content (most of mine return things like JSON, which is not for end user consumption) and , as such, are beyond the scope of indexation. Again, as in #2, you might want this content to be shown elsewhere if you want it indexed. https://developers.google.com/webmasters/ajax-crawling/
-
ok, i am not keen on this approach, the developers have offered an alternative... but again, i'm not sure about it, they have said they can use ajax to force their search results / navigation over my current navigation / products on my category / product pages...
this gets rid of having to use javascript to send to different url... but up above Alan mentions cloaking, which to my understanding is basically serving anything different for a search engine / person... and thats what this will do... it serves up a different navigation to people... and the products could be listed in a different order etc... search engines do not see the ajax...
Is this any better? or just as negative?
-
Are they identical, you say the search equivalent, I just wouldn't treat search engines any different
-
even thou the content is identical?
It is only the way that content can then be navigated that is different...
-
Well then, yes I would be concerned, you are serving up different content to users, that is cloaking.
-
Hi Alan,
i think i may have explained incorrectly - my search page does have the meta tag noindex,follow - it also has a canonical link back to the main search page (i.e search.html) so i do not think any of the search results will be indexed. So my concern is not duplicate content, this should not happen...
My concern is the fact i am using javascript to litterally divert customers from one page to another... its almost like the static pages are there only for the benefit of google... and thats concerning me...
-
Google can follow JavaScript links, unless you are very good at hiding them.
I would not worry too much about the duplicate content, don't expect the duplicates to rank, but your not likely to be penalized for them. you can use a canonical tag to point all search results back to the one page.
I would not no index any pages, any links pointed to a no-index page are pouring their link juice away. if you want to no index a page use the meta tag no-index,follow, this way the search engine will follow the links and flow back out to your site
read about page rank and how link juice flows
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using GeoDNS across 3 server locations
Hi, I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience So UK visitors would connect to UK dedicated server, North America - New York server and so on Is this a good way or would this effect SEO negatively. Cheers Keith
Technical SEO | | Keith-0071 -
Use of multiple keywords that are similar for one local site
Hi I thought that if I wanted to rank a local site for the core Keyword, 'Landscaping Location' that variations of this keyword should be used on the same page. But I recently read that if I wanted to rank for: Landscaping Location
Technical SEO | | CamperConnect14
Landscaping in Location
Landscaping Services in Location that I should use separate page for each term. Is this correct? A small local website will probably only have a few pages and so making up pages solely to go after Keywords can't be right. But then would opportunities be missed? Thanks for your help with this!!0 -
Canonical Tag when using Ajax and PhantomJS
Hello, We have a site that is built using an AJAX application. We include the meta fragment tag in order to get a rendered page from PhantomJS. The URL that is rendered to google from PhantomJS then is www.oursite.com/?escaped_fragment= In the SERP google of course doesnt include the hashtag in the URL. So my question, with this setup, do i still need a canonical tag and if i do, would the canonical tag be the escaped fragment URL or the regular URL? Much Appreciated!
Technical SEO | | RevanaDigitalSEO0 -
Is this an OK back link profile?
Hi Guys I have worked throug this SEOMoz article to examine our back link profile http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings This is an image of the chart I ended up with. http://bit.ly/Z7Pp1G Does this look like an OK back link profile? Should I be doing something about the high number of 0 and -1/NA rank links? Thanks Paul backlinkprofile.gif
Technical SEO | | TheUniqueSEO0 -
Should you target non-plural if you are ranking highly for plural kw?
Let's say you get a ton of traffic and top ranking for a plural version of a keyword but almost 0 traffic for the non-plural version. Is it a good idea to try to allocate resources to the non-plural version?
Technical SEO | | Charlessipe0 -
Should i use NoIndex, Follow & Rel=Canonical Tag In One Page?
I am having pagination problem with one of my clients site , So I am deciding to use noindex, follow tag for the Page 2,3,4 etc for not to have duplicated content issue, Because obviously SEOMoz Crawl Diagnostics showing me lot of duplicate page contents. And past 2 days i was in constant battle whether to use noindex, follow tag or rel=canonical tag for the Page 2,3,4 and after going through all the Q&A,None of them gives me crystal clear answer. So i thought "Why can't i use 2 of them together in one page"? Because I think (correct me if i am wrong) 1.noindex, follow is old and traditional way to battle with dup contents
Technical SEO | | DigitalJungle
2.rel=canonical is new way to battle with dup contents Reason to use 2 of them together is: Bot finds to the non-canonical page first and looks at the tag nofollow,index and he knows not to index that page,meantime he finds out that canonical url is something something according to the url given in the tag,NO? Help Please???0 -
Is use of javascript to simplify information architecture considered cloaking?
We are considering using javascript to format URLs to simplify the navigation of the googlebot through our site, whilst presenting a larger number of links for the user to ensure content is accessible and easy to navigate from all parts of the site. In other words, the user will see all internal links, but the search engine will see only those links that form our information hierarchy. We are therefore showing the search engine different content to the user only in so far as the search engine will have a more hierarchical information architecture by virture of the fact that there will be fewer links visible to the search engine to ensure that our content is well structured and discoverable. Would this be considered cloaking by google and would we be penalised?
Technical SEO | | JohnHillman0 -
Base href
Does it matter for SEO if one uses a base href meta tag or if it is defined using java script?
Technical SEO | | cindyt-170380