Mask links with JS that point to noindex'ed paged
-
Hi,
in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content.
We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements.
Thanks,
Sebastian
-
Well, we just want to show less links to Google than to the user (but the links for Google are still a subset of the links shown to users). The links we'd do as JS links are those to less often applied search filters, which we don't index in order not to spam the search index.
Fortunately, if Google is smart enough in decrypting the links it wouldn't do any harm.
Thanks for our ideas tough! Especially the site: thing I considered myself, it really takes ages until something is de-indexed (for us, using robots.txt did speed it up by a magnitude).
-
Not to mention Google's ability to decipher JS to one degree or another, and they're working on improving that all the time. I've seen content they found that was supposed to be hidden in JS.
-
First be aware that the "site:" query won't show improvements for a long time. I had a 15 page website I built for someone get indexed in the dev server on accident. I 301'd every page to the new site's real URL. If I site search the dev url's they are still there, in spite of the fact that they 301 and have been for nearly two months. One I did 6 months ago only recently was removed from the site search.
if you link to your own pages that are not indexed for whatever reason, you could try to mask them in javascript but just be aware of the fine line you walk. Google does not like anything that misleads them or users. Hiding a link that is visible to users and not them is not a good idea in my opinion. If you have content that isn't worth indexing, it shouldn't be worth linking to anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Lots of Pages Dropped Out of Google's Index?
Until yesterday, my website had about 1200 pages indexed in Google. I did lots of changes: removed low quality content, rewrote passable content to make it better, wrote high quality content, got lots of likes and shares on social networks, etc. Now this morning I see that out of 1252 pages submitted, only 691 are indexed. Is that a temporary situation related to the recent updates? Anyone seeing this? What should I interpret about this?
Technical SEO | | sbrault740 -
Search result pages - noindex but auto follow?
Hi guys, I don't index my search pages, and currently my pages are tagged name="robots" content="noindex"> Do I need to specify follow or will it automatically be done? Thanks Cyto
Technical SEO | | Bio-RadAbs0 -
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
Technical SEO | | fazza470 -
Specific Link Page in Domain
Hi everyone: I have seen that many SEO Agencies have contacted my business (Also SEO but In- House) in order to interchange links. They have created a specific page on their site with the Label "Links" or similar, and on that page they add multiple links of the competence. I have heard that you can only do that if you make sure you add two things: No follow in links. Not inserting links of websites that have nothing to do with our sector. Either way, I have never found this amusing. I always recommend people not to do this but I have my doubts after all. ¿Could some one give me their opinion? Cheers !
Technical SEO | | Tintanus0 -
Webmaster tools lists a large number (hundreds)of different domains linking to my website, but only a few are reported on SEOMoz. Please explain what's going on?
Google's webmaster tools lists hundreds of links to my site, but SEOMoz only reports a few of them. I don't understand why that would be. Can anybody explain it to me? Is there someplace to I can go to alert SEOMoz to this issue?
Technical SEO | | dnfealkoff0 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0