Mask links with JS that point to noindex'ed paged
-
Hi,
in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content.
We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements.
Thanks,
Sebastian
-
Well, we just want to show less links to Google than to the user (but the links for Google are still a subset of the links shown to users). The links we'd do as JS links are those to less often applied search filters, which we don't index in order not to spam the search index.
Fortunately, if Google is smart enough in decrypting the links it wouldn't do any harm.
Thanks for our ideas tough! Especially the site: thing I considered myself, it really takes ages until something is de-indexed (for us, using robots.txt did speed it up by a magnitude).
-
Not to mention Google's ability to decipher JS to one degree or another, and they're working on improving that all the time. I've seen content they found that was supposed to be hidden in JS.
-
First be aware that the "site:" query won't show improvements for a long time. I had a 15 page website I built for someone get indexed in the dev server on accident. I 301'd every page to the new site's real URL. If I site search the dev url's they are still there, in spite of the fact that they 301 and have been for nearly two months. One I did 6 months ago only recently was removed from the site search.
if you link to your own pages that are not indexed for whatever reason, you could try to mask them in javascript but just be aware of the fine line you walk. Google does not like anything that misleads them or users. Hiding a link that is visible to users and not them is not a good idea in my opinion. If you have content that isn't worth indexing, it shouldn't be worth linking to anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should i noindex/nofollow a faceted navigation page?
I have an ecommerce website with 4 departments, that share the same categories, For example a bicycle shop would have different products for mountain biking and road cycling, but they would both share the same 'tyres' category. I get around this by having the department as a filter, that changes the products on show, and adds a URL parameter of ?department=1. When this filter is applied, i have a canonical link setup to the non-filtered category. Any filter links are nofollowed. My top menu has 4 different sections, one for each department, and links to these URLs with the department parameter already on, these links are set to allow robots to follow. As i am actively pointing Google at these pages, and it is my main navigation, should the page they go to be noindexed? As its the canonical i want to rank. Hopefully this makes sense. Cheers
Technical SEO | | SEOhmygod0 -
Dealing with broken internal links/404s. What's best practice?
I've just started working on a website that has generated lots (100s) of broken internal links. Essentially specific pages have been removed over time and nobody has been keeping an eye on what internal links might have been affected. Most of these are internal links that are embedded in content which hasn't been updated following the page's deletion. What's my best way to approach fixing these broken links? My plan is currently to redirect where appropriate (from a specific service page that doesn't exist to the overall service category maybe?) but there are lots of pages that don't have a similar or equivalent page. I presume I'll need to go through the content removing the links or replacing them where possible. My example is a specific staff member who no longer works there and is linked to from a category page, should i be redirecting from the old staff member and updating the anchor text, or just straight up replacing the whole thing to link to the right person? In most cases, these pages don't rank and I can't think of many that have any external websites linking to them. I'm over thinking all of this? Please help! 🙂
Technical SEO | | Adam_SEO_Learning0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
According to 1 of my PRO campaigns - I have 250+ pages with Duplicate Content - Could my empty 'tag' pages be to blame?
Like I said, my one of my moz reports is showing 250+ pages with duplicate content. should I just delete the tag pages? Is that worth my time? how do I alert SEOmoz that the changes have been made, so that they show up in my next report?
Technical SEO | | TylerAbernethy0 -
Different links to to the same page
Hi, Based on the user's actions we post activity into users Facebook timeline. And each activity has link back to our particular page on our website. For example if original page was: www.Domain.com from Facebook timeline it would be like this: www.Domain.com?Ffb_action_ids=101508953168 Do you think this will have a negative effect on our page rankings as we will eded up having a lot of different URL's to the same page? www.Domain.com?Ffb_action_ids=101508953168 www.Domain.com?Ffb_action_ids=456788765609 etc.. Thank you, Karen Bdoyan
Technical SEO | | showme0 -
We're working on a site that is a beer company. Because it is required to have an age verification page, how should we best redirect the bots (useragents) to the actual homepage (thus skipping ahead of the age verification without allowing all browsers)?
This question is about useragents and alcohol sites that have an age verification screen upon landing on the site.
Technical SEO | | OveritMedia0 -
External Links on a Front Page
Does anyone have any links to information about external links on a front page ? I am advising a client that this is not the best idea and that they could be put in a different place but can't find any proof of this.
Technical SEO | | marcelo-2753980 -
What Are The Page Linking Options?
Okay, so I'm working with a site that has pretty good domain authority, but the interior pages are not well linked or optimized. So, it ranks for some terms, but everything goes to the home page. So, I'd like to increase the authority of the interior pages. The client is not wild about spreading targeted juice via something like a footer. They also don't like a "Popular Searches" style link list. The objection is that it's not attractive. They currently use cute euphemisms as the linking text, so like "cool stuff" instead of "sheepskin seat covers." In that made up example, they'd like to rank for the latter, but insist on using the former. What about a slide show with alt text/links? Would that increase the authority of those pages in a term-targeted kinda way? Are there other options? Does it matter how prominent those links are, like footers not as good as something higher up the page? They currently use a pull-down kind of thing that still results in some pages having no authority. Do bots use that equally well? Thanks!
Technical SEO | | 945010