Mask links with JS that point to noindex'ed paged
-
Hi,
in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content.
We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements.
Thanks,
Sebastian
-
Well, we just want to show less links to Google than to the user (but the links for Google are still a subset of the links shown to users). The links we'd do as JS links are those to less often applied search filters, which we don't index in order not to spam the search index.
Fortunately, if Google is smart enough in decrypting the links it wouldn't do any harm.
Thanks for our ideas tough! Especially the site: thing I considered myself, it really takes ages until something is de-indexed (for us, using robots.txt did speed it up by a magnitude).
-
Not to mention Google's ability to decipher JS to one degree or another, and they're working on improving that all the time. I've seen content they found that was supposed to be hidden in JS.
-
First be aware that the "site:" query won't show improvements for a long time. I had a 15 page website I built for someone get indexed in the dev server on accident. I 301'd every page to the new site's real URL. If I site search the dev url's they are still there, in spite of the fact that they 301 and have been for nearly two months. One I did 6 months ago only recently was removed from the site search.
if you link to your own pages that are not indexed for whatever reason, you could try to mask them in javascript but just be aware of the fine line you walk. Google does not like anything that misleads them or users. Hiding a link that is visible to users and not them is not a good idea in my opinion. If you have content that isn't worth indexing, it shouldn't be worth linking to anyway.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
#Page Jump link sharing
Hi I'm managing an in-house link building campaign in order to help in our key search term 'Location Holidays'. We were historically number 1 for this term until a recent re-design in May where our web design agency butchered our SEO. All of the main issued fixed, we're now fluctuating between 3rd & 4th on a daily basis. I'm putting together a social share comp to promote through the press in order to boost our backlink profile. We're nesting the competition within the body of the page we want to improve the rankings for. I will be including a #page jump link to quickly access it as it will be further down the page. My question is that if we get press to link to http://holidaycompany.com/destination/#comp will http://holidaycompany.com/destination/ receive the link juice or will http://holidaycompany.com/destination/#comp be looked upon as a whole new page? Thanks in advance!
Technical SEO | | MattHolidays0 -
Deleteing old page and passing on link strenth?
We are a printing company and thinking over bringing our products down to 2 - 3 rather than the 10+ we currently have, the pages we will be getting rid of will be pages such as flyers, booklets etc and just concentrating on banners and stickers would you suggest 301ing the pages to the home page or picking pages for them to go to? Also could we expect a decent raise for the pages we are left with? Thanks shaun
Technical SEO | | BobAnderson0 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Spam links - which link is most damaging to my rankings.
I have just started using Open Site Explorer and discovered a lot of spam links to my website.
Technical SEO | | A.Ronny
(I have mostly ranked on page for many years one but in the last two weeks ranking have dropped to page two)
The links have Anchor Text such as Scam - Dishonest - Drugs. Most of the of the links are "nofollow".
Will links with "nofollow" affect my ranking and if so which of the links should i priorities to remove?
Do I look at Link Equity - Domain Authority - Page Authority or other criteria? Many thanks
Ronny0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
Page that appears on SERPs is not the page that has been optimized for users
This may seem like a pretty newbie question, but I haven't been able to find any answers to it (I may not be looking correctly). My site used to rank decently for the KW "Gold name necklace" with this page in the search results:http://www.mynamenecklace.co.uk/Products.aspx?p=302This was the page that I was working on optimizing for user experience (load time, image quality, ease of use, etc.) since this page was were users were getting to via search. A couple months ago the Google SERP's started showing this page for the same query (also ranked a little lower, but not important for this specific question):http://www.mynamenecklace.co.uk/Products.aspx?p=314Which is a white gold version of the necklaces. This is not what most users have in mind (when searching for gold name necklace) so it's much less effective and engaging.How do I tell Google to go back to old page/ give preference to older page / tell them that we have a better version of the page / etc. without having to noindex any of the content? Both of these pages have value and are for different queries, so I can't canonical them to a single page. As far as external links go, more links are pointing to the Yellow gold version and not the white gold one.Any ideas on how to remedy this?Thanks.
Technical SEO | | Don340 -
Does rel= canonical combine link juice for 2 pages?
If two pages are very similar, and one should rel= canonical to the other, will the page authority pass from the page with rel= canonical to the target page? Also, what happens when you a page rel=canonical's to itself?
Technical SEO | | SkinLaboratory0 -
Question Concerning Pages With Too Many Links:
I have run SEO moz software for a clients site, Its showing that virtually every single page has too many links. For instance this url: http://www.golfthere.com/AboutUs Am I missing something? I do not see 157 links on this page.
Technical SEO | | ToddKing0