Can obfuscated Javascript be used for too many links on a page?
-
Hi mozzers
Just looking for opinions/answers on if it is ever appropriate to use obfuscated Javascript on links when a page has many links but they need to be there for usability?
It seems grey/black hat to me as it shows users something different to Google (alarm bells are sounding already!) BUT if the page has many links it's losing juice which could be saved.......
Any thoughts appreciated, thanks.
-
Hi John
Thank you for your reply. It's kind of what I was thinking - if it feels grey or iffy....it probably is!
The reasoning was that it is a custom system with a list of car manufacturers, with outbound links to their websites. That's 34 external links on every page.
The client didn't develop the system so I'm trying to work with it and optimise as best I can, without needing them to incur development costs.
Thanks again for your reply.
Trevor
-
Hey Trevor -
A couple of things here. First, I would never recommend that someone use obfuscated Javascript on links in order to make it so that the crawlers cannot see them.
Also, I think the "too many links on a page" guideline is not one to follow too strictly. It's not an "error" in the Moz Pro Campaigns because it is a guideline. Depending on your site, you can have many more than 100 links on a page and be fine. Or, you can use other ways (iframes, nav behind Javascript) to have these links available to the crawlers.
Just remember (as I am sure you do) that these links will not pass any link juice and you will need to use other ways to get your pages indexed and to have a good crawler-friendly architecture.
Just my two cents. I don't think you'd be cloaking, but it's starting to get a bit iffy. I'd steer clear.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.
Intermediate & Advanced SEO | | pujan.bikroy0 -
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
SEO implications of using Marketing Automation landing pages vs on-site content
Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
Intermediate & Advanced SEO | | philremington
Phil1 -
If I deindex a page then will Google stop counting those links pointing to it?
Hey everyone, I am deindexing some posts of my website as I think they are not providing any value to the users. My question is that if I deindex a post and it has some good quality links pointing to it, will google stop those links counting for my website?
Intermediate & Advanced SEO | | Bunnypundir0 -
SEO - Massive duplication of same page, but different link.
Hi!
Intermediate & Advanced SEO | | jennisprints
I'm dealing with a big client who's site has a big (approx. 39 000) duplication of the "same" page (same content) but each page has a different URL. The duplicated page is a "become a member"-page.
I've checked the backlinks in Google Search Console and there are no sites linking to any of the duplicated pages.
The developers have no clue where or how the pages came to be duplicated, but my guess is that every time a new customer sets up an account the page becomes duplicated. The customer want us to just remove the pages and sort out the duplication, but removing the pages might cause a big drop in back links/traffic and what not. I would much rather redirect the duplicated pages to the original page, but given that there are 39 000 pages it might mess with the site speed. Looking for ideas and suggestions of what the next step should be, remove or redirect.
Thanks so much!0 -
Help needed for a 53 Page Internal Website Structure & Internal Linking
Hey all... I'm designing the structure for a website that has 53 pages. Can you take a look at the attached diagram and see if the website structure is ok? On the attached diagram I have numbered the pages from 1 to 53, with 1 being the most important home page - 2,3,4,5, being the next 4 important pages - 6,7,8... 15,16,17 being the 3rd set of important pages, and 18,19,20..... 51,52,53 being the last set of pages which are the easiest to rank. I have two questions: Is the website structure for this correct? I have made sure that all pages on the website are reachable. Considering the home page, and page number 2,3,4,5 are the most important pages - I am linking out to these pages from the the last set of pages (18,29,20...51,52,53). There are 36 pages in the last set - and out of this 36, from 24 of them I am linking back to home page and page number 2,3,4,5. The remaining 8 pages of the 36 will link back to pages 6,7,8...15,16,17. In total the most importnat page will have the following number of internal incoming links: Home Page : 25 Pages 2,3,4,5 : 25 Pages 6,7,8...15,16,17 : 4 Pages 18,19,20...51,52,53 : 1 Is this ok considering home page, and pages 2,3,4,5 are the most important? Or do you think I should divide and give more internal links to the other pages also? If you can share any inputs or suggestions to how I can improve this it will greatly help me. Also if you know any references for good guides to internal linking of websites greater that 50 pages please share them in the answers. Thank you all! Regards, P.S - The URL for the image is at http://imgur.com/XqaK4
Intermediate & Advanced SEO | | arjun.rajkumar810 -
How would you use this broken link building opportunity?
I've found a good opportunity to build some links and I'd love your opinions on my options here. There's a big event that happens once a year in my city. Let's say the event used to have a website called www.CityEvent.com. The event decided not to use this website anymore, but instead to put all of their event information on their facebook page. It looks like they let their domain name expire and someone else snapped it up. It's now sitting as an empty wordpress blog with one line of text. This empty website has 1300 links pointing to it. I can see two opportunities here: 1. Write a very thorough article on my website (that I am trying to build links to) describing the event and giving people all of the information that they need to know about it. (The amount of information on the Facebook page is minimal.) or 2. Create a new website called www.EventCity.com and put up a static page with all of the information that people need to know. There would be a link on this page pointing to the site that I am trying to rank. In both cases there would be much more information than is available on the Facebook page including a collection of youtube videos about the event and many helpful links for people who are interested in this type of event. Then the plan is to contact the sites who are linking to the dead page and invite them to link to my new page (either on my site or the new site that I could create). I see a few pros and cons to each method. For option #2 I think people would be more likely to link to a more official looking page rather than an article on a separate website. (My website has information about the city in question but is not closely related to the event at all.) However, I would only be getting one link back to my site. One negative to this is that the actual organizers of the event may not be pleased that someone has created an official looking page. But then again, perhaps they would be happy to have a free website. For option #1 I would possibly get more links from sites that are authoritative in my city that point directly to the site I am trying to rank. However, people would be less likely to link to us because we are not an official site for the event, but simply a very good article about the event. There are no other good articles for this event that are ranking on Google. Hopefully that makes sense. What would you do? EDIT - Just thought of a third option - try to buy the domain.
Intermediate & Advanced SEO | | MarieHaynes0 -
How many time should a keyword be used in the body of text?
We employee an outside agency to write content for our website as we do not have the ability in house to write unique and good quality content. They have just sent an article which is around 300 words. I told them the keyword phrases to use. When I got the document there is only 1 instance of the keyword phrase(s) in it. Now there seems to be a conflict here amongst posts I have read and general SEO advise as to how many times it should be present (SEOmoz indicates 4 times for instance), our outside agency says it doesn't matter. Now if I have a page optimised for 2 keywords this starts making things tricky and probably looks keyword stuffed to the reader. Assuming the keywords are present once in meta tags, H1, meta descriptions and alt text, what do people think is best practice taking into account recent panda updates? Thoughts appreciated. Thanks Craig
Intermediate & Advanced SEO | | Towelsrus0