Pin It Button, Too Many Links, & a Javascript question...
-
One of the sites I work for has some massive on-page link problems. We've been trying to come up with workarounds to lower the amount of links without making drastic changes to the page design and trying to stay within SEO best practices. We had originally considered the NoFollow route a few months back but that's not viable. We changed around some image and text links so they were wrapped together as one link instead of being two links to the same place. We're currently running tests on some pages to see how else to handle the issue.
What has me stumped now though is that the damned Pinterest Pin Button counts as an external link and we've added it to every image in our galleries. Originally we found that having a single Pin It button on a page was pulling incorrect images and not listing every possible image on the page... so to make sure that a visitor can pin the exact picture they want, we added the button to everything. We've been seeing a huge uptick in Pinterest traffic so we're definitely happy with that and don't want to get rid of the button. But if we have 300 pictures (which are all links) on a page with Pin It buttons (yet more links) we then have 600+ links on the page. Here's an example page: http://www.fauxpanels.com/portfolio-regency.php
When talking with one of my coders, he suggested some form of javascript might be capable of making the button into an event instead of a link and that could be a way to keep the Pin It button while lowering on-page links. I'm honestly not sure how that would work, whether Google would still count it as a link, or whether that is some form of blackhat cloaking technique we should be wary of.
Do any of you have experience with similar issues/tactics that you could help me with here? Thanks.
TL;DR Too many on page links. Coder suggests javascript "alchemy" to turn lead into gold button links into events. Would this lower links? Or is it bad? Form of Cloaking?
-
This test showed a little light on what is indexed typically: http://www.seomoz.org/ugc/can-google-really-access-content-in-javascript-really
-
Loading link via JS is fairly standard technique. (See http://sharethis.com/ or http://www.addthis.com/). Google will index some JS created content so you may have to delay the link tag creation until a mouseenter event to get the desired effect.
Added bonus: using well written JS code can lighten the code weight of the page allowing it to load faster. Currently, each Pin icon contains a div, a link and an image tag. If you use prototyping, JS can replicate all this content from the attributes of the primary image tag very quickly. (I see you load jQuery so this task is very easy to accomplish)
Also, move the rel="words" in the link into the img tag as an alt attribute. Current the images lack alt tags which isn't the best. Using keywords in the rel attribute isn't correct. It is supposed to mark up the relationship to between items and "Stacked Stone Panels" isn't a relationship. You may have been thinking of the title attribute.
Next, you are loading WAY too many resource files (mainly js). A few items twice. Try combining them into a few minified files. There is a lot of work that could be done to speed up the site: http://www.webpagetest.org/result/130320_PT_12RV/ over 25 seconds to load.
Think about making a sprite of the images, it would save a ton of requests and downloads. Also, pagination, if done correctly, could save a lot of time.
-
Thanks guys! My coder is going to look over all of the best possible ways we could implement this and then we're going to see about doing a little testing on one of our galleries. Thanks again.
-
To my knowledge, Google does only "simple" Javascript. For instance
will be spidered as a link. if you have your click event do something more arcane (like call a function) it won't be. If you want to further obfuscate it from Google, add your click event by using an observer (like JQuery's $().click() function).
Google, to my knowledge, has never spidered AJAX. AJAX may not contain any human readable content.
-
No known negatives associated with doing that? If not then we might give it a test run on one of the galleries.
-
There was no negative impact after the Pin It button was added and effectively doubled the number of on-page links.
As for the Ajax loading idea, that was actually another one of the ideas that my coder had but I wasn't sure of what the effect would be on Googlebot indexing and following images. Though all the newer photos do get added to the top which would be visible if we implemented that.
-
That is definitely a lot of links... but have you noticed a negative SEO impact because of the pin it buttons? Having that many links isn't ideal, but it probably won't affect your site that much.
Alternatively, you can try loading some of the images via AJAX so that they aren't all displayed at once, and only load when the user scrolls down.
-
In my opinion I believe the correct implementation is to use the JavaScript event. I've seen it implemented this way on a few ecommerce sites that I know are doing well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which links to map across in site redesign
Hi there, I'm currently doing a fairly major website redesign for a client. They are moving to my hosting so I am creating the site on my cloud account and have edited my host files to work on it. The site structure will stay largely the same as it is quite a straightforward services site. However I'm moving them onto Wordpress from a different set up and I'm not sure how many of the links that they have, that aren't straight forward pages, I need to create redirects for. I have used Screaming Frog to get a list of all their URLs, of which there are 82. However alongside text/ html links I have: image/jpeg text/css
Web Design | | Frog-Marketing
application/javascript Do I need to create redirects for all of these link types? Or just any of the pages I'm not using? Many thanks, Sarah.0 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Is there an issue if we show our old mobile site to Google & new site to users
Hi, We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google. The mobile site & desktop site will have same url across devices & browsers. Regards
Web Design | | vivekrathore0 -
Link juice passing from a .org.uk link to a .org/uk websites
Hi all, A client I am working on had a CMS built in recently which has resulted in all their canonicals tags being taken off the website, and as such the same page with both a .org/uk and .org.uk/uk domain have appeared in the search results and I am wondering what your guys take is on the best cause of action. For further background: Historically they have always used .org.uk/uk (not sure why) for their UK website and used .org/xxx for other countries (they also have a .org splashpage FYI). Having seen the .org/uk pages, and knowing they have to choose one to avoid duplication, they would like to move their uk website to the .org/uk domain to fit in with the rest of the divisions. However due to the historical use of .org.uk/uk their backlink profile contains links to both the .org.uk and .org domains. My question then: would a canonical tag on all the .org.uk/uk pages pointing to the .org/uk pages be strong enough to pass on link juice to the .org/uk pages (from all links pointing to .org.uk) or would a 301 redirect be required in this instance, or indeed would it be best to stay with the .org.uk/uk domain? Thanks, Diana
Web Design | | Diana.varbanescu0 -
Technical SEO Question about TLD combined with SubDomain
I am making a new website but need to figure out the best way to do this in terms of SEO. I would like the website to have functionality of brochure website combined with an online store. My issue is that I will be using software called prestashop for my online store and CMS called MODx to develop my brochure site. (These can not be combined into one CMS). I can create brochure site with MOdx = www.example.com & then from that a subdomain using prestashop for my online store = store.example.com Can I get Google to index these as one site or would I be better off trying to get everything under the TLD. Ideally I would like just one site without subdomain Bacially what I am asking is... What are the effects of having subdomains in terms of SEO? Am I better of having everyhitng under TLD? Can I get Google to view TLD and Sub as one site? Hope this makes sense, thank you.
Web Design | | Socialdude0 -
Hiding Links Under A Tab As Good As Anything Else And More Attractive?
I'm working with a site that finds standard linking to spread authority to interior pages ugly. Here's what they don't like: footers tag clouds sidebar lists of links text heavy paragraphs with links a gallery of images with alt text/links So, I'm looking for other ways to link from their homepage to these less prominent pages inside the site. Here are my two questions: 1. Would something like this work, with the links under the "Specs" tab (p.s., this is just a random example and not my client): http://www.goincase.com/products/detail/CL57925/ 2. Any other ideas for spreading the authority via links from their homepage and other pages on the site to less powerful pages? Thanks! Best...Mike
Web Design | | 945010 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090