These are the ones I recommend:
Posts made by Cyrus-Shepard
-
RE: Recommended basic credible SEO on youtube
-
RE: Backlinks in client website footers - best strategy?
My only response would be this. Sometimes it can work especially in the short run, but is often penalized. I would wait until the next penguin update comes out and take a look at the competitors site again. Again, not linking site wide from the footer is not a hard-and-fast rule but more of a general best practice.
The problem is that these are not editorial links per se. As such they are at constant risk are being devalued or penalized by Google.
You can roll the dice and sometimes see a short-term victory. But more often than not you risk getting burned.
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
Great question! We do often see a positive correlation between the number of followed outbound links and higher rankings (though I'm not sure we've scientifically measured this recently). Anecdotally, we hear this often as well. Most famously when the NYTimes made external links "followed" which was followed by an increase in traffic/rankings.
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
It's an interesting perspective. Looking at the pages+links, they all look trustworthy and normally I wouldn't see a reason to nofollow them, especially since they are all editorially controlled by you and your team.
Linking equity is a concern, but I honestly doubt you're saving anything by making them nofollow, especially since Google updated how they handle PageRank sculpting back in 2009.
Not that there aren't legitimate ways to preserve and flow link equity (such as including internal links withing the main body of text instead of sidebar areas/navigation) but in this case I think leaving the links follow won't hurt at all.
-
RE: Spam score is 7/17
Hi Amelia,
Great question. It doesn't mean your site is spammy, it simply sounds like your link profile likely looks like a lot of sites we see that do happen to be spammy.
My guess is that in order to rank better, you may want to work on increasing your visibility by attracting more external links to your site. Does that sound reasonable? (Additionally, this would eliminate most of the spam flags)
It's a big task, but a few resources that may help:
- https://moz.com/blog/category/link-building
- https://moz.com/beginners-guide-to-link-building
- http://pointblankseo.com/link-building-strategies
Hope that helps!
-
RE: Importance of minimal markup on a page
The study was looking at the correlation between the amount of HTML and higher Google rankings. Although I don't believe it's an actual ranking factor, we typically find a small but positive correlation with longer content pieces. The simple explanation being that longer content has more "stuff" to rank for, and there's a corresponding correlation to longer content and links earned, which also helps with rankings.
-
RE: Bad Backlink?
Interesting note about iframes. Google typically attempts to associate the content of an iframe with the originating/hosting site. How this plays into how Google interperets the "link" from this site - I have no idea, but I doubt it's very harmful.
-
RE: Positions dropping in SERPs after Title and Snippet change
There are a few possible reasons Google might adjust rankings after seeing a change in your title and meta descriptions. Among them: (keep in mind these are only possibilities)
1. The algorithm determines that the page is less relevant to the target query keywords
2. The title change deviates from earlier anchor text pointing at the page, meaning the page might not be as relevant to the query
4. After changing your title+description, you experience a lower CTR in search results. In theory this could lower your rankings. But because you describe the old title/description still showing in SERPS, this is less likely
5. The drop in rankings is temporary, or is unrelated to any changes you made.
If Google is still showing the old title/description, #5 is a strong possibility. You may want to check Google's cache of the page to see if it's picking up on the changes. Depending on the site this can take anywhere from a few hours to several weeks.
If nothing else, you can always change the title/description back to the original version and test what happens.
-
RE: Mass HTTP to HTTPs move
We observe only a small correlation between HTTPS sites and higher rankings (like 0.04) - so there's very little apparent pure SEO benefit. It really seems to be the "tie-breaker" Google claims, though this difference may increase in the future.
-
RE: Mass HTTP to HTTPs move
Hi Jason,
I've been involved with a number of migrations (including this site, moz.com) and in my own experience I've seen anywhere from zero traffic change to a loss in the range of 8-9%.
Google says that in theory, you shouldn't lose any traffic, and several large publishers I've spoken with can attest to this.
In practice, HTTPS migrations can be complex, and with more moving parts the potential to do things less than perfectly can escalate. If you mess up your 301s or create confusing redirects, the potential for reduced traffic is real. My general advice is don't let the migration to HTTPS scare you, but proceed with caution. This post may help: https://moz.com/blog/seo-tips-https-ssl
-
RE: Is having a Video important for SEO or is it the time-on-site that's important?
There's a couple of things that can add concrete SEO benefit when adding video to your site:
**1. Video Schema: **This can tell search engines what the video is about and add a ton of relevant information. https://developers.google.com/webmasters/videosearch/schema
2. Video Sitemaps: You can additionally provide some of the schema information here. But at this this can help get your videos indexed https://developers.google.com/webmasters/videosearch/sitemaps
3. Transcriptions: You can see how we do this on Moz with our Whiteboard Friday series. Basically it combines the best of both Video and blog post formats.
So the benefit is 2-fold. When implemented properly, a video can add contextually relevant information for SEO purposes (although "simply adding a video" likely isn't enough. And secondly, the increased user engagement, if/when successful, can add an important boost.
Hope that answers your question. Best of luck with your SEO!
-
RE: Pyramid link structure - how to noindex, nofollow
Hi there,
I see what you're trying to do, and I think I understand it. You're attempting to conserve your link equity and flow it only to the most important pages, or what we use to call "pagerank sculpting."
The good news is you don't really need to worry about it. These days, adding nofollow to your links doesn't really increase the equity flowing through the followed links. And in fact, you could be shooting yourself in the proverbial foot by denying equity passing links to your lower product pages.
Best time to use nofollow for internal pages is typically to increase crawling efficiency, or to prevent bots from visiting pages you don't want indexed anyway. Attempting to scuplt link equity in this way could cause lots of unintended negative consequences and my advice would be in most cases to let your link equity flow freely throughout your site in a way that was natural to both humans and bots alike.
Best of luck!
-
RE: Some Old date showing in SERP
I'm actually kinda stumped. For whatever reason, Google is ignoring the sitemap date. Here's what I would do:
1. Even though the sitemap is valid, I'm still unclear if Google is reading it. The only way to know for sure is by checking the Sitemap function in Google Search console here and verifying indexation: https://www.google.com/webmasters/tools/sitemap-list
2. You could try to put a date on the page. Something like "Last Updated" at the bottom of the page.
3. A longshot, but you could add the <lastreviewed>Schema markup to the page, and see if Google honors that.</lastreviewed>
If you try any of these, let us know if any of them worked!
-
RE: Traffic has not recovered from https switch a year ago.
We've actually seen Google get harsh on category-type pages across a wide number of industries and sites. It's even happened here at Moz. If your HTTPS is implemented correctly (and sounds like you are reasonably certain it is) you might want to look to other areas.
I'd look at your category pages and make sure:
- Pagination is implemented correctly
- Canonical are in place, where appropriate
- If possible, each category should have it's own introductory text, i.e. https://moz.com/ugc/category/link-building
- Basically, do everything you can to treat your category pages like actual landing pages worthy of search traffic, including unique content, value, title tags, descriptions, etc.
-
RE: Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
You may be interested in this post titled "Cloudflare and SEO" : https://blog.cloudflare.com/cloudflare-and-seo/
"We did a couple things. First, we invented a new technology that, when it detects a problem on a site, automatically changes the site's CloudFlare IP addresses to isolate it from other sites. (Think of it like quarantining a sick patient.) Second, we worked directly with the crawl teams at the big search engines to make them aware of how CloudFlare worked. All the search engines had special rules for CDNs like Akamai already in place. CloudFlare worked a bit differently, but fell into the same general category. With the cooperation of these search teams we were able to get CloudFlare's IP ranges are listed in a special category within search crawlers. Not only does this keep sites behind them from being clustered to a least performant denominator, or incorrectly geo-tagged based on the DNS resolution IP, it also allows the search engines to crawl at their maximum velocity since CloudFlare can handle the load without overburdening the origin."
-
RE: Some Old date showing in SERP
How odd. I'm not sure of the answer, but before we go any further I was hoping you could verify a couple of things;
1. In Google Search console, can you verify that your sitemaps are submitted and that Google is indexing/reading them? I would think since you have a "last mod" date in your sitemap it would send a signal to Google that the page was more up to date.
2. When looking at the cache of your page in Google, it doesn't look like all the resources are loading. http://webcache.googleusercontent.com/search?q=cache:example.com
Based on this, if you perform a fetch and render in Google console, does it show that you are blocking any resources?
-
RE: Unnamed Update — February 4, 2015
Unfortunately, there's not a lot of good info out there on this update. Can you give us any insight on the keywords/URLs you saw impacted?
-
RE: Accidentally generated navigation links that point to American site from Australian site.
Probably not too harmful. Unfortunately, it's really hard to predict. There are three likely outcomes:
- You get dinged because Google sees these links as manipulative, or alternatively over-optimized.
- The extra linking (and cross-traffic) actually help you
- Nothing. Google often ignores links from sites with an administrative relationship
I seriously doubt this would trigger a long term Penguin issue, but as long as your cleaning it up I really don't see what else you can do other than to per-emptively disavow links form the cross-sites. But before doing this, I'd seriously question if this is the route you want to take, as there could be other ramifications.
-
RE: Keyword Themes - What's in a theme?
Turns out I wrote a post that expanded on this idea of keyword themes: https://moz.com/blog/keywords-to-concepts
Hope that helps! Best of luck with your SEO.
-
RE: Does a non-canonical URL pass link juice?
Complex question Caveat: I don't work for Google and the precise workings of the canonical element in Google's algorithm is mostly educated speculation.
The answer is somewhere in-between yes and no. That's because the canonical element means that URL B is treated as URL A. In that sense it really shouldn't pass any direct link authority.
But(!) now let's complicate things. Let's point some links at URL B. (and not at URL A) In theory, those links are then canonicalized to URL A, and that equity passes to your site (yeah!)
So it's not a direct influence, but you can in theory gain link equity from canonicalized versions of URLs that point to your site.
-
RE: How do you check what links there are to a specific page on a site?
Hi Leo,
Sounds like you were doing the right thing. Different tools will show different numbers, as all tools use different link indexes. In general, Moz is a bit more picky about the links it displays - we try to display the most important links that are likely to have an impact on your ranking. The downside of this is that our index can be smaller than some of the others (Ahrefs, Majestic) and you'll often find a bigger volume of links with those other indexes.
Also, for general help in using Open Site Explorer, here's an excellent resource: https://moz.com/help/guides/research-tools/open-site-explorer
-
RE: Similar pages on a site
I'm simply going to re-emphasize what others have said here: it's more important how similar the content is than anything else. "Jumpers" is certainly broad enough that you can attack it from several different content angles. If your website sells jumpers, it's not unusual to have multiple pages about jumpers.
The key is that every page should serve a specific purpose. If this isn't the case, work to find ways to either:
- Consolidate or
- Make the purpose of each page uniquely valuable.
Hope that helps! Best of luck with your SEO.
-
RE: After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
The good news is, this actually sounds pretty normal. 24 hours to reflect changes in content is better than many sites. I can't account for why it dropped from 4 to 24, but I'd say this is still in the range of "good"
-
RE: After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
Howdy,
A couple of questions:
1. Are there certain pages that aren't getting updated, or is it your entire site?
2. How often are changes in the pages reflected in Google's cache?Is it a case where Google simply displays old/outdated information all the time? Finally, have you done a "Fetch and Render" check in Google Webmaster Tools?
-
RE: Whether or not to remove a link from a website with high spam score on Open Site Explorer
Anything with "Link Exchange" in the title should be bumped to a Spam Score 100
Those are really, really horrible links. I'd probably disavow them just to be safe. On the other hand, unless you've received a manual action notice in Google Webmaster Tools, it's quite possible those links aren't hurting you at all and Google is simply ignoring them.
On the other hand, those links are just awful.
-
RE: Nofollow Outbound Links on Listings from Travel Sites?
Good question.
On one hand, I'm a fan of linking out with, link equity. There's a good correlation with linking out and higher rankings (though I don't believe we've ever studied the difference between followed and nofollowed in this regard) I hate to see links "nofollowed" simply to protect against Google actions, but it is a reality of doing business.
To me, it comes down to how many of the sites are actual spam. "Low quality" is certainly different than spam. If it's a handful of sites out of thousands, I wouldn't worry about it too much. Generally, tourism websites are a much more trustworthy quality than sites in the gambling/adult/pharmaceutical verticals.
Now, on the other hand, if you do choose to nofollow the links, you probably won't see too many negative consequences.
In the end, I think you have to guage how bad the sites are that you're linking to, and make your judgement from there.
-
RE: A competitor's SEO firm is building spammy links to my website.
Typically, spam links don't generate much actual traffic. Is it possible that what you're seeing in GA is referral spam instead? http://www.optimizesmart.com/geek-guide-removing-referrer-spam-google-analytics/
Also, do you see the links in Open Site Explorer? This might help determine if the links are real or if the referrals you are seeing are "phantom"
-
RE: Learn how to use Moz's Spam Score metric to identify high risk links. Get your Daily SEO Fix.
Not quite sure what you'd like to know about Domain Authority, but you can find lots of information about it here: https://moz.com/learn/seo/domain-authority
If there is a specific question we can answer, be sure to let us know!
-
RE: Do Ghost Traffic/Spam Referrals factor into rankings, or do they just affect the CTR and Bounce Rate in Analytics?
Short answer: no (or at least, very unlikely)
Google publicly states they don't pull data form inside GA, and realistically they don't need to. They are only concerned about the performance of their search results, and how that traffic responds to individual results. They also have no reason to lie about not using GA data, as there are so many other sources of information that are better.
"And finally... when is google going to shut these open back doors in Analytics so that Vitaly and his ilk are shut down forever?"
Great question! Hopefully soon.
-
RE: Soft 404s for unpublished & 301'd content
Short answer: create a custom 404 page, not just for these pages, but one that can show for everypage on your site.
A few resources:
https://support.google.com/webmasters/answer/93641?hl=en
Example: http://moz.com/sadfklfadsadfjs
-
RE: Soft 404s for unpublished & 301'd content
Yes, it's possible, but that could be considered cloaking. I'd say best to return a 404.
-
RE: Soft 404s for unpublished & 301'd content
In that case, sounds like you should either:
- 404 them if you have evidence these have hurt your rankings/traffic (have you experienced a dip?)
- Ignore them and go about your day
-
RE: Soft 404s for unpublished & 301'd content
If Google thinks the 301 leads to a page that isn't relevant enough, they may flag it as a "soft 404" even though it returns a 301. That's Google's way of saying they think you should 404 these pages instead.
How much will it hurt you? Probably not much, but it's hard to say.
Let's ask these questions:
- How much traffic goes to these pages? If not much, is it okay to 404 them?
- Are there more relevant pages you could redirect these to? (ideally, something with a similar title as the original page?)
- Have you seen much traffic loss overall? If not, it's likely this isn't hurting you.
Hope this helps! Best of luck with your SEO.
-
RE: Ranking For Synonyms Without Creating Duplicate Content.
There's a 100 different ways to do this, but typically my favorite approach is to try to work the synonym into the same copy without seeming spammy.
For example, if my primary keyword is "GMO" and my very literal synonym is "Genetically Modified Organism" then I'd try to work both variations into the copy.<title>GMO Dangers - Knowing the Risks of Genetically Modified Organisms</title>
Here's a great article that goes into depth about the advantages of incorporating multiple variants into your SEO targeting http://cognitiveseo.com/blog/5370/941-traffic-increase-exploiting-the-synonyms-seo-ranking-technique/
-
RE: Http to https question (SSL)
Likely a coincidence, or at least highly probably there are other circumstances at play.
If you changed platforms, changed content, links, architecture at all, if there have been any changes in the backlinks, if the competition has made changes (something you can't controll!) if Google has made algorithm changes - even specific to your vertical, then you are bound to see changes in rankings that might be hard to pinpoint or explain.
Attorneys, especially those in certain niches like DUI, are especially tough and prone to fluctuation. Might take some extra investigation on your part.
Regardless, the site looks good and fast. Nice work!
-
RE: Can you nofollow a URL?
Looks like a lot of good information from folks here so I'll be brief.
Technically, there's no practical way to redirect the page without redirecting the links. Unless your page serves a 404 or 410 response code, those links will be associated with your domain.
The only way to disassociate yourself from these links is through use of the Disavow Tool.
-
RE: Does rel="canonical" support protocol relative URL?
Relative URLs can be used, but it's still superior to use absolute URLs to avoid any mistakes down the line. i.e. you miss a 301 redirect on a subdirectory and both HTTP and HTTPs versions resolve.
Relative URLs can be used in a pinch, but aren't recommended.
-
RE: Domain Migration of high traffic site:
It took us 6-8 months to bounce back.
Yes, I'm actually worried about sending confusing signals to Google about where the content is suppose to live.
I don't like making Google think
-
RE: Domain Migration of high traffic site:
Keep in mind this is a complex situation with lots of moving parts, so there are no "right" answers.
That said, you're dealing with a lot of potential redirects:
- Site migration
- HTTPS
- .php extension rewriting
On one hand, I'm concerned if you stagger these out, you'll be diluting your authority through several 301 redirects.
In this case, might be best to do it all at once.On the other hand, doing it all at once greatly increases the chance of something going wrong. In this case, best to stagger things out.
At Moz, we went with "do it all at once" when we migrated to moz.com. Needless to say, we didn't do everything perfect and our organic traffic dropped nearly 30%, instead of the 15% we would have normally expected from a smaller migration.
Regardless, I'd still preference the "all at once" approach, and just be very diligent with your redirects, making sure not to drop pages, rewrite content, change title tags, etc.
As for moving big assets over ahead of time, I know some people endorse it, but I haven't seen a ton of evidence that it makes a significant difference. If it complicates things, I'd likely avoid it.
Hope this helps! Best of luck with your SEO.
-
RE: I would like to get rid of 300,000+ links, please
Hi Linda,
In the absence of any penalty or rankings dip, there's likely nothing to worry about as this is pretty common. If nothing else, even though Google reports those links in GWT it's very possible they discount them in every other way that counts.
On the other hand, if you are worried about it there should be no harm in proactively disavowing the domain. Lot's of agency folk that I know make this a regular practice to disavow any suspect links and 99.9% of the time, if done correctly, no harm will come to your site simply by filing a disavow file (unless you disavow links that are actually good, but that's another story)
-
RE: SEO Implications For a Technical Functionality Fix
If I understand correctly, you have incoming affiliate links which don't work on HTTPs due to varnish, so you are redirecting them to HTTPs, where they do work. Let me know if I'm missing anything.
Okay, first of all if you are serving up two versions of your site on any page (HTTP and HTTPS) without 301'ing one to the other, you should absolutely have canonical tags pointing to the HTTP. And without the affiliate tracking parameter. (Edit: see thoughts below on NOINDEX)
As for 301 vs 302: Technically, to stay within Google's guidelines, affiliate links to your site should be nofollowed. In practice, sometimes they can offer a ranking benefit, but more than often Google discounts them. Regardless, if you abuse them for linking purposes it can come back to bite you in some instances. There's no clear answer, but keep in mind 302s may very well negate some of the link equity from these affiliate links (which may or may not be a good thing)
NOINDEX - My thought process is, if you don't want the HTTPS URLs indexed, and the link equity from the affiliate links isn't a consideration, then it's likely best to NOINDEX, NOFOLLOW the pages. This ensures they will be kept out of Google's index, keep crawl efficiency optimized, and deliver cleaner results. This also means the canonical tag isn't necessary. (and probably unwanted, as it sends conflicting signals with the NOINDEX tag) Keep in mind this strategy effectively kills any incoming link equity from the affiliate links, but does help keep you within Google's good graces.
-
RE: What exactly is an impression in Google Webmaster Tools search queries with the image filter turned on?
Although my best guess tends to agree with Martijn, I'm not absolutely sure. I searched Google Webmaster Forums but couldn't find a definite answer. It might be worth posting there to see if a representative from Google could lay the question to rest: https://productforums.google.com/forum/#!forum/webmasters
That said, I believe image search impressions are only counting regular image search (not images that appear in regular results) and do not count impressions below the fold that haven't rendered (because that could go on for a very long time and result in 1000s of impressions that never happened.
-
RE: Bingpreview/1.0b Useragent Using Adding Trailing Slash to all URLs
Sounds like a plan. I'd also make every redirect a 301, just in case. Cheers.
-
RE: Bingpreview/1.0b Useragent Using Adding Trailing Slash to all URLs
On one hand I'd agree with you that you shouldn't have to rewrite those URLs on your end. On the other hand, it's usually best practice to make sure both versions of a URL (with slash and/or without) resolve to the same page. The reason for this is that:
- Search bots, including Google, will often "explore" variations of URLs for discoverability reasons - they want to make sure they are discovering all of your available content.
- People will link to you with and without trailing slashes. If they link to you with a trailing slash and your page breaks, you could be wasting link equity, to say nothing of the bad user experience of people visiting your site from the referral links
- For one reason or another it's common to append URLs with various parameters (for tracking reasons, campaings, etc) and often these URLs are generated by third party services when pointing at your site.
For all of these reasons, it's pretty common to either force redirect trailing slashes (via a 301) or make sure both versions resolve to the same content, and use a rel=canonical tag to indicate to search engines that these are indeed meant to be the same page.
On the other hand, if this is something not feasible and URLs ending in a slash are indeed different pages, you might want to carefully consider what those pages deliver to both humans and bots because it seems inevitable that both will eventually crawl and stumble upon them.
Perhaps not the answer you were looking for, but I hope it helps.
-
RE: What's Moz's Strategy behind their blog main categories?
1. Right, each post is linked to in the author byline at the top of the post. I don't believe the links there carry much weight either, but there are literally 1000s of them throughout the site for each category, and the links are connected to semantically relevant blog posts, as opposed to a topic-agnostic sidebar link.
4. Whoops, I goofed by calling it a "non-html" pull down (typing too fast). Of course it's HTML. I simply meant we moved them from a static sidebar format into a pull-down, non linking sidebar format.
Cheers!
-
RE: A few pages deindexed from Google .. PLEASE HELP!
Hi Edmond,
Like Ray pointed out, there could be multiple reasons. If you'd like another set of eyes, feel free to PM me as well.
-
RE: What's Moz's Strategy behind their blog main categories?
Hi David,
Great question. Couple of points I'll go over, and realize the answer only applies specifically to Moz. Others may find different optimal solutions.
1. Crawling categories isn't a problem for us. Each blog post is linked to several juice-passing category links within the body of the post. And we have 1000s of posts all linking to individual categories.
2. Our reason for putting categories in a drop down was simply to save space. We could have made the links crawlable using pure CSS/HTML, but we suspected it really wouldn't make a difference.
3. Sidebar links have diminished value. It's doubtful much link equity was flowing through them in the first place (for other architectures it may be beneficial to have completely robot friendly sidebar category links, but in our case they were so redundant they weren't really necessary)
4. Finally, when we moved categories into the sidebar, we noticed no change to rankings/traffic to category pages, so we left it as is. Had this been different, we would have reconsidered our strategy.
Again, this strategy is outside traditional "best practices" but practically speaking, it works just fine for us, but may be different for newer sites, sites without as much link equity, etc, etc.
Hope this helps! Best of luck.
-
RE: SEO Monthly Strategy
Personally, i try to avoid this approach and focus on quality over quantity.
A single, well written and shared piece of content has the potential to earn many times more links and traffic than 50 purchased or hastily written articles when the quality is in question.
Really, I don't care about how many posts I've written or how many comments I've made (and by directory submissions I hope you mean Local Directories What I do care about is how users engage with those activities, how often my content is shared by authoritative influencers, and how these activities contribute to SEO success.
Sticking to a structure and/or schedule is important, but I would avoid the cookie-cutter approach to SEO. A couple presentations by Rand that may be helpful.
Hope that helps! Best of luck with your SEO.
-
RE: Google crawling different content--ever ok?
This is not the definition of cloaking and I wouldn't worry too much about any penalty.
That said, anytime you redirect googlebot to a different experience than users it's a situation you want to be very careful with, and in most situations avoid. Often this is solved by serving different experiences via javascript. Even though Google is pretty darn good at parsing javascript, they will often interpret the default version of a page as if the javascript is turned off.
Regardless, I'd keep an eye on search results, Google Webmaster Tools, cached versions of your site and make ample use of "Fetch and Render" in GWT to ensure Google interprets your site they way you think it should.
-
RE: Optimizing internal links or over-optimizing?
Let's put it this way....1. Google expects to see pages like FAQs, About Us, Contact Pages, etc have a high number of internal links.
2. The link equity "leaked" to these pages is usually negligible. Sure, there's a small amount of PageRank, but it's not really considered anything that would influence your on-page optimization.
On the other hand, if the links were in obvious places, and there were a lot of them, I might be tempted to control my link equity as well. It really all depends on where the links are placed and how prominent they are on the page. "How likely is the user to click this link?" is a good question to ask a.k.a. "reasonable surfer"
3. At the same time, Google probably doesn't care too much that you removed those links, either. Very rare that removing links is seen as over-optimization.
4. Google is getting very good at following javascript links. They many not show up in Google Webmaster Tools, but it's likely they are parsing your javascript anyway. (unsure how much link equity, if any, flows through this)
If you're concerned about it, you can simply watch your rankings, traffic, and crawl stats to monitor for problems. If there is, simply revert your actions back to the original, and hopefully everything will be good.
Best of luck with your SEO!