Alphaimage loader filter
-
What is an alpha image loader filter and what does it do to on page optimization?
-
It's an old proprietary filter used to try to fix rendering in older Internet Explorer browsers (IE 6 and earlier) that didn't properly support transparency in .png images.
Its use is harmful for page optimisation because it blocks rendering for the rest of the page while it is processing for each element.
Since most of us no longer worry about supporting IE 6 and older in our site designs, it's not worth using at the expense of all our modern browser users.
Paul
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Filter By Category bad for seo?
Hello Everyone! I know that a single product should not have filter by color option since it will create duplicate content, and you have to use canonical tags to solve it. BUT how about sorting through products via category/brands?
On-Page Optimization | | Safxmed
Filter by category changes the URL of the General shop page (ex: hello.com/Shop/Category1022039 ). This page only displays the products within, no content/ descriptions etc unlike the original category page (ORIGINAL CATEGORY PAGE) Each of these category/brand already have their own individual pages (ex: hello.com/Shop/A). This is the page that will be optimized for content, FAQ, and ranking etc. Unlike in the url created when filtering through the categories. So technically I would have 2 URL for each Brand/Category. Would they compete with each other? What would you guys suggest. Please advise me on this. Thank You0 -
ECommerce Filtering Affect on SEO
I'm building an eCommerce website which has an advanced filter on the left hand side of the category pages. It allows users to tick boxes for colours, sizes, materials, and so on. When they've made their choices they submit (this will likely be an AJAX thing in a future release, but isn't at time of writing). The new filtered page has a new URL, which is made up of the IDs of the filter's they've ticked - it's a bit like /department/2/17-7-4/10/ My concern is that the filtered pages are, on the most part, going to be the same as the parent. Which may lead to duplicate content. My other concern is that these two URLs would lead to the exact same page (although the system would never generate the 'wrong' URL) /department/2/17-7-4/10/ /department/2/**10/**17-7-4/ But I can't think of a way of canonicalising that automatically. Tricky. So the meat of the question is this: should I worry about this causing issues with the SEO - or can I have trust in Google to work it out?
On-Page Optimization | | AndieF0 -
How can I find out which filter my site has fallen under?
How can I find out which filter my site has fallen under? there only 22 pages (out of 380) that are not under Google filters Thank you in advance.
On-Page Optimization | | andreysmiling19870 -
Too many on page links - created by filters
I have an ecommerce site and SEOmoz "Crawl Diagnostics Summary" points out that I have too many hyperlinks on most of my pages. The most recent thing I've done that could the culprit is the creation of number product filters. Each filter I put on the page is creating a hyperlink off that page. As an example, there's a filter available for manufacturers. Under that, there are 8 new filter links, thus new hyperlinks. On one category there are 60 new links created because of filters. I feel like these filters have made the user experience on the site better BUT has dramatically increased the number of outbound links off the page. I know keeping it to under 100 is a rule-of-thumb but at the same time there must be some validity to trying to limit them. Do you have any recommendation on how I can "have my cake and eat it too?" Thanks for any help!
On-Page Optimization | | jake3720 -
Does 'XXX' in Domain get filtered by Google
I have a friend that has xxx in there domain and they are a religious based sex/porn addiction company but they don't show up for the queries that they are optimized against. They have a 12+ year old domain, all good health signs in quality links and press from trusted companies. Google sends them adult traffic, mostly 'trolls' and not the users they are looking for. Has anyone experienced domain word filtering and have a work around or solution? I posted in the Google Webmaster help forums and that community seems a little 'high on their horses' and are trying to hard to be cool. I am not too religious and don't necessarily support the views of the website but just trying to help a friend of a friend with a topic that I have never encountered. here is the url: xxxchurch.com Thanks, Brian
On-Page Optimization | | Add3.com0 -
Swear Filter - SERP Impact
My forum currently has a swear filter in place. While I personally think in most cases there's better alternatives to swearing, the general consensus is that it should be removed, and I've no problem with that in principal, however my concern is that Google may penalise the site in some way if this is done. I've searched around a fair bit and haven't found any solid info on this so hoping someone on here may know the answer. The question - Can repeated swear words effect rankings or prevent a website displaying in Google if safesearch is on? Thanks as always.
On-Page Optimization | | Optimise0 -
Anchor text filters
I am using text replace on a blog to automatically link keywords in posts and pages back to the homepage. Sometimes the same KEYWORD links back 2/3 times in one post, can this harm rankings or cause an anchor text filter? thanks,
On-Page Optimization | | babyjane0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30