Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
-
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
Hi Dana, did either of these responses help? What did you end up settling on? We'd love an update! Thanks.
Christy
-
I have an article on that here. An extension for firefox called Dust-Me selectors can help you identify unused CSS on multiple pages. It tracks all the pages you visit of a website and tracks classes and ids which were never used. Moreover, you can also give it a sitemap and it will figure out the CSS which was never used.
-
This sounds like it might just do the trick. You'll need to have Ruby installed for it to work. If you have a Mac, it's already on there. If you have a Windows you'll need this. It's pretty easy, I installed Ruby on my Windows gaming rig. If you're running a Linux flavor, try this.
Just take your URLs from the site crawl and make a txt file. You can compare that with your CSS file. I've never tried it on a large site, let me know how it goes for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you influence the default site title?
Hi, We have noticed that on brand searches, a site's page title is replaced with the name of the site or the business, we can understand that this is due to the fact that a CTR enticing title is not as important when the customer is looking for a certain brand. What tells Google what company name to display in this instance? We're having trouble with our French site displaying the page title, we are moving the position of the title code earlier in the page, but can't see why a) Telefleurs is not displaying the title chosen and b) why it is displaying EuroFlorist when our French brand is Telefleurs. Any advice on this would be much appreciated! Thanks, Sam JgLwnGV.png
Web Design | | seoeuroflorist0 -
Old site to new WordPress site - Client concerned about Yahoo Ranking
Hello, Back Story I have a client (law firm) who has a large .html website. He has been doing his own SEO for years and it shows. I think the only reason he reached out to a professional is because he got a huge penalty from Google last fall and fell very far down in rankings. Although, he still retains a #1 spot in Yahoo for his site for the keyword phrase he wants. I have been creating a new WordPress theme for the client and creating all new pages and updating the formatting/SEO. From the beginning I have told the client that when we delete the old site and install a new WordPress site (same domain name, but different page hierarchy) he will take a bump in the search engines until all the 301 redirects get sorted out. I told him I can't guarantee any time frame of how long the dip in SEO will last. Some sites bounce right back while others take longer. Last week, during a discussion, he tells me that if he loses his #1 ranking on Yahoo for any length of time he thinks he will go out of business. Needless to say I was a little taken back. When it comes to SEO I use best practice techniques, do my research, stay on top of trends but I never guarantee rankings when moving to a new site. I'm thinking of ways I can help elevate any type of huge SEO drop off and help the client. Here is what I was thinking of suggesting to the client and I would love some feedback. Main Question He has another domain he isn't doing anything with. It's pretty much his domain name with pc added. I was thinking about using that domain to create a simple 1-2 page WordPress website with brand new content (no duplicate content) aimed at attracting his keyword phrase. I would do as much SEO as I could with a 1-2 page site and give it a month or so to see if this smaller site can get into the top #10 in Yahoo, or higher. Then, when we move the site he will still have a website on the first page of Yahoo for his keyword phrase. I hope I explained it clearly 🙂 I would be open to any suggestions anyone may have. Thanks
Web Design | | Bill_K0 -
Pagerank and SERP rankings downhill after site update
Our site underwent a major update in September 2012. We put the entire site in WordPress and did away with our static pages. Then, in February 2013, we moved our shopping cart pages from a subdomain to our main domain (in WordPress). In both cases, we had to implement a massive 301 redirect through htaccess as most of our URLs changed with the update. Our site consists of the shopping cart (WooCommerce), blog, and supporting pages. We noticed traffic starting to drop around the last week of November (2012) and it has steadily declined ever since. None of our shop pages have a pagerank with virtually all them showing a gray bar with question mark. Only the shop homepage has some pagerank -- that too from 4 previously to 2 now. Some of the words we used to rank very well for before, we don't even show in the first five pages anymore. At first, we thought it was a temporary situation that would self correct over time, but it doesn't seem to get better at all. All said, we have lost over 80% of our traffic from Google organic. Upon repeated reviews, the 301 redirects seem to be done correctly and we don't see any serious mistakes that could cause such a huge drop. So the question is are we missing something? Are we not looking at the right places? Any ideas where we might start looking? We're simply looking for ideas and a fresh perspective.
Web Design | | bizmanuals0 -
Does Using Magento With Multi Sites Affect SEO
We have a client who has 3 separate websites targeting the US, Australia, and the UK. Each of them has relevant ccTLD's such as: .com .com.au and .co.uk. Our client wants to use the Magento multi-site function so it combines all the stores (which are the exact same products) and merge it into one through Magento. These sites are all hosted in the US and had nothing to do with me haha! I understand Rand has mentioned on a video it would be best having the websites with ccTLD's hosted in that country (if budget permits), however in this case the budget doesn't permit us to go down that road. Has anyone any advice on this matter, has anyone did this before and had a lot of success with the SEO? At present there doesn't seem to be a lot of information about it and opinions are varied and sometimes divided. Any help would be very much appreciated guys Thanks, Matt
Web Design | | HigherthanSEO0 -
Why do site links appear under one keyword and not another? Any ideas?
Hi everyone, I have a client whose website is doing the strangest thing. When I search the branded keyword (the company name), Google doesn't show any site links under the result. However, when I search for the company name plus Inc., I do get site links. Now, the website is ranking first in both searches, so that's not the issue. And, as near as I can tell, the site only contains one or two uses of the company name plus the word "inc." Most of the text on the page and all of the meta data only uses the company name, and most of the links that connect to the site use only the company name as well. Even the Who Is for the site doesn't use the term "inc." And ideas what might be going on? I know Google says that the process is still automated but for the life of me I can't figure out what kind of automated process would result in these results. Thanks! Megan (Rebecca's minion)
Web Design | | RebeccaRalston0 -
I've set up my own site which is still fairly new but I'm a bit concerned that there is a bloackage SEO wise somewhere because when I try to crawl the site on SEOmoz it only crawls one page.
I'm really baffled and none of my research has shed much light on it. My url is www.emporiumofmanliness.co.uk I'd really appreciate any help! Thanks
Web Design | | JoshED0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0