Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
-
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
Hi Dana, did either of these responses help? What did you end up settling on? We'd love an update! Thanks.
Christy
-
I have an article on that here. An extension for firefox called Dust-Me selectors can help you identify unused CSS on multiple pages. It tracks all the pages you visit of a website and tracks classes and ids which were never used. Moreover, you can also give it a sitemap and it will figure out the CSS which was never used.
-
This sounds like it might just do the trick. You'll need to have Ruby installed for it to work. If you have a Mac, it's already on there. If you have a Windows you'll need this. It's pretty easy, I installed Ruby on my Windows gaming rig. If you're running a Linux flavor, try this.
Just take your URLs from the site crawl and make a txt file. You can compare that with your CSS file. I've never tried it on a large site, let me know how it goes for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my financial services site being flagged as gambling
Watchguard and Websense/Forecepoint are flagging my financial services site gambling...how can I prevent that from happening. https://fwag.com/
Web Design | | AdsposureDev0 -
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Incorporating Spanish Page/Site
We bought an exact match domain (in Spanish) to incorporate with regular website for a particular keyword. This is our first attempt at this, and while we do have Spanish speaking staff that will translate/create a nice, quality page, we're not going to redo everything in Spanish page. Any advice on how to implement this? Do I need to create a whole other website in Spanish? Will that be duplicate content if I do? Can I just set it up to show the first page in Spanish, but if they click on anything else it redirects to our site? I'm pretty clueless on this, so if anything I've suggested is off-the-wall or a violation, I'm really just spit-balling, trying to figure out how to implement this. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Duplicate content on websites for multiple countries
I have a client who has a website for their U.S. based customers. They are currently adding a Canadian dealer and would like a second website with much of the same info as their current website, but with Canadian contact info etc. What is the best way to do this without creating duplicate content that will get us penalized? If we create a website at ABCcompany.com and ABCCompany.ca or something like that, will that get us around the duplicate content penalty?
Web Design | | InvoqMarketing0 -
Is it better to redirect a url or set up a landing page for a new site?
Hi, One of our clients has got a new website but is still getting quite a lot of traffic to her old site which has a page authority of 30 on the home page and has about 20 external backlinks. It's on a different hosting package so a different C block but I was wondering if anyone could advise if it would be better to simply redirect this page to the new site or set up a landing page on this domain simply saying "Site has moved, you can now find us here..." sort of idea. Any advice would be much appreciated Thanks
Web Design | | Will_Craig0 -
Seo and CSS media queries
Hello to all participants! I'm starting on responsive design with css media queries and I was wondering if hidding content can, in this case, can also be bad for seo? I know that hidding content is bad (eg. display: none;), but is it also like that with responsive design or does Google see it other way? If I have a news column with title, image and text for 1024px and hide the text and image leaving just the title for 768px, or smaller, will Google consider this black hat and will it be bad for seo? are there any articles I can read about this subject, and other similar subjects? sorry for my english 🙂 thanks
Web Design | | Lusodados1 -
Flat vs. Silo Site Architecture, What's Better
I'm in the midst of converting a fairly large website (500+ pages) into WordPress as a content management system. I know that there are two schools of thought regarding site architecture: Those who believe that everything should be categorized, I.E.- website.com/shoes/reebok/running People who believe that the less clicks it takes from the homepage the better. As it stands, our current site has a completely flat architecture, with landing pages being added randomly to the root, I.E.- website.com/affordable-shoes-in-louisville-ky I'm beginning to think that there is a gray area with this. I spoke to someone who says that you should never have a page more than 2 categories/subfolders deep. But if we plan on adding a lot of content doesn't it make sense to set the site up into many categories so we can set a good foundation for adding massive amounts of content. Also, will 301 redirecting to the new structure cause us to lose rankings for certain terms? Any help here is appreciated.
Web Design | | C-Style0