I'm wondering if anyone knows of a tool that will aggressively rewrite CSS to compress styles more efficiently. e.g. I'd like:
.foo { color : red; font-size: 16px; height: 20px; }
.bar { color : red; font-size: 16px; height: 30px; }
to be compressed to:
.foo, .bar { color : red; font-size : 16px; }
.foo { height : 20px; }
.bar { height : 30px; }
To be clear, all the minifiers I know of, like YUI Compressor, only remove white-space and possibly join a few properties (like font-family
and font-size
into font
). I'm looking for something that's willing to do a complete re-write of the structure of a file.
Short of that, if anyone knows of any work anyone has done with regards to the compression logic behind this, that info would be appreciated. I'm thinking of writing my own if I can't find one, but there's a million things to consider, like margin-top
over-writing part of margin
, selector specificity & include order, etc etc etc... Then the job of how to efficiently compress the info, like will it be more efficient to repeat a selector or a property?
You can also use http://www.minifycss.com/css-compressor/ or http://www.ventio.se/tools/minify-css/
Have you seen YUI Compressor?
I don't know any aggressive CSS minification tool, but you could use the following approach:
Setup
margin:1px 0 0 0;
tomargin-top:1px; margin-left:0px;
...).div
,p > span
,#myid
) and B (unique properties, eg.display:block;
,color:#deadbeef;
).c
for your elements inb
. This could be the number of neighbors of a given elementb
, oraccumulated lenght of properties - accumulated length of selectors
. Your choice.You may notice that by using this approach you'll get a bipartite graph. Now try the following greedy algorithm (pseudo code):
Algorithm
margin-top:0px;margin-left:1px
).Remarks
Please note that the actual compression depends on your weight function. As it is a greedy algorithm it will likely return a minified CSS, but I believe someone will post a counterexample. Note also that you have to update your weight function after deleting the elements in Z.
Runtime estimate
The algorithm will always terminate and will run in O(
|B|^2*|A|
) if I'm not mistaken. If you use a heap and sort the properties in each adjacency list (setup time O(|B|*|A|log(|A|)
)) you'll get O(|B|*|A|* (log(|B|)+log(|A|))
).A project called CSS Tools claims to do this.
CSS Tidy works like a champ!