How can I compress / gzip my mimified .js and .css

2019-04-22 11:10发布

问题:

I ran Google pagespeed and it suggests compressing my .js and .css

Eliminate render-blocking JavaScript and CSS in above-the-fold content Show how to fix

Enable compression
Compressing resources with gzip or deflate can reduce the number of bytes sent over the network.
Enable compression for the following resources to reduce their transfer size by 210.9KiB (68% reduction).
Compressing http://xx.com/content/bundles/js.min.js could save 157.3KiB (65% reduction).
Compressing http://xx.com/content/bundles/css.min.css could save 35.5KiB (79% reduction).
Compressing http://xx.com/ could save 18.1KiB (79% reduction).

During my publish I have a step that uses Windows Powershell to move a .js and .css mimified bundle to S3 and this goes to cloudfront.

Is there some step I could add in the PowerShell script that would compress the .js and .css files?

Also once the files are compressed then do I have to do anything more than change the name so as to tell my browser that it will need to try and accept a gzip file?

回答1:

You can add to your upload script the needed code to gzip compress the files.

Some example code could be this:

function Gzip-FileSimple
{
    param
    (
        [String]$inFile = $(throw "Gzip-File: No filename specified"),
        [String]$outFile = $($inFile + ".gz"),
        [switch]$delete # Delete the original file
    )

    trap
    {
        Write-Host "Received an exception: $_.  Exiting."
        break
    }

    if (! (Test-Path $inFile))
    {
        "Input file $inFile does not exist."
        exit 1
    }

    Write-Host "Compressing $inFile to $outFile."

    $input = New-Object System.IO.FileStream $inFile, ([IO.FileMode]::Open), ([IO.FileAccess]::Read), ([IO.FileShare]::Read)

    $buffer = New-Object byte[]($input.Length)
    $byteCount = $input.Read($buffer, 0, $input.Length)

    if ($byteCount -ne $input.Length)
    {
        $input.Close()
        Write-Host "Failure reading $inFile."
        exit 2
    }
    $input.Close()

    $output = New-Object System.IO.FileStream $outFile, ([IO.FileMode]::Create), ([IO.FileAccess]::Write), ([IO.FileShare]::None)
    $gzipStream = New-Object System.IO.Compression.GzipStream $output, ([IO.Compression.CompressionMode]::Compress)

    $gzipStream.Write($buffer, 0, $buffer.Length)
    $gzipStream.Close()

    $output.Close()

    if ($delete)
    {
        Remove-Item $inFile
    }
}

From this site: Gzip creation in Powershell



回答2:

Powershell community extensions has a scriptlet for gZipping files, and it's very easy to use:

Write-Gzip foo.js #will create foo.js.gz
mv foo.js.gz foo.js -Force

You don't have to rename your files, just add a Content-Encoding header and set it to gzip.



回答3:

Since Amazon S3 is intended to only serve static files, it doesn't compress files (assets), that's why you need to compress them by yourself:

  • Compress your .js and .css with gzip: I don't know about howto with PowerShell, but I do with Python, I suggest to make a python deployment script (even better a fabfile) and integrate the compression and push code on it.

  • "Also once the files are compressed then do I have to do anything more than change the name so as to tell my browser that it will need to try and accept a gzip file?" : Good Questions ! It is not necessary to change the name of the compressed file, I suggest to don't rename. However, you:

  • MUST set also the header "Content-Encoding:gzip" otherwise browsers will not know the file.

  • Must set the headers 'Content-Type': type (type = 'application/javascript' or 'text/css')

  • Must set the header 'x-amz-acl': 'public-read': to make it public accessible.

  • I suggest also to set the header "Cache-Control:max-age=TIME" (example: TIME=31104000, for 360 days): To help browsers caching it (Better performance)

This will work whether served from origin or with cloudfront. But remember, if you serve them with cloudfront, You will need to invalidate all the files after each push, otherwise the old version will live up to 24 hours from the push. Hope this helps, I can provide a python solution if needed.