Azure WebJobs Blob Trigger - multiple resizes

2019-07-25 12:58发布

问题:

I'm attempting to create C# Azure WebJob which is triggered on a new Blob creation to resize the uploaded image into three different sizes. I found and followed this great tutorial.

There are two sections, the first portion "works" but enters into a recursion loop as the creation of the three new sizes triggers the script, which creates three more instances for each of three new images, so forth and so forth. This was intentional, to highlight the need for the final implementation.

Here is the initial recursion loop code which "works" location in the Functions.cs file:

public static void ResizeImagesW800([BlobTrigger("input/{name}.{ext}")] Stream input,
    [Blob("output/{name}-w800.{ext}", FileAccess.Write)] Stream output)
{
    ResizeImage(input, output, 800);
}

public static void ResizeImagesW500([BlobTrigger("input/{name}.{ext}")] Stream input,
    [Blob("output/{name}-w500.{ext}", FileAccess.Write)] Stream output)
{
    ResizeImage(input, output, 500);
}

private static void ResizeImage(Stream input, Stream output, int width)
{
    var instructions = new Instructions
    {
        Width = width,
        Mode = FitMode.Carve,
        Scale = ScaleMode.Both
    };
    ImageBuilder.Current.Build(new ImageJob(input, output, instructions));
}

Here is the code which Visual Studio 2015 gives an error on:

public static void ResizeImagesTask(
    [BlobTrigger("input/{name}.{ext}")] Stream inputBlob,
    string name,
    string ext,
    IBinder binder)
{
    int[] sizes = { 800, 500, 250 };
    var inputBytes = inputBlob.CopyToBytes();
    foreach (var width in sizes)
    {
        var input = new MemoryStream(inputBytes);
        var output = binder.Bind<Stream>(new BlobAttribute($"output/{name}-w{width}.{ext}", FileAccess.Write));

        ResizeImage(input, output, width);
    }
}

private static void ResizeImage(Stream input, Stream output, int width)
{
    var instructions = new Instructions
    {
        Width = width,
        Mode = FitMode.Carve,
        Scale = ScaleMode.Both
    };
    ImageBuilder.Current.Build(new ImageJob(input, output, instructions));
}

The error is thrown at this line:

 var inputBytes = inputBlob.CopyToBytes();

The error is:

CS1061: 'Stream' does not contain a definition for 'CopyToBytes' and no extension method 'CopyToBytes' accepting a first argument of type 'Stream' could be found (are you missing a using directive or an assembly reference?)

I've tried using .NET 3.5, 4.0, 4.5, 4.5.1, 4.5.2, 4.6, 4.6.1 as target frameworks, but all of them throw the same error.

Also, here are the using statements for the Functions.cs file:

using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using ImageResizer;

What am I doing wrong here? Thanks!

UPDATE 1

using System;
using System.Collections.Generic;
using System.IO;
using System.Web;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage;
using ImageResizer;
using ImageResizer.ExtensionMethods;
using Microsoft.WindowsAzure.Storage.Blob;

namespace HilcoIndustrialAssetApiWebJob
{
    public class Functions
    {
        // output blolb sizes
        private static readonly int[] Sizes = { 800, 500, 250 };

        public static void ResizeImagesTask(
        [QueueTrigger("newfileuploaded")] string filename,
        [Blob("input/{queueTrigger}", FileAccess.Read)] Stream blobStream,
        [Blob("output")] CloudBlobContainer container)
        {
            // Extract the filename  and the file extension
            var name = Path.GetFileNameWithoutExtension(filename);
            var ext = Path.GetExtension(filename);

            Console.WriteLine("New Blob name -> " + name);

            // Get the mime type to set the content type
            var mimeType = MimeMapping.GetMimeMapping(filename);

            foreach (var width in Sizes)
            {
                // Set the position of the input stream to the beginning.
                blobStream.Seek(0, SeekOrigin.Begin);

                // Get the output stream
                var outputStream = new MemoryStream();
                ResizeImage(blobStream, outputStream, width);

                // Get the blob reference
                var blob = container.GetBlockBlobReference($"{name}_{width}.{ext}");

                // Set the position of the output stream to the beginning.
                outputStream.Seek(0, SeekOrigin.Begin);
                blob.UploadFromStream(outputStream);

                // Update the content type =>  don't know if required
                blob.Properties.ContentType = mimeType;
                blob.SetProperties();
            }
        }

        private static void ResizeImage(Stream input, Stream output, int width)
        {
            var instructions = new Instructions
            {
                Width = width,
                Mode = FitMode.Carve,
                Scale = ScaleMode.Both
            };
            var imageJob = new ImageJob(input, output, instructions);

            // Do not dispose the source object
            imageJob.DisposeSourceObject = false;
            imageJob.Build();
        }
    }
}

回答1:

I guess the sample use ImageResizer NuGet package. You may install it from VS2015 with the command Install-Package ImageResizer. Then if you add using ImageResizer.ExtensionMethods; in your code, you'll get the CopyToBytes method extending the Stream object. Hope this helps Best regards Stéphane



回答2:

Azure Webjob SDK support Blob Binding so that you can bind directly to a blob.

In your context, you want to bind to an input blob and create multiple output blobs.

  • Use the BlobTriggerAttribute for the input.
  • Use the BlobTriggerAttribute to bind to your output blobs. Because you want to create multiple outputs blobs, you can bind directly to the output container.

The code of your triggered function can look like that:

using System.IO;
using System.Web;
using ImageResizer;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage.Blob;

public class Functions
{
    // output blolb sizes
    private static readonly int[] Sizes = {800, 500, 250};

    public static void ResizeImage(
        [BlobTrigger("input/{name}.{ext}")] Stream blobStream, string name, string ext
        , [Blob("output")] CloudBlobContainer container)
    {
        // Get the mime type to set the content type
        var mimeType = MimeMapping.GetMimeMapping($"{name}.{ext}");
        foreach (var width in Sizes)
        {
            // Set the position of the input stream to the beginning.
            blobStream.Seek(0, SeekOrigin.Begin);

            // Get the output stream
            var outputStream = new MemoryStream();
            ResizeImage(blobStream, outputStream, width);

            // Get the blob reference
            var blob = container.GetBlockBlobReference($"{name}-w{width}.{ext}");

            // Set the position of the output stream to the beginning.
            outputStream.Seek(0, SeekOrigin.Begin);
            blob.UploadFromStream(outputStream);

            // Update the content type
            blob.Properties.ContentType = mimeType;
            blob.SetProperties();
        }
    }

    private static void ResizeImage(Stream input, Stream output, int width)
    {
        var instructions = new Instructions
        {
            Width = width,
            Mode = FitMode.Carve,
            Scale = ScaleMode.Both
        };
        var imageJob = new ImageJob(input, output, instructions);

        // Do not dispose the source object
        imageJob.DisposeSourceObject = false;
        imageJob.Build();
    }
}

Note the use of the DisposeSourceObject on the ImageJob object so that we can read multiple time the blob stream.

In addition, you should have a look at the Webjob documentation about BlobTrigger: How to use Azure blob storage with the WebJobs SDK

The WebJobs SDK scans log files to watch for new or changed blobs. This process is not real-time; a function might not get triggered until several minutes or longer after the blob is created. In addition, storage logs are created on a "best efforts" basis; there is no guarantee that all events will be captured. Under some conditions, logs might be missed. If the speed and reliability limitations of blob triggers are not acceptable for your application, the recommended method is to create a queue message when you create the blob, and use the QueueTrigger attribute instead of the BlobTrigger attribute on the function that processes the blob.

So it could be better to trigger message from a queue that just send the filename, you can bind automatically the input blob to the message queue :

using System.IO;
using System.Web;
using ImageResizer;
using Microsoft.Azure.WebJobs;
using Microsoft.WindowsAzure.Storage.Blob;

public class Functions
{
    // output blolb sizes
    private static readonly int[] Sizes = { 800, 500, 250 };

    public static void ResizeImagesTask1(
        [QueueTrigger("newfileuploaded")] string filename,
        [Blob("input/{queueTrigger}", FileAccess.Read)] Stream blobStream,
        [Blob("output")] CloudBlobContainer container)
    {
        // Extract the filename  and the file extension
        var name = Path.GetFileNameWithoutExtension(filename);
        var ext = Path.GetExtension(filename);

        // Get the mime type to set the content type
        var mimeType = MimeMapping.GetMimeMapping(filename);

        foreach (var width in Sizes)
        {
            // Set the position of the input stream to the beginning.
            blobStream.Seek(0, SeekOrigin.Begin);

            // Get the output stream
            var outputStream = new MemoryStream();
            ResizeImage(blobStream, outputStream, width);

            // Get the blob reference
            var blob = container.GetBlockBlobReference($"{name}-w{width}.{ext}");

            // Set the position of the output stream to the beginning.
            outputStream.Seek(0, SeekOrigin.Begin);
            blob.UploadFromStream(outputStream);

            // Update the content type =>  don't know if required
            blob.Properties.ContentType = mimeType;
            blob.SetProperties();
        }
    }

    private static void ResizeImage(Stream input, Stream output, int width)
    {
        var instructions = new Instructions
        {
            Width = width,
            Mode = FitMode.Carve,
            Scale = ScaleMode.Both
        };
        var imageJob = new ImageJob(input, output, instructions);

        // Do not dispose the source object
        imageJob.DisposeSourceObject = false;
        imageJob.Build();
    }
}


回答3:

I have created a Webjob project with VS2015 (still in update1), I'm using Azure 2.8 SDK (I will soon update it to 2.9).

I have done a cut & paste with the sample code from your original link. I have added the two app config connection strings (and updated the Web App corresponding connection strings in Application Settings). I have juste added the referenced nugget package and the missing "using". It works fine.

I have published the sample code on GitHub in case you wish to try it.

https://github.com/stephgou/ImageResizer.git

You just have to update the app config and your web app connection strings.

Hope this helps

Best regards Stéphane