This link talks about performance and bypass the portal. To me a WCF service that authenticates is similar to a portal.
A lightweight service authenticates the client as needed and then generates a SAS. Once the client receives the SAS, they can access storage account resources directly with the permissions defined by the SAS and for the interval allowed by the SAS. The SAS mitigates the need for routing all data through the front-end proxy service.
The application is a thick .NET WPF client communicating with a WFC service hosted in an Azure App using Azure AD for authentication.
It is a document management application so lots of file transfer. Search and search results is a relatively small amount of traffic. Need search to be responsive.
Is it over optimization to use SAS for file upload and download?
The other option is to upload and download the files via the WCF service.
What would be some gotcha for one or the other?
My thought is I would like to keep the files off the WCF service to keep it responsive.
If this should be a separate question then fine. The client gets the results of the search 1000 at a time. Even with a long expiration on the SAS it could expire if the they left the results up hours. If the SAS is a Property binding how might I detect an expired SAS? Each file has a unique ID in the application. Would it be better to just request the SAS in the get?
Have cases where a user may access almost every file in the in search results and others where they may only access 1 of 100 based on information in the search results. They may be running some large searches to get counts only and access zero files.
I don't think so. Uploading/downloading files using SAS makes complete sense to me.
Biggest advantage of using SAS based approach is that you're interacting directly with Azure Storage without having the data route through your WCF Service. Thus you could keep your service really lightweight and not put too much infrastructure behind it for scaling purposes. For SAS, the WCF service would simply get a request for a SAS on a blob, and it returns the SAS URL for the blob which your client application can use to upload the files.
With SAS, one point of concern could be sharing of SAS URL and the link getting in hands of unintended audience. However you could mitigate this concern by keeping short-lived SAS tokens and apply IP ACLing on the SAS.
Not much information is shared about your application (especially the search part) but I am guessing that information about the files is kept in some kind of relational database and actual files are kept in blob storage. I would stay away from generating SAS tokens as part of search result and only generate them on demand. If a user is trying to upload a file, you get a SAS URL for upload just before the actual upload process. Similarly when a user is downloading a file, you get a SAS URL for the file at that time and do the download.