I have a specific application that requires the use of client certificates for mutual authentication of HTTPS requests. The server has a flexible certificate validation policy that allows it to accept self-signed client certificates that are not present in the server's certificate store. This is known to work just fine using curl as a client.
What I've determined via testing and packet sniffing is that Microsoft's ASP.NET HttpClient
tries to be overly smart during the SSL handshake. This particular client will only use a client certificate (from the WebRequestHandler.ClientCertificates
collection) if it has chain of trust to one of the server's trusted roots. What I've observed is that if there are no certificates with chain of trust then the client simply won't send a certificate at all during the handshake.
This is understandable default behavior but is overly restrictive and there appears to be no way to turn it off. I have experimented with various other WebRequestHandler
properties, including AuthenticationLevel
and ClientCertificateOptions
but to no avail.
Is there a way to force HttpClient
to send a client certificate when it is available in the ClientCertificates
collection, even though it appears that it will not validate on the server side? I'm open to both straightforward and dirty (reflection hacks) solutions as I really need this client to work.