Currently I have an issue and can't find strict answer on it.
I have ASP.NET MVC 5 application targeting 4.6.1 framework and its goal is to work with third party API's that are secured by TLS 1.1/TLS 1.2 protocols.
I have tried to run my application on 2 environments:
- my local machine Windows 10 with .NET 4.6.2 Framework, IIS Express;
- server machine Windows Server 2012 with .NET 4.6.1, IIS 8.0;
The issue is in that when I start it locally ServicePointManager.SecurityProtocol
default value is set to Ssl3, Tls
, so I can't target API's and have to code it on application start to use TLS 1.1/TLS 1.2: ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12
.
When application runs on server default value ServicePointManager.SecurityProtocol
is set to Tls, Tls11, Tls12
, so it works well.
According to documentation applications run on .NET Framework 4.6 or above versions have to use TLS 1.1/TLS 1.2 by default, how it is on remote machine.
Why the default values of ServicePointManager.SecurityProtocol
are different? Is it because .NET Framework configuration? Or maybe registry settings? I have searched through it but couldn't find an answer.
We can update registry like below to let .Net framework use TLS1.1\TLS1.2, a restart is needed.
I've tried, and the value of ServicePointManager.SecurityProtocol in my machine changed from "Ssl3, Tls" to "Tls, Tls11, Tls12":
MSDN: ServicePointManager.SecurityProtocol Property:
MSDN Blogs: Support for SSL/TLS protocols on Windows:
MSDN: Cipher Suites in TLS/SSL (Schannel SSP)
In other words: this is determined by your Windows version and its patch level.
But like @Damien said, why would you bother what the default level is?