OpenSSL mission vs legacy systems
I see time and time again that low security enforcement is protected from changes because of breaking support in legacy systems and I have trouble understanding how that relates to the OpenSSL mission:
“We believe everyone should have access to security and privacy tools, whoever they are, wherever they are or whatever their personal beliefs are, as a fundamental human right.”
For example in the case of RSA key lengths:
“However this would break even more legacy use cases. To allow them but not allow generating insecure keys by default these minimum (and possibly maximum) key sizes should be made configurable.”
https://github.com/openssl/project/issues/809
I understand that said legacy systems are the “.., whoever they are, wherever they are or whatever their personal beliefs are,..” part of the mission statement.
But in my mind “.. everyone should have access to security and privacy tools ..” means that we deliver something that actually IS secure to use.
Is there a strategy for dealing with these security issues or are they dealt with case by case?
Am I the only one feeling that we sometimes try to serve some use cases which in reality doesn’t make sense?
Michael BaentschMon 23 Mar 2026 5:51AM
I whole-heartedly agree. IMO the Mission of OpenSSL vindicates by default to be "providing access to [arguably needless to say, but by default secure] security and privacy tools".
For those not willing to read a lot of arguments, I'd simply say: Can you really justify not landing https://github.com/openssl/openssl/pull/25094 for years? I couldn't.
With more words: One might argue that the mission does only promise to provide some security, particularly to those that don't have any but I'd think that'd be mincing words... possibly justified by a 1990s stance to overcome US (crypto-as-"weapons") export regulations but totally outdated by today's plethora of crypto code bases heeding and pushing the state of the art (also in breaking such code bases :).
So, what'd be wrong by setting the bar in the default code base equally by default as high as the state of the art demands all the while permitting lower bars by explicit, e.g., "LEGACY_SECURITY_SETTING_DO_NOT_USE" defines to still cater for the legacy cases? In the quoted RSA key length case, put a above-define-guard around an `assert` for minimum sensible key lengths?
If this requires more test configs for the commercial OpenSSL users paying the bills and demanding this legacy support, I'd say fine -- why not: Generating more tests (checking legacy settings) doesn't sound like an effort the core team should undertake but something a reasonably well-trained AI could do without much effort (building and testing the system with a define like the one proposed above set), no?
And finally a Fortune Cookie "wisdom": "If you keep doing what you always did, you keep getting what you always got". In this case, being afraid to break legacy code practically guarantees being as broken (also in the cryptographic sense) as such legacy (code and security configs). And that does not sound like something the Mission wants to achieve.
Viktor DukhovniMon 23 Mar 2026 7:07AM
It is worth keeping in mind that security advances primarily by raising the ceiling, rather than the floor. More thoughts on that in https://datatracker.ietf.org/doc/html/rfc7435
Specifically, we advance the mission by adding support for, and preferring stronger algorithms. For example, OpenSSL 3.5 and later prefer X25519MLKEM768 over other TLS key exchange groups and servers will issue a Hello Retry Request (HRR) when a client includes that group in its SupportedGroups TLS extension, but does not speculatively include a corresponding keyshare. OpenSSL 3.5 and later clients do in fact send that keyshare by default.
Changes have also been made to avoid SHA1 in TLS, with TLS 1.0 and 1.1 now only supported if the legacy provider is loaded and the cipher security level is explicitly set to `0`.
The default security level 1 is notionally 80-bit or better symmetric equivalent ciphers, and certificate signatures, ... So RSA keys are 1024-bit or higher, and if you want or need security level 2 (IIRC 112-bit or better with 2048-bit RSA), it is available...
See https://github.com/openssl/project/issues/809#issuecomment-4108466004 for some additional comments.
Frederik Wedel-HeinenMon 23 Mar 2026 4:51PM
Okay so from the case you make in the GitHub issue I understand the OpenSSL mission is to allow users to verify the signature made with an RSA key that might have been compromised. I don’t understand the security in that.
Another security aspect is to limit attack surfaces and limit software complexity.
Again my concerns are not only on RSA, but other aspects of the code base as well.
I think what I am suggesting is to setup criteria like we do on addition of algorithms and protocols. I’m assuming that a new cryptographic algorithm must be standardized before introducing it to the codebase?
What’s the criteria for removing an algorithm apart from api stability guarantees of major versions? Do we have such a policy? I’m asking because the complexity of the codebase is a weakness in terms of security and maintainability.
I’m trying not to step on toes but I don’t see the clear direction.
We can always find use cases for keeping stuff around, but at some point we need to let go. The alternative always exist: download an old copy or release branch and build your own copy of a “once were secure system”.
I really think the latest version should be secure to its bone.
Jon EricsonMon 23 Mar 2026 7:23PM
There's a human psychology angle to this question. I gotta assume that the vast majority of people who are tasked with maintaining systems that include OpenSSL have no particular knowledge of data security. The OpenSSL Library is just one of innumerable dependencies and they will take the path of least resistance when it comes to cryptography.
So imagine this sort of person upgrading to the latest OpenSSL release and it stops working for them. What we'd like to happen is for them to say "Oh! My system isn't secure so I better do the work to fix that. Thank you OpenSSL for pointing that out!" But we all know that what they will actually say is something like "Oh! OpenSSL is broken. Guess I won't update after all." If that happens, it's sorta the worst of both worlds: they still use broken cryptography and they don't get the benefits from upgrading to the latest OpenSSL version.
That isn't to say we shouldn't remove broken stuff. It's rather to say we need to find ways to bring people who really don't want to deal with this stuff along. I think about how Google pushed a lot of sites (including ones I was working with a the time) from HTTP to HTTPS. It was a combination of carrot and stick. First HTTPS sites were given a boost in the Google search algorithm and then Chromium began putting in roadblocks to HTTP content. Progress is slow, but there has been progress!
The other important bit is that there's been a lot of education about how and why sites should move to HTTPS. A decade ago I didn't see the value for essentially static sites such as my own blog. Now I'm sorta appalled when I bump into one of the small minority of sites that are holding out. (Mostly, I suspect, because the people who created the sites don't maintain them anymore.) It's also gotten easier to set up HTTPS because it's usually the default anyway these days.
I'm no authority about how to do this in the OpenSSL Library, but I think the path should be:
1. Make it easier to implement secure algorithms.
2. Make it harder to use insecure algorithms.
3. Lay out a transition plan to take into account the apathy a lot of people have about security.
4. Educate, educate, educate.
5. Be aware of factors outside of our control that might limit progress.
I'm personally excited about the education bit, as you might have gathered. Step one could be to put together a list of things that are not secure and are still encountered in the wild. Too-small RSA keys would be on the top of that list.
Frederik Wedel-HeinenThu 26 Mar 2026 6:48AM
I’m just advocating that our mission is to provide security tools so users should feel like they are doing something wrong if their use case is insecure.
What we'd like to happen is for them to say "Oh! My system isn't secure so I better do the work to fix that. Thank you OpenSSL for pointing that out!" But we all know that what they will actually say is something like "Oh! OpenSSL is broken. Guess I won't update after all."
I think the truth lies somewhere in between. A professional would try to understand what went wrong and then decide if he/she has to update the system. If the choice is “no”, then it is probably fine because his use case is not secure any longer.
I’ve tried to do a scan of what caught my eyes when inspecting the latest master and here are some findings:
Ciphers:
Blowfish
DES
3DES
RC2
Rc4
Idea
RSA 512, 1024
Hash:
SHA-1
Whirlpool
Md2
Md4
Md5
Protocols:
SSL 3.0
TLS 1.0
TLS 1.1
DTLS 1.0
DTLS1_BAD_VER
TLS features:
Anonymous diffie Hellman ciphersuites
TLS compression
They are all to some degree insecure ranging from “bad idea” to “plain out broken” and “why are you seriously still doing this”.
There might be more that I missed.
Paul DaleThu 26 Mar 2026 6:51AM
Some are still useful even though they are weak. For example, MD5 is still used in non-cryptographic ways. I can't say this about all (or even most) of them of course.
Tomas MrazThu 26 Mar 2026 9:05AM
@Frederik Wedel-Heinen I do not think we want to drop the concept of legacy provider and legacy algorithms. They still have their use cases and should be available (not by default, but if you know what you are doing, you should be able to do it).
BTW SSL 3.0 support was removed in 4.0/master.
We should also keep the principle that we remove things in major versions and not in minor ones.
As for the minimum RSA key lengths or legacy small EC parameters - I think we could do something with them but it requires a solution that won't break legacy use cases. Maybe having these implemented in the legacy provider might work although it would be non-trivial.
Angel YankovThu 26 Mar 2026 9:38AM
Just a datapoint on this. Completely agree with Tomas. Not enabled by default but having the option to do so, if you know what you are doing.
A lot of "broken" ciphers/hashes keep being used in non-cryptographic ways indeed. Not just MD5. SQL Server requires md2 support.
More recently: SHA-1 might be broken for most cryptographic purposes, but it will take years to move off from it. Even git still defaults to SHA-1 for commit hashes.
Frederik Wedel-HeinenThu 26 Mar 2026 10:37AM
As I state in the original post I do understand that we can’t pull the plug tomorrow but I’m looking for the strategy to support the mission.
I gather from this discussion that: yes we do care and have the legacy provider that supports some segregation and compile time configs that supports another type of segregation.
Now what I am missing is perhaps an overview over what and how to understand which decisions have been made and which have not been made. This relates very much to Jon’s comment about transition plans and education.
Would it be worthwhile to have some sort of documentation about this? I can prepare something which could serve as a starting point.
Paul Dale ·Mon 23 Mar 2026 4:31AM
I agree, I know the reasons behind the current stance but, IMO, the project is too conservative about making breaking changes. I wouldn't advocate a free for all but some more flexibility would be beneficial both in terms of security and the wider perception of the project.