Search
Items tagged with: jose
One of the criticisms of the JOSE/JWT standards is that they give an attacker too much flexibility, by allowing them to specify how a message should be processed. In particular, the standard “alg” header tells the recipient what cryptographic algorithm was used to sign or encrypt the contents. Letting the attacker chose the algorithm can lead to disastrous security vulnerabilities.
In response to this, some people have suggested alternatives such as removing the alg header and instead only allowing a version and a type from a very limited set. While this seems better, it is actually just a variation on the same mistake. If an attacker can still pick the version then they can still potentially cause mischief.
My opinion is that it is better to remove any indicators from the message about how it should be processed – no “alg”, no purpose indicator, no version numbers. Instead the message should (optionally) allow specifying just one hint: a key id (“kid” header in JOSE). The metadata associated with the key should then tell you how to process messages with that key. For instance, the related JWK spec already allows specifying the algorithm that the key is to be used for. The recipient looks up the key and processes the message according to the metadata.
If you want to change the algorithm then you simply publish a new key with the new algorithm. It is best practice to use a new key when switching algorithms anyway. Even in the case where the old key would be compatible with the new algorithm, security proofs often assume that a key is only used for one algorithm. If you really must use the same key with more than one algorithm, then you could just publish more than one JWK, referencing the same key but with different metadata.
(Update added 26th April 2021): In the context of a protocol like TLS, where parties do not necessarily know each others’ key identifiers ahead of time you can imagine baking the algorithm details into the certificate. For example, suppose that a TLS ciphersuite identifier was baked directly into each certificate. The client says what ciphersuites it supports, the server then responds with a cert that matches one of those ciphersuites and they use the exact ciphersuite in the cert. CAs can then perform useful functions like ensuring deprecated ciphersuites aren’t used in new certificates, or that the same key is not used for multiple ciphersuites in potentially dangerous combinations.
A digression on capability security
I have always found an object-capability (ocap) security analysis to be illuminating, even when not designing an ocap system. For instance, consider these two ways of copying a file on a UNIX system (I can’t remember which paper I read this example in, but it is not my own):
cp from to
cat <from >to
The first (cp) takes an input filename and an output filename and opens the two files for you. The second (cat) instead takes an already opened pair of file descriptors and reads from one and writes to the other. If we think about the permissions that these programs need, the first needs to be able to read and write to any files that you might ask it to, while the second only needs to read and write to the specific file descriptors that you passed it as arguments.
The first approach is much more susceptible to a class of security vulnerabilities known as confused deputy attacks. The original example was a compiler service that accepted programs to compile and an output file to write the compiled object to. By passing in a carefully chosen output file (say /etc/passwd) you can trick the compiler into performing malicious actions that you would not be able to perform by yourself (overwriting sensitive files). Confused deputy attacks are widespread. Both path traversal and CSRF attacks are examples.
By bundling up the permissions to perform an operation with an unforgeable reference to an object to perform those actions on, a capability-based system makes it impossible to even express confused deputy attacks: you cannot forge a writeable reference to a file that you do not yourself have write access to. The same principle is used in CSRF countermeasures (which are in fact largely capability-based) – the anti-CSRF token prevents an attacker forging a request. The ocap literature shows how in this model, secure design closely follows normal best practice software design principles: encapsulation, dependency injection, eliminating global variables and static methods with side-effects, and so on.
So how does this relate to JOSE? By allowing the sender of a message to indicate a key to be used separately from specifying the algorithm to use with that key, JOSE is making exactly the same mistake the UNIX does and opening up implementations to confused deputy attacks. Here the JOSE/JWT library is the deputy and it can become confused about what algorithm to use with what key, just as a compiler can become confused about what files it should be writing to. If we remove the ability for an attacker to specify a key independently of specifying the algorithm, then we eliminate this whole class of attacks.
If we further only allow specifying the key by id, then the key id (“kid” in JOSE) becomes a capability. That capability can either be public (a simple name) or private (an unguessable random string). In either case, an attacker must use one of our set of valid keys and cannot confuse our deputy by creating new keys or specifying incompatible algorithms.
https://neilmadden.blog/2018/09/30/key-driven-cryptographic-agility/
#cryptography #jose #jwt #objectCapabilitySecurity
In the wake of some more recent attacks against popular JSON Web Token (JWT)/JSON Object Signing and Encryption (JOSE) libraries, there has been some renewed criticism of the JWT/JOSE standards themselves (see also discussion on lobste.rs with an excellent comment from Thomas Ptacek summarising some of the problems with the standard). Given these criticisms, should you use JOSE at all? Are articles like my recent “best practices” one just encouraging adoption of bad standards that should be left to die a death?Certainly, there are lots of potential gotchas in the specs, and it is easy for somebody without experience to shoot themselves in the foot using these standards. I agree with pretty much all of the criticisms levelled against the standards. They are too complicated with too many potentially insecure options. It is far too easy to select insecure combinations or misconfigure them. Indeed, much of the advice in my earlier article can be boiled down to limiting which options you use, understanding what security properties those options do and do not provide, and completely ignoring some of the more troublesome aspects of the spec. If you followed my advice of using “headless” JWTs and direct authenticated encryption with a symmetric key, you’d end up not far off from the advice of just encrypting a JSON object with libsodium or using Fernet.
So in that sense, I am already advocating for not really using the specs as-is, at least not without significant work to understand them and how they fit with your requirements. But there are some cases where using JWTs still makes sense:
- If you need to implement a standard that mandates their use, such as OpenID Connect. In this case you do not have much of a choice.
- If you need to interoperate with third-party software that is already using JWTs. Again, in this case you also do not have a choice.
- You have complex requirements mandating particular algorithms/parameters (e.g. NIST/FIPS-approved algorithms) and don’t want to hand-roll a message format or are required to use something with a “standard”. In this case, JWT/JOSE is not a terrible choice, so long as you know what you are doing (and I hope you do if you are in this position).
If you do have a choice, then you should think hard about whether you need the complexity of JWTs or can use a simpler approach that takes care of most of the choices for you or store state on the server and use opaque cookies. In addition to the options mentioned in the referenced posts, I would also like to mention Macaroons, which can be a good alternative for some authorization token use-cases and the existing libraries tend to build on solid foundations (libsodium/NaCl).
So, should you use JWT/JOSE at all? In many cases the answer is no, and you should use a less error-prone alternative. If you do need to use them, then make sure you know what you are doing.
https://neilmadden.blog/2017/03/15/should-you-use-jwt-jose/
#cryptography #jose #jwt
In the wake of some more recent attacks against popular JSON Web Token (JWT)/JSON Object Signing and Encryption (JOSE) libraries, there has been some renewed criticism of the JWT/JOSE standards themselves (see also discussion on lobste.rs with an excellent comment from Thomas Ptacek summarising some of the problems with the standard). Given these criticisms, should you use JOSE at all? Are articles like my recent “best practices” one just encouraging adoption of bad standards that should be left to die a death?
Certainly, there are lots of potential gotchas in the specs, and it is easy for somebody without experience to shoot themselves in the foot using these standards. I agree with pretty much all of the criticisms levelled against the standards. They are too complicated with too many potentially insecure options. It is far too easy to select insecure combinations or misconfigure them. Indeed, much of the advice in my earlier article can be boiled down to limiting which options you use, understanding what security properties those options do and do not provide, and completely ignoring some of the more troublesome aspects of the spec. If you followed my advice of using “headless” JWTs and direct authenticated encryption with a symmetric key, you’d end up not far off from the advice of just encrypting a JSON object with libsodium or using Fernet.
So in that sense, I am already advocating for not really using the specs as-is, at least not without significant work to understand them and how they fit with your requirements. But there are some cases where using JWTs still makes sense:
- If you need to implement a standard that mandates their use, such as OpenID Connect. In this case you do not have much of a choice.
- If you need to interoperate with third-party software that is already using JWTs. Again, in this case you also do not have a choice.
- You have complex requirements mandating particular algorithms/parameters (e.g. NIST/FIPS-approved algorithms) and don’t want to hand-roll a message format or are required to use something with a “standard”. In this case, JWT/JOSE is not a terrible choice, so long as you know what you are doing (and I hope you do if you are in this position).
If you do have a choice, then you should think hard about whether you need the complexity of JWTs or can use a simpler approach that takes care of most of the choices for you or store state on the server and use opaque cookies. In addition to the options mentioned in the referenced posts, I would also like to mention Macaroons, which can be a good alternative for some authorization token use-cases and the existing libraries tend to build on solid foundations (libsodium/NaCl).
So, should you use JWT/JOSE at all? In many cases the answer is no, and you should use a less error-prone alternative. If you do need to use them, then make sure you know what you are doing.
https://neilmadden.blog/2017/03/15/should-you-use-jwt-jose/
#cryptography #jose #jwt
In the wake of some more recent attacks against popular JSON Web Token (JWT)/JSON Object Signing and Encryption (JOSE) libraries, there has been some renewed criticism of the JWT/JOSE standards themselves (see also discussion on lobste.rs with an excellent comment from Thomas Ptacek summarising some of the problems with the standard). Given these criticisms, should you use JOSE at all? Are articles like my recent “best practices” one just encouraging adoption of bad standards that should be left to die a death?Certainly, there are lots of potential gotchas in the specs, and it is easy for somebody without experience to shoot themselves in the foot using these standards. I agree with pretty much all of the criticisms levelled against the standards. They are too complicated with too many potentially insecure options. It is far too easy to select insecure combinations or misconfigure them. Indeed, much of the advice in my earlier article can be boiled down to limiting which options you use, understanding what security properties those options do and do not provide, and completely ignoring some of the more troublesome aspects of the spec. If you followed my advice of using “headless” JWTs and direct authenticated encryption with a symmetric key, you’d end up not far off from the advice of just encrypting a JSON object with libsodium or using Fernet.
So in that sense, I am already advocating for not really using the specs as-is, at least not without significant work to understand them and how they fit with your requirements. But there are some cases where using JWTs still makes sense:
- If you need to implement a standard that mandates their use, such as OpenID Connect. In this case you do not have much of a choice.
- If you need to interoperate with third-party software that is already using JWTs. Again, in this case you also do not have a choice.
- You have complex requirements mandating particular algorithms/parameters (e.g. NIST/FIPS-approved algorithms) and don’t want to hand-roll a message format or are required to use something with a “standard”. In this case, JWT/JOSE is not a terrible choice, so long as you know what you are doing (and I hope you do if you are in this position).
If you do have a choice, then you should think hard about whether you need the complexity of JWTs or can use a simpler approach that takes care of most of the choices for you or store state on the server and use opaque cookies. In addition to the options mentioned in the referenced posts, I would also like to mention Macaroons, which can be a good alternative for some authorization token use-cases and the existing libraries tend to build on solid foundations (libsodium/NaCl).
So, should you use JWT/JOSE at all? In many cases the answer is no, and you should use a less error-prone alternative. If you do need to use them, then make sure you know what you are doing.
https://neilmadden.blog/2017/03/15/should-you-use-jwt-jose/
#cryptography #jose #jwt