I hope the community will combine it all at some point and add specifications for proper policy and resources management too by looking at the full lifecycle of modern applications.
Hopefully, when OAuth 2.1 is released, OpenID Connect will be updated to be based on OAuth 2.1. This would make some of the useful advice in FAPI (like PKCE) mandatory. A lot of the FAPI stuff that is not included in OAuth 2.1 or the OAuth BCP is just over-engineering by wanabee cryptographers, bad advice, or at least useless advice.
Knowing the OpenID foundation, this could be yet another undocumented errata set released, but we can still dream of a better world, can't we? In a better world, instead of "Use 2048 bit RSA keys" the spec will say "Don't use RSA ever."
The advanced FAPI has even more directly bad advice, as requiring PS256 and ES256. Now, these are not so bad as the common RS256 (RSA with PKCSv1.5 padding), but they are still bad algorithms. The only good asymmetric algorithm defined in JWS is EdDSA, which just like that, is forbidden by OIDC FAPI. So I'm quite happy FAPI is just a profile that would mostly be ignored.
It looks like FAPI 2.0 has finally been released in December, and thankfully it killed off most of the excesses of FAPI 1.0 and is better aligned with OAuth 2.0.
At this point the main differences are:
1. PAR: A good idea that should become a part of the OAuth standard, even if it costs an extra RTT. It prevents a pretty large class of attacks.
2. "iss" response parameter becomes mandatory (it is a core part of OAuth 2.1, but considered optional). This is a useful measure against mix-up attacks in certain conditions, but the conditions that enable it are less common than the ones
3. Requires either Mutual TLS or DPoP. I am less sold on that.
Mutual TLS is great for high security contexts, since it prevents Man-in-the-Middle attacks and generally comes with guaranteed key rotation. But mTLS is still quite troublesome to implement. DPoP is easier, but of more questionable value. It doesn't fully protect against MitM, keys are rarely rotated and it is generally susceptible to replay attacks against you take costly measures and it relies on JWT being implemented securely, by a developer who understand how not to shoot themselves in the foot with their brand new JWT Mark II shotgun. The
4. Which bring us to cryptographic algorithm usage guidelines. These are not part of OAuth 2.1, since OAuth does not mandate or rely on any cryptography with the sole exception of the SHA-256 hash used for PKCE.
This is good design. When there is an alternative that doesn't require cryptography (such as stateful tokens or the authorization code flow), it is generally more secure. You have one less algorithm to worry about being broken (e.g. by advances in quantum computing).
For what it's worth, the guidelines okay, but not good enough. RSA is still allowed. Yes, it requires PSS and 2048 bit keys, but there are knobs left that you can can use to generate valid but insecure RSA keys (e.g. a weak exponent). With EdDSA there is no such option. Weak keys are impossible to generate. Considering EdDSA is also faster, has smaller signature size and better security, there are no good reason to use RSA (and to a lesser degree ECDSA) anymore.
In short, in an ideal world I think I would just want OAuth 2.1 to incorporate PAR and make the "iss" response parameter mandatory. The cryptographic (JOSE) parts of the specification seem to me like too much add complexity, for too little gain, with too little in the way of making cryptography safe.
OpenID Foundation seems took a path of making "profiles" like FAPI rather consolidation and enforcing the best practices and depricating the bad.
FAPI (Financial-grade API Security Profile 1.0) https://openid.net/specs/openid-financial-api-part-1-1_0.htm...
I hope the community will combine it all at some point and add specifications for proper policy and resources management too by looking at the full lifecycle of modern applications.