I wouldn’t call this “incoherent” rather, I propose the terminology “vendor subvertable”.
Yes, any time a vendor of software has any direct update capabilities, a targeted update can bypass the encryption provided by some software.
In practice, we tend to delegate to a 3rd party like an OS distribution packager, where there is a delay between vendor releases and packaging. Where it can be discovered.
Another good reason to use open source for core cryptography libraries and any code a vendor supplies should be open and repeatably built also.
> A cryptosystem is incoherent if its implementation is distributed by the same entity which it purports to secure against.
I don’t see why this would be true. The real problem with the cited examples is that every program downloads the latest implementation every time you want to use it. You could download the software once and then verify it is safe (by auditing the code etc) without any problem, and the security of the channel you get it over doesn’t matter. The real problem is that you can’t freeze the state of the application but the server can modify the code that’s running every time you use it.
Yes, web crypto requires trust on the server and is not secure if your threat model includes its compromise (or that of the CA). ProtonMail recognizes this[1] and offers native open source clients. They also try to somewhat reduce the issue by using an SPA for the web client, to reduce fetches from the server.
[1] https://vimeo.com/216747532 2017 presentation by ProtonMail's CTO saying essentially the above, at 50:41
There are some efforts to use extensions to allow signing/verification of web assets (assuming you trust the extension/browser), some via third parties:
> A cryptosystem is incoherent if its implementation is distributed by the same entity which it purports to secure against.
That's what the recent Signal tormoil is like.
Communication via Signal app that's safe if you could be sure it was compiled from verified open-source code, but Signal still doesn't provide any way to in principle eliminate the possibility that client binary distributers put in a backdoor at the last minute.
I wouldn’t call this “incoherent” rather, I propose the terminology “vendor subvertable”.
Yes, any time a vendor of software has any direct update capabilities, a targeted update can bypass the encryption provided by some software.
In practice, we tend to delegate to a 3rd party like an OS distribution packager, where there is a delay between vendor releases and packaging. Where it can be discovered.
Another good reason to use open source for core cryptography libraries and any code a vendor supplies should be open and repeatably built also.
> A cryptosystem is incoherent if its implementation is distributed by the same entity which it purports to secure against.
I don’t see why this would be true. The real problem with the cited examples is that every program downloads the latest implementation every time you want to use it. You could download the software once and then verify it is safe (by auditing the code etc) without any problem, and the security of the channel you get it over doesn’t matter. The real problem is that you can’t freeze the state of the application but the server can modify the code that’s running every time you use it.
Yes, web crypto requires trust on the server and is not secure if your threat model includes its compromise (or that of the CA). ProtonMail recognizes this[1] and offers native open source clients. They also try to somewhat reduce the issue by using an SPA for the web client, to reduce fetches from the server.
[1] https://vimeo.com/216747532 2017 presentation by ProtonMail's CTO saying essentially the above, at 50:41
There are some efforts to use extensions to allow signing/verification of web assets (assuming you trust the extension/browser), some via third parties:
https://github.com/tasn/webext-signed-pages
https://github.com/jahed/webverify
https://github.com/facebookincubator/meta-code-verify
There was another one posted here recently, but I'm unable to find it now.
Probably WEBCAT from the SecureDro team: https://securedrop.org/news/introducing-webcat-web-based-cod...
That's the one! Thanks
Possibly https://github.com/freedomofpress/webcat? (I also made one, a long time ago: https://github.com/twiss/hcs-checker-firefox.)
There is also an effort to build this into browsers directly, which we (Proton) are involved with:
https://github.com/twiss/source-code-transparency (old initial proposal)
https://github.com/beurdouche/explainers/blob/main/waict-exp... (more recent proposal)
https://github.com/beurdouche/w3c-waict/blob/main/waict.bs (very early draft specification)
> A cryptosystem is incoherent if its implementation is distributed by the same entity which it purports to secure against.
That's what the recent Signal tormoil is like.
Communication via Signal app that's safe if you could be sure it was compiled from verified open-source code, but Signal still doesn't provide any way to in principle eliminate the possibility that client binary distributers put in a backdoor at the last minute.