pull down to refresh

Trust is key.
Determining which software developers can be trusted is challenging.
What makes that determination easier... is when software developers tell you, point blank, that you cannot trust them.
It then continues to make assertions about what actions from developers would implicitly point to "them telling you that you cannot trust them". But a trustworthy dev would:
  1. Tell you straight to not trust them, and
  2. Point you to the process they have in place and ask you for feedback on that
The problematic assertion though is their first. "Trust" as depicted here has nothing to do with actual trust, it's to do with not doing your due diligence. Because if the (political) behavior of individual developers or even groups of them influences your trust in the product, then you're approaching this the wrong way; either because you don't have the skills, or because you don't want to spend time.
The reason for that is that trust is earned and therefore it's more efficient to trust a process, and not an individual, as individuals have higher churn than (established/institutionalized) processes. The way a process gets trust is because you audit the process and in open source, ideally you do that by participating in it.
What options does the Bitcoin community have?
Participate in their development process. Don't trust, verify.
100 sats \ 3 replies \ @Row OP 2h
What options does the Bitcoin community have? Participate in their development process. Don't trust, verify.
Check Ken Thompson's "Reflection on Trusting Trust".
Auditories are out of the question, the problem is precisely that since the compiler is self-hosted, such attack is not easily auditable, the scheme can hide malicious code without requiring to publish a change in the source, you'd have to audit each and every binary release of the compiler.
It would be kind of solved with a second Rust compiler.
reply
"Reflection on Trusting Trust"
I'm familiar with it.
It would be kind of solved with a second Rust compiler.
Only if you use it.
reply
100 sats \ 1 reply \ @Row OP 2h
I'm familiar with it. It would be kind of solved with a second Rust compiler. Only if you use it.
That's the point, having a second compiler would allow you as an auditor to perform the cross-check.
In the current state, you can't, and have to resort to auditing every single binary.
reply
I'm not arguing that you're wrong - to the contrary - I'm answering your question to the how: If you really want to use Rust, and there is no second compiler or at least an -O0-like bootstrap compiler to compile a compiler, then you have to build said compiler and use it.
I simply read all this as "in the current state you cannot use it if you have any meaningful security requirement". But the compiler isn't even the biggest problem: Cargo is. If besides the compiler, you also have to audit every diff of every release of every crate you use, would you still use Rust? Would it still be as great a language as the bird app cult likes us to believe? Have you tried reviewing crates? I have; results vary.
Remember "we have reason to believe that libsecp256k1 is better tested and more thoroughly reviewed than the implementation in OpenSSL", and GMax explaining it a bit. If we take this as baseline Bitcoin developer mindset, then we can be pretty sure that we need some effort put into rust if we really want to use that.
reply