To an extent all software is a black box. As a consequence, the quote concerning Bitcoin needing to be open source and a means to “free us from government” that is is commonly made is false and problematic. It may be true that Bitcoin works best as an open-source system, and some say this allows us to see that Bitcoin is secure (Popper will attest to the inability to determine anything absolutely), but this adds no value. That said, Bitcoin is money. The most critical aspect of money is trust, and for this to hold true, people need to understand that there is nothing hidden.
The analysis of software is an NP-infeasible problem. Turing and then Dijkstra demonstrated proofs that the state of a system can never be fully known. You are making presumptions about the level of knowledge an open system holds and about the level of testing.
That said, Bitcoin has had bugs from the early code, and even open code is far from perfect. The main problem Bitcoin solved is double spending. The means that are used to achieve this aim are simple, it is a competitive system. The system is capitalist, but, unlike most capitalist systems that allow the incumbent leader (the dominant company) to seek to stop competition through regulatory capture, the global nature of Bitcoin defines a competitive system.
No system is perfect
Crystal-box testing is a better option (I have published papers on this in the past) for security-critical software, but the option is not always (nor truly) available. What is missing is the complexity/simplicity issue. These issues are mixed with the issues of security. It is never possible (nor feasible) to absolutely know the state of an open system. You just have a lower cost of testing and rectification.
This itself is critical.
It is not the protocol that needs to be tweaked, it is the software. The protocol is simple. Many do not like this, but this is a part of the beauty of Bitcoin. If you want to do more, you can, in script. A script that is a predicate and which can be seen to always end.
As for DoS (or a DDoS): There is always a way to DoS a system. The issue here is how much evidence you create and why you do it. Hit any system with a sustained attack from 1,000,000,000 bots, and it goes down. End of story.
This is not an argument about complexity impacting security.
What matters at the end is the best way to minimise long-term risk and the costs of securing a global money or worse in not creating adoption.
There are a number of maxims for the creation of a secure system in information technology. These all apply to Bitcoin. The paper “The Protection of Information in Computer Systems” by J. H. Saltzer and M. D. Schroeder [Proc. IEEE 63, 9 (Sept. 1975), pp. 1278–1308] was the watershed paper on this topic and provided the origins of the maxims that we take for granted today. It should be a guiding principle in coding Bitcoin, and as such, the complexity needs to be left to systems on top of Bitcoin (in Script etc).
These maxims are the fundamentals of how Bitcoin has been designed. These are:
- Economy of mechanism: keep the design as simple and small as possible.
- Fail-safe defaults: base access decisions on permission rather than exclusion.
- Complete mediation: every access to every object must be checked for authority.
- Open design: the design should not be secret.
- Separation of privilege: where feasible, a protection mechanism that requires two keys to unlock it is more robust and flexible than one that allows access to the presenter of only a single key.
- Least privilege: every program and every user of the system should operate using the least set of privileges necessary to complete the job.
- Least common mechanism: minimize the amount of mechanism common to more than one user and depended on by all users.
- Psychological acceptability: it is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly.
Point 8 is commonly overlooked. To make a security system work it needs to be accepted by the people using it. If we make a system too complex, it will fail. If people perceive it is impeding on their ability to undertake their job, they will find a way to bypass it. In Bitcoin Core, the developers never came to understand user acceptance.
These Maxims are listed in the section of the paper by Saltzer and Schroeder under Design Principles. This section begins by stating: “Whatever the level of functionality provided, the usefulness of a set of protection mechanisms depends upon the ability of a system to prevent security violations. In practice, producing a system at any level of functionality (except level one) that actually does prevent all such unauthorized acts has proved to be extremely difficult. Sophisticated users of most systems are aware of at least one way to crash the system, denying other users authorized access to stored information. Penetration exercises involving a large number of different general-purpose systems all have shown that users can construct programs that can obtain unauthorized access to information stored within. Even in systems designed and implemented with security as an important objective, design and implementation flaws provide paths that circumvent the intended access constraints. Design and construction techniques that systematically exclude flaws are the topic of much research activity, but no complete method applicable to the construction of large general-purpose systems exists yet. This difficulty is related to the negative quality of the requirement to prevent all unauthorized actions”.
Risk is always relative
Absolute security does not exist, nor can it be achieved. Even ostensibly secure systems are vulnerable, as all systems possess a level of insecurity — an attacker with sufficient resources can always bypass controls. The goal is to ensure that the economic constraints placed upon the attacker exceed the perceived benefits to the attacker in order to mitigate the risk. The difficulty, however, lies in quantifying these risks to information systems.
Relative computer security can be measured using six factors (Aycock, 2006):
1. What is the importance of the information or resource being protected?
2. What is the potential impact, if the security is breached?
3. Who is the attacker likely to be?
4. What are the skills and resources available to an attacker?
5. What constraints are imposed by legitimate usage?
6. What resources are available to implement security?
The result is that security is a relative risk measure related to organisational economics at the micro level and the economics of national security at the macro level. This works to frame security in terms of one’s neighbour. The question is not, “am I secure?” (Wright & Zia, 2011), but rather, “am I more secure than my neighbour?” In this thesis, it is shown that attackers are rational economic actors. As such, they will maximise their gains and minimise the risk to themselves in seeking gains from the systems they attempt to exploit.
Bitcoin is designed to be simple, open, and safe. In this, it can be trusted. It is cash.
It is locking the original protocol (as closely as we can) and building that will allow Bitcoin to be cash and scale.