The Blocksize War – Chapter 2 – March To War
In the early days of Bitcoin, from 2009 to early 2011, the entire ecosystem consisted of just one piece of software, the Bitcoin client. This software existed initially for Microsoft Windows and comprised of the wallet, full node and miner. There were no mobile applications, no merchants, no gambling websites, no darknet markets, no exchange traded products, no exchanges, no institutional investors; just one primitive and basic software application. All one could do was mine some coins, send them and receive them. At the time, Bitcoin was pretty useless, and on the surface the system did not appear to have much value or potential. To be interested in the space, one had to have an imagination. One had to see many steps ahead and conceptualise how the system would develop and change over time. Building layer upon layer of assumptions with regards to how Bitcoin would evolve. Many of these assumptions had never been tested or comprehensively discussed; they were just taken for granted and accepted. By 2015, Bitcoin had been around for five or six years, and for those dedicated to the space, this was quite a long time to hold an assumption. Many in the community actually had different, conflicting assumptions with respect to how Bitcoin worked, and the extent of these disagreements had never been revealed. Now, these disagreements were bubbling to the surface and, with Bitcoin meaning so much to these people, the results could become ugly and unpredictable.
The Bitcoin price had also appreciated significantly, from essentially a few cents in 2010 to around US$220 a coin by the summer of 2015. Many parties to the conflict had therefore benefited considerably from a financial perspective by investing in Bitcoin early. An unfortunate consequence of this is that some in the community became overconfident, even a little arrogant. For instance, let’s say someone had decided to invest in the early part of 2011, when the price of Bitcoin was under US$1. They may have based this investment on certain assumptions and a particular vision. They could have then continued to hold the coins into 2015, seeing their investment increase by more than 200 times. This is likely to influence one’s psychology: surely the assumptions made in 2011 were correct? After all, they led to such strong gains. This investor is now likely to consider themselves to have a very strong understanding of Bitcoin and to know what is best going forwards, believing that they clearly understood Bitcoin well in 2011 because they made such strong gains. Unfortunately, maybe people did not quite appreciate that others with very different and conflicting visions had also invested in Bitcoin in early 2011, thereby negating this somewhat flawed and biased logic. Often, it appeared that people just assumed that the other early investors all agreed with them and that those on the other side of them in the blocksize war were newcomers. This goes a long way to explaining how the blocksize war appeared to escalate and become so vicious, so quickly.
It is worth going into a bit of early Bitcoin history at this point. When Bitcoin was released, there was no blocksize limit, although it is likely that larger blocks, perhaps more than 32 MB, would have broken the system. The limit was first introduced by Satoshi in the summer of 2010. On July 15, 2010, Satoshi added the following line of code to the software repository:
static const unsigned int MAX_BLOCK_SIZE = 1000000;
The software containing this upgrade was then released on July 19, 2010. The new 1 MB limit did not come into force until September 7, 2010, at block height 79,400 (79,400 blocks since Bitcoin was launched). This type of upgrade was called a softfork, i.e. a new rule tightening restrictions on block validity. It is a softfork because adding or lowering the limit tightens the rules. Increasing the limit would relax the rules and is therefore known as a hardfork. Everyone needs to upgrade to new software to follow the new chain in the event of a hardfork. However, this softfork/hardfork terminology was not known at the time, and was only used as of April 2012. This blocksize-limit softfork was Bitcoin’s first new rule that had some kind of activation methodology, in this case a flag day, where the new rules became active at a certain block height. Satoshi never provided a clear reason for the blocksize limit at the time. Many large blockers contend that the measure was only temporary, although no notes from the time that I could find indicate this.
The next key event of interest, and something widely cited by the large blockers, occurred on October 4, 2010. Not even one month after the blocksize limit became active, one of the Bitcoin developers, Jeff Garzik, proposed removing it and increasing the limit. He submitted a software patch removing the 1 MB rule and he argued this would ensure Bitcoin could scale to match Paypal’s transaction rate. Although Jeff knew such an issue wasn’t applicable in these early days, he considered it important from a marketing and narrative perspective. Just 15 minutes later, Theymos replied, stating that: “Applying this patch will make you incompatible with other Bitcoin clients.” Satoshi them chimed into the conversation:
+1 theymos. Don’t use this patch, it’ll make you incompatible with the network, to your own detriment. We can phase in a change later if we get closer to needing it.
The next day, Satoshi made an additional comment, in what is now one of the most widely quoted statements from the larger blockers:
It can be phased in, like:
if (blocknumber > 115000)
maxblocksize = largerlimit
It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don’t have it are already obsolete.
When we’re near the cutoff block number, I can put an alert to old versions to make sure they know they have to upgrade.
It should be noted that, at the time, the block height was 83,500; therefore, block height 115,000 was 31,500 into the future, approximately seven months later. To the large blockers, Satoshi’s intention here is clear. Satoshi only introduced the limit as a temporary measure and was already providing instructions on how to increase it, with a clear plan in place.
However, in general, large blockers didn’t always look at the whole picture or context. One can interpret the dialogue as Satoshi opposing the patch to increase the blocksize limit right away, as it would make one incompatible with the network. Satoshi then takes a more cautious stance and goes on to describe how one could increase the limit if one wanted to, with some safety mechanisms to ensure a smooth upgrade. This narrative feels more similar to what the small blockers were saying.
The next quote from Satoshi widely cited by the large blocks is from even earlier, November 2008, before Bitcoin had even launched, where he talks about the network eventually being able to handle as many transactions as Visa, 100 million per day. This quote is very important to the large blockers and clearly aligns strongly with many of their visions for Bitcoin:
Long before the network gets anywhere near as large as that, it would be safe for users to use Simplified Payment Verification (section 8) to check for double spending, which only requires having the chain of block headers, or about 12KB per day. Only people trying to create new coins would need to run network nodes. At first, most users would run network nodes, but as the network grows beyond a certain point, it would be left more and more to specialists with server farms of specialized hardware. A server farm would only need to have one node on the network and the rest of the LAN connects with that one node.
The bandwidth might not be as prohibitive as you think. A typical transaction would be about 400 bytes (ECC is nicely compact). Each transaction has to be broadcast twice, so let’s say 1KB per transaction. Visa processed 37 billion transactions in FY2008, or an average of 100 million transactions per day. That many transactions would take 100GB of bandwidth, or the size of 12 DVD or 2 HD quality movies, or about $18 worth of bandwidth at current prices.
If the network were to get that big, it would take several years, and by then, sending 2 HD movies over the Internet would probably not seem like a big deal.
Of course, the small blockers have a response to even this. They claim that Satoshi was making these comments under the assumption that Simplified Payment Verification (SPV) technology exists. What this means is that light wallets could receive proof of a double spend in an invalid block and would therefore, in normal circumstances, not be required to verify all the transactions. This technology has not yet been developed and may not be possible. Therefore, some small blockers argue, Satoshi’s claims about competing with Visa for throughput no longer apply. This can be considered as a somewhat pedantic argument and a narrow interpretation of the meaning of SPV.
The below reply to Satoshi was in the original email thread where Bitcoin was first announced, a few months before it even launched. The very first reply to Satoshi when he announced Bitcoin was from somebody called James A Donald, who was already expressing concern about capacity, within just one day of the idea being announced:
To detect and reject a double spending event in a timely manner, one must have most past transactions of the coins in the transaction, which, naively implemented, requires each peer to have most past transactions, or most past transactions that occurred recently. If hundreds of millions of people are doing transactions, that is a lot of bandwidth – each must know all, or a substantial part thereof.
As for Satoshi quotes used by the small blocker side, perhaps the most referenced is when Satoshi referred to a competing client as a “menace to the network” and mentioned how the core design of Bitcoin was “set in stone”, in a discussion with Gavin in June 2010:
The nature of Bitcoin is such that once version 0.1 was released, the core design was set in stone for the rest of its lifetime. Because of that, I wanted to design it to support every possible transaction type I could think of. The problem was, each thing required special support code and data fields whether it was used or not, and only covered one special case at a time. It would have been an explosion of special cases. The solution was script, which generalizes the problem so transacting parties can describe their transaction as a predicate that the node network evaluates. The nodes only need to understand the transaction to the extent of evaluating whether the sender’s conditions are met.
The script is actually a predicate. It’s just an equation that evaluates to true or false. Predicate is a long and unfamiliar word so I called it script.
The receiver of a payment does a template match on the script. Currently, receivers only accept two templates: direct payment and bitcoin address. Future versions can add templates for more transaction types and nodes running that version or higher will be able to receive them. All versions of nodes in the network can verify and process any new transactions into blocks, even though they may not know how to read them.
The design supports a tremendous variety of possible transaction types that I designed years ago. Escrow transactions, bonded contracts, third party arbitration, multi-party signature, etc. If Bitcoin catches on in a big way, these are things we’ll want to explore in the future, but they all had to be designed at the beginning to make sure they would be possible later.
I don’t believe a second, compatible implementation of Bitcoin will ever be a good idea. So much of the design depends on all nodes getting exactly identical results in lockstep that a second implementation would be a menace to the network. The MIT license is compatible with all other licenses and commercial uses, so there is no need to rewrite it from a licensing standpoint.
Satoshi made many comments during the first two years of his involvement in the space, many of which could be said to support either side in this war. In general, it could be said that the quotes indicate that Satoshi seemed to broadly support the large blockers with respect to the narrow issue of the blocksize limit and transaction throughput, but Satoshi seemed somewhat supportive of the smaller block position with respect to their view on the flexibility of the Bitcoin rules. At this point, the debate appears to get almost religious, with both sides poring over every Satoshi quote looking for comments or interpretations supporting their cause.
What Satoshi thought, however, should not be considered as especially important. Many smaller blockers articulated a view that Satoshi was now irrelevant. At least, his views five years ago shouldn’t matter, because a lot has changed since then. We now probably know much more about Bitcoin than Satoshi back then, due to the experience of seeing the network in action. Bitcoin is not a religion and Satoshi is not a prophet, small blockers often contended. Decisions should be made based on scientific merit alone; what Satoshi said makes no difference, they claimed. However, Bitcoin does have some characteristics similar to a religion and this does appear to be how many people felt. After all, religions are very successful; perhaps these characteristics contributed somewhat to Bitcoin’s success.
Satoshi did actually appear to contribute to the debate in 2015. On the day Bitcoin XT was published, an email was sent from one of Satoshi’s email addresses (Satoshi@vistomail.com), articulating the small blocker side of the argument, including a claim that he had changed his mind about scaling:
I have been following the recent block size debates through the mailing list. I had hoped the debate would resolve and that a fork proposal would achieve widespread consensus. However with the formal release of Bitcoin XT 0.11A, this looks unlikely to happen, and so I am forced to share my concerns about this very dangerous fork.
The developers of this pretender-Bitcoin claim to be following my original vision, but nothing could be further from the truth. When I designed Bitcoin, I designed it in such a way as to make future modifications to the consensus rules difficult without near unanimous agreement. Bitcoin was designed to be protected from the influence of charismatic leaders, even if their name is Gavin Andresen, Barack Obama, or Satoshi Nakamoto. Nearly everyone has to agree on a change, and they have to do it without being forced or pressured into it. By doing a fork in this way, these developers are violating the “original vision” they claim to honour.
They use my old writings to make claims about what Bitcoin was supposed to be. However I acknowledge that a lot has changed since that time, and new knowledge has been gained that contradicts some of my early opinions. For example I didn’t anticipate pooled mining and its effects on the security of the network. Making Bitcoin a competitive monetary system while also preserving its security properties is not a trivial problem, and we should take more time to come up with a robust solution. I suspect we need a better incentive for users to run nodes instead of relying solely on altruism.
If two developers can fork Bitcoin and succeed in redefining what “Bitcoin” is, in the face of widespread technical criticism and through the use of populist tactics, then I will have no choice but to declare Bitcoin a failed project. Bitcoin was meant to be both technically and socially robust. This present situation has been very disappointing to watch unfold.
Most large blockers immediately dismissed the email as a fake. However, the email headers seemed to indicate the email did originate from Vistomail. This therefore leaves three possibilities: i. Satoshi’s email account was hacked; ii. Vistomail administrators sent the email; or iii. This email was genuinely from Satoshi. The second possibility seems extremely unlikely, therefore it’s probable that either this message is genuine or the account was hacked. A hacked email account is certainly possible, since Satoshi’s other email account (Satoshi@gmx.com) was hacked when somebody was able to reset the password. Either way, it didn’t really matter. If one individual, such as Satoshi, had so much influence over the system that he could rescue it from this crisis alone, then Bitcoin had failed to move on past the early days of reliance on an individual. Bitcoin had to be robust in of itself to withstand the immense pressures it would be exposed to as a controversial and revolutionary money system, without reliance on one individual, who could presumably easily be stopped or could vanish at any time. This may be why Satoshi disappeared in the first place. I would like to say this is where the supposed involvement of Satoshi ended in this story. However, unfortunately, Satoshi, or more precisely claims about Satoshi, come into the story again later on.
While, in 2010, there were discussions about these scaling issues, there were no significant disagreements; everyone was just learning. By April 2011, things appeared to have changed a little and a deep disagreement about scaling, transaction fees and long-term Bitcoin mining incentivisation became apparent. Everyone was still civil and polite, but a fundamental difference of opinion seemed to emerge. BitcoinTalk user “Vandroiy” posed a question: he essentially asked how miners would be incentivised when the block subsidy ran out and became low. Of course, everyone knew the answer to this, as the whitepaper says “the incentive can transition entirely to transaction fees”. However, Vandroiy was asking a more challenging question. As Vandroiy put it on April 22, 2011:
Any single, small miner intends to maximize profit. His decision on what transactions to include doesn’t create a big change in the height of fees. Thus, the miner will include all transactions that pay any fee, even very low fees, to have maximum profit. This results in the price for transactions dropping. In turn, those miners who already were hardly profitable have their earnings further reduced and quit. This reduces hashrate, difficulty drops, and the circle repeats. By this reasoning, difficulty is likely to drop close to zero.
Analysing Vandroiy’s point from an economic perspective, he was essentially saying that the marginal cost of including a transaction is near zero and, in a competitive environment, price equals marginal cost. The market would then clear at low prices, sometimes referred to as the “fee death spiral problem”. However, this was not a normal market where the only objective was to reach an equilibrium price and clear; some believed this market had a positive externality, or another objective, as the whitepaper put it, to incentivise miners. Whether or not this was truly a problem for Bitcoin proved highly controversial. From reading this thread, it appears as if roughly half the people thought this was a problem and half did not. Even Mike Hearn initially seemed to agree with the death spiral problem, stating that it “seems plausible”. However, the following day, on April 23, 2011, Mike had, quite legitimately, re-considered his position, and he no longer considered this as a problem:
The death spiral argument assumes that I would include all transactions no matter how low their fee/priority, because it costs me nothing to do so and why would I not take the free money? Yet real life is full of companies that could do this but don’t, because they understand it would undermine their own business.
Most of the people who thought the fee market death spiral was a problem seemed to settle on a proposed solution: the blocksize limit would prevent fees falling too low, as users would have to bid against each other for space in blocks, which would be full. This blocksize limit would therefore create what economists call a producer surplus, which could incentivise miners once the block subsidy ran out. While this disagreement seemed to split the community right down the middle, nobody appeared too worried by the situation. Instead of debating further, there appeared to be little public discussion on the matter for the next few years. The participants in the discussion all seemed to assume that Bitcoin would evolve more in their preferred direction. In 2013, Mike appeared to recognise the fee death spiral as a legitimate problem, but proposed “assurance contracts” as a potential solution, rather than a blocksize limit.
The first public evidence of active campaigning on this blocksize issue was a video produced by Bitcoin developer and small blocker, Peter Todd. In May 2013, he released a professionally-produced video on YouTube. In the video, he argued that it was necessary for the blocksize limit to remain small, such that users could validate all the transactions and keep Bitcoin decentralised. The video even talked about “ignoring anyone trying to change the software you use, to increase the 1 MB blocksize”. The video has been extensively derided by the large blockers, who even eventually took over Peter’s small block campaign website keepbitcoinfree.org, and replaced it with material supporting larger blocks.
Peter Todd also angered many of the large blockers, due to his position as the main proponent of something called “Replace by fee” (RBF). RBF allows users to replace a Bitcoin transaction (before it is confirmed in the blockchain), with a new transaction by spending the same transaction input again, only with a higher fee. Miners adopting this RBF policy would choose to include the higher fee transaction. In contrast, miners not adopting this and instead using the first seen safe (FSS) principal, would include the transaction they saw first. In general, Mike, Gavin and large blockers were opposed to RBF, while small blockers tended to support it. A crucial distinction should be made here between this and the blocksize war: the blocksize limit is a part of the Bitcoin protocol, while RBF is only a miner policy. Miners are therefore free to do what they like with respect to RBF and there is no need for consensus. The distinction between a Bitcoin protocol rule and any other aspect of the system, such as RBF, was extremely important to small blockers, while most larger blockers never saw the distinction or agreed that it mattered to the same extent. Some considered it as an arbitrary distinction which smaller blockers had created to get their way. Despite this distinction, the core economic argument around RBF was almost exactly the same as that around the miner fee death spiral.
Opponents of RBF opined that it damaged the user experience and made double spending more likely, while advocates claimed miners were incentivised to choose higher fee transactions anyway, to increase profits, therefore it was inevitable and the software policy might as well align with this economic reality. The large blockers’ retort to this was that miners cared about the user experience too, and therefore why would they damage the user experience of the system they depended on?
It seems to me that the answer to this apparent dilemma depends primarily on the level of competitiveness in the mining industry. If the mining industry was highly concentrated among a few small players, then FSS seemed a somewhat logical policy and the fee death spiral argument appears not to apply. This is because the decisions these miners make would have a significant impact on the ecosystem and thus potentially damage their future earnings as miners. If the level of industry concentration is low, then the impact the decisions of individual miners have on the ecosystem is more limited. Miners may instead choose to maximise their short-term profits, rather than care about the long-term user experience, that their action alone would not impact significantly anyway. This problem is sometimes referred to as a tragedy of the commons. If one believes the tragedy of the commons is applicable here, the rational thing may therefore be to apply RBF policies, and the fee death spiral risk seems somewhat viable.
The arguments over RBF seemed to have very similar inflection points to the blocksize issue:
- Large blockers prioritised the short term, while small blockers focused on the long term;
- Large blockers prioritised the user experience, while small blockers favoured making the system more resilient;
- Large blockers prioritised growth, while small blockers were more concerned about sustainability;
- Large blockers were more pragmatic and business-focused, while small blockers were more scientific and theoretical, typically highly intelligent computer and cryptography boffins.
It was not necessarily the case that either side disagreed on the technical arguments; they just had different preferences and weighted the importance of each component under consideration differently. Unfortunately, this resulted in different conclusions that appeared impossible to reconcile.
On Wednesday, April 15, 2015, there was an official Bitcoin Foundation event in London, called DevCore. Gavin was in attendance, having flown over to deliver his keynote speech entitled “Why we need a bigger chain”. I was also in attendance at the event. Gavin was very approachable and willing to discuss the issue. Gavin emphasised to me that 1 MB was ridiculously small, and that many web pages were larger than that. In his mind, the history of information technology was about exponential growth and things becoming faster and larger. Moore’s law was mentioned repeatedly, used as an example to show how systems improve over time and that eventually Bitcoin would have much larger blocks, into the gigabytes, and there would be no technical scaling issues. Gavin quietly mentioned to me that he favoured a jump to a 20 MB limit, but that he was willing to compromise and maybe change this to 8 MB, if others came on board. A few days later, on March 18, 2018, Mike and Gavin held an evening Q&A session in London. When discussing blocksize, Gavin said the following:
I may just have to throw my weight around and say, this is the way it’s going to be, and if you don’t like it, find another project. Frankly, that is what happened with the P2SH thing; I just kind of said, I have listened to everybody, I have listened to a couple of proposals and this is the way it is going to be
As he said this, I took a quick glance around the room. The majority of people seemed happy with Gavin having this power. However, there was clearly a minority of people, perhaps just five percent or so, who were somewhat angered by this and viewed Gavin as being arrogant in making that comment; they looked quite uncomfortable. To them, Gavin was not in charge of Bitcoin; if he could simply throw his weight around and change the protocol, what exactly was the point of Bitcoin? In mentioning P2SH, Gavin was bringing up a somewhat contentious Bitcoin softfork upgrade in 2012, where there were competing proposals and Gavin had essentially chosen the path forwards. From hanging around after the talk, I got a clear sense that Mike was pushing Gavin to take a tougher and tougher stance on the blocksize, while Gavin was pushing back a little. Mike was even asking if Gavin could boot the other developers out of the main Bitcoin Core repository on GitHub and take full control of the repository. From chatting to them further, it did seem possible that Gavin would eventually join Mike in taking a stronger stance. The two of them clearly thought that, once Gavin had taken that position, it would prove decisive. Quite how and when Gavin would do this, and what specific action he would take, was unclear to me at the time.
On May 4, 2015, Gavin published a blog entitled “Time to roll out bigger blocks”. This was the first in a series of blogposts where he attempted to address the concerns with larger blocks. Gavin had clearly decided now was the time to make the push for larger blocks. On May 7, 2015, the lead maintainer of the Bitcoin Core project on GitHub, Wladimir Van Der Laan, made the following comment in an email to the Bitcoin mailing list:
I’m weakly against a block size increase in the near future. Some arguments follow. For sake of brevity, this ignores the inherent practical and political issues in scheduling a hardfork.
Bitcoin Core was the name of the reference implementation of Bitcoin and a descendant of the client Satoshi originally created. This client had initially just been called Bitcoin or Bitcoin-QT, however the name Bitcoin Core was adopted in February 2013 after a suggestion from Mike Hearn, which now seems somewhat ironic. Gavin had previously handed the ownership of the Bitcoin repository on GitHub to Wladimir, to enable Gavin to focus more on the research side of Bitcoin. There is also some irony here, as Gavin appeared to have handed over control so that he could research areas like transaction fees and blockspace. This may have seemed like a more important role at the time, compared to the grinding maintenance work of managing the repository; it was not seen as Gavin relinquishing any power. Later on, the larger blockers regarded Gavin’s decision to hand over control to Wladimir as a critical mistake. However, small blockers typically contended that Wladimir had no real power, and that owning the repository was only a janitorial role. The final decision on merging code was made only if there was broad agreement from the group of developers, so ultimately control of the repository does not matter. In addition to this, and crucially, the Bitcoin rules are not determined by changes to the software repository; they are determined by the clients which users are already running. Of course, the repository could publish new versions of the client with protocol changes, but there was no automatic upgrade feature, and nobody was forced to upgrade. This is another example of a distinction which was crucial to small blockers, but which the large blockers simply did not see or agree with. To the large blockers, there was too much power in the hands of Bitcoin Core, therefore it quickly became their main enemy.
Whatever one’s views on the power of the lead maintainer of the software project, the comment from Wladimir about his “weak” opposition to a blocksize increase in the near future, seemed highly significant. It appeared as if the hardfork would not be merged into Bitcoin Core, despite tremendous lobbying from Gavin, and Gavin’s options therefore felt somewhat limited. On May 29, 2015, Gavin gave the strongest hint yet of what he planned to do: that he may switch his support over to Bitcoin XT and throw his weight behind the alternative incompatible Bitcoin protocol. Despite the below email, which was pretty clear, I never really believed it and considered it a threat; I thought it was some kind of negotiating tactic.
If we can’t come to an agreement soon, then I’ll ask for help reviewing/submitting patches to Mike’s Bitcoin-Xt project that implement a big increase now that grows over time so we may never have to go through all this rancor and debate again.
I’ll then ask for help lobbying the merchant services and exchanges and hosted wallet companies and other bitcoind-using-infrastructure companies (and anybody who agrees with me that we need bigger blocks sooner rather than later) to run Bitcoin-Xt instead of Bitcoin Core, and state that they are running it. We’ll be able to see uptake on the network by monitoring client versions.
Perhaps by the time that happens there will be consensus bigger blocks are needed sooner rather than later; if so, great! The early deployment will just serve as early testing, and all of the software already deployed will ready for bigger blocks.
But if there is still no consensus among developers but the “bigger blocks now” movement is successful, I’ll ask for help getting big miners to do the same, and use the soft-fork block version voting mechanism to (hopefully) get a majority and then a super-majority willing to produce bigger blocks. The purpose of that process is to prove to any doubters that they’d better start supporting bigger blocks or they’ll be left behind, and to give them a chance to upgrade before that happens.
Because if we can’t come to consensus here, the ultimate authority for determining consensus is what code the majority of merchants and exchanges and miners are running.
On July 21, 2015, Pieter Wuille, another Bitcoin developer, who had worked with Mike Hearn at Google in the past, proposed a hardfork blocksize increase. Pieter was regarded as being on the “small block” side of the argument. To me, this appeared to be a compromise proposal, a response to the pressure from Gavin. The proposal was numbered BIP 103 and acknowledged Wladimir Van Der Laan and a developer called Gregory Maxwell for their suggestions, indicating their potential support. The proposal was for the hardfork to activate in January 2017, at which point the blocksize limit would increase by 17.7 percent per annum until the year 2063. The proposal did not include any activation methodology. It appeared to be meant as a catalyst for further discussion and then, once agreement had been reached, the activation methodology could be determined.
I consider this offer as a significant moment. The blocksize increase schedule did seem a bit conservative; however, I thought it was part of a negotiation. I expected Gavin to react positively to the offer, perhaps provide a counter offer, and the sides could gradually move towards each other. It appeared as if we were slowly moving towards a resolution. To my astonishment, Gavin and the large blockers did not react positively to BIP 103 at all. They regarded the proposed increase as so small that it was more of an insult than progress. Unfortunately, BIP 103 did not seem to help. With the 17.7 percent annual increase proposed in BIP 103, it seemed likely that demand for Bitcoin transactions would exceed this level of growth. In contrast, the large blockers wanted to ensure exactly the opposite; they wanted the blocksize limit to increase faster than demand. If both sides wanted the opposite, then could compromise really be achieved?
To the large blockers, the priority was about the user experience. Avoiding full blocks was key, otherwise users would have to wait for an unpredictable amount of time for their transactions to confirm. What merchant would adopt Bitcoin as a payment method if it was as unreliable as this? Forcing users to bid against each other in a bidding war for blockspace would, by definition, deny some users the ability to use Bitcoin, driving them away to something else. This was considered as a terrible business strategy. What kind of platform succeeds when it deliberately drives away users?
To the small blockers, this was not such a problem. To them, full blocks would not be some kind of crisis; if anything, they were a sign of success. They indicated that Bitcoin was becoming popular, and a new equilibrium level of user adoption would occur, reflecting the blocksize limit constraint. They occasionally mocked the large blocker argument about fees getting too high and causing users to leave, likening it some kind of paradox: “Nobody goes there anymore… it’s too crowded.”
In addition to this, small blockers tended to consider full blocks as both necessary and inevitable in the long term anyway. It was necessary to prevent a fee market death spiral when the block subsidy was low. It was also necessary to ensure miners would move the chain forwards once the subsidy became low. It was considered vital to always have a surplus of transactions which didn’t get in the blocks and were sitting there waiting to be included; that way, miners always had an incentive to build blocks. If there were no full blocks and no surplus transactions, why would a miner even bother mining if there is no revenue? Instead, miners would turn their machines off, save energy costs and wait for a backlog of transactions to build up again after each block. This would greatly reduce network security. Large blockers considered this reasoning as highly inappropriate. The block subsidy would be around for decades; why lose customers now for something that may be a problem in 20 to 100 years?
Small blockers also believed full blocks were inevitable anyway. After all, if blockspace was available, why not use it up? Anyone could store anything they liked in the blockchain, for instance their music collection or encrypted documents. Demand for cheap, highly-replicated storage was essentially unbounded, they argued. Asking for the limit to increase above expected demand was therefore nonsense. Indeed, one person could easily fill all the space up themselves. The retort to this point from the larger blockers circled back to the mining incentive argument; miners would not do this, they claimed, miners would not let this amount of data in the blocks. In addition to this, large blockers argued that blocks had not been full in the first five years of Bitcoin, a characteristic which they said contributed to its success. Why would anyone want to make the risky move of changing that now?
Unfortunately, the community was no closer to agreement and Gavin pushed ahead with his plan. In July 2015, Gavin is said to have sounded out some of the Chinese miners and mining pools on his proposal. There was a meeting In Beijing, at which miners are said to have disagreed with Gavin in pushing for an increase to 20 MB, as the Chinese communication infrastructure was considered too weak to quickly broadcast blocks of this size. Therefore, agreement was said to be reached for 8 MB blocks. Gavin was preparing behind the scenes for his big planned move in August, now just a few weeks away.
In the Q&A section of the Bitcoin XT website, the following was said:
Decisions are made through agreement between Mike and Gavin, with Mike making the final call if a serious dispute were to arise.
In some sections of the community, this just re-enforced the impression that this was all a power grab by Mike. Who was Mike to “make the final call”? It was not that there was anything wrong with Mike, he seemed to be a pretty nice guy; it’s just that making this statement so blatantly didn’t feel like the right approach. Bitcoiners like to feel in control, they want to take ownership and have financial sovereignty. This was not part of the messaging of Bitcoin XT at all, which appeared too focused around Mike personally. Therein lies the second major blunder from the large blocker side: Bitcoin XT was too associated with Mike, rather than being dressed up as more of a grassroots user approach. Even focusing the software around Gavin was likely to improve the chances of success.