Scaling Bitcoin without compromising decentralization

Much confusion and disagreement exists on the ability for blockchain technology and Bitcoin to scale, and the relation between scaling and decentralization. This article attempts to clear things up. Let us address one by one the various misconceptions that are often cited in the scaling debate.

Larger blocks will lead to less full nodes

On one hand, larger blocks increase the resource requirements for running a fully validating node. On the other hand larger blocks means increased adoption, and an increased number of entities that want to run a fully validating node.

Nobody can predict the future, but it seems much more likely that the second effect is more important and that larger blocks will lead to more full nodes.

If Bitcoin’s throughput were to grow say twenty fold, It would be a serious player in the payment industry and there would likely be 100,000s of businesses, analyists, statisticians, financial institutions, bank, developers and enthusiasts eager to run a full node and scrutinize the blockchain, even if does take some serious hardware investment.

10 terabytes of storage and additional resource requirements are orders of magnitudes cheaper than the value of 10 terabytes of blockchain data.

Everyone must be able to cheaply run a full node

I don't know where this idea comes from, but it is a rather odd requirement. First of all, it lacks realism: the vast majority of end-users is never going to run a full node regardless of the block size.

More importantly, such requirement doesn’t match with Bitcoin’s architecture. Bitcoin uses a merkle tree to hash the transactions, which enables end-users to verify the proof-of-work and inclusion of their own transaction while ignoring other peoples transactions. This is called SPV (simplified payment verification).

Although there are differences between running a full node and running an SPV node, there are more similarities: neither are cryptographically trustless as both rely on trusting miner’s incentives not to collaborate to defraud you. Neither requires trusting a peer.

The main difference in security is that an SPV node is open to some attacks that full nodes are protected against: With SPV, miners can collaborate not just to revert payments, but also to feed invalid blocks, defrauding them in various other ways. Such attacks however costs the miners about $250,000 per block as other miners and exchanges would reject such blocks, so are not very relevant for ordinary end users.

From the persepective of an attacker, a valid block attack is cheaper, more effective and less risky than an invalid block attack. Hence the effective difference in trust and security between an SPV node and a full node is minimal.

Lightning Network will solve scaling issues

Lightning Network is decentralized micropayment network being development as a second layer solution on top of Bitcoin. Currently various implementations are in alpha testing on testnet, though it must be noted that these are using a full message flooding model; scalable decentralized routing is still in early design stage.

Some people believe that such micropayment network can help with decentralized scaling, but this is not likely the case.

Let us assume Lightning Network gains effecient routing, works exactly as advertised and at some point will be ubiquitously used.

There are still some issues by design:

  • You can only receive payments if your node is running, so everyone who wants to passively receive money must run a node 24/7
  • If a peer is unavailable, you can temporarily not access part of your money. This problem is amplified by the fact that it is very cheap to “bully” the network by simply connecting and disconnecting to peers.
  • Lightning Network doesn’t work well with large payments, which means that users will have to manage their channels, topping them and waiting for these to be confirmed.

These may seem just minor inconveniences, but they mean that for regular Joe, it is more convenient, cheaper and easier to simply trust a third party service to eliminate these inconveniences.

This is in sheer contrast with SPV, where trusting a third party is more expensive, and not more convenient or easy than your own SPV node that verifies the proof-of-work and inclusion of your own transactions, and provides control of ones own keys.

The risk of centralization isn’t people running an SPV node instead of a full node. The risk of centralization is people not running a node at all and trusting their keys and money with a third party. And with Lightning Network’s architecture, 99% of users will not run a LN node just like they won’t run a full node. They will choose the convenience of a trusted service.

Lightning Network could be interesting as a micropayment network, but for decentralized scaling it is vastly inferior to the SPV/on-chain model as it most likely harms decentralization and certainly doesn’t help it.

With on-chain scaling blocks will become huge

I would certainly hope so, but unfortunately this isn’t likely to happen anytime soon. Let’s put some numbers to this by looking at the growth of the blocks before blocks were full and the limit took effect.

The growth is almost linear at about 200kb per block per year, or 100mb per block per 500 years. If we want to scale, we should be trying to accelerate this; instead many people worry how on-chain scaling would allow VISA level throughput. Let’s try to scale our graph to include this target of 150 million transactions per day:

Limiting the block size because of VISA level throughput is scaling without a sense of scale.

One of Satoshi Nakamoto's key insights was the fact that transactions are tiny and computers are fast. This becomes evident when we look at the current state of the network. All Bitcoin transactions in the world are propagated in a puny 1.1 mb per 10 min. This is small enough to be processed by a $50 Raspberry PI!

In terms of bandwidth, this rounds to 0.00mb/sec, or if we include protocol overhead and a margin for the irregularity of block times, about 0.00mb/sec.

In practice, well connected nodes use a lot bandwidth; this is caused by the noble cause of seeding the data to others. This is akin to BitTorrent: Even if I only download a single 4gb movie, my software can easily consume terabytes of data per month due to seeding. This doesn't change the fact that I can download, verify and watch the movie for little more than 4gb, just like I can verify all incoming blockchain data using negligable bandwitdh.

Even in the most optimistic growth scenario, Bitcoin's block size will never even get close to outpacing technology and becoming prohibitive.

We shouldn’t be just kicking the can down the road: it will roll by itself and all we need to do is find ways to make it roll faster.

The road is way too long to throw up the barriers at this point.


Maintaining the blocksize limit destroys Bitcoin, its decentralization, its utility and everything it stands for.

Larger blocks means increased adoption and an increase in merchants, businesses, miners, pools, chip manufacturers, full nodes and decentralization.

I have one wish for 2018: for all to unite and to embrace the power of on-chain scaling.