Pondering about Bitcoin scaling. Response to Roger Ver.

in #bitcoin8 years ago (edited)

In a recent article bitcoin.com owner and Bitcoin Jesus, Roger K Ver wrote about some of the problems we face in growing Bitcoin.

As we face a greater popularity in the crypto world, and Bitcoin still is the front-runner of that, we have to be brutally honest about the implications of growth. Roger is doing that, and I welcome that. So please don't see this as a debate or a disagreement.

At the same time, if you are not shoulder-deep into Crypto, scaling may feel scary. After all, anyone doing the numbers will see that in order to get millions of people on the currency, you have to process all those transactions. And due the distributed nature of Bitcoin this can't be done on a server in a datacenter, this is something that each machine running a fully-validating Bitcoin node will have to do.

The good thing is that most people don't realize nearly how fast modern computers are and how little time it takes to process Bitcoin transactions. I recently wrote a blog investigating how much we have to invest in technology to support 50 million users each posting 1 transaction a day. The conclusion is rather optimistic.

But back to the topic at hand; if we want to grow, Roger states there are 4 negative externalities that result from more users and bigger blocks. And I want to challenge him on that!

1. Additional bandwidth.

This sounds so obvious, but its not. The truth is that Bitcoin has not seen much in the form of smart architecture or engineering, probably ever. Lots of great protocol-developers and cryptographers, not so much on the engineering front. And the result of this is that there is room for optimisation. A lot of room.
In a research project called "x-thin-blocks" the obvious-in-hindsight problem was solved that each node validates and transmits all transactions twice. Once when they are send on the network and again when they are validated in a block. This new technology brings down the sending of a block from 1MB to maybe 20KB.
Its not been deployed yet, with the result that if we upgrade to something more modern that can both support bigger blocks and supports this technology we'll likely see a DROP in bandwidth.

2. Extra CPU Time.

Another point that sounds extremely obvious, add more transactions to process every second and you "waste" CPU power validating them. Well, yes and no. First of all, read my answer to point 1 where we half the amount used by not doing the same work twice.
The situation with CPU is actually quite rosy already, the processing power needed to validate is extremely little. If we were to go to 10GB blocks today your desktop would still not have a problem validating all those transactions in under 10Min. So the bottom line is, its not an issue.
Next to that we have seen various optimizations already. For instance in Bitcoin Classic I removed the validation of very old transactions. Ones that have been confirmed and have a week worth of proof-of-work to prove they have been validated by numerous nodes before. There are other optimizations that can be added still. But, again, it hardly is an issue to focus on.

3. Extra Storage Space.

Agreed, this is a real issue, but the problem is at minimum overstated by many. Most people don't realize that it would take 75 years of Bitcoin blocks to fill up a common 3TB harddrive.
Naturally, we all want to get really big blocks in, say, 5 years so many many more people would be able to use Bitcoin on a daily basis. This would make the issue a real thing before long, for sure.
The solution I see is one where we fix this in software. This is actually less hard than most believe because a node doesn't actually need to keep more than a day worth of downloaded blocks on hand. Any node never actually refers to its own old blocks. The only reason to keep old blocks is to share them on the network with nodes that have yet to see them the first time.
The solution I came up with is a smarter way of pruning. I called it Random Pruning which uses the network as a big database. The effect is that you can have a node store only as much data as you are willing to store without this hurting the Bitcoin network.

4. Fewer Nodes

I don't really have much to add to Rogers point here. He also concludes this is really not likely to become an issue.
One thing that I'd like to add to this is that people that see censorship happening (for instance a country bans Bitcoin nodes) the result in a network that is the size and popularity of Bitcoin has always been that more people start running nodes. Just because they can. Call it a variation of the Streisand effect, if you will.

Naturally, when Roger ends up concluding we need bigger blocks, the I completely agree. And I'm working towards that end as much as I can.

I think what I'm trying to add is the idea that there somehow is a technical reason we can't scale is in my technical expert opinion just totally not true and will become less true in the coming year(s) as we see engineering efforts starting up in the Bitcoin space.

Sort:  

And i noticed your post that linked to steemit was disappeared, the MODs deleted it?

Thanks for the notice, updated.

I guess you meant my post to reddit, turns out that using the words "I joined steemit" to indicate I got an account here was mostly seen by people as me signing a contract with them. Poor choice of words. So I deleted it (since you can't change topic names there). Maybe I should post it with a non-confusing topic :)

I see, thanks for the explanation.

i really hope, that soon your ideas will be implemented in Bitcoin.
Thank you a lot for your great work!

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.029
BTC 76781.75
ETH 3131.82
USDT 1.00
SBD 2.65