Let's talk spec: #1 Asymmetric Moving Maxblocksize Based On Median Block Size

2018-11-23T11:12:47.000Z Honest Cash

This article is part of the Let’s talk spec series. Please read the introduction if you haven’t.

Let us start the with a topic that everyone is familiar with. The maximum block size limit. Everyone has an opinion on this one, so this should be exciting!

A new proposal is being written by a person that goes by the name of im_uname called Asymmetric Moving Maxblocksize Based On Median Block Size. Quite the title, but the concept is simple.

The basic idea is that we look at recent block history. We calculate recent usage and use it to set a new limit that is much higher. We adjust every time a new block is found. With this the blocks are unlikely to ever go full, even at load peaks.

At the same time we get to have a sanity check in place. If someone tries to send us a 1GB block, while all recent blocks have been 5MB, we know it is garbage and need not waste resources downloading or validating it.

At the heart of this proposal, as drafted, is this rule:

We propose a dynamic limit to the block size based on the the largest number out of the following:

_1. Median block size over the last 12,960 blocks (about three months) multiplied by 10 and calculated when a block is connected to the blockchain.

_

2. Median block size over the last 52,560 blocks (about one year) multiplied by 10 and calculated when a block is connected to the blockchain.

3. 32,000,000 bytes (32MB).

A similar proposal was written, and implemented by BitPay in the midst of the BTC scaling debate. Of course it had no chance back then, given that both block size increase and protocol upgrade by hard fork in general was not on the table in that camp. But now may be a good time to revisit this idea.

So what are the attack scenarios?

Malicious miner tries to increase the limit by throwing a lot of junk into the block. It theoretically costs him nothing, because his transaction fee cost is zero. To have any effect on the limit, he has to mine lots of these junk blocks. In practice, the block will propagate slowly and he may loose blocks in an orphan race. Propagation techniques like graphene, xthin and compact blocks will not work, these require other nodes to have seen the transactions before the block is mined.

There is still a limit! An evil actor is still able to fill the blocks and temporarily disrupt the network. The disruption will be in the form of having other peoples transactions not confirm timely. This will be very costly for this actor in terms of fees, and increasingly costly as the network adapts to a higher limit. It is not an sustainable attack.

The limit may become too high. This can be mitigated by miners choosing to mine smaller blocks than the actual hard limit.

I will weigh in my opinion on this one. I will support any decent dynamic block size proposal. This is a decent proposal. It is better than what we have, a static, but human configurable limit.

There is the human side of it too. The best way to agree on a limit, is not to have to agree. Let the system adjust. Keep the humans out of it. We can focus on other fun things.

This change does not need to be permanent. Removing it is a simple soft fork, where we set the static limit back to 32MB.

Finally;

For finer details, here is the specification. The earlier draft by BitPay is also a good read and has a FAQ covering among other things thoughts about miner incentives.

Does this break the economics of Bitcoin? Is it preferable that we manually adjust this limit? Should we aim for this change for May?

This article was originally published on yours.org, which is also where some discussion happened.

This article as also discussed on reddit

Responses