• Forum has been upgraded, all links, images, etc are as they were. Please see Official Announcements for more information

Pre-proposal: Protecting the Network Against Flood Attack

Ricardo Temporal

New member
I have designed an economic model that calculates the block size dynamically in order to mitigate the attacks against the network.
I intend to submit the proposal to the vote of the masternodes. If the proposal is approved, it does not mean that it will be implemented. It means that the DAO considers the proposal as a possible way to go. The vote allows we to know the opinion of the investors without the noise caused by the debate.
The cost of the proposal is just the reimbursement of the fee if the proposal is approved.
The link to the paper follows in another post.
 
Last edited:
Abstract. The block size is analyzed under the perspective of a deliberate attack against the
network. We talk about the naive proposals that do not consider a deliberate attack. We argue for an
economic model that calculates the block size dynamically in order to mitigate the attacks.
 
The focus of this proposal is the defense of the network against the enemies. We take the
assumption that there are enemies out there, and the enemies may have massive amount of
resources. Some enemies haven’t attacked yet; but, we should be prepared in advance.
 
you should post this is technical discussion part , over here just tell us why you think your solution is better than current one in English not in maths , Thanks
 
The economic incentives of the current solution is vulnerable. The enemy can overload the cost of the system to make the ownership of a masternode becomes not profitable. Technically, the system can continue to run, but it will not be economically viable in the long term.

My solution holds the costs low.
 
As pointed out, this should be submitted as a DIP in the technical forums.

Personally, I would prefer a solution for dealing with micro-transactions. Recently, for example, Coinfirm told us they were using the dash blockchain as a non-financial ledger, issuing thousands of micro-transactions. I support their idea but it's clear a technical solution is needed to prevent bloat / reduce system load.
 
The demand is unlimited. but, the resources are limited. Scarcity is considered as 'the basic economic problem', check the link for reference. For example: With current block reward of 1.8 Dashs per week, the increase of the block size cannot increase the cost of ownership to more than that value. People presume that the price will go up in the same speed of the demand in order to offset the cost. But, that assumption is false because it does not consider a deliberate attack. The enemy can buy just small amounts of the the coin in other to broadcast a massive demand.

Perhaps, this should be a DIP instead of a proposal. Ok.

https://www.investopedia.com/terms/s/scarcity.asp
 
I agree that the decision whether to implement or not should by via a DIP. But I read the OP as suggesting MNO's voting on this proposal merely to get the opinion/preferences of the MNOs. I see no harm in finding out. I would vote "yes" just out of curiosity.
 
wouldn't it be the case that this solution is more dangerous for the network than the problem it tries to solve? large mempool is a problem, but no danger to the blockchain. unlimited growth of blocksize could stop or damage the network. imagine how many nodes would drop of the network (crash or fork) now if someone would push a 1gb block through every 2.5 mins.
 
wouldn't it be the case that this solution is more dangerous for the network than the problem it tries to solve? large mempool is a problem, but no danger to the blockchain. unlimited growth of blocksize could stop or damage the network. imagine how many nodes would drop of the network (crash or fork) now if someone would push a 1gb block through every 2.5 mins.

You might have read only the title. I don't blame you, it's my communication mistake. I must fix the title, sorry. The dynamic part of the block size is about decreasing, not increasing the block size. Exactly to avoid the damage of the network when somebody is pushing 1Gb block every 2.5 minutes.

In the version 12.2, the maximum block size is 2 Mb. In this case, the dynamic model would make a decision to use that full capacity or not. The model would just decrease some blocks to 1 Mb. It would never increase to 3Mb, as it's prohibited by the protocol.
 
I've made some changes to the document according to the feedback.

1) Changed the title: "Protecting the Network Against Large Blocks".
2) Explanation: "The block size will vary between the minimum and the maximum capacity of the network...".
3) Explanation: "The enemy can buy just a small amount of coins to broadcast a massive amount of transactions..."

The link is updated.

The DIP001 had been first a proposal approved by the vote before being transformed into a DIP. I think the core team should not even consider to implement such a change that is not approved by the masternodes. I will not do anything for a while. Let me improve the communication according to the feedback.
 
I am struggling to see the problem with increasing the block size when the current size fills up.
If the spammer pays for a full block of spam, the fees generated should be enough to pay for the increase in network capacity; that is the whole premise for scaling in the future.
If the fees generated by a full block isn't enough to pay for the network capacity to host it than the whole blockchain model will break down in a few years anyway and we will have considerable larger problems than spam transactions.
If paying that cost becomes prohibitive to the spammer and they stop the attack, we will just have large, relatively empty blocks, which isn't actually a problem since large empty blocks don't take more resources to host than smaller blocks.
 
... the fees generated should be enough to pay for the increase in network capacity; that is the whole premise for scaling in the future.

The network capacity is being paid by the inflation, the fees are not being enough to replace the inflation. The fees can be increased in the future to keep that premise true, ok. But, this is brute force. A model can avoid the problem while keeping the fees low.
 
My point of contention with filtering the blockchain for large blocks of cheap transactions is that it can filter out valid transactions as well. If it is not necessary to do so, I don't think we should ever filter for cheap transactions, simply let the current system of competing fees handle any excess.

If we implemented your system, what would prevent a hypothetical attacker from spending more on an attack in order to get large chunks of cheaper valid transactions thrown out by the algorithm?

I am not actually against the idea if I can see the logic behind filtering our transactions, but as the incentives stand, I don't see how it would be an improvement on our current setup.
 
Back
Top