• Forum has been upgraded, all links, images, etc are as they were. Please see Official Announcements for more information

Bringing Festy & Dash to the public - Irish Pubs Global Market

@Chuck Williams I appreciate your input but I'm unsure what your recommendation is. Are you agreeing with Festy that dash is currently not up for the job / future unclear?

Dash is up for the job of managing the transaction volume, but not ancillary data, IMHO - some back of the napkin calcs I shared w/ Graham earlier:

______________________
With InstantSend the actual volume of transactions that can be carried is higher than 40tps, I'm asking team for data. Also, It's been reactivated w/ most recent upgrade.

The InstantSend transactions can be "loaded up" to level of available RAM in the mempool (superfast) , and then whiddled down over time as they're confirmed to blocks (~ 4K transactions every 2.5m / 40Tx/s) This will be doubled with next block upgrade ([5MB]80Tx/s), but mempool may be able to handle more than 2x prior to next block upgrade [pending core team confirmation]

One interesting calculation - if 50,000 people showed up at an event - to process 1 transaction for each of them in a span of 10 minutes only takes 83.3 Tx/s.

So in determining need it would be interesting to know how many event-goers could possibly be transacted simultaneously - from a hardware (POS, or "top-up"/"cash-out" events)...

For instance - I think the average stadium may have 10-20 ticket booth operators, and maybe 100-200 operating storefronts? That would be a maximum of 220 simultaneous transactions per second - if each unit was conducting 1 sale per second, right?

...but to get a little more realistic - lets say 1 sale every 15-30 seconds?

That would mean in the above scenario worst case to accommodate all hardware you'd need 14-7 transactions per second capacity.
Also in above scenario it would take about 2 hours of collective waiting in line for 50,000 people to be processed through 1 transaction at each of those 220 stations @ 30seconds per transaction. That comes out to about a 2 minute wait in each line - sound about right? So in the above scenario you could reasonably run 2 events @ 50,000 person capacity with 220 POS systems at each event running on-chain, and just barely push the limits of InstantSend.

That's not accounting mempool limits. I'm assuming they're greater than chain transaction capacity since storage is slower than memory and data is pretty easy to propagate if you don't have to write it or confirm it

"...That's not accounting for bandwidth at each location. But I think that the human limiting factor of the actual transaction (product selection to conversion) is your biggest bottleneck.

However - to maintain control over "...Volatility, availablitity, usability.."

... I think it's a good idea to institute your own control & backup systems that write to the Dash chain only on "significant events"...

Especially since you're doing a "value add" of recoverability of funds / card replacement, etc.
___________________________


So transactions yes - but where to store the customer data? Credit card transactions for top-ups? Record keeping for secondary api integrations?

Dash doesn't do this right now... but with Evolution - there will likely be a very easy way.

That's why I think the path selected is good, for now. They might be able to dump it in the future and go 100% on-platform - but they'll need something to do business in the meantime. I think a simple sidechain with decent security measures is a good interim solution for the ancillary data.
 
We really appreciate all the comments so far, both positive and negative. This has been an insightful thread and opens up some broader questions about the on-chan and off-chain debate and the storage of customer data which is an integral question in blockchain industry at the moment.

I have made a video replying to some of the most pertinent questions thus far, which you can watch here:

Thank you for the feedback and keep the comments coming!
 
Back
Top