EOS.IO Development Update
Our team has been working around the clock to make our EOS.IO software the best it can be. Those following along on GitHub should see some substantial improvements in the structure of the code as we implement many of the things we discussed in the last update.
EOSIO BIOS
A computer's BIOS is built into the hardware and is the first thing a computer loads prior to starting the operating system. This week we continue with the operating system metaphor to make the EOSIO blockchain bootstrap process as simple as possible, like a computer BIOS. The blockchain now starts up with a very simple initial state:
- a single account (eosio.system)
- a single private key
- a single block producer
This initial account is like the root account on linux systems, it has unlimited power until it yields this power to a higher level operating system smart contract. From this initial state, the @eosio.system account will upload the operating system smart contract that implements the following:
- staking for voting, network bandwidth, cpu bandwidth, ram, and storage.
- producer and proxy vote creation
You can view this initial state as an embryonic stem cell capable of adapting an EOSIO based blockchain to any number of use cases and governance structures all of which can be updated and tweaked without requiring any hard forks.
We gain many benefits from this approach because it makes the core EOSIO software simpler and easier to test.
Dynamic Number of Block Producers
The primary outcome of this is that EOSIO blockchains now support a dynamic number of block producers which can be changed with a simple update to the @eosio.system smart contract. We will still default this to 21 producers, but this is no longer hard coded.
The primary reason for making it dynamic is because for many private blockchains, 21 producers is beyond overkill. Enterprise use of private blockchains may prefer to have just a couple of producers and test networks might want only a single producer.
Metering
Historically we have indicated that each transaction would have at most 1ms of runtime as measured subjectively by the block producer. We realized there is a demand for some transactions which could take up to 50ms to run and also want to incentivize efficiency by encouraging developers to design transactions that take less than 50us to run. Under our original model all transactions utilized the same CPU whether 50us or 1ms meaning there is no incentive to optimize below 1ms.
Because runtime is subjective and can vary depending upon other activities running on the same computer, it is not possible to generate an objective and reproducible measure of runtime.
We realized that at no additional cost, we could modify our existing time-based rate limiter to a limiter that would calculate an objective estimate of the number of WASM instructions executed. This is similar to how Ethereum measures gas consumption. With this new objective measure we can rate limit CPU just like we rate limit bandwidth.
The block producers will use the same "dynamic oversubscription" algorithm for CPU usage that they use with network bandwidth. This means that while the network has spare CPU capacity users can get more CPU-per-staked token than they would be guaranteed to get during full congestion.
The block producers would still implement a subjective runtime limit in addition to the CPU instruction counting. This subjective limit would protect the network from those who would abuse the metering algorithm by using the most time-expensive operations more than the less time-expensive operations.
Separation of CPU and Network Bandwidth
In previous updates we indicated that we would separate out RAM, Storage, and Bandwidth where CPU/Network were both considered part of bandwidth. We realized that some applications, like Steem, might have high network bandwidth (for posts) and low CPU bandwidth, whereas other applications might have low network bandwidth (exchange orders), but higher CPU bandwidth (order matching). This means that one-size-fits-all pricing and/or staking does not make sense.
To keep things simple, the user interface can still bundle these things together for normal users; however, power users now have more price flexibility.
Transaction Compression
In the process of adding support for the c++ STL library we noticed that smart contracts could get quite large (50kb) and would therefore consume significant network bandwidth. It is conceivable that more complex contracts might grow to be over 200kb. We also realized that many applications, such as Steem, bundle very compressible content into transactions.
We added support for zlib compression of transactions which can provide a 60% or more reduction in bandwidth usage for smart contract uploads and potentially higher for Steem-like content.
Network Updates
The P2P network team has been busy updating the code to enhance performance and stability. This week they made significant progress on the following:
block summary - when a block is broadcast only the transaction IDs are included rather than retransmitting all the transactions in the block. This will reduce bandwidth usage by almost 50%.
large message support - broadcasting large messages (like 50kb smart contracts) needs a different network protocol than small messages (like 200 byte transfers).
Conclusion
Our development team is working to make EOSIO the most efficient, general purpose, and flexible platform to date.
Dan,
Please consider ZStandard, LZ4 and LZMA2 as compression options. All have friendly licenses. At a minimum, ZStandard is faster and more effective than zlib. LZ4 is the fastest and smallest memory footprint, and LZMA2 will give you the best compression, great for a smart contract that compresses once and decompresses many times.
Having a compression type in the smart contract data structure would at least allow adding new compression types in the future.
We designed it for multiple standards
yeah multiple standards is perfect for this system.
You are very considerate, thanks sir
Designing it for multiple standards is your best bet!
ZSTD is a really strong balance of speed+compression.
Very cool.
Interestingly, Eos and Steem features in the top most promising altcoins in 2018 says Weiss
Good news!!!
Apparently, an indepth into this issue will be of pertinent value to the ecosystem in general.
Dynamism in the number of block producers will be valuable in the pace of transactions which will be needed for a great future of eos.
Bandwidth should summarily be taken into a bigger consideration.
Network stability will be a feat towards achieving a sustainable value too.
All the best!!!
Exactly bro! You speak my mind!
I gain so much from the awesome post. Thumbs up
Dan, you make it sound all so easy : ) Looking forward to an enlightened society, thanks for putting up the effort!
@Ned needs to start delegating his SP in quantities of 5k to community engagers like I've been doing. Delegating more than that (e.g. 500K) is just too damn much for an individual.
Really a massive effort. He has great ideas.
@dan, this is a race against the clock, at that speed we'll be reaching a zero point in time! ;)
Thanks a lot for all this incredible information and I want to congratulate you and your team for this phenomenal work you are all doing. I also love being educated in the matters at hand. Brilliant wording, equally powerful learning. I can't thank you enough for this either.
Namaste :)
Keep up the good work lads, EOS is one of my very few favorite blockchains and along with Cardano have huge potential to shape the future.
I know my upvote wont affect reward pool ...but you deserve it.....
He really does deserve it. Kudos to @dan for this excellent EOS project
Even my upvote isnt going to make much difference ... But still one who desrve u cant keep ur hands away from upvote... :p
It is really great by declining the post to the steemit reward pool ...hats off .....Great Steemians be like this ....
BTW nice work.....salutae
Ya lafmaya... Che kyo jumpa maermic .. Dont be great ... Just Grow some balls.
Steem is mentioned 4 times in your post, @dan (I include the tag). So it would be safe to conclude that a Steem-like dapp will run on EOS, I guess.
Why would we need a SteemOnEOS when we already have Steem? What would be the advantages?
... if they rebuild steemoneos ... airdrop to eos holders ... it will take advantage of eos distribution ... a lot less whales... votes will be worth more and minnows will now have a higher stake.
I mean, why wouldn't Steem just migrate to Eos? It already has Graphene infrastructure. It makes no sense to build another coin when Steem is already ahead and can be converted faster than a new one can be built..
whaleees
STEEM doesn't come with any EOS with which to board. And STEEM dollars won't be compatible with EOS dollars.
Eos isn't even out yet so idk what that means... It's built on the same infrastructure. I'm not sure I understand what that second part means either.. Why wouldn't you be able to trade steem dollars for Eos dollars on an Eos Dapp? Seems perfectly possible to me. Have you read the whitepapers and know something I don't?
Yeah, that could be possible advantages, but actually I'm really curious about @dan's perspective on this.
Very interesting perspective, I wouldn't be surprised to see it happen.
GREAT answer!!! Thank you, namaste :)
My limited understanding is that EOS is a smart contract platform, I'm not sure if it would be suited for developing a social media app onto... Would be grateful for more input
On EOS you can run multiple steemit like platforms. All running as Dapps, EOS can run ethereum as well with out a hitch you can essentially create an uber type platform as well with out concern to speed and or bandwidth.
I believe it would be possible to create a sophisticated social media app, in which you could curb the :self-aggrandising: that happens on Steem. For example, rewards could be based on a direct transfer of coins (rather than a common pool), and for motivating this you could add a curator reward clause as part of the transaction.
Excellent question! Namaste :)
Ned won't be there
Hey dan,
thanks alot for the regular updates. It's great to hear about the improvements to the software!
I'd love hear more on the 'human issues' side, if you find the time. Thomas Cox mentioned you had a conversation on the issue of vote-buying. I'd be great to hear your thoughts on this as well!
Awesome work by the Dan and the Block.One team - the Dynamic number of Block Producers is very welcome and a big step in the right direction.