A carousel that loves the wind in its hair...

in #ai7 years ago (edited)

What if there are projects that actually get enough adoption that they produce 'lift,' and go
to the moon? Wouldn't we all like to know which one? Which one will transform our culture and provide a new path into the future?

But what happens when the design of the system actually ends up changing us in a way we're not expecting? How can we account for that? Cryptography and market design are super in blockchain technology, but I feel there needs to be more discussion about how these interfaces and services we're designing will actually affect human behavior. This is the same dilemma with technology like VR, psychoactive drugs, or cell phones. How do we protect ourselves from the negative side effects of technology if we don't fundamentally understand the technology?

There could be a way through the dilemma, provided we looked at the side of what is more commonly referred to as game theory. Game theory, I believe, is the best science we have of the subjective experience, because it looks at behavior, but it also looks at behavior in small groups. Game theory on the scale we expect blockchain tech to hit is going to be a completely different 'ballgame,' since it will be the biggest game in history. Interoperability has been thrown around a lot lately, and I'd have to say it's probably our biggest challenge.

Communication between chains means small groups interacting with other small groups, or big with small, or big with big, which will all have different incentives, and different compatibility. If each individual group has a different rewards system, wouldn't behavior also need to be accounted for in interoperability? For example, for a blockchain that runs a dApp for an online decentralized service marketplace, communication with a different blockchain, like, say, a decentralized currency exchange, would mean one groups behaviors(like FUD selling) affects the other whenever there are periods of stress or abnormal activities. This can lead to interchain manipulation. But what's worse than manipulation?

An article on a prototype learning program funded by Elon Musk surprised me the other day when they mentioned that the computer had decided to run around in circles for rewards, instead of trying to outpace the others because of the parameters for rewards designed for the AI. With the craze for cryptocurrencies bringing out the pyramid/ponzi schemes in us, what if we made ourselves a 'game' that pushes us to reinforce behaviors that go against our nature. We know money makes some humans do crazy things, so how would we know we're not designing systems that reward bad behaviors? What other incentives besides currency can game theory use in order to steer us away from long term failure? I wonder if this is just the beginning...

Anyone have any thoughts? I wonder if this is another aspect of blockchain tech that really needs to be an iterative process, and to put less pressure on expectations for productivity right out of the gate. Comment below! and thanks for reading.DSC_9477.jpg

Coin Marketplace

STEEM 0.20
TRX 0.19
JST 0.034
BTC 90741.27
ETH 3105.77
USDT 1.00
SBD 2.95