CONTEXTUAL THINKING. —「More about the world-is-a-simulation conjecture」

in #steemstem7 years ago (edited)



Updated 2018.3.18, 2018.3.21

「What hasn't been already said」

 

You know you want to talk about it. At this point everybody has heard of the world-is-a-simulation conjecture, either in science fiction, in philosophy and, in the last few years, in popular conversion. Like many skeptical arguments it's considered difficult both to support or refute in analysis.

I present some technical points that are relevant. My thoughts after reading this well written post on this topic are such:

 

〈1〉
 

The simulation argument was also presented in a rather famous short story — Vernor VINGE, The cookie monster, Analog 123(10):8–40, 2003.10 — and a well known novel — Robert WILSON, Darwinia, New York: Tor, 1998.

 

〈2〉
 

Much of what inspires the simulation hypothesis is metaphysical disquiet regarding the foundations of quantum mechanics. If you can imagine that measurement is necessary for existence, then you might be comfortable with the nonlocal (quantum blob) or equivalently participatory nature of the universe at the smallest level. After all, both John LOCKE and James HUTTON argued interaction and measured difference, and also well know, for example Satosi WATANABE (Knowing and guessing, New York: Wiley, 1969), are primary to existence. What is not measured at all induces no dynamic on any dial, and therefore doesn't do anything, so what does it mean for it to exist? At the smallest scale, where also large energies by E_0 = hv, if the measuring device doesn't measure they system, then nothing measures it, and we get counterintuitive behavior, as that situation never comes up in life in the classical macroscale world.

Some people begin thinking of simulations, because the future determining the past aspects (made famous by John WHEELER in his lectures about a measurement today determining the state of a star in the past) seems very odd. Unless you view the world as an set of intersecting experimental setup each consisting of measuring apparati distributed over space and time, and we are simply observing nonlocalities. (Basil Hiley prefers to discuss quantum blobs and not participations. He showed that the average of just of a few quanta is often entirely deterministic as a unit.)

John WHEELER famously argued over forty years that everything was a statistic of bits, etc., etc. Roger PENROSE and David FINKELSTEIN conjectured at one point or another that all bosons are statistics of fermions and that you have the classical world emerging from graphs of prespinor measurements that form linked even tuples of spinors. You get the same supermanifolds with Grassmann operators as in string theory, but this develops statistically into a Clifford algebra at the scale of larger particles and then into a Minkowski metric on the largest scales, which is unexpected yet desired. In that case, perturbations of quantum events are particles, and fields are monoidal categories with a few basic types of particles/operators and functors between each pair of categories. Spin, charge, and fermions are stable, bosons unstable, and the stable generate statistically the unstable, etc.

Bob COECKE and Aleks KISSINGER (Picturing quantum processes, Cambridge: University Press, 2017), for example, are following up on that. Stephen WOLFRAM (A new kind of science, Champaign: Wolfram, 2002) has his own thoughts, which he developed in his excellent book.

Michael ATIYAH (The interaction between geometry and physics, The unity of mathematics, Boston: Birhaeuser, 2006), Heinz Foerster, and Gordon PASK originally suggested, and it has been periodically repeated, that much of the indeterminism is because we don't have enough accurate knowledge of the past of particles, that past matters more than we admit for predictions. Giacomo M. D'ARIANO, G. CHIRIBELLA, P. PERINOTI (Quantum theory from first principles, Cambridge, University Press, 2017) recently had a great book out.

And other theories, such as the fact that if we admit general relativity, we get indeterminism for free, from any perspective, because a distributed computing system passing messages with time required for messages to travel, has, for example, computable but unbounded counts with a random number in the Wolfram sense that any statistical analysis would have it appear random, which do not exist in Turing machine models, which are deterministic or nondeterministic, random, but never compute unbounded counts.

Many, many conjectures that remove metaphysical discomfort exist. They are areas of active research. Foundations of physics and Philosophical transaction of the royal society A, for example, are doing quite comfortably as journals.

Yet all current proposal are all highly mathematical. Quite possibly there is nothing really wrong with current quantum mechanics or relativity, other than philosophical dislike of some of the implications. In the case of individuals being dissatisfied with the metaphysical implications of quantum mechanics, the simulation hypothesis becomes appealing, because it explains nonlocality and quantum indeterminism and nondeterminism as generation of details on demand, the world as lazily computed, i.e., only what observers bother measure is computed. That is plausible considering how we program video games and other applications.

 

〈3〉
 

One issue with the simulation hypothesis is that it's unnecessary, for existing physical theories have no problem dealing with the philosophical issues in quantum mechanics. Noncommutative and possibly nonassociative (bracketing) aspects are necessary anyways for measurement and sufficient difference to be possible, and moreover, to buffer measurements, so that measurements are meaningful. For a system where all velocities and group generators are commutative has singular commutators. What is the problem with that? The map null commutators to nonnull commutators does not preserve isomorphism, which means small errors in measurement can result in nonisomorphic topological skeletons for the universe, due only to measurement noise. Measurement are unstable in that case. Both in the sense that we cannot infer a different measurement result is due to the system and not due noise in the measuring device, and in the information theoretic sense related to existence of primary particle, and such a world will not long exist. Approximately correct measurements must approximately describe what does exist. Measurement error should result in apparent change of the group producing what is observed. (Irving SEGAL, A class of operator algebras which are determined by groups, Duke mathematical journal, 18(1):221–265, 1951.1; D. FINKELSTEIN, J. BAUGH, A. GALIAUTDINOV, M. SHIRIGARAKANI, Transquantum dynamics, Foundations of physics, 33(9):1267–1275, 2003.4).

If the universe were a simulation, and there was a larger computer outside it, simulating it, noncomputable results would occur more frequently than random (guess and check works), and we do not statistically observe that. Quantum thermodynamics is basically valid. (Hiroomi UMEZAWA, Advanced field theory micro, macro, and thermal physics, New York: American Institute of Physics).

 

〈4〉
 

The simulation hypothesis is plausible but very improbable. Yet most things that don't happen are such not because they are impossible, but because they are sufficiently improbable. That is the basis of thermodynamics.
The simulation hypothesis doesn't necessarily imply anything religious, any external observers or simulator makers. The universe, if it is a simulation, might be a result of a short code in John Neumann's (The computer and the brain, New Haven: Yale University Press, 1958) sense, running on another universe. A short code is just a higher level programming language plus all the code needed to compile it as one program, so while it requires programming the underlying computer, all the instructions that follow are shorter, simpler, compressed.

The computer might be simulating another but both might be randomly occurring codes that self propagate and form groups. In that case, the universe we observe can be a simulation as a result of code formed randomly on an underlying universe and self propagating, without any programmer. Stephen Wolfram (2002), indeed, shows how ergodic systems can generate all possible rules for themselves to follow and one of those rules might be a simple program, meaning it can be a result of randomness at a lower level, but the simple program codes for a complex simulated universe.

If so, then without any programmer or simulator, or external observer, there are not necessarily any applications to real life.

The string theory gentlemen use moduli spaces, which are all possible local least energy states, also called vacuum states. Some might belong to systems that underly our universe in the sense that n =(M) m if exists fM such that n = f(m). If we can separate a physically meaningful group according to vacuum states into independent layers M(1), M(2), . . . , taking these as sets of generators, and we observe only some of these, but all appear physically meaningful, we might suspect our reality can be generated in the sense of a simulation, our reality being a short code, not a complete code, compiled and simulated on an underlying system that meshes with it but exists below it.


I usually write stories which are 10,000–25,000 words . . . 40–100 pages.
        #creativity #fiction #writing #creative #technology #life #scifi
            #thealliance #steemstem #isleofwrite #writersblock

ABOUT ME

I'm a scientist who writes fantasy and science fiction under various names.

The magazines which I currently most recommend:
Magazine of Fantasy and Science Fiction
Compelling Science Fiction
Writers of the Future


playplayplayBBB.gif
PRACTICAL THINKING — LATESTRECENT POPULAR
BOOK RECOMMENDED — fiction and nonfiction reviewed
FISHING — thinking about tools and technology
TEA TIME — philosophy

©2018 tibra. Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License . . . . . . . . . . . . . . . .   Illustrations, Images: tibra.

Sort:  

All technology is growing up, too fast. human race must adapt.

Yes, the human race must adapt.

Congratulations @tibra! You have completed some achievement on Steemit and have been rewarded with new badge(s) :

Award for the number of upvotes received

Click on any badge to view your own Board of Honor on SteemitBoard.

To support your work, I also upvoted your post!
For more information about SteemitBoard, click here

If you no longer want to receive notifications, reply to this comment with the word STOP

Upvote this notification to help all Steemit users. Learn why here!

I have to admit, this is a lot to take in, so excuse my lack of knowledge on the matter.. What are your thoughts on the application of the simulation hypothesis in real life?

The simulation hypothesis doesn't necessarily imply anything religious, any external observers or simulator makers. The universe, if it is a simulation, might be a result of a short code in John Neumann's (1958) sense, running on another universe. A short code is just a higher level programming language, so while it requires programming the underlying computer, all the instructions that follow are shorter, simpler, compressed.

The computer might be simulating another but both might be randomly occurring codes that self propagate and form groups. In that case, the universe we observe can be a simulation as a result of code formed randomly on an underlying universe and self propagating, without any programmer. Stephen Wolfram (2002), indeed, shows how ergodic systems can generate all possible rules for themselves to follow and one of those rules might be a simple program, meaning it can be a result of randomness at a lower level, but the simple program codes for a complex simulated universe.

If so, then without any programmer or simulator, or external observer, there are not necessarily any applications to real life.

The string theory gentlemen use moduli spaces, which are all possible local least energy states, also called vacuum states. Some might belong to systems that underly our universe in the sense that n =(M) m if exists fM such that n = f(m). If we can separate a physically meaningful group according to vacuum states into independent layers M(1), M(2), . . . , taking these as sets of generators, and we observe only some of these, but all appear physically meaningful, we might suspect our reality can be generated in the sense of a simulation, our reality being a short code, not a complete code, compiled and simulated on an underlying system that meshes with it but exists below it.

(Just added this reply to your question to the main text.)

Congratulations! This post has been upvoted from the communal account, @minnowsupport, by tibra from the Minnow Support Project. It's a witness project run by aggroed, ausbitbank, teamsteem, theprophet0, someguy123, neoxian, followbtcnews, and netuoso. The goal is to help Steemit grow by supporting Minnows. Please find us at the Peace, Abundance, and Liberty Network (PALnet) Discord Channel. It's a completely public and open space to all members of the Steemit community who voluntarily choose to be there.

If you would like to delegate to the Minnow Support Project you can do so by clicking on the following links: 50SP, 100SP, 250SP, 500SP, 1000SP, 5000SP.
Be sure to leave at least 50SP undelegated on your account.

The hardest part of technology is the speed which it is developing, keeping up with the pace is a real challenge.
c0ff33commentaimage.png
#thealliance

Coin Marketplace

STEEM 0.21
TRX 0.25
JST 0.039
BTC 94750.26
ETH 3276.79
USDT 1.00
SBD 3.15