Simulation hypothesis

Consider all the things that occur within a single square meter of the universe.
What would be the computational power needed to comprehensively simulate everything that occurs within it?

Your computer operates at gigahertz, billions of clock cycles per second. What would be the actual time-step between each "frame" of the universe?

Consider that such a process is occurring throughout all of the known universe.
Why would such power be wasted in such a way?

Each dot in this image represents not a star, but an entire galaxy. The position and state of every particle is updated with exceedingly high frequency.

*this image

You would have to have some pretty beefy stuff. You would have to start with quantum foam and work your way up, simulating every law that's in existence, and I guarantee if we successfully did it with modern tech it'd be more like a 500cm instead of 1m. You have to model electron clouds, atoms, density of the gas or absense of. You'd probably have to run a supercomputer, because successfully doing it means you can run a simulation of theoretically /anything/ in that square meter. You could model a full human face in 100% detail down to individual workings of synapses. Think that's modern tech?

>What would be the actual time-step between each "frame" of the universe?

The time-evolution operator in QM is [math]e^{-iHt/\hbar}[/math], where H is the Hamiltonian operator. Thus your "clock speed" could be [math]\frac{H}{h}[/math], where H is an energy eigenvalue and h is the planck constant.

You don't have to actually do all that, it just has to look convincing.

This is the right answer.
You don't need to simulate the parts of the universe no one can see.
If you printed that pic out and put it in front of Hubble how long would it be before someone noticed it wasn't evolving right? Probably millions of years. Very little computing power is needed.

...

Occam's razor

In order for this Universe to be a simulation you're adding a minimum of one layer of Universe on top of it so it can be simulated from it (with the potential of infinite layers of Universes around it according to the hypothesis)

In order for this Universe to be the real one, you need to add zero additional complexity around it. It is therefor more logical (and probable) to take it as the real and unsimulated Universe

No, that doesn't mean that it's certainly not a simulation since the second option is the path of least complexity. It means that you should stick to that until you have data that concretely proves otherwise, else you're inserting emotions and desires into science.
And that's really the only reason you want to believe in it - it's a form of escapism where you want to escape the faulty decisions you have undertaken this life and sign them off as "irrelevant" since it is supposedly not the real Universe and nothing therefor matters.

>inserting emotions and desires into science

But isn't that exactly how the simulation works?

> it's certainly not a simulation until you have data that concretely proves otherwise
I do.
And there are not infinite layers of universes within universes.

To make an accurate representation, one would have to simulate every single subatomic particle... And gravity somehow.

The math wouldn't be so bad, I think. Since THIS.png exists. You only have to calculate that with an insanely high tickrate. (excluding ofcourse gravity)

The mesuring of the particles would be worse than running thse simulation. Because one would have to measure all the values of every particle. But whe you have a rondom square meter of an infinite universe, everything would be possible. Therefor randomly chosen particles can be a thing.

And yes, the computer would have to be amazingly powerful to do every calculation for every particle.

I wonder if a quantum computer would solve that problem.

What might that data be?

Glitches, bugs and lag.

>muh simulation
It isn't real. Fuck off

>Why would such power be wasted in such a way?
BECAUSE WE USED TO THINK KILOBYTES WERE HUGE.

Also it took thousands upon thousands upon thousands of years for humans to evolve to cognition and then several thousand years before humans came up with language, but within the last 50, we've gone from THINKING KILOBYTES ARE HUGE to wasting insurmountable ammounts...trillions, quadrillions, of kilobytes.

Sooner than you think we'll be angery about only getting a few yottabytes a second.

...

It depends on how much observation is occurring.

The universe simulation doesn't calculate anything until observed.

Why do you assume simulation has to happen in real time? To those inside the simulation they won't be able to tell if their 1 second of time takes 1000 seconds of base reality time to run. You could simulate the universe on a Commodore 64 given enough time.

>It depends on how much observation is occurring. The universe simulation doesn't calculate anything until observed.

yep. no point in calculating the collapsed quantum state until the user is paying attention to it. Until then everything is just in the quantum superposition with its probabilities stored on file for future reference.

>square meter
Zero computational power, since no volume means nothing to compute