I’m telling you. I’ve always been telling you. We are living in a black hole. It explains elegantly a lot. It also raises absurd questions and possibilities, and nibbles at the nature of causality itself, but so help me god, we are living in a black hole.
- Rotation is meaningless without an external reference frame to compare against. Consider that right now the planet were on is rotating at ~1000km/h but to us it feels stationary. We only know the planet rotates because we observe the sun,moon,stars rotate around us (which ancient peoples misunderstood as earth-centerism thinking everything rotates around us)
- Rotation requires a center axis to rotate around. There is no true center to our observable universe, only subjective perspective reference frames. wherever you are is the center from your perspective. There is no definitive geometric center axis of our universe to rotate around.
tomiant@piefed.social 10 hours ago
SmokeyDope@piefed.social 9 hours ago
You are close! though its not quite that simple. According to penrose spacetime diagrams the roles of space and time get reversed in a black hole which causes all sorts of wierdness from an interior perspective. Just like the universe has no center, it also has no singularity pulling everything in unlike a black hole.
Now heres where it gets interesting. Our observable universe has a hard limit boundary known as the cosmological horizon due to finite speed of light and finite universe lifespan. Its impossible to ever know whats beyond this horizon boundary. similarly,black hole event horizons share this property of not being able to know about the future state of objects that fall inside. A cosmologist would say they are different phenomenon but from an information-theoretic perspective these are fundamentally indistinguishable Riemann manifolds that share a very unique property.
They are geometric physically realized instances of godelian incompleteness and turing undecidability within the universes computational phase space. The universe is a finite computational system with finite state system representation capacity of about 10^122 microstates according to beckenstein bounds and planck constant. If an area of spacetime exceeds this amount of potential microstates to represent it gets quarentined in an inverse pocket universe so the whole system doesnt freeze up trying to compute the uncomputable.
The problem is that the universe can’t just throw away all that energy and information due to conservation laws, instead it utilizes something called ‘holographic principle’ to safely conserve information even if it cant compute with it. Information isn’t lost when a thing enters a black hole instead it gets encoded into the topological event horizon boundary itself. in a sense the information is ‘pulled’ into a higher fractal dimension for efficent encoding. Over time the universe slowly works on safely bleeding out all that energy through hawking radiation.
So say you buy into this logic, assume that the cosmological horizon isn’t just some observational limit artifact but an actual topological Riemann manifold made of the same ‘stuff’ sharing properties with an event horizon, like an inverted black hole where the universe is a kind of anti-singularity which distributes matter everywhere dense as it expands instead of concentrating matter into a single point. what could that mean?
So this holographic principle thing is all about how information in high dimensional topological spaces can be projected down into lower dimensional space. This concept is insanely powerful and is at the forefront of advanced computer modeling of high dimensional spaces. For example, neural networks organize information in high dimensional spaces called activation atlases that have billions and trillions of ‘feature dimensions’ each representing a kind of relation between two unique states of information.
So, what if our physical universe is a lower dimensional holographic projection of the cosmological horizons manifold? What if the undecidable bubble around our universe is the universe in its native high dimensional space and our 3D+1T universe is a projection?
tomiant@piefed.social 8 hours ago
Its impossible to ever know whats beyond this horizon boundary. similarly,black hole event horizons share this property of not being able to know about the future state of objects that fall inside.
Similarly. That’s exactly what I was thinking. I mean, we can see a resolution of within 300000 years of the creation of the Universe, or something like that. That too, becomes a sort of event horizon. I’m not saying this is the case, but it kind of rhymes if we were within some sort of bubble on “the other side” of a supernova resulting in a black hole. I mean, even by our own physics, matter is energy, and energy cannot be destroyed only transformed. So, it could be that, in my mind, the Big Bang was an “inverted supernova/black hole” event. I think someone called it a “white hole”. But again, I’m just completely speculating, but it’s super interesting to me as someone who just has a passion for learning and understanding and trying to figure out things myself based on the tremendous works of all the giants upon whose shoulder I stand and squint at the horizon!
SmokeyDope@piefed.social 2 hours ago
Did you write this, genuinely? It is pure poetry such that even Samurai would go “hhhOooOOooo!”.
Yes I did! I spent a lot of time cooking an axiomatic framework for myself on this stuff so im happy to have the opportunity to distill my current ideas for others. Thanks! :)
And it is so interesting, because, what you are talking about sounds a lot like computational constraints of the medium performing the computation. We know there are limits in the Universe. There is a hard limit on the types of information we can and cannot reach. Only adds fuel to the fire for hypotheses such as the holographic Universe, or simulation theory.
The planck length and planck constant are both ultimate computational constraints on physical interactions, with planck length being the smallest meaningful scale, planck time being the smallest meaningful interval (universal framerate), and plancks constant being both combined to tell us about limits to how fast things can compute at the smallest meaningful distance steps of interaction, and ultimate bounds on how much energy we can put into physical computing.
Theres an important insight to be had here which ill share with you. Currently when people think of computation they think of digital computer transistors, turing machines, qbits, and mathematical calculations. The picture of our universe being run on an aliens harddrive or some shit like that because thats where were atas a society culturally and technologically.
Calculation is not this, or at least its not just that. A calculation is any operation that actualizes/changes a single bit in a representational systems current microstate causing it to evolve. A transistor flipping a bit is a calculation. A photon interacting with a detector to collapse its superposition is a calculation. The sun computes the distribution of electromagnetic waves and charged particles ejected. Hawking radiation/virtual particles compute a distinct particle from all possible ones that could be formed near the the event horizon.
The neurons in my brain firing to select the next word in this sentence sequence from the probabilistic sea of all things I could possibly say, is a calculation. A gravitational wave emminating from a black hole merger is a calculation, Drawing on a piece of paper a calculation actualizing a drawing from all things you could possibly draw on a paper. Smashing a particle into base components is a calculation, so is deriving a mathematical proof through cognitive operation and symbolic representation. From a certain phase space perspective, these are all the same thing. Just operations that change the current universal microstate to another iteratively flowing from one microstate bit actualization/ superposition collapse to the next. The true nature of computation is the process of microstate actualization.
Lauderes principle from classic computer science states that any classical computer transistor bit/microstate change has a certain energy cost. This can easily be extended to quantum mechanics to show every superposition collapse into a distinct qbit of information has the same actional energy cost structure directly relating to plancks constant. Essentially every time two parts of the universe interact is a computation that cost time and energy to change the universalmicrostate.
If you do a bit more digging with the logic that comes from this computation-as-actualization insight, you discover the fundamental reason why combinatorics is such a huge thing and why pascals triangle/ the binomial coefficents shows up literally everywhere in STEM. Pascals triangle directly governs the amount of microstates a finite computational phase space can access as well as the distribution of order and entropy within that phase space. Because the universe itself is a finite state representation system with 10^122 bits to form microstates, it too is governed by pascals triangle. On the 10^122th row of pascals triangle is the series of binomial coefficent distribution encoding of all possible microstates our universe can possibly evolve into.
This perspective also clears up the apparent mysterious mechanics of quantum superposition collapse and the principle of least action.
A superposition is literally just a collection of unactualized computationally accessable microstate paths a particle/ algorithm could travel through superimposed ontop of eachother. with time and energy being paid as the actional resource cost for searching through and collapsing all possible iterative states at that step in time into one definitive state. No matter if its a detector interaction ,observer interaction, or particle collision interaction, same difference. each possible microstate is seperated by exactly one single bit flip worth of difference in microstate path outcomes creating a definitive distinction between two near-equivalent states.
The choice of which microstate gets selected is statistical/combinatoric in nature. Each microstate is statistically probability weighted based on its entropy/order. Entropy is a kind of ‘information-theoretic weight’ property that affects actualization probability of that microstate based on location in pascals triangle, and its directly tied to algorithmic complexity (more complex microstates that create unique meaningful patterns of information are harder to form randomly from scratch compared to a soupy random cloud state and thus rarer).
Measurement happens when a superposition of microstates entangles with a detector causing an actualized bit of distinction within the sensors informational state. Its all about the interaction between representational systems and the collapsing of all possible superposition microstates into an actualized distinct collection of bits of information.
Plancks constant is really about the energy cost of actualizing a definitive microstate of information from a quantum superposition at smaller and smaller scales of measurement in space or time. The energy cost of distinguishing between two bits of information at a scale smaller than the planck length or at a time interval smaller than planck time will cost more energy than the universe allows in one area (it would create a kugelblitz pure energy black hole if you tried) and so any computational microstate path that differs from another with less than a plancks length worth of disctinction are indistinguishable, blurring together and creating a bedrock limit scale for computational actualization.
But for me, personally, I believe that at some point our own language breaks down because it isn’t quite adapted to dealing with these types of questions, as is again in some sense reminiscent of both Godel and quantum mechanics, if you would allow the stretch. It is undeterminability that is the key, the “event horizon” of knowledge as it were.
language is symbolic representation, cognition and conversation are computations as your neural network traces paths through your activation atlas. Our ability to talk about abstractions is tied to how complex they are to abstract about/model in our mind. The cool thing is language evolves as our understanding does so we can convey novel new concepts or perspectives that didn’t exist before.
tomiant@piefed.social 8 hours ago
You are absolutely right, and it is obviously not that simple, I have some tentative leads which, based on recent discoveries such as just this apparent “axel” of the Universe, could actually mean something significant. I am aware the hypothesis has been proposed previously, I believe in the 80’s, and it was not taken seriously, and as I read the news about this new discovery, to me it struck me as a smoking gun moment- if there is an axel or spin to the Universe itself, that would be a tale sign of a black hole seen from a certain perspective (outside of the Universe, which poses a problem, but not really, because we know about dimensions, so it could in my mind hypothetically itself be explained even if I am not sure what that would look like if you catch my drift).
To me, as a layman, and just fanatic lover of space (and time), I just pareidolia my way into hypotheses, and more often than not there’s some reason to it. So that’s my ground. I wrote quite a bit about it and did some research into various theories surrounding it, and there has been a lot of renewed interest in this Black Hole hypothesis ever since this discovery was made because apparently I was not alone in drawing possible conclusions.
Thank you so much for your detailed answer, I will go through it now and read what you wrote and what you posted, to get a better understanding, because obviously I am nowhere near qualified to make such bold statements as I have done. But did anyway.
<3
redsand@lemmy.dbzer0.com 6 hours ago
I wish I had geohotz email to give you both. He was trying to find a way to hack the universe for physics clues for a long time.
cobalt32@lemmy.blahaj.zone 6 hours ago
I recommend you check out the YouTube channel PBS Space Time. They have some excellent videos explaining the holographic principle and all the background knowledge needed to understand it.
Fluke@feddit.uk 6 hours ago
2 does not preclude the observable universe being but a fraction of the entire universe, in which there is a centre about which everything else rotates, or at least, a hitherto unimagined mass on a truly universal scale that chunks of universe the size of our “observable” orbit.
applebusch@lemmy.blahaj.zone 5 hours ago
No it’s not. You can tell you’re rotating using internal sensors alone, unlike linear velocity. Just because the magnitude of the earth’s rotation is small compared to our biological sense of rotation doesn’t mean rotation is relative the same way linear motion is. You’re drawing a false connection between the fact that you can’t tell if you’re accelerating under gravity or not to rotation, which is fundamentally different. Also you don’t rotate with units of linear velocity, but with units of angular velocity. The earth rotates at about 0.0042 deg/s, which is very slow. You rotate many orders of magnitude faster than that rolling over in bed or turning your head.
The universe having a nonzero total angular momentum does indeed imply an axis of rotation, but our theories don’t explicitly rule that out. Given the size of the universe, the rate of rotation would be inconceivably small compared to the earth, and extremely difficult to measure. Most of my problem with your statements on this point is you’re assuming current hypotheses to be confirmed fact set in stone, which is not true.