https://github.com/ewdlop/digital-physics-https-en.wikipedia.org-wiki-hypercomputation
https://en.wikipedia.org/wiki/Hypercomputation#Analysis_of_capabilities; The fact that we can doing calcuatio nis because we are within.
https://github.com/ewdlop/digital-physics-https-en.wikipedia.org-wiki-hypercomputation
base-e digital-physics hypercomputation malament-hogarth-spacetime plancks-engineering
Last synced: 8 months ago
JSON representation
https://en.wikipedia.org/wiki/Hypercomputation#Analysis_of_capabilities; The fact that we can doing calcuatio nis because we are within.
- Host: GitHub
- URL: https://github.com/ewdlop/digital-physics-https-en.wikipedia.org-wiki-hypercomputation
- Owner: ewdlop
- Created: 2024-11-13T16:19:48.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-03-04T03:37:32.000Z (about 1 year ago)
- Last Synced: 2025-03-04T04:20:24.882Z (about 1 year ago)
- Topics: base-e, digital-physics, hypercomputation, malament-hogarth-spacetime, plancks-engineering
- Language: Python
- Homepage:
- Size: 61.5 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Digital-Physics-https-en.wikipedia.org-wiki-Hypercomputation
https://en.wikipedia.org/wiki/Digital_physics
https://en.wikipedia.org/wiki/Hypercomputation
https://www.journals.uchicago.edu/doi/10.1093/bjps/axn031 , ,,,,, -->>>>>.
--##
https://www.youtube.com/watch?v=PFHU5NfvWpo
###
I be professor did you meant to say this....did you mean...
or damn i be slowed too!
or someone be curious and be wondering too..however,, it is daunting...
and some be triyng to figureo ut how to....for a while now.....(narcsistic ego makes you belive you were the only one and something kept gave you a nudge and be realizing...)
vs
I be thinking, maybe some folks be feeling overwhelmed, like they can't keep up with all the changes. It's like they be trying to catch a speeding train, but it's moving too fast. And then there's those who be wanting to learn, but they don't know where to start, you feel me
~~Oh~~ <--(voice to text correction)
###
(what did Ai saw)(what did i thought?)(what did AI thought?)(what is AI not telling? vs what is AI not albe to tell itself?)
To explore the size of digital information at the Planck scale, we need to consider the fundamental units of space, time, and energy, as defined by Planck units.
---
### **Key Concepts**
1. **Planck Length (\(l_p\))**:
\( l_p = \sqrt{\frac{\hbar G}{c^3}} \approx 1.616 \times 10^{-35} \, \text{meters} \).
This is the smallest meaningful length in quantum mechanics.
2. **Planck Area**:
A two-dimensional slice at the Planck scale has an area \( A_p = l_p^2 \approx 2.612 \times 10^{-70} \, \text{m}^2 \).
3. **Planck Volume**:
\( V_p = l_p^3 \approx 4.22 \times 10^{-105} \, \text{m}^3 \).
This is the smallest meaningful three-dimensional volume.
4. **Digital Information**:
If we were to store digital information (e.g., bits) within a Planck-scale system, each bit would correspond to a region of space or a quantum state.
---
### **Calculation of Bit Density at Planck Scale**
#### Information in a Volume:
Shannon's information theory and Bekenstein's bound imply that the maximum amount of information (in bits) that can be stored in a physical system is related to its area and energy.
1. **Bekenstein Bound**:
Information is proportional to the surface area of the boundary in Planck units:
\[
I_{\text{max}} = \frac{A}{4 l_p^2}
\]
For a Planck-scale area, \( A = l_p^2 \):
\[
I_{\text{max}} = \frac{l_p^2}{4 l_p^2} = \frac{1}{4} \, \text{bits}.
\]
This means, in the strictest sense, 1 bit requires approximately \( 4 l_p^2 \) of area.
2. **Planck Volume Capacity**:
A Planck-scale cube of volume \( V_p = l_p^3 \) could store a single bit if each Planck area on its surface contributes \( \frac{1}{4} \) bits, considering the holographic principle.
---
### **Practical Implications**
1. **Bit Size at Planck Scale**:
A single bit requires approximately \( l_p^2 \) area (2D) or \( l_p^3 \) volume (3D).
2. **Density of Information**:
If information is distributed across a larger scale, the density of bits would scale inversely with \( l_p^2 \) (for area) or \( l_p^3 \) (for volume). For example:
\[
\text{Information Density} \propto \frac{1}{l_p^3} \approx 2.37 \times 10^{104} \, \text{bits/m}^3.
\]
---
### **Conclusion**
At the Planck scale:
- **One bit of information** can be encoded in approximately \( l_p^3 \approx 4.22 \times 10^{-105} \, \text{m}^3 \) of volume.
- **Information density** at the Planck scale is incredibly high, around \( 10^{104} \, \text{bits/m}^3 \).
This theoretical limit illustrates the quantum mechanical constraints of information storage and processing at the smallest scales of spacetime.
#### 10^{104} bits = 2^10^{104} combinaticombinations