hi
AI
Hello! How can I assist you today?

do you remember the number I have asked you to remember?
Continue readinghi
AI
Hello! How can I assist you today?
do you remember the number I have asked you to remember?
Continue reading →I was playing with time vector and and entropy, leading to the concept of time dilation as a function of temperature so one might say that time is locally “stopping” for the matter at absolute zero.
In a nutshell the concept means that the dependency of the time/information propagation depends not only on mass of matter, but also on temperature and as we go to the lower temperatures down to absolute zero, the entropy is reducing down to zero as well and essentially there should be no propagation of information through the space. One also might say that once the thermal energy is zero, for given non-zero mass the speed of light/information goes to zero as well. The phenomena can be associated with effective temperature time dilation.
I liked the concept as it connected several factors (even though might contradict some others :)) It can connect quantum-level and very large scale behaviors.
Searching for macro level confirmation I thought that small deviations on cosmic background temperature could expose the dilation function. This way, the large matter density clusters of galaxies could have higher background temperature than low matter density and lead to a smaller time dilation. Similarly for galaxy arms, the lower proximity to a center of the galaxy and the arms, could lead to a lower temperature of the background, thus smaller gradients of change (higher time dilation).
Potentially the temperature driven time dilation could explain the lensing deviations. Assuming a very large and very cold body such a black hole (not taking into account the falling matter of accretion) it would increase the lensing effect then in the proximity of the body. In the model it would be also interesting to observe the impact of high temperature gradients of active BH-star pairs within the accretion disk.
Low matter density of voids also could bias the time dilation, thus effectively the distortion of their assessed size.
It would be interesting to examine the temperature time dilation model on CMB or clumping. I believe in the beauty of math, but a numerical path can be faster to show whether it is viable or not.
The concept also might solve other contradictions of the Dark Matter. It might explain the correlation to the amount of matter and temperature in the galaxies. Highly dynamic large systems might be explained by large-scale thermodynamics with the application of the model.
Can it be that expansion of Universe is biased by its reducing inter-galactic temperature and associated time dilation?
The concept would reduce a need in an introduction of a new particle or a need in gravitational distortion adjustment.
#Thoughts from physicists’ asylum 😉
There is no standard agreed naming for generations and in the past the naming was a major event driven (e.g. post/WWII) nor the range and we are at GenZ which left not too much to go, it might be a good strategy to come up with a scalable method.
First let’s start with the range as it can give the sense of granularity. As of X/Y/Z the range is 15 years, which might make sense as it is the age of getting to a maturity, thus changed behavioral pattern. However taking into account the exponential technological growth impact on accelerated maturity and rate of significant changes, 15 years sampling might be too gross.
Proposal is to lower it down to 10 years due to the reasons above and since it is easier to multiply.
Now, as of the reference point, we can take 2020, or better year 2000 as the easy baseline to add and remember the multiplications of 10.
The naming convention then would be a scalable order of decade counting from 2000. e.g. children born between 2000 and 2009 would be Gen0, born at 2020-2029 would be Gen2 and so on.
Imagination gives us an ability of seeing ourselves from the outside to realize how we come across to us. If this kind of amazing recursive perception is (possibly) unique to human being, that might be a critical point in evolution of humanoid. Imagine what was it to imagine yourself for the very first time! Now we could imagine ourselves in the past (remember) and then use the past-to present extrapolation to imagine ourselves in the future – first 2st-person (point of view) dreams.
To do that we had to get at the beginning some important abilities – to have first conscious predictions and ability to imagine alternative reality.
When we dream and extrapolate our imaginary future based on memories, our minds build the time-independent (or salable) sampled (in neural system) structure for subset of subjective realities. Essentially from this point we could build new alternative realities. At the very beginning it was only extrapolation of existing reality (to predict and get evolutional competitive advantage), but then the extended deviation in prediction caused outliers – imagination of something totally not related to reality. What was that first time? Maybe two-headed dangerous predator was giving a birth to a fear of imaginary? Maybe the will of getting a mate produced an image of some beautiful creature – first dreams of love ?
These days, the animals “actively recall themselves” and not just respond to a recorded past (“state machine”) as we know it happens in dreams, when they imagine themselves running, they are moving legs.
And so it begins – the path to a self.
The Scientists Oath is about a non-disclosure and protection of scientific discoveries from getting to military or other institutions, in case they provide crucially high power that can lead to a disproportionate domination or discrimination, or distinction of human race, or irreversible damage to the environment.
How slow humans and the whole living world is going to be for coming AI consciousness? Our perception of time is limited by our physiology. Our censorial and analytical signals propagation got clear boundaries of time domain in resolution and latency. Similarly, for AI the perception of time would be limited by design and architecture of underlying systems.
Is there a way to calculate the relative perceptional time ratio between us and the AI?
For us, what would be that indicator? A max “clock” of neural activity? Atomic propagation of signal within ion channel? Width of action-potential (>1mS)? Resonance of epilepsy (>2Hz)? Spatial average of signal propagation latency within a brain? What would it be for AI? System clock? Some KPI, based on typical workloads?
Anyways, once conscious AI will arise, we might stall in time and space from its perspective and be observed as a large static piece of art.