Naked Science Forum
On the Lighter Side => New Theories => Topic started by: larens on 01/04/2020 03:56:38

Since new theories here are on the "Lighter Side", one can only have general discussions about them. Any realistic new theory is otherwise too mathematically on the "Heavy Side". So let us assume that the next great breakthrough in science is going to transcend Newton's calculus. General relativity and quantum mechanics are based on tensor and complex calculus, respectively. The Standard Model is based on infinite groups on differentiable spaces. All this calculus is eventually based on natural numbers. To work with something truly different we need to go deeper into mathematics and use the sets in the Classification of finite simple groups, along with some logic. Let us assume that the next great breakthrough in science is going be the greatest since Newton and use this deeper math. How is the person who comes up with this theory ever going to be heard in today's intellectual world?
There were so few students in Newton's day that sometimes he taught classes with no students. Today our degree mills churn out PhDs indoctrinated in the standard assumptions of our calculus based scientific paradigm. Like a horde of locusts they fill up all the positions for professors, editors, peer reviewers, and grant administrators. Our creative new scientist is not going to have a PhD based on the new theory. How would they been able to find an advisor? Academia is so specialized today that no one else is going to be knowledgeable in the broad scope of fields that a fundamentally new theory covers. The new scientist will have to have spent a long time as an autodidact to have learned so much themselves. This likely means that they will have to have renounced standard curricula since childhood and studied on their own.
Once they come up with the basic theory they will have to overcome the barrier of being denounced as a "crackpot", since they will not have acquired the social status necessary to present a breakthrough theory. Their emails will be ignored. Their posts online will be deleted. Self appointed gatekeepers will destroy threads that are not deleted by creating a torrent of superfluous objections and questions until almost no one else is willing to even read the threads. People will demand that they present the full theory in a form that can be understood by neophytes within the short attention span characteristic of the Internet. A century ago the famous Indian autodidact Ramanujan only had his work accepted by British mathematicians, because G. H. Hardy had second thoughts about throwing it away and took a second look. Ramanujan's work was broken into small pieces that Hardy could analyze individually. The validity of a new general theory can only be determined by analyzing its entire core, so our creative new scientist does not have Ramanujan's advantage.
The equivalent today of going to Hardy would be to go to Piet Hut, the Head of the Program in Interdisciplinary Studies at the Institute for Advanced Studies in Princeton. A major difference is that Hut had to sue to retain his tenured position after string theorists tried to get him fired. They felt that his work was less important than their string theory, which was their pet candidate for the "Theory of Everything". Though Hut gets only minimal resources from the Institute, he has become successful at raising money and creating institutes elsewhere to pursue his work. Does our creative new scientist have to depend on Piet Hut or are there other potentially successful strategies to be recognized? What do you think?

Since new theories here are on the "Lighter Side", one can only have general discussions about them. Any realistic new theory is otherwise too mathematically on the "Heavy Side".
This is an incorrect interpretation of the “Lighter Side”. It is only called the lighter side because the rules are relaxed compared to the educational side, so new theorists do not have to adhere to current accepted theories.
They are, however, free to provide as much mathematical depth as they wish, but it has been to our great sadness that most seem unwilling, and often unable, to do so preferring to remain in the vague and undefined regions.

There is a bit of humor in my post. When seeing the phrase "On the Lighter Side" I decided to take a new approach. My experience is that directly presenting new theory online to a general scientific audience is rarely satisfactory. I get bombarded by questions where the questioner often displays there inability to even use a search engine effectively. I sometimes tell people that, if I do not give a citation or link Wikipedia has an adequate explanation.
Professional articles have abstracts. I would like to have feedback at the abstract level on what people think is important to them. As a writer I will have to keep them engaged by knowing this. When presenting material that violates common dogma, I may have to do a little brainstretching along the lines of Enrico Fermi's famous question to his beginning physics students, "How many blind piano tuners are there in Chicago?"

I may have to do a little brainstretching along the lines of Enrico Fermi's famous question to his beginning physics students, "How many blind piano tuners are there in Chicago?"
When you say "brain stretching", do you mean getting stuff wrong?
How many blind piano tuners are there in Chicago?"[

So... do you actually have a new idea to share with us?

@BoredChemist  Thank you for pointing out that Fermi's question with the word "blind" is an urban legend. The oral version I got at the University of Chicago several decades ago had already picked up the word "blind". I remember thinking about how "blind" made it a much more tantalizing question. Chicago ranks second to CalTech as the university with the highest IQ people. They had changed it, perhaps unconsciously, to the more tantalizing form.
@Kryptid  I presented several new ideas in my post. I have a great number of options for proceeding. What would you be most interested in, e.g., an overall summary, mathematical elaboration, the probable consequences of such a new theory?

Give us a summary with supporting evidence.

I will first give you a summary. The supporting evidence is far too large to fit together with the summary. Because the theory fits together like a large jigsaw puzzle, it will have to be broken down into manageable sections to be fully explained. In the meantime you will just have to take my word that it does correlate well with available data.
The starting point is a natural trinity of categories of existence, i.e. material, mathematics, and consciousness. This arises by breaking the triadic symmetry in the Equivalential Calculus by three relations to time. Material goes forward; mathematics is time invariant; consciousness is governed by forward and backward causality. The basic idea shows up in philosophy from Plato to Piet Hut. The Math > Material relation is basic to Einstein's 1933 lecture on "simplicity". In the 1960s algorithmic simplicity was developed, but did not give unique answers for the values of fundamental physical constants. The next necessary idea was to find privileged positions in the topology of natural numbers that arise from the order of sets in the Classification of finite simple groups (FSG). To get unique answers one also needs to use the principle of serendipity, i.e., mathematical expressions that contain an improbable correlation are the ones which occur in nature. The Material <  > Consciousness symmetry is used to match numerical values to observed data. These do not follow the conventional physicalist reductionism: physics > chemistry > biology > psychology > sociology .
The origin of life falls in a neutral position in this last symmetry, so is covered by a separate strongly classical theory that describes the origin of precellular life in a warm spring on a former satellite of the asteroid Vesta. This "blob" life developed cells on its surface, which were then transported to Earth.
The group Mathieu24 has a privileged position in the FSG Classification. It gives the structure of the two extended Golay error correcting codes. The binary code relates to a 12D chemistry that projects down onto 3D as quasicrystalline patterns. It explains the stability of the otherwise anomalous mineral icasohedrite. The ternary code relates to 6 revolutionary periods in the development of global culture extending from the Bolling climate warming 14,800 years ago to the present. Plato gave the length of these periods as a riddle in the Republic, which heretofore had not been solved. The correct answer is 21621600 (hours), Ramanujan's 13th superior highly composite number.
Dark matter is explained by superfluid axion clouds following the rules of 4th derivative conformal gravity. These rules also explain the size of galaxies and their clustering. The observed slight decrease in the strength of the electromagnetic and gravitational fields with redshift are explained by their generally irrational values being rational in the present. The topology of the universe is a Poincare Dodecahedral Space, The universe is equally geocentric and nongeocentric. The alignment of distant patterns in the universe with the Ecliptic, derogatorily named the Axis of Evil, are the best recognized geocentric phenomena today. Much stronger ones are our beautiful solar eclipses, These arise because of the nearly equal angular diameters of the Sun and Moon, which are given by the theory. The psychological time periods of second, hour, day, and week are also given.
I will stop here for today, because any further listing of phenomena and values will be anticlimactic.

Conventional theory is included in the new theory by the principle of plentitude, because the orders of the cyclic groups are the prime numbers, which lie at the foundation of conventional theory. Other than introducing new explanations, the best evidence for a new theory resides in its ability to explain anomalies in older theory without introducing too many new ones. Currently there are two big problems in the theory of the vacuum. The first is resolved by having a derived value for the cosmological constant (dark energy). The second, the lack of stability of the vacuum during the Big Bang, is resolved by having a derived nominal value for the mass of the top quark just at the top of the range of stability. The slightly heavier observed mass can be explained by the finiteness in the new theory. The current epoch is a privileged time in the new theory, which gives the current age of the universe. Thus variable "constants", such as, the cosmological constant are given as their current values. The current entropy of the universe is derived from the Monster group family.
The derivation of quark and lepton masses uses Koide relations. Observed values, however, diverge widely from these relations around one Dalton, so another important principle needs to be discovered.
The five Platonic solids relate to the lengths of CH and CC bonds, the density of lithium, and the dalton/electron mass ratio. These, along with the aforementioned 12D projections, give a new, though complicated, basis for chemistry. The atomic mass number of the cold fusionfission reaction complex appears in the theory, though I have not yet developed a general explanation for condensed matter nuclear reactions.
Through the irregularity of finite sequences the Lie groups of the FSG Classification give rational factors of order one in the derivation of some values, e.g., those for the SunEarthMoon system. I interpret these as a group uncertainty principle arising from the finite bandwidth of observation. A while ago I set aside determining the application of the complete set of factors, but now have enough mathematical tools to take a new look at it.
I think this is enough to chew on for the moment. I have not given links to the important phrases, because I want people to first get a flavor of how the theory works. Most of them can easily be found by a simple search anyway.

The starting point is a natural trinity of categories of existence, i.e. material, mathematics, and consciousness.
OK, that's the first thing you need to provide evidence for.

What would you consider evidence? The natural trinity is one of the early axioms of the theory. I have been showing that they lead to elegant and efficacious science, which is the only way one can evaluate a fundamentally new theory.

The trouble with axioms is that they need to be clearly true (if you are modeling teh real world).
It also needs to be clear what they mean.
What would you consider evidence?
Evidence is a reason to think that what you say is true.
What reason is there for me to accept your assertion?

Other than introducing new explanations, the best evidence for a new theory resides in its ability to explain anomalies in older theory without introducing too many new ones.
I'd say the best evidence would be to make falsifiable predictions that were found out to be true. Case in point, the gravitational lensing predicted by relativity.

@BoredChemist  The axioms of quantum mechanics are not "clearly true" to the vast majority of people. Only you can give a reason for which you might accept my assertion. I am not privy to your thoughts.
@Kryptid  One has to convince people of the probable validity of a new theory before they are willing to conduct experiments that may potentially falsify that theory. I have been proposing measuring the mass of the axion by a throughthewall experiment as the best experiment to potentially falsify (at least that part of) my theory.
@whomever  So to make my summary more elegant, I need to explain my aforementioned phrase "topology of natural numbers". It covers a key piece of math that I found too large to explain early in my summary. Starting from FSG rather than Peano axioms makes the divisor function privileged. Classifying numbers by their divisibility then selects the primes, powers of ten, and superior highly composite (HC) numbers as the least, middle, and most highly divisible. Selecting three numbers from each sequence then makes a 3x3 array, For the first two sequences one just selects the first three, i.e., 2, 3, 5 and 10, 100, 1000. This does not work for the third sequence, because 2 is also a prime. The three factors that select the three superior HC numbers are:
* 6 is the square of 36, which is the largest square that is HC.
* 5040 is the largest HC number into which all smaller HC numbers divide.
*21621600 is the only superior HC number for which the ratio of prime divisors to unique prime divisors is greater than 2.
The number in the middle of the array 100 is the Goldy Locks number  with number of divisors neither too large or too small. but just right. It is an argument of exponential functions that give many of the important values in the theory.

I have been proposing measuring the mass of the axion by a throughthewall experiment as the best experiment to potentially falsify (at least that part of) my theory.
What does your model predict the mass of the axion to be?

The proposed mass of the axion is 0.529465 meV/c^2.

The proposed mass of the axion is 0.529465 meV/c^2.
That's extremely similar to the electron mass. One would wonder why we hadn't seen it in particle accelerators if it was in that range (or an equivalent lack of detection. That is, a lot of massenergy suddenly going missing because it was converted into these axions).

"meV" is the abbreviation for millieV, not megaeV, whose abbreviation is "MeV". The reason that no one has yet detected axions is that no one has yet built a detector in the proper range. Astronomical constraints give a maximum mass slightly larger than my proposed value.

"meV" is the abbreviation for millieV, not megaeV, whose abbreviation is "MeV".
You are correct. My mistake.
Experiments are continuing to look for axions, so we may find out sooner than later whether you are right or not: https://phys.org/news/202003axiomdarknumerical.html

The ADMX collaboration is not relevant to my theory unless they really find something, because they are looking for masses over 100 times smaller than my proposed value. They are just looking under the streetlamp, because the theoretically interesting range is dark for lack of detectors. It require pushing throughthewall technology to lower masses by about an order of magnitude.

My most accurate proposed value is 137.0359990621 for the local low energy inverse fine structure constant. The CODATA empirical value is 137.035 999 084(21), so they are off by one standard deviation. I claimed a higher accuracy for the dalton/electron mass ratio, but there was an ambiguity in determining the least significant digits, so I am not now pushing it.

The axioms of quantum mechanics are not "clearly true" to the vast majority of people.
Nor are the laws of cricket.
But it's possible to deduce them by watching the game (for long enough).
In the same way, we can watch the universe and deduce the laws of QM.
However, your "axioms" don't seem to follow that pattern.
If there is supporting evidence for them, please show it.

My most accurate proposed value is 137.0359990621 for the local low energy inverse fine structure constant. The CODATA empirical value is 137.035 999 084(21), so they are off by one standard deviation. I claimed a higher accuracy for the dalton/electron mass ratio, but there was an ambiguity in determining the least significant digits, so I am not now pushing it.
Please show your working.

Nor are the laws of cricket.
But it's possible to deduce them by watching the game (for long enough).
In the same way, we can watch the universe and deduce the laws of QM.
However, your "axioms" don't seem to follow that pattern.
If there is supporting evidence for them, please show it.
The universe is more complicated than a cricket game. To find its rules one needs to use sophisticated instruments, e.g., atomic microscopes, large telescopes, and large particle colliders. To get started, however, one needs to analyze one's basic experience before going to a cricket game. Descartes pointed the way in saying, "I think, therefore I am." Therefore consciousness is the most certain category of being. Next he observed that his thoughts were orderly. Thus a way of putting things in order is necessary, which I am calling "language". Thirdly, we are constantly engaged with physical reality. This is the simple experience of the natural trinity.
You may ask, "Why don't we just continue and have 10 categories of being, as Aristotle proposed?" One needs to know when to switch to a deeper analysis. Aristotle also threw out the consideration of atoms, and thought that stars existed on a rotating sphere, He had not considered the limited resolution of eyes and the existence of inertial frames of reference. I am reminded of the message of the Muses in Plato's Republic. They say that empiricists are blind to their message.
The evidence for modern physics theories commonly involves extending results to higher accuracy. In previous replies I gave my high accuracy predictions for the mass of the axion and the value of the inverse fine structure constant. The latter is really a prediction that the observed value is not going to change. I made the prediction many years ago. For several years the CODATA value shifted to a difference greater than two standard deviations. Now it is back to just one standard deviation. This implies that their value was flawed by a systematic error for a while. If this is not evidence, than what is?
My determination of the value of the inverse fine structure constant extends from the use of the number 100 in exponential functions that I mentioned in my last post. Explaining it fully will be fairly long. I need to set up separate threads for such specific explanations. Otherwise this thread will become unmanageable, since it is about a general theory that covers all branches of science. In the meantime I would like other people to address the question, "What form of evidence for this entire theory would I find most convincing?"

In the meantime I would like other people to address the question, "What form of evidence for this entire theory would I find most convincing?"
Since you made a prediction for the axion's mass, the ability to replicate the spectrum of other particle masses would be nice.

I agree that prime numbers have been a fruitful area of investigation for mathematicians since at least the early Greek mathematicians.
 And it's still important: Many published theorems in mathematics depend on the (as yet unproven) Riemann hypothesis
 The Riemann hypothesis relates to the prime numbers
 One mathematician described this proof as "One of the hardest ways to win a million dollars": There is a prize of 1 million dollars to the first mathematician who proves (or disproves) the Riemann hypothesis.
 Mathematicians think it is true  and I'm sure many hope it is true; their careers were based on this assumption.
All this calculus is eventually based on natural numbers. To work with something truly different we need to go deeper into mathematics
I suggest that a fruitful area of new mathematics has been fractals with their fractional dimensions, and chaos theory, with strange attractors.
We have been able to apply it to applications like the length of coastlines and behavior of heart rhythms. But important applications still exist in economics, ecosystems and climate.
 People tend to think "everything will continue as it has before", but the weight of humanity is now large enough to tip the scales of ecosystems and the climate.
 This is likely to dump us into unfamiliar territory
 So gaining an understanding of chaotic systems is important
See: https://en.wikipedia.org/wiki/Attractor
https://en.wikipedia.org/wiki/Fractal
Since new theories here are on the "Lighter Side"
I agree that the "Lighter Side" is a good place to start a discussion of philosophy.
Do expect that if it is to be accepted as a "Theory of Everything" with concrete predictions, then you will be expected to justify your assumptions.
There were so few students in Newton's day that sometimes he taught classes with no students.
Universities were shut down 16551656 due to the black plague.
 A bit like universities today with COVID19
 The difference is that students can now view the sole lecturer from a remote location, and from a remote timezone.

I tossed out the value of the inverse fine structure constant as bait, since people were not being specific about what they regarded as evidence. I do not appreciate the strategy of people just demanding more and more evidence with no commitment to ever being satisfied.

It would be nice, if the thread were not being shadow banned since reply #20. I tossed out the value of the inverse fine structure constant as bait, since people were not being specific about what they regarded as evidence. I do not appreciate the strategy of people just demanding more and more evidence with no commitment to ever being satisfied.
I was rather specific in reply #24.

@evan_au
* Since Ramanujan's 1915 article on highly composite numbers, they are also important. Robin's Theorem connects them to the Riemann Hypothesis.
* The derivation of the value of the inverse fine structure constant uses fractional dimensions. I have used fractals themselves in interpreting reductionism within my theory.
* The solar system is on the boundary of chaos. While this is probably important at a deeper level, I have instead used semiclassical chemistry to describe the origin of life in the solar system.
* Fractals and chaos are not key to deriving fundamental constants. Finite simple groups are.
* Our main disagreement has been on how to order the justification of my assumptions. A "Theory of Everything" is so large that this is a nontrivial problem. I would start at a middle level of abstraction, then alternate between building superstructure and foundations. This still leaves a lot of choice, e.g., how to order the explanation of different disciplines. When I am presented with a criterion that was accepted in Euclid's day, but not since at least the time of Copernicus, I do not think that person is being intellectually responsible.
* Newton did not start teaching until after the plague.
@Kryptid
I explained in reply #8 that the part of the theory giving particle masses was incomplete, so I set it aside for things that are more accurate. In the limit of free particles, it is fairly accurate. The geometry behind the Koide relations is also nonintuitive enough, that I would not want to start there.

I tossed out the value of the inverse fine structure constant as bait, since people were not being specific about what they regarded as evidence.
And I asked you how you arrived at the number you got.
My most accurate proposed value is 137.0359990621 for the local low energy inverse fine structure constant. The CODATA empirical value is 137.035 999 084(21), so they are off by one standard deviation. I claimed a higher accuracy for the dalton/electron mass ratio, but there was an ambiguity in determining the least significant digits, so I am not now pushing it.
Please show your working.
I'm still waiting.

The value of the inverse fine structure constant (VIFSC) is a good example of a serendipity, i.e., improbable coincidence. Its derivation includes the quotient of two large numbers minus a privileged number being less than one. It uses five numbers privileged by being in a symmetrical pair or central line of the array I described in reply #13. These are 2, 5, 10, 100, and 1000. The primordial VIFSC is 1000[(product of first 2^5 primes)/(pi^100)10]. Pi is paired with e. The ratio of primordial gravitational forces between a pair of reduced Planck masses and a pair of electrons is e^100. The local VIFSC is 137 + 36/1000  937900/10^12. The integral part of the primordial VIFSC is 137. The largest highly composite number that is a square is 36. It is also used in the derivation of 937900, the ratio of the speed of light to the unit velocity of the molecular system of units in the theory. The speed of sound in the atmosphere is about one unit velocity. 937900 is derived by an inverse preferred number operation. It has 36 divisors with consecutive divisors nominally differing by a factor of the square root of 2 before rounding. The exponents of powers of 1000 fall in a sequence with successive squaring by 2. The nominal mass of the heaviest particle, the tritop, in electron masses is 1000^2. The last numerator in the local VIFSC is 1000^4. The minus sign arises because of the relativity of values. My including the associated fundamental constants allows you to see how the theory fits together like a jigsaw puzzle.

In the meantime I would like other people to address the question, "What form of evidence for this entire theory would I find most convincing?"
Since you made a prediction for the axion's mass, the ability to replicate the spectrum of other particle masses would be nice.
Wikipedia has a short introduction to the Koide relations:
https://en.wikipedia.org/wiki/Koide_formula
To apply them to quarks and leptons one needs the masses of the top quark and the tau in electron masses. The tritop consists of 3 top quarks, so the top quark mass is 1/3 that of the tritop, which I gave as one million in my last reply #30.
The tau mass is approximately the sum of the first 2^5 primes, so pairs with the main factor in the numerator of the primordial VIFSC in my last reply.

I suggest that a fruitful area of new mathematics has been fractals with their fractional dimensions, and chaos theory, with strange attractors.
The nonLie type of finite simple groups (FSG) are fundamentally more important than fractals and chaos, because they allow the derivation of fundamental constants. Lie groups are differentiable, so have an infinite number of points in their manifolds. Infinite sets are usually pathological for deriving constants, because infinity + anything = infinity. In reply #7 I started to discuss how extending the concept of simplicity leads to unique constants for the universe. The basic idea of simplicity is that you describe something large from a minimum number of axioms. FSG have just 6 axioms: closure, associativity, identity, invertibility, finiteness, and simplicity (indivisibility). FSG are a large set of mathematical building blocks. The Classification theorem insures that one knows of all of them when using them.

I'm still waiting.

I'm still waiting.
You may be waiting a long time. I am not going to explain my entire theory to someone who is not seriously engaged in considering my work and has a strategy of just perfunctorily demanding more.

I'm still waiting.
You may be waiting a long time. I am not going to explain my entire theory to someone who is not seriously engaged in considering my work and has a strategy of just perfunctorily demanding more.
What I am "demanding" is the bare minimum.
Good luck with your studies.
"How is the biggest scientific breakthrough since Newton to be recognized?"
It doesn't look like we will find out any time soon.

What I am "demanding" is the bare minimum.
Once again you are making a perfunctory demand while refusing to define your terms. Supposed this is the "soft side" of the forum. I have presented enough, however, that someone knowledgeable in fundamental mathematical science should be able to easily understand the essence of my theory. I note a double standard here. Contemporary scientists have been able to derive ZERO fundamental constants. They have been allowed to get away with espousing superficial philosophy, e.g., Tegmark's mathematical universe and the multiverse based Anthropic Principle.

Unlike with the quarks and leptons, I never showed how to derive the axion mass. It is the mass equivalent to the kinetic energy of one Dalton moving at one unit velocity of the theory, i.e., c/937900, which I explained in reply #30. Neutrino/axion mass ratios are 2, 18, and 98. These value are squares times two. Neutrino masses can only be measured as differences of mass squared. The two is necessary to cancel the 1/2 in the kinetic energy formula.

Once again you are making a perfunctory demand while refusing to define your terms.
Do you really need an explanation of "Please show your working" next to a quote where you tell us your estimate of the fine structure constant. OK, if you insist.
Show us the calculation and the inputs.
As I say, that's the bare minimum that will get you taken remotely seriously.

Show us the calculation and the inputs.
You sound like a math teacher. This is not a math class. Scientific papers do not show arithmetic. They give algebraic expressions and any inputs not standardly known. This is the minimum for scientific papers. They also give a context to support the rationalization for using the given expressions and inputs. All this I minimally gave in reply #30.

All this I minimally gave in reply #30.
You gave numerology.
Were you expecting to be taken seriously?

You gave numerology.
Were you expecting to be taken seriously?
Yes, I was giving numerology. It is the only way one can derive fundamental constants from pure math. Numerology fell into disfavor after early 20th century physicists, particularly Eddington, failed to make it work.
I am used to not being taken seriously, because disfavor has become dogma. That is why I posted my next thread on the origin of life. It uses noncontroversial chemistry, rather than numerology. I still have to overcome the dogma that a wet planet promotes early life, rather than being a toxic environment for it.

Quote from evan_au on 03/04/2020 22:22:42
"I suggest that a fruitful area of new mathematics has been fractals with their fractional dimensions, and chaos theory, with strange attractors."
The Solar system is on the boundary of chaos. A simulation that included the planets and the asteroids Ceres and Vesta was only able to make significant projections 70 million years into the future, because it was chaotic. This is evidence for the mathematization of reality, where the origin of life lies on a neutral point in two dimensions. Firstly, it is the biotic/abiotic point on the scale of complexity running from particle physics to global history. Secondly, it is a periodic/chaotic point in the astronomical realm.
The peculiar interaction between Ceres and Vesta is not necessary for the origin of life. All that is necessary is that Vesta and its satellite are removed from near Earth orbit so that they do not collide with Earth and destroy the life that has been seeded there. This could be accomplished more easily by just colliding with another inner Solar system planet. To have a unique mathematization the periodic/chaotic transition needs to be in a privileged location. Combining it with the biotic/abiotic point logically satisfies this constraint.

Quote from Kryptid on: 02/04/2020 21:07:14
I'd say the best evidence would be to make falsifiable predictions that were found out to be true. Case in point, the gravitational lensing predicted by relativity.
My most accurate proposed value is 137.0359990621 for the local low energy inverse fine structure constant. The CODATA empirical value is 137.035 999 084(21), so they are off by one standard deviation.
Researchers at UWSeattle and UCBerkeley have developed better atom interferometers that trap atoms in an optical lattice. This will allow a 16fold more accurate measurement of the fine structure constant. Since they have already built the instruments, these should be the first tests to potentially falsify one of predictions. A slightly different result would not falsify my entire prediction, however,  just the least significant digits of it. It would show that this part of my theory is at least incomplete. My prediction here is connected to a couple of other results  the primordial value of the fine structure constant, and the value of the Dalton/electron mass ratio.