@BoredChemist - The axioms of quantum mechanics are not "clearly true" to the vast majority of people. Only you can give a reason for which you might accept my assertion. I am not privy to your thoughts.
@Kryptid - One has to convince people of the probable validity of a new theory before they are willing to conduct experiments that may potentially falsify that theory. I have been proposing measuring the mass of the axion by a through-the-wall experiment as the best experiment to potentially falsify (at least that part of) my theory.
@whomever - So to make my summary more elegant, I need to explain my aforementioned phrase "topology of natural numbers". It covers a key piece of math that I found too large to explain early in my summary. Starting from FSG rather than Peano axioms makes the divisor function privileged. Classifying numbers by their divisibility then selects the primes, powers of ten, and superior highly composite (HC) numbers as the least, middle, and most highly divisible. Selecting three numbers from each sequence then makes a 3x3 array, For the first two sequences one just selects the first three, i.e., 2, 3, 5 and 10, 100, 1000. This does not work for the third sequence, because 2 is also a prime. The three factors that select the three superior HC numbers are:
* 6 is the square of 36, which is the largest square that is HC.
* 5040 is the largest HC number into which all smaller HC numbers divide.
*21621600 is the only superior HC number for which the ratio of prime divisors to unique prime divisors is greater than 2.
The number in the middle of the array 100 is the Goldy Locks number - with number of divisors neither too large or too small. but just right. It is an argument of exponential functions that give many of the important values in the theory.
What would you consider evidence? The natural trinity is one of the early axioms of the theory. I have been showing that they lead to elegant and efficacious science, which is the only way one can evaluate a fundamentally new theory.
Conventional theory is included in the new theory by the principle of plentitude, because the orders of the cyclic groups are the prime numbers, which lie at the foundation of conventional theory. Other than introducing new explanations, the best evidence for a new theory resides in its ability to explain anomalies in older theory without introducing too many new ones. Currently there are two big problems in the theory of the vacuum. The first is resolved by having a derived value for the cosmological constant (dark energy). The second, the lack of stability of the vacuum during the Big Bang, is resolved by having a derived nominal value for the mass of the top quark just at the top of the range of stability. The slightly heavier observed mass can be explained by the finiteness in the new theory. The current epoch is a privileged time in the new theory, which gives the current age of the universe. Thus variable "constants", such as, the cosmological constant are given as their current values. The current entropy of the universe is derived from the Monster group family.
The derivation of quark and lepton masses uses Koide relations. Observed values, however, diverge widely from these relations around one Dalton, so another important principle needs to be discovered.
The five Platonic solids relate to the lengths of C-H and C-C bonds, the density of lithium, and the dalton/electron mass ratio. These, along with the aforementioned 12-D projections, give a new, though complicated, basis for chemistry. The atomic mass number of the cold fusion-fission reaction complex appears in the theory, though I have not yet developed a general explanation for condensed matter nuclear reactions.
Through the irregularity of finite sequences the Lie groups of the FSG Classification give rational factors of order one in the derivation of some values, e.g., those for the Sun-Earth-Moon system. I interpret these as a group uncertainty principle arising from the finite bandwidth of observation. A while ago I set aside determining the application of the complete set of factors, but now have enough mathematical tools to take a new look at it.
I think this is enough to chew on for the moment. I have not given links to the important phrases, because I want people to first get a flavor of how the theory works. Most of them can easily be found by a simple search anyway.
I will first give you a summary. The supporting evidence is far too large to fit together with the summary. Because the theory fits together like a large jigsaw puzzle, it will have to be broken down into manageable sections to be fully explained. In the meantime you will just have to take my word that it does correlate well with available data.
The starting point is a natural trinity of categories of existence, i.e. material, mathematics, and consciousness. This arises by breaking the triadic symmetry in the Equivalential Calculus by three relations to time. Material goes forward; mathematics is time invariant; consciousness is governed by forward and backward causality. The basic idea shows up in philosophy from Plato to Piet Hut. The Math -> Material relation is basic to Einstein's 1933 lecture on "simplicity". In the 1960s algorithmic simplicity was developed, but did not give unique answers for the values of fundamental physical constants. The next necessary idea was to find privileged positions in the topology of natural numbers that arise from the order of sets in the Classification of finite simple groups (FSG). To get unique answers one also needs to use the principle of serendipity, i.e., mathematical expressions that contain an improbable correlation are the ones which occur in nature. The Material < - > Consciousness symmetry is used to match numerical values to observed data. These do not follow the conventional physicalist reductionism: physics -> chemistry -> biology -> psychology -> sociology .
The origin of life falls in a neutral position in this last symmetry, so is covered by a separate strongly classical theory that describes the origin of precellular life in a warm spring on a former satellite of the asteroid Vesta. This "blob" life developed cells on its surface, which were then transported to Earth.
The group Mathieu-24 has a privileged position in the FSG Classification. It gives the structure of the two extended Golay error correcting codes. The binary code relates to a 12-D chemistry that projects down onto 3-D as quasicrystalline patterns. It explains the stability of the otherwise anomalous mineral icasohedrite. The ternary code relates to 6 revolutionary periods in the development of global culture extending from the Bolling climate warming 14,800 years ago to the present. Plato gave the length of these periods as a riddle in the Republic, which heretofore had not been solved. The correct answer is 21621600 (hours), Ramanujan's 13th superior highly composite number.
Dark matter is explained by superfluid axion clouds following the rules of 4th derivative conformal gravity. These rules also explain the size of galaxies and their clustering. The observed slight decrease in the strength of the electromagnetic and gravitational fields with redshift are explained by their generally irrational values being rational in the present. The topology of the universe is a Poincare Dodecahedral Space, The universe is equally geocentric and nongeocentric. The alignment of distant patterns in the universe with the Ecliptic, derogatorily named the Axis of Evil, are the best recognized geocentric phenomena today. Much stronger ones are our beautiful solar eclipses, These arise because of the nearly equal angular diameters of the Sun and Moon, which are given by the theory. The psychological time periods of second, hour, day, and week are also given.
I will stop here for today, because any further listing of phenomena and values will be anticlimactic.
@BoredChemist - Thank you for pointing out that Fermi's question with the word "blind" is an urban legend. The oral version I got at the University of Chicago several decades ago had already picked up the word "blind". I remember thinking about how "blind" made it a much more tantalizing question. Chicago ranks second to CalTech as the university with the highest IQ people. They had changed it, perhaps unconsciously, to the more tantalizing form.
@Kryptid - I presented several new ideas in my post. I have a great number of options for proceeding. What would you be most interested in, e.g., an overall summary, mathematical elaboration, the probable consequences of such a new theory?
There is a bit of humor in my post. When seeing the phrase "On the Lighter Side" I decided to take a new approach. My experience is that directly presenting new theory online to a general scientific audience is rarely satisfactory. I get bombarded by questions where the questioner often displays there inability to even use a search engine effectively. I sometimes tell people that, if I do not give a citation or link Wikipedia has an adequate explanation.
Professional articles have abstracts. I would like to have feedback at the abstract level on what people think is important to them. As a writer I will have to keep them engaged by knowing this. When presenting material that violates common dogma, I may have to do a little brain-stretching along the lines of Enrico Fermi's famous question to his beginning physics students, "How many blind piano tuners are there in Chicago?"
Since new theories here are on the "Lighter Side", one can only have general discussions about them. Any realistic new theory is otherwise too mathematically on the "Heavy Side". So let us assume that the next great breakthrough in science is going to transcend Newton's calculus. General relativity and quantum mechanics are based on tensor and complex calculus, respectively. The Standard Model is based on infinite groups on differentiable spaces. All this calculus is eventually based on natural numbers. To work with something truly different we need to go deeper into mathematics and use the sets in the Classification of finite simple groups, along with some logic. Let us assume that the next great breakthrough in science is going be the greatest since Newton and use this deeper math. How is the person who comes up with this theory ever going to be heard in today's intellectual world?
There were so few students in Newton's day that sometimes he taught classes with no students. Today our degree mills churn out PhDs indoctrinated in the standard assumptions of our calculus based scientific paradigm. Like a horde of locusts they fill up all the positions for professors, editors, peer reviewers, and grant administrators. Our creative new scientist is not going to have a PhD based on the new theory. How would they been able to find an advisor? Academia is so specialized today that no one else is going to be knowledgeable in the broad scope of fields that a fundamentally new theory covers. The new scientist will have to have spent a long time as an autodidact to have learned so much themselves. This likely means that they will have to have renounced standard curricula since childhood and studied on their own.
Once they come up with the basic theory they will have to overcome the barrier of being denounced as a "crackpot", since they will not have acquired the social status necessary to present a breakthrough theory. Their emails will be ignored. Their posts online will be deleted. Self appointed gatekeepers will destroy threads that are not deleted by creating a torrent of superfluous objections and questions until almost no one else is willing to even read the threads. People will demand that they present the full theory in a form that can be understood by neophytes within the short attention span characteristic of the Internet. A century ago the famous Indian autodidact Ramanujan only had his work accepted by British mathematicians, because G. H. Hardy had second thoughts about throwing it away and took a second look. Ramanujan's work was broken into small pieces that Hardy could analyze individually. The validity of a new general theory can only be determined by analyzing its entire core, so our creative new scientist does not have Ramanujan's advantage.
The equivalent today of going to Hardy would be to go to Piet Hut, the Head of the Program in Interdisciplinary Studies at the Institute for Advanced Studies in Princeton. A major difference is that Hut had to sue to retain his tenured position after string theorists tried to get him fired. They felt that his work was less important than their string theory, which was their pet candidate for the "Theory of Everything". Though Hut gets only minimal resources from the Institute, he has become successful at raising money and creating institutes elsewhere to pursue his work. Does our creative new scientist have to depend on Piet Hut or are there other potentially successful strategies to be recognized? What do you think?