You know

I actually like my third definition in where we find light being a constant locally measured, but from a 'container reasoning' also find its speed to depend on mass. That will give you time dilations comparing between frames of reference, but constants locally. What one can notice from such a reasoning is that a 'propagation' become a very complex behavior, especially if we consider the experiments done by NIST. Also that there are several possible ways to define what a frame of reference would be, macroscopically versus microscopically. It still should place a 'real' (as in a proven 'twin experiment') time dilation to being a effect between frames of reference to make sense though. That as it presumes your local clock and 'c' to be one and the same, everywhere. And it also points out the difference between using a 'container model' versus a local interpretation. If you instead of this propagation postulate a field with excitations it should become different though, giving us a propagation as defined locally measurably, but theoretically becoming a expression of 'circumstances' and probabilities, defining the existence of a locally measured 'excitation', defined to some position in this field (time and space). The 'circumstances' both involving SpaceTime, as well as the experiments nature, what it is defined to measure.

=

But I would expect us to need to change our ideas of what a 'propagation' is. What I mean by stating that you can define a frame of difference differently, is the difference between a ideal description, as me having my ideal 'local clock', versus one in where where we microscopically define frames of reference as consisting of quanta or 'bits', as when me using Planck scale to define a smallest 'unit' of time theoretically. Both are in some means ideal descriptions, but a 'quanta of time' states that going past it should change the nature of things defined. Furthermore, the 'quanta's' should then be the constants creating this SpaceTime macroscopically. I'm not sure if this change loops and strings, unless you define a local arrow to 'each one' of them? If you do, then time should become a smooth phenomena, with Planck scale as a first ordered pattern.

Actually I don't think you need to do that. If you define it the first way, our type of 'local arrow' ending at Planck scale, you come to a symmetry break. Or maybe you do, it's in a sense two ways of looking at time, them coexisting. One makes the arrow we measure, as described over a 'universal container'. The other is not a arrow in that sense, more like some 'keeping of a beat' to me. If you look at some 'SpaceTime unfolding' that way, it's more like static sheets of patterns, each one lighted up momentarily, 'flickering past', through that beat, the one that makes the anchor for our repeatable experiments. Depends on what you expect a arrow to be that one.

You can think of the one assuming a speed to automatically 'change' due to mass as being equivalent to the way your local arrow always tells you the same about your local life span, it never changes. If you do so a 'curved space' becomes a really tricky proposition as we find gravitational time dilations at centimeters. Then weight that against the fact that you only can measure light once, Using the description of a 'light path' is an assumption we make.

And yes it gives us two ways to define this universe of ours. One macroscopic, the other microscopic. One consisting of a light sphere, defining a time of the universe. The other describing it through scaling. From scaling the symmetry break constantly is here, it never went anywhere, it's just scales and patterns. From the other 'universal time exist' and you can prove it astronomically, anywhere you go.

and you definitely need 'patterns & sheets' to describe it, put them together and you gain your 'field'. The 'field' is an assumption made, resting on the way we observe dimensions existing, and assume a universe with a past, a present, and a future to exist. The 'sheet' being each 'instant of existence' that we can measure in. If it 'flickers' through that beat you won't notice it.

this type of universe sounds very deterministic, doesn't it

But that's not what probabilities tells us. It's still probabilistic, in each measurement on a position in time and space. And from that you then may want to define something more, 'containing' all probabilities, or do as I and end it at Planck scale, defining what's behind that to have it all.

Finally, this should then be what makes the 'dimensions' we measure in. Myself, I've never been fully comfortable with the idea of 'premade dimensions' from where a SpaceTime unfolds, and if you look at Einstein his ideas tells us that the universe actually is observer dependent. It can 'shrink' in your direction of 'motion'. Looking at it from a 'field' it stops having to do with some 'energy' needed to contract it (a whole universe) for the observer. It's still energy needed as a coin of exchange naturally, but it changes our definition of what this 'infinite universe' is. It's not a container, it's more of 'limits'. Although to anyone consciously existing and comparing inside it it will have all qualities we expect a container to have, volumes and time, dimensions. The dimensions should be created through 'c' communicating, also making a local arrow in where each one of us unfold. But this last is a really difficult thing to define.

=

What one really need to understand is the difference between a container model, and defining it strictly locally. Strictly locally 'c' is 'c'. You can split any acceleration infinitesimally, and if using Planck scale the way I do, presumably stop there. That becoming each 'static sheet' of pattern if we now instead use a global model. There the acceleration disappear, as in becoming unmeasurable, although we have to assume its 'property' to still exist. Doing so 'c' will hold everywhere. And locally defined you can ignore any ideas of a 'global container'.

So locally, 'c' is always 'c', using those definitions. In a normal container model you step away from that minuscule scale into a macroscopic definition of a arrow. That one defined by ideals, like 'the universal time defining a Big Bang', and 'frames of reference'. There you will find Lorentz contractions and time dilations. There you can define 'light paths', 'SpaceTime curvatures', and a 'infinite' but still 'contained' universe. And in that universe giving light a path, 'interacting' with mass, you also can define it as 'slowing down' as you might do in a acceleration. It's all in the equivalence principle.

but if you also find a equivalence between the local arrow and that speed it doesn't matter. 'c' is a constant both ways.

(hmm, that shouldn't be read as I consider mass to be a expression of 'slow time' though, you just need to consider observer dependencies and uniform motion to see where such an idea takes you.. It's just a statement stating that 'c' and your local clock are equivalent. So, no matter how a far observer defines my clock, locally it will behave as always, as will everything 'at rest' with me. And so will my local measurement of 'c' still be 'c'.

Although, find a way to redefine what happens in uniform motions, their time dilations and Lorentz contractions, to fit the idea of gravitational opposites, and I might become interested. But you need to do it from this equivalence of a local clock, to a local speed of speed of light in a vacuum. And using 'light clocks' may make sense geometrically but I think you need something different here.

To see it my way is really easy, as soon as you accept that your local clock equals 'c'. What might not be so easy to accept is the way I define 'c' to be a constant everywhere, 'ignoring' mass and accelerations. And use it as a stepping stone from where to define a universe.