And yet you said that a metre bar would be larger yet measure the same. If all the aspects of the universe are expanding in lockstep such that any distance appears constant, then redshift caused by expansion is impossible.
If the increasing distance between atoms is unmeasurable, then so too must be the increasing distance between galaxies be undetectable.
LIGO can detect changes of distance on the order of 10⁻²¹, and it should be increasing in effective length by 2×10⁻¹²m/s, yet I don't see any mention of any large interferometer measuring anything but gravitational waves, and I don't see any large time-dependent components of LIGOs systemic error data.
We also can measure the increasing distance of galaxies via redshift, so unless you can explain how light from galaxies is different from the light in a large interferometer, I must conclude that the interferometers aren't expanding at the same rate as the observed expansion of the universe.
We aren't expanding like the universe is expanding.
Actually that's a good point about interferometers, the only detectable change whould be in the difference between each arm's length.
Gravitational waves do behave like EM waves, we've seen a neutron star merger simultaneously in gravity and light. If there was a difference, one observation would lag behind.
How exactly would we measure an absolute value of distance? The whole thing about general relativity is than everything is relative. If everything was scaled up such that the fine structure constant stayed the same, we wouldn't be able to measure a difference.
Which brings us back to the question I have with your model: How can a changing distance be measured by light to be the same (metre bar) but also different (redshift)? If light is scaling with the rest of the universe, it shouldn't get shifted. This in the crux of my confusion.