Tuesday, October 12, 2010

The Train Chugs Along

Poor Graham D has discovered the sudden silence that ensues when one shouts out loudly that the standard cosmology is completely wrong. He should try giving seminars on the subject, sporting feminine summer attire, and see the tortured faces of the men who feel professionally obliged to offer harsh criticism despite not having actually understood anything. Oh well, those years are long gone now.

Meanwhile, there really isn't much news from down here ... at least, not that I'm telling you about yet!

22 comments:

  1. Standard cosmology is not completely wrong. There are of course problems and there are wrong ideas such as horribly ugly inflationary scenario which mis-interprets the flatness of 3-space which is an experimental fact. Flatness is a signal about extremely beautiful and predictive concept (quantum criticality in cosmological scales).

    The notion of Higgs is not completely wrong: Higgs probably exists but what is probably wrong is the description of massivation in terms of Higgs alone.

    Super-symmetry is not completely wrong: what is probably wrong is its particular mathematical realization. In particular, the realization in super-string models which leads to high-D target spaces and landscape misery.

    Superstrings are not completely wrong: what is wrong is the failure to realize that strings are idealizations of 3-D string like objects.

    QCD is not completely wrong: what is wrong is mis-interpretation of quark color as spin-like quantum number leading to non-conservation of baryon number in GUTs and endless fine tunings to keep proton long-lived enough.

    I could continue this list but I think that the main message has become clear: the basic problem of theory building is that people are too inpatient in their eagerness to build up CV to check thoroughly whether their assumptions are really the minimal ones and the only ones to explain what is observed.

    ReplyDelete
  2. Matti, if you had spent as much time around trendy cosmologists as I have, and you had a more category like perspective, you would probably be just as happy as I am to use the words Completely Wrong.

    ReplyDelete
  3. Can I ask here? Matti, once again, how would you shortly describe the massivation and Higgs potential and the mexican hat? Where comes the p-adic in?

    Assumption: antimatter is a dualistic mirror particle just like the lightlike cone but of matter with an intermediate phase (the zero) that may be a bosonic (magnetic body?) 'field'.

    ReplyDelete
  4. I have to second Matti's assessment about cosmology. On a gross or coarse grained level it is on tack. Even inflationary cosmology is correct in its main features. The questions come with how the details of inflation occur, such as the conditions for the 63 e-folds, and how this emerged from a quantum process. This process is likely a quantum tunneling or quantum critical point or phase transition.

    ReplyDelete
  5. In TGD framework p-adic thermodynamics gives the dominating contribution to fermion masses, which is something completely new.
    In the case of gauge bosons thermodynamic contribution is small since the inverse integer valued p-adic temperature is T=1/2 for bosons rather than T=1 for fermions.

    Whether Higgs can contribute to masses is not completely clear. I take the Mexican hat potential as a mere trick.

    I have not yet worked out the implications of the weak form of electric magnetic duality implying that each wormhole throat carrying fermionic quantum numbers is accompanied by a second wormhole throat carrying opposite magnetic charge and neutrino pair screening weak isospin and making gauge bosons massive. The following view looks the most plausible one at this moment.

    a) Higgs can develop a coherent state meaning vacuum expectation value and this is naturally proportional to the inverse of the p-adic length scale as are boson masses. This contribution can be assigned to the magnetic flux tube mentioned above since it screens weak force -equivalently makes them massive. Higgs expectation would not cause boson massivation. Rather, massivation and Higgs vacuum expectation would be caused by the presence of the magnetic flux tubes. Standard model would suffer from causal illusion. But better to be not too sure.

    b) The "stringy" magnetic flux tube connecting fermion wormhole throat and the wormhole throat containing neutrino pair would give to the vacuum conformal weight a small contribution and therefore to the mass squared of both fermions and gauge bosons (dominating one for the latter). This contribution would be small in p-adic sense (proportional 1/p^2 rather than 1/p). I cannot calculate this "stringy" contribution but stringy formula in weak scale is very suggestive.

    c) In the case of light fermions and massless gauge bosons the stringy contribution must vanish and therefore must correspond to n=0 string excitation (string does not vibrate at all) : otherwise the mass of fermion would be of order weak boson mass. For weak bosons n=1 would be the natural identification: weak bosons would therefore vibrate (note that the tiny vibrating super strings associated with elementary particles in string model and inducing feelings of awe in string theorists would not vibrate at all!) . n>1 excited states of bother fermions and bosons having masses above weak boson masses are predicted and would mean new physics becoming possibly visible at LHC. I have talked about this physics in my blog.

    ReplyDelete
  6. Well, my opinion happens to be different. Since I do not believe that Quantum Gravity is based on classical symmetry principles, I have no reason to take the Higgs boson seriously. Moreover, I am well aware of its effective role in a local theory, but that theory just simply does not apply to quantized masses ... just like Newton's laws break down if you push them too far.

    Lawrence, the claim is that you, like almost everyone, are Completely Wrong about the cosmology. There is no inflation. Louise's infinite $c$ condition explains the horizon problem perfectly well, and fits in with thousands of my other theoretical prejudices, including ... heh, I guess you think it's just crackpotism ... but those are PREDICTIONS.

    ReplyDelete
  7. Matti,

    The stringy wormhole you talk about might be related to some aspects of the Taub-NUT spacetime bolted onto a D3 or D5-brane. I can send you some interesting work I and others have done on this.

    Kea,

    The big bang cosmology and inflation do manage to fit the data so far. The WMAP is a pretty decent confirmation of the e-fold stretching of anisotropies.

    I am not sure what is meant by an infinite c condition. I am not a partisan of the idea of a variable speed of light. I am not familiar to Louise (last name?).

    Gravitation is maybe at best semi-classical or quantizable to a few loops. I tend to think there is an underlying Fermi-Dirac field theory underneath that is quantized. I am not sure if this is at all related to what you are referring to here.

    ReplyDelete
  8. No, the standard big bang cosmology does not fit the data. First, look at the work of Starkman et al on extra galactic plane subtractions from the WMAP data that suggest the zero angular correlation beyond an angle of roughly 60 degrees, in agreement with Riofrio's cosmology. Then we have the Dark Force problem: a cosmological constant is not an explanation of anything - it was a mistake, just like Einstein said it was.

    The fact that they get an expanding universe cosmology is hardly commendable, since we have had that for almost 100 years now.

    ReplyDelete
  9. The cosmological constant is a constant of integration. The covariant derivative of T^{ab} is nabla_aT^{ab} = 0. Integrating it up gives G^{ab} + Λg^{ab}. Einstein adjusted it to a particular value in order to get a static universe. Back then that was the idea which made sense. Hubble blew all of that away. However, it is back and is a type of stress energy term from the quantum vacuum

    T^{ab} = (8πG/c^4)(ρg^{00}+ pU^iU^j)

    I checked out Riofrio’s idea. Let me assume the equation GM=tc^3. If I define r = ct this looks like half the Schwarzschild factor. So I would interpret this as a relationship between a length a photon crosses in a time t with its mass. The longer it takes a photon to cross the length the larger the mass. However, writing this as some time variation of c can’t work. For one thing, the speed of light is a conversion factor. Secondly, if you look up various equations that depend on c, in particular the Planck units, and attempt to vary c everything changes in such as way that no apparent change in c could ever be detected. Saying this is an equation about the variation of c is about equivalent to saying the following. I have F = ma. Hence the mass is m = F/a, and for a situation where the acceleration goes to zero I have an infinite mass if the force is still present. Therefore a mass sitting on the ground has an infinite mass.

    ReplyDelete
  10. I am not arguing with mathematics, but with physical meaning. The static universe was a mistake then, and the Dark Force is a mistake now.

    Riofrio has unearthed much evidence that $c$ has varied. Of course it is a conversion factor ... and $\hbar$ must also vary. And you are beginning to see that this $c$ variation is a statement about mass. In fact, there is a conformal boundary so that we can say $M = t$ in Planck units, where $t$ is a cosmic time parameter which one sets to zero in the Big Bang epoch. Alternatively, Riofrio's law is a form of Kepler's law, the general validity of which you probably do not doubt.

    ReplyDelete
  11. Lawrence,

    what is known is that 3-space is flat in good approximation. Inflationary scenario is a possible explanation for this but plagued by extra-ordinary clumsy Higgs potentials and loss of predictivity.

    TGD explanation is based on quantum criticality. At quantum criticality there are no scales and this translates to the vanishing of 3-D curvature tensor and flatness of 3-space during the phase transition which corresponds to an increase of Planck constant scaling up quantum scales. Cosmological expansion would indeed occur as rapid quantum phase transitions in TGD Universe
    rather than as smooth continuous expansion.

    The condition of flatness is extremely powerful when combined with the embeddability to M^4xCP_2. The imbeddable cosmologies with critical mass density are parameterized by single parameter identifiable as duration of the critical phase during which the phase transition occurs.

    This cosmology has very nice features.

    a) At the moment of Big Bang mass density behaves as 1/a^2 so that the mass of comoving volume vanishes at this limit. Silent whisper amplified to quite big bang but not infinitely big.

    b) The pressures term in Einstein tensor is negative so that one obtains automatically accelerated expansion. The expansion is not due to negative kinetic energy as in quintessence scenario nor due to the addition of cosmological constant term to Einstein's equations but simply due to the constraint force taking care that space-time surface is surface in M^4xCP_2. One cannot of course exclude phenomenological modeling in terms of cosmological constant.

    c) TGD based cosmology is fractal containing cosmologies within cosmologies in all scales and the scaled version of the same universal critical cosmology allows also to model what happens at RHIC. It would be important to test this cosmology since it is universal unlike the inflationary scenarios.

    ReplyDelete
  12. To Kea:

    I have applied category theoretical ideas to TGD but from experience I know that physicist must develop the mathematics to be used on basis of physical intuition. The motivations of category theory come basically from mathematics, not from physics.


    A comment about Riofrio's idea. There is physical evidence for the varying c. The Earth-Moon distance is apparently varying as the reflection of laser light demonstrates. The challenge is to define what this variation means.

    The problem with Riofrio's idea is that it does not make sense in GRT framework. General Coordinate Invariance does not tolerate the introduction of a formula in which one just puts in some time coordinate without precisely specifying which kind of coordinate system is in question. c= constant is sensible statement in special relativity where one has a family of preferred coordinates. In general relativity one can only say that light moves along light-like geodesics. The value of the conversion factor c in general relativity depends on what coordinates one happens to use. Riofrio's formula also leads to non-sensical results since it leads to time variation of elementary particle masses as Kea noticed a couple of years ago.


    In TGD framework the sub-manifold geometry allows to speak about varying c, call this velocity c#. The times taken by topologically condensed particle to move from point A to B along two different space-time sheets measured using for instance the Minkowski time coordinate of M^4xCP_2 can be compared and these times are in general different since the motion along light-like geodesic of the space-time surface is in general longer than along the light-like geodesics of imbedding space and depends on how wiggly the space-time sheet is. The value of c# equals for Robertson-Walker cosmology to c#=sqrt(g_aa) and increases since space-time approaches flatness: I think that just the opposite is predicted by Riofrio's proposal.

    The apparent variation of Earth-Moon distance can be understood from the fact that the c# is different for very large cosmological space-time sheets defining the reference frame assignable to say galaxy and the reference frame assignable to solar system. The apparent change of Moon-Earth distance is due to the variation of the "meter stick" defined by distant stars since the ratio of c#:s associated with this system and solar system changes. The prediction for the apparent variation of the Earth-Moon distance is expressible in terms of the basic parameters of cosmology and turns out to be correct. What is highly non-trivial is the direct coupling of cosmology with solar system physics in principle allowing to do determine cosmological parameters with high precision. This is one of the strong pieces of evidence for sub-manifold cosmology.

    ReplyDelete
  13. OK, Matti, I deleted some of that ... FAR TOO LONG!

    First point: my motivations for category theory definitely come from physics, not mathematics. I started out in life as a real experimental physicist. And I agree that most of what Baez writes about physics is garbage. That is one of my problems. But you should not assume that everyone who uses category theory is doing the same thing.

    Regarding varying $c$: I don't think Louise's picture is so different to yours really. Remember that we are always mixing up several notions of time here. Your picture of increasing $c$ could be viewed as a forward time evolution in Louise's picture. Since I take stringy dualities seriously, this actually makes sense.

    ReplyDelete
  14. The true quantum mechanic characters, non-locality, action at a distance is forgotten? No time, no distance? Then it doesn't matter if lightspeed is this or that, or if time is past or future.

    ReplyDelete
  15. Ulla, non locality is very much a feature of the categorical picture, starting with entanglement for ordinary quantum mechanics. Action at a distance requires a deeper discussion. The idea of a changing $c$ is supposed to be an observer dependent one ... us being the observers. So in that sense there is no absolute time and distance.

    ReplyDelete
  16. To continue, I need to compare these with other constants. We need to look at their dimensionality. ħ and c have in naturalized units dimension zero [0]. The speed of light c is a conversion factor between ruler and clock measures, and the Planck unit is a conversion factor between momentum and position measures. I can’t say whether or not on some ultimate level there is no variation of these which involves structures we have not knowledge of. Maybe in the Tegmark world, a putative sort of Platonic math-scape of reality, there is such a variation. I do not know, so at this phase I have to have some rock or hard basis to anchor any theoretical thought. So I maintain, based in part on the arguments above that c and ħ are absolutely constant until demonstrated otherwise at a later time.

    Now there are other units. The most salient one is the fine structure constant α = e^2/ħc. This is absolutely unitless in any units, and it is the ratio of a naturalized unitless gauge coupling with the product of two naturalized unitless constant. If there are variations in the electric charge unit this will result in a variation in the fine structure constant. For this reason any putative variation can be benchmarked accordingly. In fact α ~ 1/137 for ~1TeV transverse momentum processes is renormalized to α ~ 1/128. So in this setting we can have gauge coupling terms that renormalize with energy. The gauge coupling constants or parameters for the other gauge forces are similar and scale with transverse momentum in scattering.

    There is then the remaining coupling parameter, which is the Newtonian gravitational constant. In naturalized units this has units of length squared. The Planck units are defined according to G as L_p = sqrt{Għ/c^3}. In D-dimensions we have G ~= g^2L_s^{D – 2}, where L_s is the string length L_s = sqrt{α’}, for α’ the string parameter. So this has units which involve “area,” and is a feature in the entropy formula S = kA/4L_p^2, for A the area of a black hole.

    I have to wrap this up, and unfortunately I could go further --- much further in fact. It is my sense that the Planck constant and the speed of light are absolute constants. After all, if you have a range of observables or objects that have a set of dependencies you must have a set of constants. Otherwise your system is under determined. So for physics over the foreseeable future will probably demand that c and ħ be constant.

    ReplyDelete
  17. Lawrence, your first comment was deleted. If you want to lecture people in a condescending manner, do it on your won blog.

    ReplyDelete
  18. "After all, if you have a range of observables or objects that have a set of dependencies you must have a set of constants. Otherwise your system is under determined."

    This is true. What is a constant and what is variable MUST be determined by which leads to the simplest theory. This is where Louise Riofrio has an advantage.

    ReplyDelete
  19. Lawrence, Louise usually discusses her theory in the context of General Relativity, with which I suspect you are not arguing. In my opinion, once we start talking about a varying $\hbar$ (keeping alpha constant) we are outside General Relativity, as a mathematical formalism. However, even Einstein believed that 19th century Riemannian geometry was inadequate to capture the underlying principles of relativity. Here we have two basic options: (i) Louise's varying $c$ rule or (ii) the favoured Dark Force within conventional geometry, fixing $c$ arbitrarily. The claim is that (i) is far, far simpler and more natural.

    We realise that most professionals believe otherwise, and I have a great deal of personal experience of this, but their arguments rarely display much comprehension of the argument ...

    ReplyDelete
  20. The Albrecht-Moffat-Magueijo (AMM) types of models propose variable c, which is tantamount to saying that the 4th dimension of space with x_4 = ict has some additional change to it. This might occur at extreme transverse momentum scattering, such as with Horava's suggestion of a Landau triple point. However, these constructions most likely point to some phase structure with the transition to quantum gravity. This can lead to a breakdown of Lorentz symmetry, which means quantum gravity is something else beside a canonical quantization of the Einstein field equations. On that front I can talk to you, for this connects up with a Thirring model of quaternions for the conformal structure of AdS_2 and higher AdS_n. I think that general relativity is a classical emergent theory, which is not fully quantized, and which emerges from an underlying structure. However, this most likely pertains in the most early universe or the singularities of black holes, where curvature terms are on the order of 1/L_s^2. The onset of inflationary cosmology is later than this, and occurs on a scale ~ 10^4L_s. This is not likely something which as a simple c = c(t) based on a GM =rc^2 with r = ct, which is frankly a rather childish suggestion. The critical parameter in connection with this possible breaking of Lorentz symmetry is frankly not time, but energy or transverse momentum.
    Inflationary cosmology is just the de Sitter vacuum realization for a large cosmological constant that renormalizes in response to a V(φ) in a scalar field theory with V(φ) ~ φ^2. During inflationary cosmology spacetime is classical and the universe has past the quantum gravity phase. So there is none of the putative quantum gravity possibilities I suggest above. The AMM models have been falsified with the Fermi (GLAST) detection of GRB in the distant universe. So there is no dispersion or related spacetime physics with a variable c, either c = c(t) or the reciprocal c = c(ν), which would accompany those models. Consequently, spacetime up to the appears to exhibit none of these behaviors.

    ReplyDelete
  21. ... which means quantum gravity is something else beside a canonical quantization of the Einstein field equations

    Indeed.

    ReplyDelete
  22. Lawrence's argument that hbar and c are absolute constants is correct only in GRT framework and in standard quantum mechanics. It more than often happens that lecturers forget the implicit assumptions of the school books;-)

    As I explained the measured c, I call it c# has c only as upper bound in sub-manifold gravitation. Argument is childishly simple and explains the apparent variation of Earth-Moon distance, which is something GRT people should take with utmost seriousness.

    In the case of Planck constant TGD predicts that the Planck constant for single sheet for the covering is n-fold multiple of the ordinary one. If one treats the whole bundle of sheets as single object it is the ordinary one but now one has to find a manner to treat the system non-perturbatively. This is by no means prevented by conversion factor identification. Also the more radical option in which hierarchy of Planck constant does not follow from basic TGD is completely consistent with conversion factor interpretation.

    ReplyDelete

Note: Only a member of this blog may post a comment.