Monday, August 22, 2016

Does GRT really allow gravitational radiation?

In Facebook discussion Niklas Grebäck mentioned Weyl tensor and I learned something that I should have noticed long time ago. Wikipedia article lists the basic properties of Weyl tensor as the traceless part of curvature tensor, call it R. Weyl tensor C is vanishing for conformally flat space-times. In dimensions D=2,3 Weyl tensor vanishes identically so that they are always conformally flat: this obviously makes the dimension D=3 for space very special. Interestingly, one can have non-flat space-times with nonvanishing Weyl tensor but the vanishing Schouten/Ricci/Einstein tensor and thus also with vanishing energy momentum tensor.

The rest of curvature tensor R can be expressed in terms of so called Kulkarni-Nomizu product P• g of Schouten tensor P and metric tensor g: R=C+P• g, which can be also transformed to a definition of Weyl tensor using the definition of curvature tensor in terms of Christoffel symbols as the fundamental definition. Kulkarni-Nomizu product • is defined as tensor product of two 2-tensors with symmetrization with respect to first and second index pairs plus antisymmetrization with respect to second and fourth indices.

Schouten tensor P is expressible as a combination of Ricci tensor Ric defined by the trace of R with respect to the first two indices and metric tensor g multiplied by curvature scalar s (rather than R in order to use index free notation without confusion with the curvature tensor). The expression reads as

P= 1/(D-2)×[Ric-(s/2(D-1))×g] .

Note that the coefficients of Ric and g differ from those for Einstein tensor. Ricci tensor and Einstein tensor are proportional to energy momentum tensor by Einstein equations relate to the part.

Weyl tensor is assigned with gravitational radiation in GRT. What I see as a serious interpretational problem is that by Einstein's equations gravitational radiation would carry no energy and momentum in absence of matter. One could argue that there are no free gravitons in GRT if this interpretation is adopted! This could be seen as a further argument against GRT besides the problems with the notions of energy and momentum: I had not realized this earlier.

Interestingly, in TGD framework so called massless extremals (MEs) (see this and this) are four-surfaces, which are extremals of Kähler action, have Weyl tensor equal to curvature tensor and therefore would have interpretation in terms of gravitons. Now these extremals are however non-vacuum extremals.

  1. Massless extremals correspond to graphs of possibly multi-valued maps from M4 to CP2. CP2 coordinates are arbitrary functions of variables u=k• m and w= ε • m (here "•" denotes M4 inner product). k is light-like wave vector and ε space-like polarization vector orthogonal to k so that the interpretation in terms of massless particle with polarization is possible. ME describes in the most general case a wave packet preserving its shape and propagating with maximal signal velocity along a kind of tube analogous to wave guide so that they are ideal for precisely targeted communications and central in TGD inspired quantum biology. MEs do not have Maxwellian counterparts. For instance, MEs can carry light-like gauge currents parallel to them: this is not possible in Maxwell's theory.

  2. I have discussed a generalization of this solution ansatz so that the directions defined by light-like vector k and polarization vector ε orthogonal to it are not constant anymore but define a slicing of M4 by orthogonal curved surfaces (analogs of string world sheets and space-like surfaces orthogonal to them). MEs in their simplest form at least are minimal surfaces and actually extremals of practically any general coordinate invariance action principle. For instance, this is the case if the volume term suggested by the twistorial lift of Kähler action (see this) and identifiable in terms of cosmological constant is added to Kähler action.

  3. MEs carry non-trivial induced gauge fields and gravitational fields identified in terms of the induced metric. I have identified them as correlates for particles, which correspond to pairs of wormhole contacts between two space-times such that at least one of them is ME. MEs would accompany to both gravitational radiation and other forms or radiation classically and serve as their correlates. For massless extremals the metric tensor is of form

    g= m+ a ε⊗ ε+ b k⊗ k + c(ε⊗ kv +k⊗ ε) ,

    where m is the metric of empty Minkowski space. The curvature tensor is necessarily quadrilinear in polarization vector ε and light-like wave vector k (light-like ifor both M4 and ME metric) and from the general expression of Weyl tensor C in terms of R and g it is equal to curvature tensor: C=R.

    Hence the interpretation as graviton solution conforms with the GRT interpretation. Now however the energy momentum tensor for the induced Kähler form is non-vanishing and bilinear in velocity vector k and the interpretational problem is avoided.

See the article Does GRT really allow gravitational radiation?. For background see the chapter Basic extremals of the Kähler action.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Friday, August 19, 2016

New findings about high-temperature super-conductors

Bozovic et al have reported rather interesting new findings about high Tc super-conductivity: for over-critical doping the critical temperature is proportional to the density of what is identified as Cooper pairs of electronic super-fluid. Combined with the earlier findings that super-conductivity is lost - not by splitting of Cooper pairs - but by reduction of the scale of quantum coherence, and that below minimal doping fraction critical temperature goes abruptly to zero, allows to add details to the earlier TGD inspired model of high Tc super-conductivity. The super-conductivity would be indeed lost by the reconnection of flattened square shaped long flux loops to shorter loops of pseudogap phase. Quantum coherence would be reduced to smaller scale as heff is reduced. Transversal flux tube "sound waves" would induce the reconnections. Electrons at flux loops would stabilize them by contributing to the energy density and thus to the inertia increasing the string tension so that the average amplitude squared of oscillations is reduced and critical temperature increases with electron density.

For details see the chapter Quantum Model for Bio-Superconductivity: I of "TGD and EEG" or the article New findings about high-temperature super-conductors.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Tuesday, August 16, 2016

Combinatorial Hierarchy: two decades later

Combinatorial Hierarchy (CH) proposed by Noyes and Bastin is a hierarchy consisting of Mersenne integers M(n)= MM(n-1)=2M(n-1)-1 and starting from M1=2. The first members of the hierarchy are given by 2,3,7,127,M127=2127-1 and are primes. The conjecture of Catalan is that the hierarchy continues to some finite prime. It was proposed by Peter Noyes and Ted Bastin that the first levels of hierarchy up to M127 are important physically and correspond to various interactions (see this). I have proposed the levels of CH define a hierarchy of codes containing genetic code corresponding to M7 and also memetic code assignable to M127 (see this).

Pierre Noyes and Ted Bastin proposed also an argument why CH contains only the levels mentioned above. This has not been part of TGD view about CH: instead of this argument I have considered the possibility that CH does not extend beyond M127. With the inspiration coming from email discussion I tried to understand the argument stating that CH contains M127 as the highest level and ended up with a possible interpretation of the condition. Zero energy ontology (ZEO) and the representation of quantum Boolean statements A→ B as fermionic parts of positive and negative energy parts of zero energy states is essential. This led to several interesting new results.

  1. To my best understanding the original argument of Noyes and Bastin does not allow M127 level whereas prime property allows. States at M127 level cannot be mapped to zero energy states at M7 level. Allowing a wild association with Gödel's theorem, one could say that that there is hube number of truths at M127 level not realizable as theorems at M7 level.

    A possible interpretation is that M127 level corresponds to next level in the abstraction hierarchy defined by CH and to the transition from imbedding space level to the level of "world of classical worlds (WCW) in TGD. The possible non-existence of higher levels (perhaps implied if MM127 is not prime) could be perhaps interpreted by saying that there is no "world of WCWs"!

  2. Rather remarkably, for M7, which corresponds to genetic code (see this), the inequality serving as consistency condition is saturated. One can say that any set of 64 mutually consistent statements at M7 level can be represented in terms of 64 Boolean maps at M3 level representable in terms of zero energy states. One obtains an explicit identification for the Boolean algebras involved in terms of spin and isospin states of fermions in TGD framework at level M7 so that genetic code seems to be realized at the fundamental elementary particle level thanks to the dimension D=8 of imbedding space. Even more, the level M127 corresponding to memetic code emerges in the second quantization of fermions at M7 level. Here color triplet property of quarks and color singletness of leptons and the identification of elementary particles as pairs of wormhole contacts are in essential role.

The conclusion would be that in TGD Universe genetic code and its memetic counterpart are realized at the level of fundamental particles. Already earlier I have ended up with alternative realizations at the level of dark nucleons and sequences of 3 dark nucleons (see this).

For details see the chapter Genes and Memes of "Genes and Memes" or the article Combinatorial Hierarchy: two decades later.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Monday, August 15, 2016

Two styles of theorizing: conservative and radical

The Facebook discussions with Andrei Patrascu and others have created a need to emphasize that there are two styles of working in theoretical physics. These approaches might be called conservative and radical.

  1. Conservative approach takes some axiomatics as a starting point and deduces theorems. In the highly competitive academic environment this approach is good for survival. The best juggler wins but theorems about physics of say K3 surface are not very useful physically.

  2. Second approach might be called radical. The rebel tries to detect the weak points of existing paradigms and propose new visions generalizing the old ones. This approach is intellectually extremely rewarding but does not lead to academic success. There is a continual battle between the two approaches and the first approach has been the winner for about four decades since the establishment of standard model (and perhaps has lead to the recent dead end).

Personally I belong to the camp of rebels. The work that I have done during last decades is an attempt to clean what I see as staple of Aigeas by going beyond quantisation, which I see just as a set of rules producing amplitudes but is mathematically and conceptually highly unsatisfactory.

The rough physical idea was that in classical theory space-times are 4-surfaces in H=M4× CP2 satisfying some variational principle: this allows to lift Poincare invariance to the level of H have Poincare charges as Noether charges so that the energy problem of general relativity is circumvented. My personal conviction is that the loss of Noether charges is the deep reason for the failure to quantize general relativity: the success of string models would reflect the fact that in string models Poincare invariance is obtained.

The challenge is to quantize this theory.

  1. I of course started by trying to apply canonical quantisation taking GRT as a role model. After its failure I tried path integral in fashion at that time: every young theoretician wanted to perform a really difficult path integral explicitly and become the hero! It turned out that these methods failed completely by the non-linearity of any general coordinate invariant variational principle dictating the dynamics of space-time surface.

    I had good luck since this failure forced me quite soon to deeper waters: at this point my path deviated radically from that still followed by colleagues. Note that canonical quantization relies also on Newtonian time as also the notion of unitary time evolution and this is conceptually highly unsatisfactory in the new framework.

  2. Around 1985 I indeed realized that much more radical approach is required and around 1990 I had finally solved the problem at general level. TGD must be formulated as a geometrization of not only classical physics but also of quantum theory by geometrizing the infinite-D "world of classical worlds" consisting of 3-surfaces.

    WCW must be endowed with Kähler geometry - already in case of much simpler loop spaces Freed showed that Kähler geometry is unique. Physics would be unique from the mere mathematical existence of WCW geometry!

    Also superstringers had this idea for some time but during the wandering in landscape they began to see the total loss of predictivity as a blessing. After LHC they speak only about formal string theory so that the conservative approach has taken the lead.

    Kähler function would correspond to action for a preferred extremal of Kähler action and have interpretation as analog of Bohr orbit. Classical physics in the sense of Bohr orbitology would be exact part of quantum theory rather than a limit. This follows from general coordinate invariance (GCI) only.

  3. Physical states would correspond to spinor fields in WCW: for given a 3-surface they correspond to fermionic Fock states. An important point is that WCW spinor fields are formally purely classical: "quantization without quantization" would Wheeler say.

    Induced spinor fields are quantized at space-time surface and the gamma matrices of WCW are linear combinations of fermionic oscillator operators so that at space-time level quantization of free fermions is needed and anticommutativity has geometric interpretation at WCW level: fermionic statistics is geometrized.

  4. The generalisation of S-matrix in zero energy ontology (ZEO) - also needed - is associated with the modes of WCW Dirac operator satisfying analog of Dirac equation as conditions stating supersymplectic invariance (formally analogous to Super Virasoro conditions in string models). One just solves the free Dirac equation in WCW!

    Childishly simple at the general conceptual level but the practical construction of the S-matrix as coefficients from zero energy state identified as a bilinear formed by positive and negative energy states is a really hard task!

The only way to proceed is to identify general principles.
  1. General Coordinate Invariance is what led to the idea that WCW geometry assigns to 3-surface (more precisely, to a a pair 3-surfaces at boundaries of causal diamond) a space-time surface unique space-time surface as preferred
    extremal of Kähler action (most plausible guess). Kähler function is the value of Kähler action for regions of space-time surface with Euclidian signature of induced metric and Minkowskian regions give imaginary contribution as analog of QFT action. Mathematically ill-defined path integral transforms to a well-defined functional integral over 3-surfaces.

  2. Infinite dimensional WCW geometry requires maximal symmetries. Four-dimensionality of M4 factor and space-time surface is necessary condition since 3-D light-like surfaces by their metric 2-dimensionality allow ab extension of ordinary 2-D conformal invariance. M4 and CP2 are unique in that they allow twistor space with Kähler structure. The twistorial lift of TGD predicts cosmological term as an additional term besides Kähler action in the dimensional reduction of 6-D Kähler action and also predicts the value of Planck length as radius of the sphere of twistor bundle having sphere as fiber and space-time surface as base. I am still not sure whether twistorial lift is really necessary or whether cosmological constant and gravitational constant emerge in some other manner.

  3. Infinite number of conditions stating the vanishing of classical super-symplectic Noether charges (not all of them) is satisfied and guarantee strong form of holography (SH) implied by strong form of general coordinate invariance (SGCI): space-time dynamics is coded by 2-D string world sheets and partonic 2-surfaces serving as "space-time genes": a close connection with string models is obtained. These conditions are satisfied also by quantal counterparts of super-symplectic charges and there is a strong formal resemblance with super Virasoro conditions. These conditions include also the analog of Dirac equation in WCW.

  4. Feynman/twistor/scattering diagrammatics (last one is the best choice) is something real and must be generalized and diagrams have a concrete geometric and topological interpretation at space-time level: it must be emphasized and thickly underlined that also this is something completely new. By strong form of holography this diagrammatics reduces to a generalization of string diagrammatics and 8-D generalization of twistor diagrammatics based on octonionic representation of sigma matrices is highly suggestive. Recall that massive particles are a nuisance for twistor approach and massive momenta in 4-D sense would correspond to massless momenta in 8-D sense.

    Twistor approach suggests a Yangian generalization of super-symplectic symmetries with polylocal generators (polylocal with respect to partonic 2-surfaces). Yangian symmetries should dictate the S-matrix as in twistor Grassmann approach. The symmetries are therefore monstrous and the formidable challenge is to understand them mathematically.

  5. One has half dozen of new deep principles: physics as WCW spinor geometry and quantum physics formulated in terms of modes of classical WCW spinor fields so that the only genuinely quantal aspect of quantum theory would be state function reduction; SGCI implying SH; maximal isometry group of WCW to guarantee existence of its Kähler geometry and its extension to Yangian symmetry, ZEO; number theoretic vision and extension of physics to adelic physics by number theoretic universality; hierarchy of Planck constants assignable to the fractal hierarchy of super-symplectic algebras and inclusions of hyperfinite factors of type II1.

    I also learned that one cannot proceed without a proper quantum measurement theory and this led to a theory of consciousness and applications in quantum biology: the most important implication is the new view about the relationship of experienced and geometric time.

    What remarkable, that all this has followed during almost four decades from a childishly simple observation: the notion of energy in GRT is ill-defined since Noether theorem does not apply, and one can cure the situation by assuming that space-times are 4-surfaces in space-time having M4 as Cartesian factor. This should show how important it is for a theoretician to be keenly aware of the weak points of the theory.

I believe that I have now identified the general principles (axiomatics) but a huge collective effort would be needed to deduce the detailed rules. There are of course many open problems and questions about details. In any case, this involves a lot of new mathematics and mathematicians are needed to build it.

To get some perspective a comparison with what most mathematical physicists are doing is in order. They start from a given axiomatics since they want to deduce theorems to be the most skillful juggler and get the reward. My goals have been different. I have used these years to identify the axioms of a theory allowing to lift quantum theory to a real unified theory of fundamental interactions. I have been also forced to use every bit of experimental information whereas mathematical physicist could not care less of anomalies.

For a summary of earlier postings see Latest progress in TGD.

Have magnetic monopoles been detected?

LNC scientist report that they have discovered magnetic monopoles (see this and this). The claim that free monopoles are discovered is to my opinion too strong, at least in TGD framework.

TGD allows monopole fluxes but no free monopoles. Wormhole throats however behave effectively like monopoles when looked at either space-time sheet, A or B. The first TGD explanation that comes in mind is in terms of 2-sheeted structures with wormhole contacts at the ends and monopole flux tubes connecting the wormhole throats at A and B so that closed monopole flux is the outcome. All elementary particles are predicted to be this kind of structures in the scale of Compton length. First wormhole carries throat carries the elementary particle quantum numbers and second throat neutrino pair neutralizing the weak isospin so that weak interaction is finite ranged. Compton length scales like heff and can be nano-scopic or even large for large values of heff. Also for abnormally large p-adic length scale implying different mass scale for the particle, the size scale increases.

How to explain the observations? Throats with opposite apparent quantized magnetic charges at given space-time sheet should move effectively like independent particles (although connected by flux tube) in opposite directions to give rise to an effective monopole current accompanied by an opposite current at the other space-time sheet. This is like having balls at the ends of very soft strings at the two sheets. One must assume that only the current only at single sheet is detected. It is mentioned that ohmic component corresponds to effectively free monopoles (already having long flux tubes connecting throats with small magnetic string tension). In strong magnetic fields shorter pairs of monopoles are reported to become "ionised" and give rise to a current increasing exponentially as function of square root of external magnetic field strength. This could correspond to a phase transition increasing heff with no change in particle mass. This would increase the length of monopole flux tube and the throats would be effectively free magnetic charges in much longer Compton scale. The space-time sheet at which the throat carrying the quantum numbers of fermion is preferred in the case of elementary fermions.

The analog of color de-confinement comes in mind and one cannot exclude color force since non-vanishing Kähler field is necessarily accompanied by non-vanishing classical color gauge fields. Effectively free motion below the length scale of wormhole contact would correspond to asymtotic freedom. Amusingly, one would have zoomed up representation of dynamics of colored objects! One can also consider interpretation in terms of Kähler monopoles: induced Kähler form corresponds to classical electroweak U(1) field coupling to weak hypercharge but asymptotic freedom need not fit with this interpretation. Induced gauge fields are however strongly constrained: the components of color gauge fields are proportional to Hamiltonians of color rotation and induced K\"ahler form. Hence it is difficult to draw any conclusions.

For a summary of earlier postings see Latest progress in TGD.

Wigner's friend and Schrödinger's cat

I encountered in Facebook discussion Wigner's friend paradox (see this and this). Wigner leaves his friend to the laboratory together with Schrödinger's cat and the friend measures the state of cat: the outcome is "dead" or "alive". Wigner returns and learns from his friend what the state of the cat is. The question is: was the state of cat fixed already earlier or when Wigner learned it from his friend. In the latter case the state of friend and cat would have been superposition of pairs in which cat was alive and friend new this and cat was dead also now friend new this. Entanglement between cat and bottle would have been transferred to that between cat+bottle and Wigner's friend. Recall that this kind of information transfer occur in quantum computation and quantum teleportation allows to transfer arbitrary quantum state but destroys the original.

The original purpose of Wigner was to demonstrate that consciousness is involved with the state function collapse.
TGD view is that the state function collapse can be seen as moment consciousness. Or more precisely, self as conscious entity corresponds to the repeated state function reduction sequence to the same boundary of causal diamond (CD). One might say that self is generalized Zeno effect in Zero Energy Ontology (ZEO). The first reduction to the opposite boundary of CD means death of self and re-incarnation at opposite boundary as time reversed self. The experiencet flow of time corresponds to the shift of the non-fixed boundary of self reduction by reduction farther from the fixed boundary - also the state at it changes. Thus subjective time as sequence of reductions is mapped to clock time identifiable as the temporal distance between the tips of CD. Arrow of time is generated but changes in death-reincarnation.

In TGD inspired theory of consciousness the intuitive answerto the question of Wigner looks obvious. If the friend measured the state of cat, it was indeed dead or alive already before Wigner arrived. What remains is the question what it means for Wigner, the "ultimate observer", to learn about the state of the cat from his friend. The question is about what conscious communications are.

Consider first the situation in the framework of standard quantum information theory.

  1. Quantum teleportation could make it possible to transfer arbitrary quantum state from the brain of Wigner's friend to Wigner's brain. Quantum teleportation involves generation of Bell state of qubits assignable with Wigner's friend (A) and Wigner (B).

  2. This quantum state can be constructed by a joint measurement of component of spin in same direction at both A and B. One of the four eigenstates of (by convention) the operator Qz= Jx1)⊗ Jy2)-Jy1)⊗ Jx2) is the outcome. For spinors the actions of Jx and Jy change the sign of Jz eigenvalue so that it becomes possible to construct the Bell states as eigenstates of Qz.

  3. After that Wigner's friend measures both the qubit representing cat's state, which is to be communicated and the qubit at A. The latter measurement does not allow to predict the state at B. Wigner's friend communicates the two bits resulting from this measurement to Wigner classically. On basis of these two classical bits his friend performs some unitary operation to the qubit at his end and transforms it to qubit that was to be communicated.


This allows to communicate the qubit representing measurement outcome (alive/dead). But what about meaning? What guarantees that the meaning of the bit representing the state of the cat is the same for Wigner and his friend? One can also ask how the joint measurement can be realized: its seems to require the presence of system containing A⊗ B. To answer these questions one must introduce some notions of TGD inspired theory of consciousness: self hierarchy and subself=mental image identification.

TGD inspired theory of consciousness predicts that during communication Wigner and his friend form a larger entangled system: this makes possible sharing of meaning. Directed attention means that subject and object are entangled. The magnetic flux tubes connecting the two systems would serve as a correlate for the attention. This mechanism would be at work already at the level of molecular biology. Its analog would be wormholes in ER-EPR corresponence proposed by Maldacena and Susskind. Note that directed attention brings in mind the generation of the Bell entangled pair A-B. It would make also possible quantum teleportation.

Wigner's friend could also symbolize the "pointer of the measurement apparatus" constructed to detect whether cats are dead of alive. Consider this option first. If the pointer is subsystem defining subself of Wigner, it would represent mental image of Wigner and there would be no paradox. If qubit in the brain in the brain of Wigner's friend replaces the pointer of measurement apparatus then during communication Wigner and his friend form a larger entangled system experiencing this qubit. Perhaps this temporary fusion of selves allows to answer the question about how common meaning is generated. Note that this would not require quantum teleportation protocol but would allow it.

Negentropically entangled objects are key entities in TGD inspired theory of consciousness and the challenge is to understand how these could be constructed and what their properties could be. These states are diametrically opposite to unentangled eigenstates of single particle operators, usually elements of Cartan algebra of symmetry group. The entangled states should result as eigenstates of poly-local operators. Yangian algebras involve a hierarchy of poly-local operators, and twistorial considerations inspire the conjecture that Yangian counterparts of super-symplectic and other algebras made poly-local with respect to partonic 2-surfaces or end-points of boundaries of string world sheet at them are symmetries of quantum TGD. Could Yangians allow to understand maximal entanglement in terms of symmetries?

  1. In this respect the construction of maximally entangled states using bi-local operator Qz=Jx⊗ Jy - Jy⊗ Jx is highly interesting since entangled states would result by state function. Single particle operator like Jz would generate un-entangled states. The states obtained as eigenstates of this operator have permutation symmetries. The operator can be expressed as Qz=fzijJi⊗ Jj, where fABC are structure constants of SU(2) and could be interpreted as co-product associated with the Lie algebra generator Jz. Thus it would seem that unentangled states correspond to eigenstates of Jz and the maximally entangled state to eigenstates of co-generator Qz. Kind of duality would be in question.

  2. Could one generalize this construction to n-fold tensor products? What about other representations of SU(2)? Could one generalize from SU(2) to arbitrary Lie algebra by replacing Cartan generators with suitably defined co-generators and spin 1/2 representation with fundamental representation? The optimistic guess would be that the resulting states are maximally entangled and excellent candidates for states for which negentropic entanglement is maximized by NMP.

  3. Co-product is needed and there exists a rich spectrum of algebras with co-product (quantum groups, bialgebras, Hopf algebras, Yangian algebras). In particular, Yangians of Lie algebras are generated by ordinary Lie algebra generators and their co-generators subject to constraints. The outcome is an infinite-dimensional algebra analogous to one half of Kac-Moody algebra with the analog of conformal weight N counting the number of tensor factors. Witten gives a nice concrete explanation of Yangian for which co-generators of TA are given as QA= ∑i<j fABC TBi ⊗ TCj, where the summation is over discrete ordered points, which could now label partonic 2-surfaces or points of them or points of string like object. For a practically totally incomprehensible description of Yangian one can look at the Wikipedia article .

  4. This would suggest that the eigenstates of Cartan algebra co-generators of Yangian could define an eigen basis of Yangian algebra dual to the basis defined by the totally unentangled eigenstates of generators and that the quantum measurement of poly-local observables defined by co-generators creates entangled and perhaps even maximally entangled states. A duality between totally unentangled and completely entangled situations is suggestive and analogous to that encountered in twistor Grassmann approach where conformal symmetry and its dual are involved. A beautiful connection between generalization of Lie algebras, quantum measurement theory and quantum information theory would emerge.

For a summary of earlier postings see Latest progress in TGD.

Saturday, August 13, 2016

What does one mean with quantum fluctuations in TGD framework?

The notion of quantum fluctuation is to my opinion far from being well-defined. Often Uncertainty Principle is assigned with quantum fluctuations. Eigenstate of say momentum is necessary de-localized maximally, and one could say that there are quantum fluctuations in position. Same applies to eigenstates of energy. Personally I would prefer stronger notion for which quantum fluctuation would be analogous to thermodynamical fluctuation at criticality.

Path integral formalism provides a stronger definition. In path integral formalism one interprets transition amplitude as a sum over all paths - not only classical one - leading from given initial to given final state. In stationary phase approximation making possible perturbative approach one performs path integral around the classical path. The paths which differ from the classical path correspond naturally to quantum fluctuations. For interacting quantum field theories (QFT) the expansion in powers of coupling constant - say α=e2/4πhbar - would give radiative corrections identifiable as being due quantum fluctuations. The problem of path integral formalism is that path integral does not exist in a strict mathematical sense.

Some relevant ideas about TGD

To discuss the situation in TGD framework one must first introduce some key ideas of TGD related to the path and functional integrals. The first thing to notice is of course that path integral and functional integral might be un-necessary historical load in the definition scattering amplitudes in TGD framework. Scattering diagrams are however central in QFT and also in dramatically simpler twistorial approach. In ZEO one indeed expects that diagrams in some form are present. The functional integral over 3-surfaces replaces path integral over all 4-surfaces - the first naive guess for quantization, which completely fails in TGD framework. The recent view indeed is that the scattering amplitudes reduce to discrete sums of generalized scattering diagrams.

  1. In TGD framework path integral is replaced with a functional integral over pairs of 3-surfaces at boundaries of causal diamonds (CDs: CDs form a scale hierarchy). The functional integral is weighted by a vacuum functional, which is product of two exponent. The real exponent guarantees the convergence as functional integral and hopefully makes it mathematically well-defined. The exponent of imaginary phase is analogous to action exponential appearing in QFTs and gives rise to the crucial interference effects. In Zero Energy Ontology (ZEO) one can say that TGD is a complex square root of thermodynamics with partition function replaced with exponential of complex quantity so that fusion of QFT and thermodynamics is obtained.

  2. There is no integration over paths and the interpretation is in terms of holography implying that pair of 3-surfaces is accompanied by a highly unique space-time surface analogous to Bohr orbit.

    Holography follows from general coordinate invariance (GCI): the definition of WCW Kähler geometry must assign to given 3-surface(!) a hopefully unique space-time surface for 4-D(!) general coordinate transformations to act on. This space-time surface corresponds to a preferred extremal of Kähler action.

  3. One can strengthen the notion of holography by demanding that it does not matter whether one identifies 3-surfaces as pairs of space-like 3-surface pairs at the boundaries of causal diamond CD or as light-like 3-surfaces between Euclidian and Minkowskian space-time regions defining" orbits of partonic 2-surfaces". 2-D string world sheets carrying induced fermion fields and partonic 2-surfaces would carry all information needed to construct physical state and would have strong form of holography (SH). This would mean almost string model like description of TGD. Preferred extremals satisfy at their ends infinite number of conditions analogous to Super Virasoro conditions and defining the analog of Dirac equation at level of WCW. This is what makes the situation almost - or effectively 2-dimensional.

    The localization of modes of induced spinor fields to 2-surfaces follows from the physically well-motivated requirement that the modes of induced spinor field have well-defined eigenvalue of em charge. This demands that the induced W gauge potentials vanish: the condition requires 2-D CP2 projection and is in the generic situation satisfied at string world sheets. Also number theoretic arguments favor the condition: in particular, the idea that string world sheets are either complex or co-complex 2-surfaces of quaternionic space-time surface is highly attractive. The boundaries of string world sheets would in term be real/co-real (imaginary) "surfaces" of string world sheets. It is not clear how unique the choice of strings world sheets is.

    The localization of modes of induced spinor fields to 2-surfaces follows from the physically well-motivated requirement that the modes of induced spinor field have well-defined eigenvalue of em charge. This demands that the induced W gauge potentials vanish: the condition requires 2-D CP2 projection and is in the generic situation satisfied at string world sheets. Also number theoretic arguments favor the condition: in particular, the idea that string world sheets are either complex or co-complex 2-surfaces of quaternionic space-time surface is highly attractive. The boundaries of string world sheets would in term be real/co-real (imaginary) "surfaces" of string world sheets. It is not clear how unique the choice of strings world sheets is.

  4. It would be very nice if preferred extemals were unique. In fact, extended number theoretic vision suggests that this might not be the case. There could be a kind of gauge symmetry analogous to that encountered in M-theory where two different Calabi-Yau geometries would describe the same physics.

    Number theoretic vision states that space-time surfaces are correlates for sequences of algebraic operations transforming incoming collection of algebraic objects to an outgoing collection of them. There would be an infinite number of equivalent computations and in absence of some natural cutoff this in turn would suggest that infinite number of space-time surfaces - generalized scattering diagrams - corresponds to the same scattering amplitude.

    This would extend the old fashioned string model duality to an infinite number of dualities allowing to transform all loopy diagrams to braided tree diagrams as in QFT without interactions. The functional integration over WCW would not involve summation over different topologies of generalized scattering diagrams, choice of gauge would select one of them: in the similar manner in hadronic string model one does not sum separately over s-channel and t-channel exchanges.

    It must be however emphasized that these loops are topological, and include besides stringy loops (having different physical interpretation in TGD: particle just travels along different paths as in double slit experiment) also new kind of loops due to the new kind of vertices analogous to those for ordinary Feynman diagrams. The new elements are that the lines of Feynman diagrams become 4-D: orbits of 3-surfaces and at generalized vertices these generalized lines meet at their ends. At this kind of vertex not encountered in string models the 4-surface is locally singular although 3-surface at the vertex is non-singular. In string model string world sheets are non-singular but strings are singular at vertices (eye glass type closed string is basic example).

  5. There is also a functional integral over small deformations of a diagram with given topology (which could be chosen to be the tree topology). Quantum criticality suggests that coupling constants do not evolve locally being analogous to critical temperature and change in phase transition manner as the character of quantum criticality changes. Also p-adic considerations suggest that coupling constant evolution reduces to a discrete p-adic coupling constant evolution. Coupling constants would be piecewise constants and depend only on the p-adic length scale and the value of Planck constant heff=n× h. Theory would be as near as possible to a physically trivial theory in which couplings constants do not evolve at all. The local vanishing of radiative corrections would guarantee absence of divergences and would have interpretation in terms of integrability of TGD.

  6. By SH functional integral should reduce to that over string world sheets and partonic 2-surfaces assignable to special prefered extremals. For them vacuum functional would receive contributions from Euclidian and Minkowskian regions and have maximum modulus and stationary phase. Functional integral over deformations of these 2-surfaces would reduce to exactly calculable Gaussian. As a matter fact, the Gaussian determinant and metric determinant from WCW geometry should cancel so that one would have only discrete sum of action exponentials - perhaps only single one for given diagram topology which by generalization of string model duality would be equivalent as representations of equivalent series of algebraic computations. This would be a highly desired result from the point of view of number theoretic universality.

One could of course challenge the entire idea about functional integral. Why not just replace the functional integral with a sum over amplitudes assignable to preferred extremals corresponding maxima/stationary phase 3-surfaces weighted by exponent of Kähler action? Classically the theory would be effectively on mass shell theory. This would automatically give number theoretic universality. If the generalization of duality symmetry inspired by the idea about scattering as computation holds then one could include only braided tree diagrams.

Quantum fluctuations in ZEO

What quantum fluctuations could mean in TGD Universe? Here the quantum criticality of TGD suggests that interesting quantum fluctuations are associated with quantum criticality.

  1. One can start from a straightforward generalization of the definition of quantum fluctuations suggested by path integral approach. Holography would suggest that quantum fluctuations correspond to a delocalization in the space of highly correlated pairs of 3-surfaces. This is a nice definition consistent also with the weakest definition relying on Uncertainty Principle but this looks somewhat trivial.

  2. Quantum criticality is key feature of TGD and would suggest that quantum fluctuations are analogous to thermodynamical fluctuations at criticality and thus involve long range correlations and non-determinism. Thermodynamical fluctuations induce phase transitions. Same should apply to quantum critical quantum fluctuations.

    In the adelic approach to TGD p-adic primes and values of heff correspond to various quantum phases of matter. In ZEO phase transition should correspond for particle space-time surface a situation in which the two ends of CD correspond to different phases, that is to different values of p-adic prime p and/or heff and other collective parameters: note that algebraic extensions of rationals define an evolutionary hierarchy and should also appear as this kind of parameters.

    Zero energy state would have well-defined values of prime p and/or heff at the passive boundary of CD left unchanged by a sequence of repeated state function reductions (generalized Zeno effect). At the active end of CD, which changes during the sequence and at which members of state pairs change, one would have quantum superposition of phases with different values of p and/or heff. This conforms with the idea about what quantum critical fluctuations should mean. Passive end would not fluctuate but active end would do so. Quantum fluctuations would become part of the definition of quantum state in ZEO. The state function reduction to the opposite boundary of CD would change the roles of active and passive boundaries of CD. This would have interpretation as a quantum phase transition leading to well-defined phase at the formerly active boundary. Hence the notion of quantum phase transition would also become precisely defined.

For a summary of earlier postings see Latest progress in TGD.

Tuesday, August 09, 2016

Misbehaving b-quarks and proton's magnetic body

Science News tells about misbehaving bottom quarks (see also the ICHEP talk). Or perhaps one should talk about misbehaving b-hadrons - hadrons containing b- quarks. The mis-behavior appears in proton-proton collisions at LHC. This is not the only anomaly associated with proton. The spin of proton is still poorly understood and proton charge radius if quite not what it should be. Now we learn that there are more b-containing hadrons (b-hadrons) in the directions deviating considerably from the direction of proton beam: discrepancy factor is of order two.

How this could reflect the structure of proton? Color magnetic flux tubes are the new TGD based element in the model or proton: could they help? I assign to proton color magnetic flux tubes with size scale much larger than proton size - something like electron Compton length: most of the mass of proton is color magnetic energy associated with these tubes and they define the non-perturbative aspect of hadron physics in TGD framework. For instance, constituent quarks would be valence quarks plus their color flux tubes. Current quarks just the quarks whose masses give rather small contribution to proton mass.

What happens when two protons collide? In cm system the dipolar flux tubes get contracted in the direction of motion by Lorentz contraction. Suppose b-hadrons tend to leave proton along the color magnetic flux tubes (also ordinary em flux tubes could be in question). Lorentz contraction of flux tubes means that they tend to leave in directions orthogonal to the collision axis. Could this explain the misbehavior of b-hadrons?

But why only b-hadrons or some fraction of them should behave in this manner? Why not also lighter hadrons containing c and s? Could this relate to the much smaller size of b-quark defined by its Compton length λ= hbar/m(b) , m(b) = 4.2 GeV, which is much shorter than the Compton length of u-quark (the mass of constituent u quark is something like 300 MeV and the mass of current u quark is few MeVs. Could it be that lighter hadrons do not leave proton along flux tubes? Why? Are these hadrons or corresponding quarks too large to fit (topologically condense) inside protonic flux tube? b-quark is much more massive and has considerably smaller size than say c-quark with mass m(c) = 1.5 GeV and could be able to topologically condense inside the protonic flux tube. c quark should be too large, which suggests that the radius of flux tubes is larger than proton Compton length. This picture conforms with the view of perturbative QCD in which the primary processes take place at parton level. The hadronization would occur in longer time scale and generate the magnetic bodies of outgoing hadrons. The alternative idea that also the color magnetic body of hadron should fit inside the protonic color flux tube is not consistent with this view.

For a summary of earlier postings see Latest progress in TGD.

Friday, August 05, 2016

Is the new physics really so elementary as believed?

Last night I was thinking about the situation in particle physics. The inspiration of course comes from the 750 GeV particle, which does not exist anymore officially. I am personally puzzled. Various bumps about which Lubos have kept count fit nicely to the spectrum of mesons of M89 hadron physics (almost)-predicted by TGD (see this, this, this, and this) . They have precisely the predicted masses differing by a factor 512 from those of M107 hadron physics, the good old hadron physics. Is it really possible that Universe has made a conspiracy to create so many statistical fluctuations just to the correct places? Could it be that something is wrong in the basic philosophy of experimental particle physics, which leads to the loss of information?

First of all, it is clear that new physics is badly needed to solve various theoretical problems such as fine tuning problem for Higgs mass to say nothing about the problem of understanding particle mass scales. New physics is necessary but it is not found. What goes wrong? Could it be that we are trying to discover wrong type of new physics?

Particle physics is thought to be about elementary objects. There would be no complications like those appearing in condensed matter physics: criticality or even quantum criticality, exotic quasiparticles, ... This simplifies the situation enormously but still one is dealing with a gigantic complexity. The calculation of scattering rates is technically extremely demanding but basically application of well-defined algorithms; Monte Carlo modelling of the actual scattering experiments such as high energy proton-proton collisions is also needed. One must also extract the signal from a gigantic background. These are extremely difficult challenges and LHC is a marvellous achievement of collaboration and coherence: like string quartet but with 10,000 players.

What one does is however not to just look what is there. There is no label in the particle telling "I am the exotic particle X that you are searching for". What one can do is to check whether the small effects - signatures - caused by a given particle candidate can be distinguished from the background noise. Finding a needle in haystack is child's play when compared with what one must achieve. If some totally new physics not fitting into the basic paradigms behind search algorithms is there, it is probably lost.

Returning to the puzzle under consideration: the alarming fact is that the colliding protons at LHC form a many-particle system! Could it happen that the situation is even more complex than believed and that phenomena like emergence and criticality encountered in condensed matter physics could be present and make life even more difficult?

As a matter of fact, already the phase transition from confined phase to perturbative QCD involving thermodynamical criticality would be example of this complexity. The surprise from RHIC and later LHC was that something indeed happened but was different than expected. The transition did not seem to take place to perturbative QCD predicting thermal "forgetfulness" and isotropic particle distributions from QCD plasma as black body radiation. For peripheral collisions - colliding particles just touching - indications for string like objects emerged. The notion of color glass was introduced and even AdS/CFT was tried (strings in 10-D space-time!) but without considerable success. As if a new kind of hadron physics with long range correlation in proton scale but with energy scale of hundreds of proton masses would have been present. This is mysterious since Compton lengths for this kind of objects should be of order weak boson Compton length.

In TGD Universe this new phase would be M89 hadron physics with large value heff =n×h, with n =512 to scale up M89 hadron Compton length to proton size scale to give long range correlations and fluctuation in proton scale characterizig quantum criticality. Instanton density I ∝ E• B for colliding protons would appear as a state variable analogous to say pressure in condensed matter and would be large just for the peripheral collisions. The production amplitude for pseucoscalar mesons of new hadron physics would by anomaly arguments be obtained as Fourier transform of I. The value of I would be essentially zero for head-on collisions and large only for peripheral collisions - particles just touching - in regions where E and B tend to be parallel. This would mean criticality. There could be similar criticality with respect to energy. If experimenter poses kinematical cutoffs - say pays attention only to collisions not too peripheral - the signal would be lost.

This would not be new. Already at seventies anomalous production of electron-positron pairs perhaps resulting from pseudoscalar state created near collision energy allowing to overcome Coulomb wall where reported: criticality again. The TGD model was in terms of leptopions (electro-pions) (see this) and later evidence for their muonic and tau counterparts have been reported. The model had of course a bad problem: the mass of leptopion is essentially twice that of lepton and one expects that colored lepton is also light. Weak boson decay widths do not allow this. If the leptopions are dark in TGD sense, the problem disappears. These exotic bumps where later forgotten: a good reason for this is that they are not allowed by the basic paradigms of particle physics and if they appear only at criticality they are bound to experience the fate of being labelled as statistical fluctuations.

This has served as an introduction to a heretic question: Could it be that LHC did not detect 750 GeV bosons because the kinematical cuts of the analysis eliminate the peripheral collisions for which protons just touch each other? Could these candidates for pseudo-scalars of M89 hadron physics be created by the instanton anomaly mechanism and only in periphery? And more generally, should particle physicists consider the possibility that they are not anymore studying collisions of simple elementary systems?

To find M89 pseudoscalars one should study peripheral collisions in which protons do not collide quite head-on and in which M89 pseudoscalars could be generated by em instanton mechanism. In peripheral situation it is easy to measure the energy emitted as particles since strong interactions are effectively absent - only the E•B interaction plus standard em interaction if TGD view is right (note that for neutral vector mesons the generalization of vector meson dominance based on effective action coupling neutral vector boson linearly to em gauge potential is highly suggestive). Unfortunately, peripheral collisions are undesired since beams are deflected from head-on course! These events are however detected but data tend to end up to trash bin usually as also deflected protons!! Luckily, Risto Orava's team (see this and this) is studying just those p-p collisions, which are peripheral! It would be wonderful if they would find Cernettes and maybe also other M89 pseudo-scalars from the trashbin!

Large statistical fluctuation certainly occurred. The interpretation for the large statistical fluctuation giving rise to Cernette boom could be as the occurrence of un-usually large portion of peripheral events allowing the production of M89 mesons, in particular Cernettes.

To sum up, the deep irony is that particle physicists are trying desperately to find new physics although it has been found long ago but put under the rug since it did not conform with QCD and standard model. The reductionistic dogma dictates that the acceptable new physics must be consistent with the standard model: no wonder that everything indeed continues to be miraculously consistent with standard model and no new physics is found! Same is true in gravitational sector: reductionism demands that string model leads to GRT and the various anomalies challenging GRT are simply forgotten.

For details see the earlier blog post, the chapter New Physics predicted by TGD: part I of "p-Adic Physics" or the article M89 hadron physics and quantum criticality.

For a summary of the earlier postings see Latest progress in TGD.

Cosmic redshift but no expansion of receding objects: one further piece of evidence for TGD cosmology

"Universe is Not Expanding After All, Controversial Study Suggests" was the title of very interesting Science News article telling about study which forces to challenge Big Bang cosmology. The title of course involve the typical popular exaggeration.

The idea behind the study was simple. If Universe expands, one expects that also astrophysical objects - such as stars and galaxies - should participate the expansion, and should increase in size. The observation was that this does not happen! One however observes the cosmic redshift so that it is quite too early to start to bury Big Bang cosmology. The finding is however a strong objection against the strongest version of expanding Universe. That objects like stars do not participate the expansion was actually already known when I started to develop TGD inspired cosmology for quarter century ago, and the question is whether GRT based cosmology can model this fact naturally or not.

The finding supports TGD cosmology based on many-sheeted space-time. Individual space-time sheets do not expand continuously. They can however expand in jerk-wise manner via quantum phase transitions increasing the p-adic prime characterizing space-time sheet of object by say factor two of increasing the value of heff=n× h for it. This phase transition could change the properties of the object dramatically. If the object and suddenly expanded variant of it are not regarded as states of the same object, one would conclude that astrophysical objects do not expand but only comove. The sudden expansions should be however observable and happen also for Earth. I have proposed a TGD variant of Expanding Earth hypothesis along these lines (see this ).

When one approximates the many-sheeted space-time of TGD with GRT space-time, one compresses the sheets to single region of slightly curved piece of M4 and gauge potentials and the deviation of induced metric from M4 metric are replaced with their sums over the sheets to get standard model. This operation leads to a loss of information about many-sheetedness. Many-sheetedness demonstrates its presence only through anomalies such as different value of Hubble constant in scales of order large void and cosmological scales (see this ), arrival of neutrinos and gamma rays from supernova SN1987A as separate bursts (see this ), and the above observation.

One can of course argue that cosmic redshift is a strong counter argument against TGD. Conservation of energy and momentum implied by Poincare invariance at the level of imbedding space M4× CP2 does not seem to allow cosmic redshift. This is not the case. Photons arrive from the source without losing their energy. The point is that the properties of the imagined observer change as its distance from the source increases! The local gravitational field defined by the induced metric induces Lorentz boost of the M4 projection of the tangent space of the space-time surface so that the tangent spaces at source and receiver are boosted with respect to other: this causes the gravitational redshift as analog of Doppler effect in special relativity. This is also a strong piece of evidence for the identification of space-time as 4-surface in M4× CP2.

For details see the chapter More about TGD inspired cosmology of "Physics in Many-sheeted Space-time" or the article Some astrophysical and cosmological findings from TGD point of view.

For a summary of earlier postings see Latest progress in TGD.