Is the new physics really so elementary as believed?
Last night I was thinking about the situation in particle physics. The inspiration of course comes from the 750 GeV particle, which does not exist anymore officially. I am personally puzzled. Various bumps about which Lubos have kept count fit nicely to the spectrum of mesons of M89 hadron physics (almost)-predicted by TGD (see this, this, this, and this) . They have precisely the predicted masses differing by a factor 512 from those of M107 hadron physics, the good old hadron physics. Is it really possible that Universe has made a conspiracy to create so many statistical fluctuations just to the correct places? Could it be that something is wrong in the basic philosophy of experimental particle physics, which leads to the loss of information?
First of all, it is clear that new physics is badly needed to solve various theoretical problems such as fine tuning problem for Higgs mass to say nothing about the problem of understanding particle mass scales. New physics is necessary but it is not found. What goes wrong? Could it be that we are trying to discover wrong type of new physics?
Particle physics is thought to be about elementary objects. There would be no complications like those appearing in condensed matter physics: criticality or even quantum criticality, exotic quasiparticles, ... This simplifies the situation enormously but still one is dealing with a gigantic complexity. The calculation of scattering rates is technically extremely demanding but basically application of well-defined algorithms; Monte Carlo modelling of the actual scattering experiments such as high energy proton-proton collisions is also needed. One must also extract the signal from a gigantic background. These are extremely difficult challenges and LHC is a marvellous achievement of collaboration and coherence: like string quartet but with 10,000 players.
What one does is however not to just look what is there. There is no label in the particle telling "I am the exotic particle X that you are searching for". What one can do is to check whether the small effects - signatures - caused by a given particle candidate can be distinguished from the background noise. Finding a needle in haystack is child's play when compared with what one must achieve. If some totally new physics not fitting into the basic paradigms behind search algorithms is there, it is probably lost.
Returning to the puzzle under consideration: the alarming fact is that the colliding protons at LHC form a many-particle system! Could it happen that the situation is even more complex than believed and that phenomena like emergence and criticality encountered in condensed matter physics could be present and make life even more difficult?
As a matter of fact, already the phase transition from confined phase to perturbative QCD involving thermodynamical criticality would be example of this complexity. The surprise from RHIC and later LHC was that something indeed happened but was different than expected. The transition did not seem to take place to perturbative QCD predicting thermal "forgetfulness" and isotropic particle distributions from QCD plasma as black body radiation. For peripheral collisions - colliding particles just touching - indications for string like objects emerged. The notion of color glass was introduced and even AdS/CFT was tried (strings in 10-D space-time!) but without considerable success. As if a new kind of hadron physics with long range correlation in proton scale but with energy scale of hundreds of proton masses would have been present. This is mysterious since Compton lengths for this kind of objects should be of order weak boson Compton length.
In TGD Universe this new phase would be M89 hadron physics with large value heff =n×h, with n =512 to scale up M89 hadron Compton length to proton size scale to give long range correlations and fluctuation in proton scale characterizig quantum criticality. Instanton density I ∝ E• B for colliding protons would appear as a state variable analogous to say pressure in condensed matter and would be large just for the peripheral collisions. The production amplitude for pseucoscalar mesons of new hadron physics would by anomaly arguments be obtained as Fourier transform of I. The value of I would be essentially zero for head-on collisions and large only for peripheral collisions - particles just touching - in regions where E and B tend to be parallel. This would mean criticality. There could be similar criticality with respect to energy. If experimenter poses kinematical cutoffs - say pays attention only to collisions not too peripheral - the signal would be lost.
This would not be new. Already at seventies anomalous production of electron-positron pairs perhaps resulting from pseudoscalar state created near collision energy allowing to overcome Coulomb wall where reported: criticality again. The TGD model was in terms of leptopions (electro-pions) (see this) and later evidence for their muonic and tau counterparts have been reported. The model had of course a bad problem: the mass of leptopion is essentially twice that of lepton and one expects that colored lepton is also light. Weak boson decay widths do not allow this. If the leptopions are dark in TGD sense, the problem disappears. These exotic bumps where later forgotten: a good reason for this is that they are not allowed by the basic paradigms of particle physics and if they appear only at criticality they are bound to experience the fate of being labelled as statistical fluctuations.
This has served as an introduction to a heretic question: Could it be that LHC did not detect 750 GeV bosons because the kinematical cuts of the analysis eliminate the peripheral collisions for which protons just touch each other? Could these candidates for pseudo-scalars of M89 hadron physics be created by the instanton anomaly mechanism and only in periphery? And more generally, should particle physicists consider the possibility that they are not anymore studying collisions of simple elementary systems?
To find M89 pseudoscalars one should study peripheral collisions in which protons do not collide quite head-on and in which M89 pseudoscalars could be generated by em instanton mechanism. In peripheral situation it is easy to measure the energy emitted as particles since strong interactions are effectively absent - only the E•B interaction plus standard em interaction if TGD view is right (note that for neutral vector mesons the generalization of vector meson dominance based on effective action coupling neutral vector boson linearly to em gauge potential is highly suggestive). Unfortunately, peripheral collisions are undesired since beams are deflected from head-on course! These events are however detected but data tend to end up to trash bin usually as also deflected protons!! Luckily, Risto Orava's team (see this and this) is studying just those p-p collisions, which are peripheral! It would be wonderful if they would find Cernettes and maybe also other M89 pseudo-scalars from the trashbin!
Large statistical fluctuation certainly occurred. The interpretation for the large statistical fluctuation giving rise to Cernette boom could be as the occurrence of un-usually large portion of peripheral events allowing the production of M89 mesons, in particular Cernettes.
To sum up, the deep irony is that particle physicists are trying desperately to find new physics although it has been found long ago but put under the rug since it did not conform with QCD and standard model. The reductionistic dogma dictates that the acceptable new physics must be consistent with the standard model: no wonder that everything indeed continues to be miraculously consistent with standard model and no new physics is found! Same is true in gravitational sector: reductionism demands that string model leads to GRT and the various anomalies challenging GRT are simply forgotten.
For a summary of the earlier postings see Latest progress in TGD.