Monday, November 03, 2008

More about CDF

The weekend has been rather busy and emotional, to say the least. Just week or two ago I compared the situation in world economy to that in theoretical particle physics and found many similarities (See Serious scientist syndrome and Two Richistans). I also speculated with the necessity of New Deal. Also in theoretical particle physics. Maybe this New Deal will be realized in particle physics much sooner than I ever dared to dream. I believe that CDF anomaly will not respect the sociology of science, and everything I know about anomaly (at my non-specialist level forcing to concentrate on bare essentials) is consistent with what TGD predicts. In particular, the correct prediction for the lifetime of new particle without bringing in no new parameters is something which is really difficult to dismiss. Also the predictions for the masses of τ-pions are testable as also τ--baryons should be there. Similar predictions follow for electronic and muonic variants of leptohadron physics.

The response in blogs created in me mixed feelings. Many participants continued to behave as I would not exist at all and responded only to the hand wavings of names. There were also the usual arrogant telling that my theory does not predict anything! Still! After these 31 years and 15 books! Someone even censored out my posting! There were also people realizing how far reaching the implications of these successful predictions are. I am especially grateful for Lubos and Tommaso for adding link to my blog.

I glue below two responses to blogs. I have taken the freedom to edit and add something since originals can be found from the blogs. The first one is to Tommaso Dorigo in his blog and tries to explain how the basic predictions come out. The second one is to Jester in Resonaance blog and tries to make clear that the notion predict means much more than doing Monte Carlo calculations. Here I have attached some new arguments to the end of the response.

My hope is that I could make clear that theoretical physics is very conceptual stuff at the fundamental level and the conceptualization is far from being funny world salad since these words with precise meaning provide a higher level language with which to communicate complex ideas. This is something which quite too many theoretical physicists restricting their activities to mere application of methods have forgotten. I am not of course referring to Tommaso and Jester here. I must however say that in my own country the situation in this respect is rather gloomy, to put it mildly.

Response to Tommaso Dorigo

Dear Tommaso,

the model without further calculations explains the following observations.

  1. Jets results by the same mechanism as in QCD. A coherent state of τ-pions is generated in the non-orthogonal E and B of colliding protons, which is then heated to QCD plasma like state decaying to colored lepton jets producing leptohadrons in turn decaying to ordinary leptons. I have considered in detail the decay mechanism leading from colored excitations to ground state here.

  2. Muons dominate if the masses of τ and τc(ν and νc) are near to each other because there is very little phase space for τ final states. The near equality of the masses is motivated by the fact that this is true for ec and μc. In the case of electron the small electron mass reduces the phase space. In the case of neutrino the equality is not checked: it would predict that charge electropion has mass nearly equal to that of electron. I have very probably discussed this point in the above link: this would of course relate to the anomalous production of electron positron pairs discovered already at seventies but for some reason forgotten since then.

  3. The numbers of anomalous muons with opposite charges are same since neutral taupion initiates the jets.

  4. One can also calculate the life times of neutral and charged τ-pions just by putting to the standard formula for lifetimes of ordinary pions appropriate masses. Charged pion decays weakly and everything is calculable using PCAC hypothesis (pion field proportional to axial current), neutral pion decays weak two photon channel dominantly and anomaly considerations fix the Lagrangian to product of τ-pion field and E·B with a coefficient involving only axial coupling f(πτ)= xm(πτ), fine structure constant, m(πsub>τ)... The prediction is correct is x is scaled down from that for ordinary pion by a factor of .4. I would not like to sound like a teacher but add this result should really wake up everyone in the audience.

  5. The prediction for neutral leptopion mass is 3.6 GeV and same as in the paper of CDF collaboration [13], which had appeared to the arXiv Monday morning as I learned from the blog of Tommaso. The masses suggested in the article were 3.6 GeV, 7.3 GeV, and 15 GeV. p-Adic length scale hypothesis predicts that allowed mass scales come as powers of sqrt(2) and these masses come in good approximation as powers of 2. Several p-adic scales appear in low energy hadron physics for quarks and this replaces Gell-Mann formula for low-lying hadron masses. Therefore one can ask whether these masses correspond to neutral tau-pion with p= Mk=2k-1, k=107) and its scaled up variants with p=about 2k, k= 105, and k=103 (also prime). The prediction for masses would be 3.6 GeV, 7.2 GeV, 14.4 GeV.
  6. Also the total rate for virtual leptopion production is calculable using the product of leptopion field and instanton action density E·B. This requires a model for the collision. The simplest thing to do is to start with free collision parameterized by impact parameter and velocity and integration of differential cross section over impact parameter values up to infrared cutoff, which must be posed in order to have a finite result. This was done in case of lepto-electron production using classical model for orbits of ions and resulting E and B. In the case of electropion atomic size was the first guess for the cutoff. Now τ-pion Compton length is the first guess. One can estimate from this the rate for the production of leptopions and since this is the rate determining step, the total rate for production of anomalous muons via jet mechanism.

Response to Jester

Dear Jester,

a further comment about what is to predict.

TGD predicts colored leptons from extremely general premises. TGD was born as a proposal how to solve the problem of General Relativity due to the fact that energy is not well defined. Space-time as 4-D surface predicts standard model symmetries for H=M4×CP2 and that color is not spin like quantum number but corresponds to CP2 partial waves. Leptohadron physics is just one of the many predicted deviations from standard model leading to a plethora of testable predictions. TGD of course actually predicts infinite number of colored excitations of quarks and leptons so that we would be beginning to see only a tip of iceberg.

Second example is nuclear string model. Nucleons bind to strings with nucleons connected by color bonds with scaled variants of quark and antiquark at its ends. Fractal hierarchy of QCD like physics is also a prediction of induced gauge field leading to geometrization of known gauge fields and predicting extremely strong correlations between classical ew and color gauge fields expressible in terms of four CP2 coordinates. In particular, em field is accompanied by classical color gauge field for non-vacuum extremals. Therefore em flux tubes must have quark and antiquark at its ends serving as the source of the field.

Besides neutral bonds there can be charged color bonds. This predicts a large number of new nuclear states. Also for these there is empirical evidence: some of it came just some time ago, some of it has been collected during last 30 years by Russian physicist Shnoll. Nuclear decay rates correlate with distance from Sun and the explanation is that X rays from Sun induce transitions from ground states of nucleus to excited states containing charged color bonds. This is testable in laboratory by irradiating nuclei with X rays.

TGD also predicts the spectrum of elementary particle masses with one per cent accuracy besides mass scales from extremely general assumptions: super-conformal invariance (its generalization actually) and p-adic thermodynamics. Masses are exponentially sensitive to the integer k in prime p∼ 2k so that there is no hope of getting masses correctly by fitting. The detailed calculations are in 4 chapters of the book containing the leptopion chapter. There is large number of testable predictions: new exotic states, existence scaled up variants of quarks, neutrinos also of electrons with mass scale scaled up by a power of square root of 2.

One problem is that colleagues typically confuse prediction with mere numerics which can be done by any gifted graduate student. Before you can concentrate on numerics or have general recipes for calculating Feynman graphs, a huge amount of conceptualization is needed starting from ontological questions. What theory predicts to exists? This is the first question and must be answered before detailed Monte Carlo calculations of particle distributions. Some one must do it and it has been my fate.

In TGD this has led to a totally new ontology that I call zero energy ontology, which is consistent with crossing symmetry of QFTs but provides a radically new view about quantum states. A new view about time has been necessary to understand the formulation of M-matrix, which generalizes the notion of S-matrix and fuses thermodynamics with quantum theory being essentially real square root of density matrix analogous to the modulus of wave function multiplied by unitary "phase" representing S-matrix. Connes tensor product fixes M-matrix highly uniquely so that the task is to calculate. Also the notion of Feynman graph generalizes. The notion of measurement resolution becomes a key notion and one could even say that its mathematical representation fixes the M-matrix. And so on.

This conceptualization period is accompanied (I wrote first "is followed", which is not of course true) by a period when you gradually quantify your predictions by starting from simple yes/no predictions which could kill you theory. You also busily explain anomalies: if not anything else, this prevents them to be put completely under the carpet by mainstream. Unfortunately the electropion anomaly discovered already at seventies suffered this fate.

I hasten to admit that TGD is far from precise Feynman rules that any grad student could apply. For instance, during last weeks I have been working with an application of category theory in order to formulate precisely the generalized Feynman rules of TGD in terms of N-point functions of conformal QFT. Or rather, those of symplecto-conformal QFT. Symplectic QFT is analogous to conformal QFT and I managed to solve its N-point functions from associativity conditions in terms of operad notion giving infinite hierarchy of discrete symplectic field algebras. Discreteness is correlate for finite measurement resolution and realized in terms of number theoretic braids which also emerge from totally different premises. Very beautiful new mathematics allowing to formulate the notion of finite measurement resolution emerges.

From this it is still a long way to practical calculations since one must deduce the long length scale limit of the theory in order to use continuum mathematics, and specify precisely what of the very many candidates for the conformal field theories applies in a specific situation. Also the very notion of conformal field theory generalizes for light-like 3-surfaces.

The second problem is that the extreme arrogance of particle physics colleagues has made impossible the communication of TGD. It is of course censors who suffer most from the censorship in the long run. As a consequence I am doomed to be 31 years ahead of the community, which is still trying to make sense of string models refusing to realize that an extremely beautiful generalization obtained by replacing strings with light-like 3-surfaces exists and predicts among other things space-time dimension correctly and deduces standard model symmetries from number theory.

No comments: