Saturday, July 26, 2014

About the origin of Born rule

Lubos has been again aggressive. At this time Sean Carroll became the victim of Lubos's verbal attacks. The reason why Lubos got angry was the articles of Carroll and his student to derive Born rule from something deeper: this deeper was proposed to be many worlds fairy tale as Lubos expresses it. I agree with Lubos about the impossibility to derive Born rule in the context of wave mechanics - here emphasis is on "wave mechanics". I also share the view about many worlds interpretation - at least I have not been able to make any sense of it mathematically.

Lubos does not miss the opportunity to personally insult people who tell about their scientific work on blogs. Lubos does not realize that this is really the only communication channel for many scientists. For the out-laws of the academic world blogs, home pages, some archives, and some journals (of course not read the "real" researchers enjoying monthly salary) provide the only manner to communicate their work. Super string hegemony did good job in eliminating people who did not play the only game in the town: also I had the opportunity to learn this.

Ironically, also Lubos is out-of-law, probably due to his overly aggressive blog behaviors in past. Perhaps Lubos does not see this as a personal problem since - according to his own words - he has decided to not publish anything without financial compensation because this would make him communist.

Concerning Born rule I dare to have a different opinion than Lubos. I need not be afraid of Lubos's insults since Lubos as a brahmine of science refuses to comment anything written by inferior human beings like me and even refuses to mention their names: maybe Lubos is afraid of doing it might somehow infect him with the thoughts of casteless.


Without going to the details of quantum measurement theory, one can say that Born's rule is bilinear exression for the initial and final states of quantum mechanical transition amplitude. Bilinearity is certainly something deep and I will go to that below. Certainly Born's rule gives the most natural expression for the transition amplitude: demonstraing this is of course not a derivation for it.

  1. One could invent for the transition amplitude formal expressions non-linear in normalized initial and final states. One can however argue that the acceptable expressions must be symmetric in initial and final states.

  2. The condition that the transition amplitude conserves quantum numbers associated with symmetries suggests strongly that the transition amplitude is a function of the bilinear transition amplitude between initial and final states and norms of initial and final states. The standard form for non-normalized states - inner product divided by product of square roots norms - is indeed of this form. For instance, one could add exponentials of the norms of initial and final state norms.

  3. Projective invariance of the transition amplitude - the independence of the transition probabilities from normalization- implies that standard transition amplitude multiplied by a function - say exponential - of the modules square of the standard amplitude (transition probability in standard approach) remains to be considered.

  4. One could however still consider the possibility that the probability as given by Born rule are replaced by its function: pij→ f(pij)pij. Unitary poses strong constraints on f and my guess that f =1 is the only possibility.

Sidestep: To make this more concrete, the proponents of so called weak measurement theory propose a modification of formulate for the matrix element of an operator A to ⟨i|A|f⟩/⟨i|f⟩. The usual expression contains the product of square roots of the norms instead of ⟨i|f⟩. This is complete nonsense since for orthogonal states the expression can give infinity and for A =I, unit matrix, it gives same matrix element between any two states. For some mysterious reason the notion weak measurement - to be sharply distinguished from interaction free measurement - has ended up to Wikipedia and popular journals comment it enthusiastically as a new revolution in quantum theory.

Consider now the situation in TGD framework.

  1. In TGD framework the configuration space, "World of Classical Worlds" consisting of pairs of 3-surfaces at opposite boundaries of causal diamonds (CDs), is infinite-dimensional, and this sharply distinguishes TGD based quantum theory from wave mechanics. More technically, hyperfinite factors of type II (and possibly also III) replace factors of type I, in the mathematical formulation of the theory.

    Finite measurement resolution is unavoidable and is represented elegantly in terms of inclusions of hyper-finite factors. This means that single ray of state space is replaced with infinite-D sub-space whose states cannot be distinguished from each other in given measurement resolution. The infinite-dimensional character of WCW makes the definition of the inner product for WCW spinor fields extremely delicate.

    Note that WCW spinor fields are formally classical at WCW level and state function reduction remains the only genuinely quantal aspect of TGD. At space-time level one must perform second quantization of induced spinor fields to build WCW gamma matrices in terms of fermionic oscillator operators.

  2. WCW spinors which are fermionic Fock states associated with a given 3-surface. There are good reasons to believe that the generalization of the usual bilinear inner product defined by integration of the spinor bilinear over space (with Euclidian signature) generalizes but under extremely restrictive condition. Spinor bilinear is replaced with fermionic Fock space inner product and this bilinear is integrated over WCW.

    The integration over WCW makes sense only if WCW allows a metric which is invariant under maximal group of isometries- this fixes WCW and physics highly uniquely. To avoid divergences one must also assume that Ricci scalar vanishes and empty space Einstein equations hold true. Metric determinant is ill-defined and must be cancelled by the Gaussian determinant coming from the exponent of vacuum functional, which is exponent of Kähler action if WCW metric is Kähler as required by the geometrization of hermitian conjugation which is basic operation in quantum theory. One could however still consider the possibility that the probabilities given by Born rule are replaced by their functions: pij→ f(pij)pij but unitarity excludes this. Infinite-dimensionality is thus quite not enough: something more is needed unless one assumes unitarity.

  3. Zero Energy Ontology brings in the needed further input. In ZEO the transition amplitudes correspond to time-like entanglement coefficients of positive and negative energy parts of zero energy states located at the opposite light-like boundaries of causal diamond. The deep principle is that zero energy states code for the laws of physics as expressed byS-matrix and its generalizations in ZEO

    This implies that transition amplitudes is automatically bilinear with respect to positive and negative energy parts of zero energy state, which correspond to initial and final states in positive energy ontology. The question why just Born rule disappears in ZEO.

That ZEO gives justification also for Born rule is nice since it has produced a solution also to many other fundamental problems of quantum theory. Consider only the basic problem of quantum measurement theory due to determinism of Schrödinger equation contra non-determinism of state function reduction: Bohr's solution was to give up entirely ontology and take QM as a mere toolbox of calculational rules.

There is also the problem the relationship between geometric time and experienced time, which ZEO allows to solve and leads to much more detailed view about what happens in state function reduction. The most profound consequences are at the level of consciousness theory which is essentially generalization of ordinary quantum measurement theory in order to make observer part of the system by introducing the notion of self. ZEO also makes the physical theories testable: any quantum state can be achieved from vacuum in ZEO whereas in standard positive energy ontology conservation laws make this impossible so that at the level of principle the testing of the theory becomes impossible without additional assumptions.

Thursday, July 17, 2014

Has the decay of dark photons to visible photons been observed in cosmological scales?

There is an interesting news providing new light to the puzzles of dark matter in New Scientist. It has been found that Universe is too bright. There are too many high energy UV photons in the spectrum. The model calculations suggest also that this too high brightness has emerged lately, and was not present in the early universe. The intergalactic space contains more neutral hydrogen and thus also more ionized hydrogen as thought previously and it was hoped that the ionized hydrogen could explain the too high brightness. It is now however clear that 5 times more ionized hydrogen would be required than theory allows accepting the experimental data.


The question is whether dark matter could explain the anomaly.

  1. The usual dark matter candidates have by definition extremely weak interactions - not only with ordinary matter and also with dark matter. Therefore it is not easy to explain the finding in terms of ordinary dark matter. The idea about dark matter as remnant from the early cosmology does not fit naturally with the finding that the surplus UV radiation does not seem to be present in the early Universe.

  2. In TGD dark matter is ordinary matter with large heff=n× h and has just the ordinary interactions with itself but no direct interactions with visible matter. Thus these interactions produce dark radiation with visible and UV energies but with probably much lower frequencies (from E= hefff). The energy preserving transformations of dark photons to ordinary ones are an obvious candidate for explaining the surprlus UV light.

  3. These transitions are fundamental in TGD inspired model of quantum biology. Biophotons are in visible and UV range and identified as decay products of dark photons in living matter. The fact that the surplus has appeared recently would conform with the idea that higher levels of dark matter hierarchy have also appeared lately. Could the appearance of UV photons relate to the generation of dark matter responsible for the evolution of life? And could the surplus ionization of hydrogen also relate to this? Ionization is indeed one of the basic characteristics of living matter and makes possible charge separation (see this), which is also a crucial element of TGD inspired quantum biology (see this).

Do electrons serve as nutrients?

The New Scientist article about bacteria using electrons as nutrients is very interesting reading since the reported phenomenon might serve as a test for the TGD inspired idea about metabolism as a transfer of negentropic entanglement (NE) at fundamental level (see this and this).

  1. NE is always between two systems: nutrient and something, call it X. The proposal inspired by a numerical co-incidence was that X could be what I have called Mother Gaia. X could be also something else, say personal magnetic body. The starting point was the claim that the anomalously high mass of electronic Cooper pair in rotating supercounductor (slightly larger than the sum of electron masses!) could be due to a gravimagnetic effects which is however too strong by a factor 1028. This claim was made by a respected group of scientists. Since the effect is proportional to the gravimagnetic Thomson field proportional to the square of Planck constant, the obvious TGD inspired explanation would be heff≈ 1014 (see this and this).

  2. Gravitational Planck constant hgr= GMm/v0, v0 a typical velocity in system consisting of masses M>>m and m was introduced originally by Nottale. I proposed that it is genuine Planck constant assignable to flux tubes mediating gravitational interaction between M and m. In the recent case v0 could be the rotating velocity of Earth around its axis at the surface of Earth.

  3. For electron, ions, molecules, .. the value of hgr would of the order of 1014 required by the gravimagnetic anomaly and is also of the same order as heff=n× h needed by the hypothesis that cyclotron energies for these particles are universal (no mass dependence) and in the visible and UV range assigned to biophotons. Biophotons would result from dark photons via phase transition. This leads to the hypothesis heff=hgr unifying the two proposals for the hierarchy of Planck constants at least in microscopic scales.


    Thanks to Equivalence Principle implying that gravitational Compton length does not depend on particle's mass, Nottale's findings can be understood if hgr hypothesis holds true only in microscopic scales. This would mean that gravitation in planetary system is mediated by flux tubes attached to particles rather than entire planet, say. One non-trivial implication is that graviton radiation is dark so that single graviton carries much larger energy than in GRT based theory. The decay of dark gravitons to ordinary gravitons would produce bunches of ordinary gravitons rather than continuous stream: maybe this could serve as an experimental signature. Gravitational radiation from pulsars is just at the verge of detection if it is what GRT predicts. TGD would predict pulsed character and this might prevent its identification if based on GRT based belief system.

  4. In the recent case the model would say that the electrons serving as nutrients have this kind of negentropic entanglement with Mother Gaia. hgr=heff would be of order 108. Also in nutrients electrons would be the negentropically entangled entities. If the model is correct, nutrient electrons would be dark and could also form Cooper pairs. This might serve as the eventual test.

Electrons are certainly fundamental for living matter in TGD Universe.

  1. Cell membrane is assumed to be high Tc electronic superconductor (see this). Members of Cooper pairs are at flux tubes carrying opposite magnetic fields so that the magnetic interaction energy produces very large binding energy for the large values of heff involved: of the order of electron volts! This is also the TGD based general mechanism of high Tc superconductivity: it is now accepted that anti ferromagnetism is crucial and flux tubes carrying fluxes at opposite directions is indeed very antiferromagnetic kind of thing.

  2. Josephson energy is proportional to membrane voltage (EJ= 2eV) is just above the thermal energy at room temperature meaning minimal metabolic costs.

  3. Electron's secondary p-adic time scale is .1 seconds, the fundamental biorhythm which corresponds to 10 Hz alpha resonance.

Wednesday, July 16, 2014

What self is?

The concept of self seems to be absolutely essential for the understanding of the macroscopic and macro-temporal aspects of consciousness and would be counterpart for observer in quantum measurement theory.

The original proposal was that self is conscious entity.

  1. Self corresponds to a subsystem able to remain un-entangled under the sequential informational 'time evolutions' U. Exactly vanishing entanglement is practically impossible in ordinary quantum mechanics and it might be that 'vanishing entanglement' in the condition for self-property should be replaced with 'subcritical entanglement'. If space-time decomposes into p-adic and real regions, and if entanglement between regions representing physics in different number fields vanishes, space-time indeed decomposes into selves in a natural manner. Causal diamonds would form natural imbedding space correlates for selves and their hierarchy would correspond to self hierarchy.

  2. The intuitive idea inspired by the formation of bound states of particles from particles was that self corresponds somehow to an integration of quantum jumps to single coherent whole. Later I gave up this idea since it was difficult to understand how the integration could take place.

  3. The next suggestion was that quantum jumps as such correspond to selves. It was however difficult to assign to selves identified in this manner a definite geometric time duration. It is an empirical fact that this kind duration can be assigned to mental images (identified as subselves).

  4. One could also introduce self as a subsystem which is only potentially consciousness and here the notion of negentropic entanglement suggests an obvious approach based on interaction free measurement. Negentropy Maximization Principle (NMP) implies that Universe is like a library with new books emerging continually at its shelves. This would explain evolution. One can however argue that negentropic entanglement - "Akashic records" - gives rise only to self model rather than self.

  5. The approach which seems the most convincing relies on the observation that ZEO sequences of ordinary state function reductions leaving state unchanged are replaced with sequences for which the part of the zero energy state associated with a fixed boundary of CD state remains unchanged in state function reduction whereas the state at the other end of CD changes. This is something new and explains the arrow of time and its flow and self could be understood as a sequence of quantum jumps at fixed boundary of CD (with the average location of second boundary shifted towards geometric future like in dispersion). Amusingly, this is in accordance with the original proposal except that state function reductions
    take place on same boundary of CD.

    This view is extremely attractive since it implies that the act of free will interpreted as genuine state function reduction must mean reversal for the direction of geometric time at some level of hierarchy of selves. The proposal has indeed been that sensory perception and motor action are time reversals of each other and that motor action involves sending of negative energy signals to the geometric past.

For details and background see the chapter "Self and binding" of "TGD inspired theory of consciousness".

Saturday, July 12, 2014

Post-empirical science or an expansion of scope: which would you choose?

Bee has very interesting comments about the thoughts of Richard Dawid about what assessment of physical theory is. Dawid sees that we are making a transition to post-empirism in which other than empirical facts serve increasingly as the criteria for deciding whether physical theory is useful.

Post-empirical science does is not an attractive vision about the future of science. For instance, the standard claim during string theory hegemony has been that string theories are totally exceptional and that the usual criteria do not apply to them. Bee comments also the notion of "usefulness" having also the sociological aspects in the cruel academic world in which we have to live.


People participating in the discussion seem to agree that theory assessment has become increasingly difficult. Philosopher Richard Dawid suggests what I would call giving up.

Why theory assessment has become so difficult? Is this really true? Or could it be that some wrong belief in our scientific belief system has caused this?

Could it be that our idea about what unified physical theory should be able to describe is badly wrong. When we speak about unification, we take the naive length scale reductionism as granted. We want to believe that everything physical above weak boson length scale is understood and the next challenge is to jump directly to Planck scale (itself a notion based on naive dimensional analysis and could lead to a totally wrong track concerning the ultimate nature of gravitation!).

In practice this means that we we drop from the field of attention entire fields of natural sciences such as biology and neuroscience - to say nothing about consciousness (conveniently reduced to physics in materialistic dogma). These fields provide a rich repertoire of what might could be seen as anomalies of existing physical theory provided we give up the dogma of length scale reductionism and see these anomalies as what they really are: phenomena about whose physical description or correlates of we actually don't heave a slightest clue. Admitting that we do not actually understand could be the way out of blind alley.

This kind of expansion of the view about what theory should explain might be extremely useful and open up new words for theoretician to understand. Theory could not anymore be degenerated to a question what happens in Planck length scale and it would have huge number of observations to explain. What are the basic new principles needed? This would become the basic question. One candidate for them is obviously fractality possibly replacing the naive length scale reductionism. This would bring in also philosophy, but in good sense rather than as an attempt to authorize a theory which has turned out incapable of saying anything interesting about the observed world.

Thursday, July 10, 2014

Does DNA understand speech or should you sing to it?

There is an interesting popular web article about the work of Peter Gariaev with whom I have written a couple of articles. One of the findings of Gariaev's group is that the intronic portion of the DNA has a statistical resemblance to the structure of language (words of language correspond to DNA codons and Zipf's law appears to be obeyed). The question whether introns could code language at molecular level comes to mind.

It is also reported that the connection with language is much more concrete. The words of spoken language generate response at the level of DNA: DNA "hears" and maybe understands language (or is it us who understand the language in this manner?). If one accepts that even water has memory and reacts to signals inducing emotions in living organisms, this would not be so surprising. In fact, in TGD framework water would be primitive life form with dark DNA consisting of protonic strings such that proton states would be in 1-1 correspondence with DNAs, RNAs, amino-acids and perhaps even tRNAs (see this). Vertebrate genetic code follows from natural assumptions between dark counterparts of DNAs and amino-acids.

So the claim is that spoken language modulating em radiation has effect on DNA. In standard physics context it is difficult to see how this could make sense. The energies of phonons at audible frequencies are simply so low that understanding the effect in terms of phonons does not seem to be possible. Could it make sense in TGD inspired quantum biology? One can at least try and this is what is done in the sequel. The explanation relies on the basic assumptions of TGD inspired quantum biology distilled during last 10 years.

  1. Dark matter corresponds to a hierarchy of phases labelled by the values of effective Planck constant given by heff= n×h (see this). This hypothesis can be reduced to the failure of strict determinism for the basic variational principle of TGD and is consistent with the the notion of gravitational Planck constant defined as hgr= GMm/v0, where v0 is characteristic velocity assignable to the two particle system consisting of masses m and M (see this). This formula holds true at flux tubes mediating gravitational interaction in terms of gravitonic "massless extremals" (MEs) topologically condensed at them.

    For elementary particles,ions, atoms, even biomolecules this formula is consistent with heff=hgr. Equivalence Principle implies that the formula for hgr must be assumed only for them to explain approximate Bohr orbitology for planetary orbits. For Earth-charged particle system the formula predicts Planck constant for which dark cyclotron photon energies in endogenous magnetic fields are in visible and UV range at which also biophoton energies are. Gravitational Compton length does not depend on the mass of particle - essential for macroscopic quantum coherence and consistent with Equivalence Principle. For Earth-Sun system the gravitational Compton lengths is of the order Earth radius, which suggests that at dark matter level Earth is macroscopic quantum system.

  2. This picture conforms with the hypothesis that biophotons are ordinary photons resulting in heff changing phase transition (see this). Since the energy levels of biomolecules belong to visible and UV range, dark photons could control biochemistry by dark-to-bio-photon transitions. This would give the missing interaction link between biochemistry and magnetic body. The standard hypothesis is that biophotons are side products of biochemistry: in TGD Universe biophotons would become active controllers of biochemistry and would be used by magnetic
    body.

  3. Living matter as a random soup of biomolecules is replaced with a highly organized structure. Dark matter can be seen as a library of Akashic records realized in terms of negentropic entanglement (see this). Each dark particle, atom, molecule, etc is at its own magnetic flux tube characterized by heff=hgr. One can say that each book in the Akashic library resides neatly at its own book shelf labelled by the value of magnetic field strength and heff. The communication between levels of dark matter hierarchy (book shelves) would take place by using heff changing transition of dark photons having a universal energy spectrum independent of the particle mass and depending on the strength of magnetic field at the flux tube. Visible photons correspond to single energy octave which suggests connection with music discussed in the earlier posting (see this).

In this framework it is not too difficult to understand how DNA could "hear" and maybe even "understand".
  1. DNA codons carry -2 units of em charge per single nucleotide due to the presence of one diphosphate in the sugar backbone. The ratio Qtot/Mtot= 2N(tot)e/Mtot=2e/M(ave) to which cyclotron frequency is proportional, is inversely proportional to the average mass M(ave) of the unit of DNA sequence. Hence DNA sequences are coded by cyclotron frequencies and to "wake up" given unit of DNA it is enough to irradiate it with dark photons at this cyclotron frequency. For long sequences of DNA cyclotron frequency becomes essentially constant if DNAs obey statistical a distribution with single Gaussian peak. One can consider the possibility that the distribution is many-peaked and fractal.

    This is not the only one possible option that one can imagine. Cyclotron frequencies could be also assignable - not to DNA itself but - to charged particles at the flux tubes associated with the basic units of DNA.

  2. There are two ways to "wake up" DNA: frequency resonance at the level of dark matter and energy resonance at the level of visible matter. The first manner to wake up DNA is by a transformation of acoustic signal to dark photons at cyclotron frequencies which are also cyclotron frequencies of DNA molecules. DNA units would be analogous to the frequency specific hair cells in cochlea. The TGD inspired model of hearing indeed assumes that the hair cells carry out this transformation. Second manner to wake up DNA is to transform the dark photons first to biophotons with a transition energy of DNA molecule and thus inducing the chemical transition. These dark photons could then excite the DNAs resonantly at cyclotron frequencies or a chemical transition energies after transition to bio-photon. This mechanism breaks quantum coherence.

    If the excited DNAs correspond genes or to a portion of DNA inducing gene expression, acoustic signal (say speech) would be transformed to genetic expression and thus generate a physiological response. Introns could also generate em signals transformed to acoustic signals giving eventually rise to internal speech. Here the cyclotron resonance mechanism could be at work. This mechanism respects quantum coherence.

  3. Right brain sings - left brain talks metaphor suggests an interpretation for these two mechanisms. For the singing right brain the cyclotron resonance for dark photons could dominate. For the talking left brain the chemical excitation using biophotons could dominate.
Gariaev's experiments suggest that amplitude modulation of light signal by acoustic signal, say speech, is enough.
  1. The carrier wave with single frequency modulated by single frequency would consist of a superposition of signals with frequencies which correspond to sum and difference for the frequencies involved. They could naturally correspond to parallel space-time sheets (MEs) (but this is not necessary): the test particle touching both sheets indeed experiences the sum of the effects caused by the two signals. The naive expectation would be that these signals are detected as such. This would not however allow the proposed mechanism.

    Another possibility is that the resulting photons at either or both space-time sheets having frequency and energy of (say) visible photons are transformed to dark photons with the frequency of phonon in the frequency range involved with the speech. This condition fixes the value of heff to be essentially the ratio of visible and audible carrier frequencies and fixes also the value of the endogenous magnetic field strength from the condition that cyclotron energy scale is same as the energy of visible photon. The MEs in question should be topologically condensed at the magnetic flux tubes.

  2. These dark photons transform to biophotons inducing a response both at the level of biochemistry and at the level of DNA sub-units (talking and singing) : heff in question is correct, the DNA sub-unit corresponding to flux tubes with the value of heff associated with dark photons is excited and can induce protein translation or some other form of gene expression so that the incoming signal finds expression.

  3. One can consider also acoustic signals transformed directly to dark photon electromagnetic signals propagating along flux tube-massless extremal pairs to DNA since living matter consists of piezo-electrets performing these transformations. These would correspond to communication by "singing": singing could correspond basically frequency modulation induced by the modulation of magnetic field strength ("whale's song"). The variation of membrane voltage by waves and by nerve pulses induce similar frequency modulation.

For details see the chapter Quantum model for hearing of "TGD and EEG" or the article Pythagoras, music, sacred geometry, and genetic code.

Wednesday, July 09, 2014

TGD view about homeopathy, water memory, and evolution of immune system

The following gives an attempt to build a brief sketch of TGD based model of water memory and homeopathy as it is after the input from Pollack's findings and heff=hgr=hem hypothesis.

Summary of the basic facts and overall view

A concice summary of the basic qualitative facts about homeopathy (see this) could be following.

  1. The manufacture of the homeopathic remedies consists of repeated dilution and agitation of water sample containing the molecules causing the effect which the remedy is intended to heal. This paradoxical looking healing method is based on "Alike likes alike" rule. This rules brings in mind vaccination causing immune system to develop resistance. The procedure seems to somehow store information about the presence of the molecules and this information induces immune response. Usually it is the organisms or molecules causing the disease which induce immune response.

  2. The ultra-naive and simplistic objection of skeptic is that the repeated dilution involved with the preparation of homeopathic remedy implies that the density of molecules is so small that the molecules can have absolutely no effect. Despite the fact that we live in information society, this is still the standard reaction of a typical skeptic.

  3. A lot of research is done by starting from the natural idea that the electro-magnetic fields associated with the invader molecules (or more complex objects) represent the needed information and that water somehow gets imprinted by these fields. This could for instance mean that water clusters learn to reproduce radiation at frequencies characterizing the invader molecule. Benveniste is one of the most outstanding pioneers in the field. Benveniste et al (see this) even managed to record the VLF frequency finger print of some bio-active molecules and record them in binary form allowing to to yield the same effect as the real bio-active molecule induced. Benveniste was labelled as a fraud. The procedure used by the journal Nature to decide whether Benveniste is swindler or not brings in mind the times of inquisition. It tells a lot about attitudes of skeptics that magician Randi was one member of the jury!

  4. Benveniste's work has been continued and recently HIV Nobelist Montagnier produced what might be regarded as remote replication of DNA using method very similar to that used in manufacturing homeopathic remedy (see this and this).

The general conclusion is that the em frequencies possibly providing a representation of the molecules are rather low - in VLF region - so that frequencies assignable to molecular transitions are not in question. Cyclotron frequencies assignable to the molecules are the most natural candidates concerning physical interpretation. The corresponding photon energies are extremely low if calculated from E=hf formula of standard quantum mechanics so that quantal effects in the framework of standard quantum theory do not seem to be possible.

My personal interest on water memory was sparked by the work of Cyril Smith about which learned in CASYS 2001 conference years ago. What I learned was what might be called scaling law of homeopathy (see this). Somehow low frequency radiation seems to be transformed to high frequency radiation and the ratio fh/fl≈ 2× 1011 seems to be favored frequency ratio.

These two basic findings suggest what looks now a rather obvious approach to homeopathy in TGD framework. The basic physical objects are the magnetic bodies of the invader molecule and water molecule cluster or whatever it is what mimics the invader molecule. The information about magnetic body is represented by dark cyclotron radiation generated by the invader with frequency fI. This dark radiation is transformed to to ordinary photons with frequency fh and energy hefffl=hfh, which is above thermal energy, most naturally in the range of bio-photon energies so that the radiation can directly induce transitions of bio-molecules. The analogs for the EZs discovered by Pollack are obvious candidates for "water molecule clusters".

The following summarizes this overall picture in more detail.

Dark photon-bio-photon connection

The idea that bio-photons are decay product of dark photons emerged from the model of EEG (see this) in terms of dark photons with energies above thermal energy. Dark photons in question would be emitted as cyclotron radiation by various particles and molecules, perhaps even macromolecules like DNA sequencies. Also cell membrane would emit dark photons with frequencies, which correspond in good approximation to differences of cyclotron energies for large value of heff=nh (see this and this).

  1. Bio-photons have spectrum in the visible and UV would decay products of dark cyclotron photons. If the heff of particle is proportional to its mass then the cyclotron energy spectrum is universal and does not depend on the mass of the particle at all. The original model of EEG achieved this by assuming that heff is proportional to the mass number of the atomic nucleus associated with the ion.

  2. The ideas about dark matter involve two threads: heff=n× h thread motivated by biology and the thread based on the notion of gravitational Planck constant and inspired by the observation that planetary orbits seem to obey Bohr rules. hgr= GMm/v0 is assigned to the pairs of gravimagnetic flux tubes and massless extremals making possible propagation of dark gravitons. The realization was the two threads can be combined to single thread: by Equivalence Principle hgr hypothesis is needed only for microscopic objects and in this case heff=hgr makes sense and predicts that dark photon energies and dark particle Compton lengths do not depend on particle and that bio-photon energy spectrum is universal and in the desired range if one assumes that hgr is associated with particle Earth par with v0 the rotational velocity at the surface of Earth. Even heff=hem=hgr hypothesis makes sense. hem= hgr is also very natural assumption for ATP synthase which can be regarded as a molecular motor whose rotation velocity appears in the formula for hem.

  3. The prediction would be that any charged system connected to Earth by flux tubes generates cyclotron dark photons decaying to bio-photons. Bio-photons in turn induce transitions in biomolecules because the energy range is in visible and UV. Magnetic bodies can control biochemistry via resonant coupling with bio-photons.

Molecular recognition mechanism as basic building brick of primitive immune system

The reconnection of U-shaped magnetic flux tubes emanating from a system makes possible a recognition mechanism involving besides reconnection also resonant interaction via cyclotron radiation which can induced also biochemical transitions of heff=hgr hypothesis holds true.

  1. Molecules have U-shaped flux tube loops with fluxes going in opposite directions. This makes possible also super-conductivity with members of Cooper pair at the parallel flux tubes carrying magnetic fluxes in opposite direction since magnetic fields now stabilize Cooper pairs rather than tend to destroy them.

  2. The flux loops associated with systems - call them A and B - can reconnect and this leads to the formation of 2 parallel flux tubes connecting A and B. Stable reconnection suggests that magnetic field strengths must be same at the flux tube pairs associated with A and B. This implies same cyclotron frequencies and resonant interaction. This would define molecular mechanism of recognition and sensing the presence of invader molecules - even conscious directed attention might be involved.

  3. Systems with magnetic body could be constantly varying the thicknesses of at least some of their flux tubes and in order to reconnect with the magnetic body of a possible invader. This activity could be behind the evolution of the immune system.
The question is how the system or its sub-system could stabilize itself so that it would receive signals only from one kind of molecule specified by its cyclotron frequency spectrum.
  1. If the flux tubes carry monopole flux (this is possible in TGD framework and requires the the flux tube cross section is closed 2-surface), stabilization of the flux tube thickness stabilizes the magnetic field strength. How the stabilization of the thickness of the flux tubes could have been achieved?

    Pollack's negatively charged EZs with dark protons at magnetic flux tubes giving rise to dark nuclei identifiable as dark proton sequences suggests an answer. Maybe the presence of dark proton sequences could stabilize the flux tube thickness. Dark proton sequences have also interpretation as dark DNA/RNA/amino-acid sequences (see this).

A further question is whether the magnetic body of the prebiotic cell identified as EZ could use the information about invader molecule to represent its magnetic body either concretely and perhaps even symbolically and regenerate the concrete representation when needed.
  1. The concrete representation could be in terms of dark proteins whose folding would represent the topology of the invader molecule and symbolic representation in terms of dark DNA transcribed to dark protein. If the dark protein has same topology of knotting it could more easily attach to the invader molecule and make it harmless. Note that the invaders are naturally other dark DNAs and proteins jus as in living matter. The higher purpose behind this cold war would be stimulation of mimicry - emulation in computer science - leading to generation of cognitive representations and negentropic entanglement.

  2. Not only the representation of the 3-D magnetic body - its behavior - is possible. In ZEO also the representation of the dynamical evolution of magnetic body becomes possible since basic objects are pairs of 3-surfaces at future and past boundaries of causal diamond. The challenge is to represent the topology time development of magnetic body - 2-braiding, first concretely by mimicking it and then symbolically in terms of DNA coding for proteins doing the mimicry. The obvious representation for the behavior of magnetic body of invader molecule would be in terms of folding and unfolding of protein representing it.

  3. The question how the symbolic representation could have emerged leads to a vision about how genetic code emerged. The model for living system as topological quantum computer utilizing 2-braiding for string world sheets at 4-D space-time leads to the idea that 3-D coordinate grids formed by flux tubes are central for TQC: each node of grid is characterized by 6 bits telling about the topology of the node concerning 2-braiding. Could the 6 bits of dark DNA code for the local topology of the invader molecule and an the flux tube complex mimicking it?

  4. This raises the possibility that DNA strands - one for each coordinate line in say z-direction could code for the 2-braiding of 3-D coordinate grid and in this manner code for the magnetic template of invader molecule and also that of the biological body. Therefore genetic code would code for both the basic building bricks of the biological body and 4-D magnetic body serving as template for the development of biological body.

One can imagine how the biochemical evolution after this stage might have taken place.
  1. At the next step the chemical representation of genetic code would have emerged. Dark proteins learned to attach to real proteins and real proteins to other proteins and DNA and bio-catalysis became possible.

  2. The transformation of the ordinary photons emitted in the transitions of biomolecules to dark photons made possible the recognition of invader molecules using ordinary photons emitted in their molecular transitions.

  3. Magnetic bodies learned to control biochemical reactions by using dark cyclotron radiation transformed to bio-photons.

  4. Gradually dark and ordinary proteins developed a rich repertoire of functions relying on reconnection, communication by dark photons, and attachment in invader molecule. Proteins began to serve as building bricks, as bio-catalysts, promote the replication of DNA, responding to stimuli, serve as receptors.

Possible mechanism of water memory and homeopathy

The general vision about prebiotic evolution described above suggests that the mechanisms of water memory and homeopathy are basically the same as those underlying the workings of the immune system.

  1. Exclusion zones could define primordial life forms with genetic code. They are able to detect the presence of invader molecule from its cyclotron frequency spectrum.

  2. Dark proteins can form concrete memory representations of the invader
    molecules in terms of dark proton sequences defining dark proteins. The folding of these dark proteins mimics the behavior of the magnetic bodies of the invaders. These dark proteins can attach to the magnetic body of the invader molecule to make it non-dangerous. Even symbolic representations in terms of dark DNA allowing transcription and translation to concrete dark protein representation could be involved. The procedure involved in the manufacture of homeopathic remedy could be seen as a series of "environmental catastrophes" driving the evolution of dark primordial life by feeding in metabolic energy and generating new EZs, which mimic the invader molecules and existing EZs mimicking them.

  3. In organism the dark DNA representing the invader molecule would generate ordinary genes coding for ordinary proteins attaching to the invader molecules by the attachment of ordinary DNA nucleotides to them. The attachment would involve heff reducing phase transition reducing the length of connecting flux tube.

  4. Later dark genetic code transformed to chemical genetic code as dark DNA strands were formed around dark double strands and large number of other biological functions emerged besides immune response.

  5. The mechanical agitation in the manufacturing of homeopathic remedy generates exclusion zones and new primitive life forms by providing the needed energy. These in turn recognize and memorize invader molecules and their already existing representations as EZs.

For details see the article TGD view about homeopathy, water memory, and evolution of immune system.

Sunday, July 06, 2014

Could vacuum expectation value of Higgs have TGD counterpart?


Although it seems that Higgs mechanism is replaced by p-adic thermodynamics in TGD framework, one can ask whether the analog of Higgs vacuum expectation somehow emerges. This is expected to be the case if one believes that QFT theory limit for TGD makes sense (for CMAP representations about TGD see this).

  1. TGD allows Higgs like particles and it would be very difficult to develop a convincing arguments for their non-existence. The model for elementary particles as string like objects consisting of "ur-fermions" with electro-weak quantum numbers of electron and quark allows also scalar particles.

  2. The notion of Higgs vacuum expectation as fundamental mechanism of particle massivation is not supported by the following argument.


    1. Higgs would naturally correspond to CP2 vector field most naturally in complex coordinates for CP2 behaving like electroweak doublet.

    2. This field should be covariantly constant but CP2 does not allow covariantly constant vector fields. This could conform with the view that p-adic thermodynamics describes particle massivation.

  3. One can consider a loophole to this argument. Could it happen that Higgs field is covariantly constant only with respect to the induced spinor connection? For instance, at the string world sheets associated with fermions or at the orbits of string ends? Or is there some other way for Higgs vacuum expectation to creep in.
Quite recent little progress in the understanding of Kähler-Dirac equation this) suggests that the analog of Higgs vacuum expectation might make sense and could code for masses of the particles in space-time geometry, maybe in the geometry of space-time sheets at which particle is topologically condensed. Particle masses would follow from p-adic thermodynamics (see this and this).
  1. The Kähler-Dirac action for the modes of induced imbedding space spinor field reduces to a vanishing contribution in the interior of space-time surface and by weak form of electric magnetic duality (WFEMD) to a boundary term given by Chern-Simons action.

  2. For the light-like partonic orbits associated with wormhole contacts these terms must vanish by conservation of fermionic charges and one obtains just algebraic form of M4 Dirac action at the partonic 2-surface so that massless fermionic propagators are obtained. There is nothing Higgs like in propagators and this conforms nicely with the stringy version of twistor Grassmannian approach which applies in TGD.

  3. At the space-like 3-surfaces defining the ends of space-time at boundaries of CDs one obtains algebraic massless Dirac plus Chern-Simons term.

  4. Could this term play the the role of Higgs vacuum expectation? Does this term give only a small additional contribution to the masses of particles besides that given by p-adic thermodynamics for wormhole contacts and for string like object in electro-weak or even Compton length scale?

  5. What happens at at the ends of string worlds sheets. Here the Higgs term would correspond to the normal (time-like) component of the Kähler-Dirac gamma matrix whose CP2 part would give needed non-tachyonic contribution to mass. One would get algebraic form of M4 Dirac equation: pkγk Ψ= ΓnΨ, where Γα=Tαkγk is modified gamma matrix defined as a contraction of the canonical momentum currents for Kähler action with imbedding space gamma matrices γk. Note that this Kähler-Dirac gamma matrix could be covariantly constant along the boundary of string world sheet and perhaps even inside string world sheet.

  6. Note that Minkowskian part of Γn would give a tachyonic contribution, which might relate to the still poorly understood question how the tachyonic ground states of super-conformal representations with half
    integer conformal weight emerge.

Although it seems that Higgs mechanism is replaced by p-adic thermodynamics in TGD framework, one can ask whether the analog of Higgs vacuum expectation somehow emerges. This is expected to be the case if one believes that QFT theory limit for TGD makes sense.
  1. TGD allows Higgs like particles and it would be very difficult to develop a convincing arguments for their non-existence. The model for elementary particles as string like objects consisting of "ur-fermions" with electro-weak quantum numbers of electron and quark allows also
    scalar particles.

  2. The notion of Higgs vacuum expectation as fundamental mechanism of particle massivation is not supported by the following argument.

    1. Higgs would naturally correspond to CP2 vector field most naturally in complex coordinates for CP2 behaving like electroweak doublet.

    2. This field should be covariantly constant but CP2 does not allow covariantly constant vector fields. This could conform with the view that p-adic thermodynamics describes particle massivation.

  3. One can consider a loophole to this argument. Could it happen that Higgs field is covariantly constant only with respect to the induced spinor connection? For instance, at the string world sheets
    associated with fermions or at the orbits of string ends? Or is there some other way for Higgs vacuum expectation to creep in.

Quite recent little progress in the understanding of Kähler-Dirac equation suggests that the analog of Higgs vacuum expectation might make sense and could code for masses of the particles in space-time geometry, maybe in the geometry of space-time sheets at which particle is topologically condensed. Particle mass itself would follow from p-adic thermodynamics.

  1. The Kähler-Dirac action for the modes of induced imbedding space spinor field reduces to a vanishing contribution in the interior of space-time surface and by weak form of electric magnetic duality (WFEMD) to a boundary term given by Chern-Simons action.

  2. For the light-like partonic orbits associated with wormhole contacts these terms must vanish by conservation of fermionic charges and one obtains just algebraic form of M4 Dirac action at the partonic 2-surface so that massless fermionic propagators are obtained. There is nothing Higgs like in propagators.

  3. At the space-like 3-surfaces defining the ends of space-time at boundaries of CDs one obtains algebraic massless Dirac plus Chern-Simons term.

  4. Could this term play the the role of Higgs vacuum expectation? Does this term give only a small additional contribution to the masses of particles besides that given by p-adic thermodynamics for wormhole contacts and for string like object in electro-weak or even Compton
    length scale?

  5. What happens at at the ends of string worlds sheets. Here the Higgs term would correspond to the normal (time-like) component of the Kähler-Dirac gamma matrix whose CP2 part would give needed non-tachyonic contribution to mass. One would get algebraic form of M4 Dirac equation: pkγk Ψ= ΓnΨ, where Γα=Tα kγk is modified (Kähler-Dirac) gamma matrix defined as a contraction of the canonical momentum currents for Kähler action with imbedding space gamma matrices γk. Note that this Kähler-Dirac gamma matrix could be covariantly constant along the boundary of string world sheet and perhaps even inside string world sheet.

  6. Note that Minkowskian part of Γn would give a tachyonic contribution,
    which might relate to the still poorly understood question how the tachyonic ground states of super-conformal representations with half integer conformal weight emerge.


Saturday, July 05, 2014

W boson excess at LHC and ATLAS: explanations?

Tommaso Dorigo told about diboson excess (W boson pairs) reported both by ATLAS and CMS in 7 - and 8-TeV collision data. ATLAS measure the cross section to be 71.4+/- 7.5 pb, which is higher than the theory prediction 57.3 pb. Two sigma excess is the technical term. 2 sigma is not convincing but both ATLAS and CMS have measured the excess forces to consider the situation more seriously.

A rather simple SUSY model has been proposed to explain the deviations and the mathematical measure allowing to compare explanations favors SUSY model over standard model. The model involves stop as the lightest squark, neutralino and chargino and bino as spartners of weak bosons. Bino is assumed to be the lightest supersymmetric particle. W pairs are produced in a decay chain initiated by the production of stop pair. Stop then decays to b quark and chargino and chargino to neutralino plus W boson. Same happens for antistop so that one obtains W pair. The model has as parameters M(stop) and M(neutralino). The model produces the values M(stop)= 200 GeV for stop mass and M(neu)=150 GeV for neutralino mass, which are not in already excluded region of the parameter space. Also the production of neutralino pairs is predicted and might serve as a test for the model.

This is certainly not the only model that one can imagine. Especially so in TGD framework which predicts a lot of new physics about which part might be already visible.

  1. M89 hadron physics (see this) could explain the findings about heavy ion and proton heavy ion collisions at RHIC and LHC in terms of mesonlike states of M89 - string like objects for low energy M89 hadron physics - decaying to ordinary hadrons.

  2. TGD predicts a version of SUSY based with covariantly constat right-handed neutrinos generating the SUSY is consistent with the separate conservation of B and L (see this and also this). p-Adic thermodynamics provides an elegant mechanism predicting the masses of spartners: the mass formula is the same as for partners but p-adic length scale can be different. Unfortunately it is not possible to predict the p-adic length scale associated with the right-handed neutrino.

  3. TGD predicts also higher generations of gauge bosons, which could form effectively SU(3) octet whereas fermion generations would effectively form SU(3) triplets (see this).
The decays of the new particles predicted by these scenarios could generate excess of W bosons. W bosons are favored over Z bosons for any production mechanism involving a decay chain of pair of new particles producing quark pairs whose members, which are U or D type quarks, decay to lighter D or U type quarks by the emission of W boson. That decay to lighter quarks can involve W boson emission but no Z emission is due to the basic properties of CKM mixing.

Friday, July 04, 2014

Pythagoras, music, sacred geometry, and genetic code

The idea that the 12-note scale could allow mapping to a closed path covering all vertices of icosahedron having 12 vertices and not intersecting itself is attractive. Also the idea that the triangles defining the faces of the icosahedron could have interpretation as 3-chords defining the notion of harmony for a given chord deserves study. The paths in question are known as Hamiltonian cycles and there are 1024 of them. There paths can be classified topologically by the numbers of triangles containing 0, 1, or 2 edges belonging to the cycle representing the scale. Each topology corresponds to a particular notion of harmony and there are several topological equivalence classes.

The idea that the 12-note scale could allow mapping to a closed path covering all vertices of icosahedron having 12 vertices and not intersecting itself is attractive. Also the idea that the triangles defining the faces of the icosahedron could have interpretation as 3-chords defining the notion of harmony for a given chord deserves study. The paths in question are known as Hamiltonian cycles and there are 1024 of them. There paths can be classified topologically by the numbers of triangles containing 0, 1, or 2 edges belonging to the cycle representing the scale. Each topology corresponds to a particular notion of harmony and there are several topological equivalence classes.

The vision about DNA and amino-acids as analogs of notes and music piece produced by the translation machinery from mRNA is also attractive and induce the idea that the 20 amino-acids could somehow correspond to the 20 triangles of icosahedron. The combination of this idea with the idea of mapping 12-tone scale to a Hamiltonian cycle at icosahedron leads to the question whether amino-acids could be assigned with a topological equivalence class of Hamiltonian cycle and whether topological characteristics could correspond to physical properties of amino-acids. It turns out that the identification of 3 basic polar amino-acids with triangles containing no edges of the scale path, 7 polar and acidic polar amino-acids with those containing 2 edges of the scale path, and 10 non-polar amino-acids with triangles containing 1 edge on the scale path would be consistent with the constraints on the Hamiltonian cycles. One could of course criticize the lumping of acidic polar and polar aminoacids to same group.

The number of DNAs coding for a given amino-acid could be also seen as such a physical property. The model for dark nucleons leads to the vertebrate genetic code with correct numbers of DNAs coding for amino-acids. It is however far from clear how to interpreted DNAs geometrically and the problem whether one cold understand genetic code geometrically remains open.

For details see the chapter Quantum model for hearing of "TGD and EEG" or the article Pythagoras, music, sacred geometry, and genetic code.

Wednesday, July 02, 2014

More precise view about high Tc superconductivity taking into account recent experimental results

I have developed the model of high Tc superconductivity during last twenty years (see this). This model is especially relevant in TGD inspired quantum biology where high Tc superconductiviy is proposed to be in a key role. The basic new concepts are magnetic body and the identification of dark matter as phases with effective Planck constant heff=n× h.

There are more recent results allowing to formulate more precisely the idea about transition to high Tc super-conductivity as a percolation type phenomenon. Let us first summarize very briefly the relevant aspects about about high Tc superconductors.

  1. 2-dimensional phenomenon is in question. Supra current flows along preferred lattice planes and type II super-conductivity in question. Proper sizes of Cooper pairs (coherence lengths) are ξ =1-3 nm. Magnetic length λ is longer than ξ/21/2.

  2. Mechanism for the formation of Cooper pairs is the same water bed effect as in the case of ordinary superconductivity. Phonons are only replaced with spin-density waves for electrons with periodicity in general not that of the underlying lattice. Spin density waves relate closely to the underlying antiferro-magnetic order. Spin density waves appear near phase transition to antiferromagnetism.

  3. The relative orbital angular mentum of Cooper pair is L=2 (x2-y2 wave), and vanishes at origin unlike for ordinary s wave SCs. The spin of the Cooper pair vanishes.
Consider now the translation of this picture to TGD language. Basic notions are following.
  1. Magnetic flux tubes and possibly also dark electrons forming Cooper pairs.

  2. The appearence of spin waves means sequences of electrons with opposite spins. The magnetic field associated with them can form closed flux tube containing both spins. Assume that spins-are orthogonal to the lattice plane in which supracurrent flows. Assume that the flux tube branches associated with electron with given spin branches so that it is shared with both neighboring electrons.

  3. Electrons of opposite spins at the two portions of the closed flux tube have magnetic interaction energy. The total energy is minimal when the spins are in opposite directions. Thus the closed flux tube tends to favor formation of Cooper pairs.

  4. Since magnetic interaction energy is proportional to heff=n× h, it is expected stabilize the Cooper pairs at high temperatures. For ordinary super-conductivity magnetic fields tends to de-stabilize the pairs by trying to force the spins of spin singlet pair to the same direction.

  5. This does not yet give super-conductivity. The closed flux tubes associated with paired spins can however reconnect so that longer flux closed flux tubes are formed. If this occurs for entire sequences, one obtains two flux tubes containing electrons with opposite spins forming Cooper pairs: this would be the "highway" and the proposed percolation would correspond to this process. The pairs would form supracurrents in longer scales.

  6. The phase phase transitions generating the reconnections could be percolation type phase transition.

This picture might apply also in TGD based model of bio-superconductivity.
  1. The stability of dark Cooper pairs assume to reside at magnetic flux tubes is a problem also now. Fermi statistics favors opposite spins but this means that magnetic field tends to spit the pairs if the members of the pair are at the same flux tube.

  2. If the members of the pair are at different flux tubes, the situation changes. One can have L=1 and S=1 with parallel spins (ferromagnetism like situation fluxes in same direction) or L=2 and S=0 state (anti-ferromagnetism like situation with opposite fluxes). L>0 is necessary since electrons must reside at separate flux tubes.

  3. Note that the phase transition liberates energy if Cooper pairs remain at rest. The energy liberated would be rather large if the value of heff is so large that EEG photon energies are in the range of biophoton energies. The binding energy of Cooper pairs would be in eV range too as aksoi the interaction energy of spin 1 Cooper pairs in the magnetic field. Also spontaneous magnetization of the magnetic body with a liberation of large energy can be considered and the claimed spontaneous acceleration of rotating magnetic systems could rely on this mechanism. Even ATP synthase acting as motor could get its angular momentum and rotational energy via this kind of process. The rotation of the magnetic system could also drive charged particles to the magnetic body by centrifugal acceleration.

To conclude, the notions of magnetic flux tube and dark matter in TGD sense allow to understand high Tc superconductivity.