### Does the flow of subjective time correspond to the increase of the average value of effective Planck constant?

I like answering questions. It gives a lot of meaning to the life of a theoretician who is not allowed to enjoy the pleasures of academic existence. Career builder would of course argue that writing again and again similar answers is a waste of time: I should be building social networks to important people instead. This activity however allows to make important observations and little discoveries. This time I answered to the questions relating to non-determinism of Kähler action. How this non-determinism relates to quantum non-determinism? How the non-determinism in elementary particle scales relates to that in biology?

The unexpected fruit was a little discovery: the mechanism generating the arrow of geometric time in zero energy ontology might rely in crucial manner to a sequence of phase transitions increasing the value of Planck constant h_{eff}/h=n and hence the size of the causal diamond (CD) characterized by quantum average temporal distance. Since the second boundary of CD is fixed, the second one moves to future in average sense: hence the flow of experienced time and its arrow. Conscious entities become more intelligent as they age! It became also clear that large h_{eff}/h characterizes macroscopically quantum coherent many-particle system rather than single particle. This leads to view in which intelligent consciousness involving the experienced about the flow of time emerges as the complexity of the systems measured by the number of fundamental particles increases.

**1. The non-determinism of Kähler action and quantum non-determinism**

The first question was about the relationship between non-determinism of preferred extremals and quantum non-determinism. As a matter of fact, I like to use the phrase "partial failure of determinism for Kähler action" rather than "non-determinism of Kähler action".

A possible interpretation could be as a correlate for quantum non-determinism. Second interpretation would be in terms of quantum criticality implying non-determinism. I do not know whether the interpretations are actually equivalent.

I certainly do not believe that one could get rid of quantum non-determinism and there is no need for it. The generalisation of quantum-classical correspondence is however natural in ZEO, where basic objects are 4-D surfaces- classical time evolutions serving as space-time correlates for quantal evolutions.

The origin of non-determinism is following. Kähler action has a huge vacuum degeneracy. For instance, for space-time surfaces, which are maps from M^{4} to at most 2-D Lagrangian manifold of CP_{2} having by definition vanishing induced Kähler form (configuration space and momentum space are Lagrangian manifolds in the context of classical mechanics) induced Kähler form of course vanishes. These vacuum extremals define an analog of gauge degeneracy of Maxwell action for vacuum extremals. For non-vacuum externals it is expected to be lifted at least partially. Hence 4-dimensional spin glass degeneracy is more appropriate analogy. One could say that classical gravitation breaks the analog of gauge invariance for non-vacuum extremals.

For CP_{2} type vacuum externals one has also non-determinism, which corresponds directly to Virasoro conditions expressing the light-likeness of 1-D M^{4} projection of the CP_{2} type vacuum extremal. Now induced Kähler form does not vanish.

Zero energy ontology (ZEO) and causal diamond (CD) are essential notions concerning the interpretation but I will not try to explain it here but leave it as an exercise for the reader. The ends of vacuum extremal at light-like boundaries of CD are connected by infinite number of vacuum externals. One expects that some vacuum degeneracy is present also non-vacuum externals. Part of this degeneracy must be analogous to gauge degeneracy since by strong form of general coordinate invariance (GCI) implying strong form of holography, only the partonic 2-surfaces and their 4-D tangent space data fix the physics since WCW metric depends only on this data. Hence the interiors of 3-surfaces carry very little information about quantum states.

**2. Identification of gauge degeneracy as hierarchy of broken conformal gauge invariances**

The conjecture is that conformal symmetries acting as partially broken gauge symmetries realize this vision. TGD allows several kinds of conformal symmetries, and a huge generalisation of string model conformal symmetries (including Kac-Moody) but I will not go to this here. Suffice it to say that the generalization of conformal symmetries means replacement of AdS/CFT correspondence with a correspondence which looks intuitively much more realistic (see this).

Classical conformal charges would vanish for sub-algebra for which the conformal weights are multiples of some integer n, n=1,2,…. These conditions would give the long-sought-for precise content to the notion of preferred extremal. These conditions would be the classical counterparts of corresponding quantum conditions and define a Bohr orbitology. This hierarchy would correspond to the hierarchy of Planck constants h_{eff}= n× h and to the hierarchy of dark matters. There would be infinite number of hierarchies (1, n_{1}, n_{2}, . .., n_{i},...) such that n_{i} would divide n_{i+1} . They would correspond to the hierarchies of inclusions of hyper-finite factors of type II_{1} (HFFs). Included algebra defines measurement resolution, which would thus realized as conformal gauge symmetries. Evolution would correspond to a sequence of symmetry breakings: this is not a new idea but emerges naturally if $n$ serves as a quantum "IQ".

The proposal is that that there is a finite number n=h_{eff}/h of conformal equivalence classes of four-surfaces with fixed 3-D ends at the opposite boundaries of CD so that the non-determinism with gauge fixing would be finite and would correspond to the hierarchy of Planck constants and hierarchy of conformal symmetry breaking defined by the hierarchy of sub-algebras of various conformal algebras with weights comings as integer multiples of integer n=1,2,,…. These n surfaces would be analogous to Gribov copies for gauge conditions in non-Abelian gauge theories.

**3. The non-determinisms of particle physics and biology**

There was also a question about the non-determinism of partcle physics contra that of biology, where it manifests itself as partially free will.

**3.1. NMP**

Before continuing it is good make clear that a new principle is involved: Negentropy Maximization Principle (NMP). Also a new kind of entanglement entropy based p-padic norm is involved. This entanglement entropy is negative unlike ordinary entanglement entropy and characterizes two-particle system rather than single particle system. By consistency with quantum measurement theory it corresponds to identical entanglement probabilities p_{i}=1/n. This entanglement is assumed to be associated with the n-sheeted coverings (at least these) defined by the space-time surfaces in n conformal equivalence classes associated with n=h_{eff}/h and connecting same 3-surfaces at the ends of space-time surface. Two systems of this kind can entangle negentropically. Unitary entanglement matrix associated with quantum computation gives rise to negentropic entanglement. Also n-partite negentropic entanglement makes sense.

**3.2. What could be common for particle physics and biology?**

Basically the non-determinism of particle physics and of biology could be essentially the same thing but for living matter whose behave is dictated by dark matter the value of h_{eff}/h=n would be large and make possible macroscopic quantum coherence in spatio-temporal scales, which are longer by factor n. Note that n could characterize macroscopic quantum phase rather than single particle system: this distinction is important as will be found.

The hierarchy of CDs brings additional spatio-temporal scale identified as secondary p-adic scale characterising the minimal size of CD (that for n=1). This size scales like h_{eff}/h=n and one can think of a superposition of CDs with different values of n and that the average value of n measuring the age of self increases during the sequence of quantum jumps. Since n is kind of IQ, NMP says that conscious entities should become wiser as they get older: maybe this is too optimistic hypothesis in the case of human kind but maybe electrons are different!;-) I swear that this interpretation is not due to the fact that I have passed the magic threshold of 60 years when one begins to feel that the ageing means growing wisdom;-). I must confess that the interpretation of experience time flow in terms of increasing h_{eff}/h charactering CD scaling has not come into my mind earlier. One could even consider the possibility that there is no superposition - just a sequence of h_{eff}/h increasing (in average sense) phase transitions, kind of spiritual growth even at the level of elementary particles.

For instance, for electron characterised by Mersenne prime M_{127}=2^{127}-1 the minimal CD time scale is .1 seconds (note that it defines a fundamental biorhythm of 10 Hz) and thus macrotemporal. Corresponding size scale is of the order of Earth circumference. This size scale could characterize quite generally the magnetic body of the elementary particle or the magnetic body at which macroscopic quantum phase of particles resides. In both cases there would be a direct connection between elementary particle physics and macroscopic physics becoming manifest in living matter via alpha rhythm for instance. Only the interpretation in terms of

macroscopic quantum phase seems to make sense.

**3.3. What distinguishes between particle physics and biology?**

There are essential differences between elementary particle physics and biology. The

first differences comes from quantum measurement theory in ZEO.

- The repeated state function reduction does nothing for the state in standard ontology. In TGD the state is invariant only at the second boundary at which the reduction occurs. For second boundary of CD the average value if n increases. This gives rise to the experienced flow of geometric time and the arrow of time. Self exists as long as reductions take place on same boundary of CD and dies as the first reduction to opposite boundary is forced by NMP.

- In particle physics context one expects that the duration of self identified as a sequence of state function reductions at the same boundary of CD is much shorter than in living matter. Otherwise one would have too strong breaking of reversibility in elementary particle time scales.

_{eff}/h so that particles darken gradually, should have been observed long time ago since reaction rates are independent of Planck constant only the lowest order in h

_{eff}that is in classical approximation. The attempt to circumvent this objection leads to two crucial questions?

- Does h
_{eff}characterize elementary particle (or fundamental fermion) or a magnetic/field body of physical system which could be also many-particle system.

If h

_{eff}/h=n corresponds to n-sheeted covering which becomes singular at the ends of space-time surface so that sheets co-incide at partonic 2-surfaces representing particles, it seems that large h_{eff}is a phenomenon assignable to the field/magnetic body inside CD rather than particle identified as partonic 2-surface or 3-surface at the end of CD. If so large h_{eff}effects would relate to the dynamics associated with the magnetic/field bodies carrying dark matter.

- Is darkness single particle phenomenon or many-particle phenomenon? For the latter option elementary particle physics would not be any challenge so that it looks the reasonable option. Note that negentropic entanglement requires at least one pair of (say) electrons and suggests macroscopic quantum phase - say high-T
_{c}super-conductivity or super-fluidity.

The idea about evolution of many-electron systems at dark magnetic body generating increasing value of h

_{eff}makes sense, and would conform with the observation that electrons secondary p-adic time scale defines fundamental bio-rhythm. Dark magnetic bodies carrying dark particles are indeed in key role TGD inspired quantum biology. Bose-Einstein condensates and spontaneously magnetized dark phases at magnetic bodies would conform with the idea that dark matter is many-particle phenomenon.

Large h

_{eff}would not be seen in elementary particle physics. This challenges the idea that sparticles in TGD SUSY might have same p-adic mass scale as particles but be more stable in dark phase (this would be due to the scaling up of the size of CD) (see this). Note however that in TGD already elementary particles are many-fermion systems

so that it might be possible to circumvent this objection.

- The original formulation for darkness was at single particle level so that h
_{eff}characterizes elementary particles rather than many-particle systems. In elementary particle reactions the particles in the same vertex would always have the same value of h_{eff}/h. It was assumed that h_{eff}can change only in 2-vertex analogous to mass insertion vertex.

The previous arguments suggest that darkness makes sense only for many-particle systems so that mass insertion vertex becomes phase transition. These phase transitions would occur routinely in living matter but as phase transitions involving large number of particles. For instance, bio-photons would result from dark photons in this manner. This picture seems to make sense at least at the level of many-particle systems but not necessary for Feynman graphs.

This many-particle aspect would explain at very general level why the search for dark particles has been

fruitless.

This time scale could give an idea about the geometric duration of elementary particle self (the growth of the temporal distance between tips of CD during the sequence of reductions or equivalently the increase of n). If this picture really makes sense, elementary particles would get more and more intelligent in TGD Universe and stable elementary particle like electron would be real sages! Could this relate to the fact that the minimal CD size for electron defines the fundamental biorhythm of 10 Hz? Strangely, I find is easier to regarded electron as intelligent creature than my working desk or a typical academic decision maker. For holographists it should be also relatively easy to think that electrons could serve as conscious holograms.

**3.4. Could one regard elementary particle as a conscious entity?**

The previous considerations support the view that it is macroscopic quantum phases of particles at magnetic flux tubes which can be seen as conscious and intelligent evolving entities experience the flow of time. In the case of single elementary particle previous arguments would suggest that only single state function reduction occurs at given boundary of CD so that the lifetime of elementary particle self would have zero duration! This in accordance with the absence of the arrow of time at elementary particle level. Strictly speaking this does not exclude consciousness but excludes intelligence and experience of time flow.

Could already systems with small particle number, be conscious entities and develop - not necessarily large - h_{eff}/h>1. Hadrons consist of quarks and I have considered the possibility that valence quarks and gluons at the color magnetic body are dark. Also nuclei as many-nucleon systems could be dark. In TGD even elementary particles consist of fundamental fermions so that one can ask whether elementary particles possess some elementary aspects of consciousness identified as the possibility of non-vanishing "biological" life-time. This kind of picture would conform with the idea about consciousness as something emerging as the complexity of the system increases.

The average lifetime of elementary particle as a conscious entity cannot be longer than the life-time of particle in the sense of particle physics. In the case of electron having infinite lifetime as elementary particle the "biological" lifetime must be finite since otherwise the irreversibility would manifest itself as a breaking of time reversal invariance in electron scale. The temporal time scale of CD characterising the dimensions of the magnetic body of the elementary particle is the first order of magnitude estimate for the lifetime of elementary particle self. The "biological death" of electron means state function reduction in the sense of ordinary quantum measurement theory implying for instance localization of electron or giving eigenstate of spin in given quantization direction and these quantum jumps meaning re-incarnations of electron certainly occur.

This time scale could give an idea about the geometric duration of elementary particle self (the growth of the temporal distance between tips of CD during the sequence of reductions or equivalently the increase of n). One expects that Δ n is by NMP rather small for single particle systems.

**3.5. Could thermodynamical breaking of T symmetry relate to the CP/T breaking in particles physics?**

Could the "thermodynamical" breaking of time reflection symmetry (T) correspond to the breaking of T as it is observed for elementary particles such as neutral kaon? I think that most colleagues tend to be skeptic about this kind of identification, and so do I.

The point is that particle physicist's T breaking could be purely geometric whereas thermodynamical breaking of T involves the notion of subjective time, state function reduction, and consciousness. One could however ask whether the particle physicist's T could serve as space-time correlate for thermodynamicist's T and whether systems exhibiting CP breaking could be seen as conscious entities in very primitive sense of the word (n_{f}/n_{i}>1 but small). An important point is that the time evolution for CDs corresponds to scaling so that usually exponential decay laws are replaced with their hyperbolic variants. Hyperbolic decay laws become an important signature of consciousness. For instance, bio-photon intensity decays in hyperbolic manner.

The mean lifetimes are of long-lived and short lived neutral kaon are τ_{L}= 1.2 × 10^{-8} seconds and τ_{S}= 8.9× 10^{-11} seconds: the ratio of the time scales is roughly 2^{7}. This does not conform with the naivest guess that the size of CD gives estimate for the duration of elementary particle self (increase of the temporal distance between tips of CD): the estimate would be τ_{L}= 10^{-7} seconds from the fact that the mass of neutral kaon is roughly 10^{3} times electron mass. This is not too far from the lifetime of K^{0}_{L} but is about 2^{7} times longer than the life-time of short-lived kaon. Why K_{S} would be so short-lived? Could the lifetime be dictated by quark level: The longer time scale could be assigned as secondary p-adic time scale with the p-adic prime p≈ 2^{k}, k=104, characterising b quark. Could the short life-time be understood in terms of loops involving heavier quarks with shorter lifetimes as conscious entities: they indeed appear in the description of CP/T breaking?

For background see the chapter About nature of time. See also the article Does the flow of time correspond to the increase of the effective Planck constant?.

## 10 Comments:

I thought the CD was made of entanglement, not of perturbation.

If time is created as some clockwork of quantum jumps (phase shifts) then also thermodynamics as result of that clock should be linked to the growings of CD?

The clock defining experienced time - or correlating subjective time and geometric defined by the temporal distance between tips of CD - reduces to phase transitions increasing the value of h_eff/h=n. These are just macroscopic variants of quantum jumps for single particle.

Even elementary particles are in TGD many-fermion systems and one can consider the possibility that they enjoy few moments of time flow before they reincarnate by quantum jump changing their arrow of time.

Conciousness with experience of time flow and with self emerges gradually as the complexity of systems increases: as such a familiar and even trivial looking statement but now at firm basis.

The growth of CD is by scalings. This implies important prediction: exponential decay rates are replaced with hyperbolic since the natural time variable becomes the logarithm of the usual time. This becomes a signature of living systems.

The biophoton intensity provides an important example of hyperbolic decay law and can be understood if biophotons are produced from dark photons by phase transition.

This is talking of time as a ZEO :) and weak measurements.

Why is a infinity of superposition always collapsed to the same on/off 50-50% states? We see here 90-10% prediction of collapses by going backwards in time (negative energy measurements) as history, memory, also the possibility to make a measurement in the future and change the past. This you have talked of.

If this CD is made of scaled-up 'collapses' or quantum jumps, how can then this cause - effect happen? You cannot 'collapse' back the quantum jumps. Not change the past.

This text says the collapsed states are irreversibly changed. Only by using a quantum state of time this change of past is possible.

Also, what is the SIZE of the quantum state? Is it bigger that after the jump, or smaller? How are the boundaries of 'virtual' particles or dark matter particles? How is the back-feed done?

http://theory-of-thought.com/blog/can-symbols-produce-time-dilation-within-mindspace/

This is a brilliant insight, about the logarithm of time

http://arxiv.org/pdf/1408.4632v2.pdf

Four-dimensional quaternion-K ̈ahler metrics with an isometry are

determined by solutions

to the SU(

∞

) Toda equation. We set up a perturbation theory to construct ap

proximate

solutions to the latter which can be interpreted as membrane instan

ton corrections to the

moduli space metric of the universal hypermultiplet. We compute on

e such solution exactly

up to the five-instanton level, including all perturbative fluctuation

s about the instantons.

The result shows a pattern that allows us to determine the asympto

tic behaviour of

all

higher instanton corrections within this solution. We find that the ge

nerating function for

the leading terms of the latter is given by the Lambert

W

-function.

The lambert W function is necessarily a factor of control strategy involving 'disturbance' which perturbations are, either in theory or practice(simulation)

its due to the Ross Pi lemma

http://en.wikipedia.org/wiki/Ross%27_%CF%80_lemma

--maplenut@nym.hush.com

To Ulla:

Terminological warnings;-).

*I talk about interaction free quantum measurement. This is an established concept. Weak quantum measurement is complete non-sense and I wonder why it is in Wikipedia. For a layman it is of course easy to confuse these notions.

*State function collapse is only kind of state function reduction occurring in position measurement.

I did not understand the question "Why is a infinity..." Ordinary state function reductions occur in TGD just like usually for h_eff/h=1 states. Probabilities of outcomes are determined by the density matrix describing statistically the entanglement with the measurement apparatus and can be anything.

For negentropic entanglement probabilities are identical and equal to p= 1/n since density matrix is proportional

to nxn unit matrix. Unitary entangelemnt assigned to quantum computation gives rise to this kind of entanglement.

These states are almost stable against NMP: if something happens it increases the entanglement negentropy or

the negentropic entanglement is "stolen" by another subsystem. This is what we living systems are doing all the time: and in very brutal manner, just eating the innocent victim;-). The total amount of NE is not reduced but the owner of it changes!

*You say that it is not possible to change the past.

-This is true in the sense that I cannot change the *subjective past* defined by the sequene of quantum jumps already occurred.

-I can however change the *geometric past* and also future and I do it in every quantum jump!

Here is the crucial different between **geometric time and subjective time.** I have noticed that it is very, very difficult to get rid of conditionings related to the notion of time. We have learned to identify these two times although the differences are obvious to even child. Maybe only a mathematician can get rid of this conditioning by a concentrated effort. I do not really know whether anyone except me has hitherto understood the difference! And mathematically it is almost a triviality!

*The sequence of state function reductions leaving the state at second boundary of CD invariant is *analogous*

to a discretized unitary evolution. It is not continuous unitary evolution but corresponds to the unitary evolution of

say quantum computation. Self is resting after a good lunch, sipping good wine, concentrating on experiencing time flow, no need to perform volitional acts. Eventually the quantum jump occurs and - well- self dies and reincarnates at opposite boundary of CD!;-) . This is not correct description at the level of detail but gives the idea about what

conscious self is. In next approximation one takes into account that self has mental images -subselves- which are born and are dying and self loses his peace of mind for all the sorrow that its mind children are producing. Self tries to mediate and does all kinds of things to get rid of these mental images;-)

To Ulla:

If one wants to assign a size to the quantum state it would be naturally the size of the CD: characterized by temporal

distance T between its tips or the distance L=c*T that light travels during this time. CD becomes larger in statistical sense. It can occasionally get reduced in size. This is like diffusion: the average distance of the diffusing particle from origin increase with time. You can see what diffusion is by looking what happens near trash bin: somehow the trash that should be inside it appears outside it and gradually at increasing distances.

Increase of h_eff, ageing, would mean getting "wiser", spiritual growth. Note necessarily at the level of individual of course. h_eff/h=n would measure the size of CD giving a measure for "spiritual size". Eastern idealists probably cannot tolerate this kind of mixing of quantitative with spiritual. But I believe that everything has "real world" correlates.

There is no need to assign feedback with state function reduction. It is higher level concept involved with self organization. Sarfatti tried to bring feedback to the level of quantum dynamics in his attempt to revive Bohm's old vision of wave mechanics without state function reduction.

To Anonymous:

Thank you for the link. The article is about construction of 4-D quaternion-Kaehler metrics with an isometry: they are determined by so called SU(infty) Toda equation. Also Lambert W- function (I know that you like it;-) appears in the construction. I tried to see whether quaternion-Kahler manifolds could be relevant for TGD.

From Wikipedia I learn that QK is characterized by its holonomy: subgroup of Sp(n)xU(1): tangent space contains 3-D sub-manifold identifiable as imaginary quaternions. CP_2 is one example of QK manifold with non-vanishing constant curvature. QKs are Einstein manifolds: Einstein tensor is proportional to metric.

What is really interesting from TGD point of view is that twistor theory allows to show that one can assign to QK a special kind of twistor space (twistor space in the mildest sense requires only orient ability). Wiki tells that if Ricci curvature is positive, this (6-D) twistor space is what is called projective Fano manifold with a holomorphic contact structure: unfortunately the Wiki definition of Fano manifold says absolutely nothing to me.

How this relates to TGD?

* In the twistor formulation of TGD the space-time surfaces are base spaces of 6-D twistor spaces in the Cartesian product of 6-D twistor spaces of M^4 and CP_2 which are the only twistor spaces with Kaehler structure. In TGD framework space-time regions can have either Euclidian or Minkowskian signature of induced metric. The lines of generalized Feynman diagrams are Eucldiian.

*Could the twistor spaces associated with the lines of generalized Feynman diagrams be projective Fano manifolds? Could QK structure characterize Euclidian regions of preferred extremals of Kaehler action. I do not whether generalization to Minkowskian signature might exist.

*Why QK structure that is projective contact Fano?

-CP_2 allows both projective and contact structures. Delta M^4_+xCP_2 allows it - I call it loosely symplectic structure. Also 3-D light-like orbits of partonic 2-surfaces allow contact structure.

-Both contact structure and projectivity of CP_2 would be inherited if QK property is true. Contact structures at orbits of partonic 2-surfaces would extend to the Euclidian regions of space-time surface representing lines of generalized Feynman diagrams. Projectivity of Fano space would in turn be inherited from CP_2 and its twistor space SU(3)/U(1)xU(1).

*Could the isometry (or possibly isometries) for QK be seen as a remnant of color symmetry or rotational symmetries of M^4 factor of imbedding space?

Yes, I have theorized that subjective alterations in the perception of time are correlated with the degree of information exchanged.

Post a Comment

<< Home