Thursday, July 17, 2014

Has the decay of dark photons to visible photons been observed in cosmological scales?

There is an interesting news providing new light to the puzzles of dark matter in New Scientist. It has been found that Universe is too bright. There are too many high energy UV photons in the spectrum. The model calculations suggest also that this too high brightness has emerged lately, and was not present in the early universe. The intergalactic space contains more neutral hydrogen and thus also more ionized hydrogen as thought previously and it was hoped that the ionized hydrogen could explain the too high brightness. It is now however clear that 5 times more ionized hydrogen would be required than theory allows accepting the experimental data.


The question is whether dark matter could explain the anomaly.

  1. The usual dark matter candidates have by definition extremely weak interactions - not only with ordinary matter and also with dark matter. Therefore it is not easy to explain the finding in terms of ordinary dark matter. The idea about dark matter as remnant from the early cosmology does not fit naturally with the finding that the surplus UV radiation does not seem to be present in the early Universe.

  2. In TGD dark matter is ordinary matter with large heff=n× h and has just the ordinary interactions with itself but no direct interactions with visible matter. Thus these interactions produce dark radiation with visible and UV energies but with probably much lower frequencies (from E= hefff). The energy preserving transformations of dark photons to ordinary ones are an obvious candidate for explaining the surprlus UV light.

  3. These transitions are fundamental in TGD inspired model of quantum biology. Biophotons are in visible and UV range and identified as decay products of dark photons in living matter. The fact that the surplus has appeared recently would conform with the idea that higher levels of dark matter hierarchy have also appeared lately. Could the appearance of UV photons relate to the generation of dark matter responsible for the evolution of life? And could the surplus ionization of hydrogen also relate to this? Ionization is indeed one of the basic characteristics of living matter and makes possible charge separation (see this), which is also a crucial element of TGD inspired quantum biology (see this).

4 comments:

Ulla said...

http://arxiv.org/abs/1407.3823
https://medium.com/the-physics-arxiv-blog/d0758c7c88b5

begin to sound TGDish :)

Matti Pitkanen said...


These endless debates about blackholes, white holes, firewalls, etc is the is the sign that the space-time of general relativity is not anymore appropriate notion. Hawking makes quite correct conclusion but is not of course able to jump out of the system and propose what is this new thing.

Strings brought nothing essentially new in trying to replace space-time, since the only manner to make string theory in some sense predictable is to *assume* that Einstein-YM type QFT is their long length scale limit. Just an assumption to get at least something out of the mill. Blackholes are transformed to possibly higher-dimensional blackholes. Nothing really new at the level of ideas but a lot of profound and unsolvable difficulties such as landscape catastrophe.

The correct question is simple: What is the microscopy giving general relativity at long length scale limit?

It is certainly not string model since basic objects must be 3-D. Once this question is made, on can make progress. Of course, many other questions lead to the same outcome expressed in terms of three letters;-).

To put it bluntly, if colleagues are interested in making progress, there ionly one and formidably ego-un-friendly option (after these 37 years of ridicule) to follow. The many-sheeted space-time of TGD means extreme simplicity at the level of single space-time sheet because of imbeddability to 8-D imbedding space, both Euclidian and Minkowskian signatures are present providing totally new view about material objects and indeed leaving no room for black holes at the fundamental level and is mathematically well-defined at quantum level.


The fundamental question seems to be: Is the communication of new vision in the recent guru dominated Big Science possible at all? Putting it in terms of language used by computer scientists: Is the task of getting signal through the barrier formed by academic egos NP hard. If it is, then the lifetime of the Universe is a reasonable estimate for the breakthrough time of TGD;-). If so, this frustrating tinkering with black holes might still continue for decades.

Ulla said...

‘Groupthink is a type of thought exhibited by group members who try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas. Individual creativity, uniqueness, and independent thinking are lost in the pursuit of group cohesiveness, as are the advantages of reasonable balance in choice and thought that might normally be obtained by making decisions as a group. During groupthink, members of the group avoid promoting viewpoints outside the comfort zone of consensus thinking. A variety of motives for this may exist such as a desire to avoid being seen as foolish, or a desire to avoid embarrassing or angering other members of the group. Groupthink may cause groups to make hasty, irrational decisions, where individual doubts are set aside, for fear of upsetting the group’s balance.’ http://en.wikipedia.org/wiki/Groupthink

Anonymous said...

Very interesting, the Lambert W function arises in "fixing" some of the ancient (1905) molecular hydrogen ionization spectra ( see my other comments on how the Lambert W is almost surely involved in the proof of the Riemann hypothesis)

http://arxiv.org/abs/1408.3999

"Corless and his co-workers [8] mentioned that there was an anomaly in molecular physics. When physicist tried to calculate the eigenstates of the hydrogen molecular ion (H+
2 ), the results were not matching with the predictions [12].
The problem was that – being unable to solve their equations – the physicists
used numerical approximations. This problem originally emerged in 1956
and was solved by Scott and his coworkers [26] just in 1993. In the analytic
solution the Lambert function appears which helped to take exponentially subdominant
terms into account in the solution which could explain the anomaly."