Monday, August 31, 2009

A resolution of cosmological entropy paradox

Anonymous made an interesting question about how TGD solves the cosmological entropy paradox. The initial state of cosmology seems to be maximum entropy state. The recent state should have even larger entropy if second law holds. One can however argue that this is not the case.

The TGD inspired proposal is that the resolution of cosmological entropy paradox relates to the relationship between subjective and geometric time.

  1. It is subjective time with respect to which second law holds true. It corresponds to the geometric time of observer only locally.

  2. One can apply second law only for to what happens inside 4-D causal diamond (CD) corresponding to the time scale of observations: in positive energy ontology second law is applied at fixed value of geometric time and this leads to problems. In cosmology the relevant CD extends from the moment of big bang and to the recent time or even farther to geometric future. The idea that entropy grows as a function of cosmic time is simply wrong if you accept zero energy ontology.

More concretely.

  1. In each quantum jump re-creating entire 4-D Universe the entire geometric future and past changes.

  2. Initial state of big bang in geometric sense(!)- the zero energy states associated with small CDs near the light-cone boundary corresponding to Big Bang- are replaced by a new one at every moment of subjective time. Hence the "subjectively recent" initial state of Big Bang can be assumed to have maximum entropy as also states after that when the time scale of observations (size of CD) is the age of the universe. Gradually the entire geometric past ends up to a maximum entropy state in time scales below the time scale characterizing the time scale of observations. Thermal equilibrium in 4-D sense rather than 3-D sense results and the paradox disappears.

Note: The breaking of strict classical determinism of Kahler action allowing CDs within CDs picture is essential mathematical prerequisite: otherwise this picture does not make sense. It makes possible also space-time correlates for quantum jump sequence rather than only for quantum states.

Note: One proposal for the resolution of entropy paradox could relate to generation of black holes with large entropy. In TGD framework this does not work since for gravitational Planck constant the value of black hole entropy is ridiculously small.

Friday, August 28, 2009

The planet that should not exist

There is an interesting news story about a planet that should not exist. Thanks to Kea also for additional references: see this and this. The exoplanet is so close to its Sun that it should have spiralled long ago to the Sun. For some reason this has not happened.

The finding brings in mind more than hundred year old problem: why the electron orbiting atom did not spiral into atomic nucleus? The solution of the puzzle was provided by the discovery of quantum theory. The postulate was that electron moves on Bohr orbits and can make only transitions between the Bohr orbits emitting light in these transitions. There is minimum value for the radius of Bohr orbit. Later wave mechanism emerged from Bohr model.

TGD view about dark matter suggests an analogous solution to the astrophysical variant of this puzzle. Planets correspond to Bohr orbits but for a gigantic value of Planck constant whose value is dictated by Equivalence Principle to high degree. This Planck constant could be assigned to the space-time sheet mediating gravitational interaction or even with matter. This means astroscopic quantum coherence and the interpretation is that astroscopic quantum coherence is associated with dark matter around which visible matter condenses and makes in this manner visible the quantum character of dark matter.

That the planet does not spiral to the star means smallness of dissipation and this is guaranteet by the large value of hbar. The naive estimate is that dissipation rate is proportional to the inverse of hbar. As Donkerheid noticed also Mars-Phobos forms a similar mysterious system and the explanation would be same.

A more refined view about the situation is in terms of light-like 3-surfaces, which are basic dynamical objects in quantum TGD. At elementary particle level their size is about CP2 size (about 104 Planck lengths). Also macroscopic and even astroscopic sizes are possible and this would be the case for dark matter for which Planck constant and thus also quantum scales are scaled up. Note that light-like 3-surfaces are boundaries between regions of space-time with Euclidian and Minkowskian signature of metric. The recent TGD inspired vision about Universe is as a kind of Indra's net formed by light-like 3-surfaces appearing in all length scales and having extremely complex topology. For details see the chapter Anyons and Quantum Hall Effect of "Towards M-matrix" explaining Quantum Hall Effect in terms of macroscopic light-like 3-surfaces and suggesting that this kind of anyonic phases are realized also in astrophysical scales for dark matter.

Amusingly, the counterpart of Planck length scaling as (hbar G)1/2 is apart from numerical constant equal to (v0)-1/2 GM (2GM is Scharschild radius) if one assumes that hbar= GM2/v0 is associated with an astrophysical system with mass M: v0/c ≈ 2-11 holds true for the inner planets in the solar system. Planck length would be few orders of magnitude larger than Schwartscild radius so that Planck scale physics would be scaled up to astrophysical length scale! Black-hole entropy which is proportional to 1/hbar is of order unity and would be extremely small for the ideally dark black-hole. Obviously M-theorists would be forced to reconsider the physcal significance of their black hole entropy calculations if this picture is correct.

The interested reader can consult the chapter Quantum Astrophysics of "Physics in Many-Sheeted Space-time" and the chapter Anyons and Quantum Hall Effect of "Towards M-matrix".

Labels: ,

Did Boltzmann understand all about time?

Lubos Motl wrote a pedagogical review about the notion of time. The title of the posting is "The arrow of time: understood for 100 years". As a conservative Lubos believes that all interesting things about time were said by Boltzmann already before the birth of quantum theory. Second law would summarize all that is interesting. Lubos is also impatient about the fact that there are still people who feel that the nature of time is not fully understood.

The core of the Boltzmannian view about time is simple to summarize.

  1. The time development can be seen as analogous to Markov process. To continue, let us introduce discrete time tn = n×Δt. One could regard this as a technical simplification but will find that Δt has interpretation as time scale of quantum coherence.

  2. Physical events are identified as transitions from state i to state j taking place during time interval Δt and the probabilities P(i,j) characterize them. The probability of state i at time t=tn is given by p(i,tn) and once the probabilities p(i,0) for initial state are given by summing the probabilities over all paths j→k→...→i leading from j to i and calculating the average over the initial states

    p(i,tn)= ∑j (Pn)(i,j)× p(j,0).

    This formula leads to the second law stating the increase of entropy defined by Shannon formula.

There are several unpleasant questions that Lubos leaves out of consideration.

  1. Boltzmann's approach was developed before the advent of quantum mechanics but involves classical probabilistic approach not very natural in classical physics. Quantum theory indeed allows to calculate the transition probabilities p(i,j) from first principles. There are however deep interpretational problems. Can one say when quantum transition -state function reduction - breaking the deterministic Schrödinger evolution really takes place? How long is the period of non-determinism if it occurs? Is the non-determinism really assignable to the geometric time or could it be that the relationship between the geometric time and experienced time is what we do not actually understand? These are basic problems of quantum measurement theory which is a useful bag of calculational recipes but does not really deserve to be called a theory.

  2. At the technical level these problems are avoided by assuming that transitions take place during an infinitely long time interval. This idealization means that quantum coherence is present in infinitely long time scale. The resulting probability proportional to a square of energy conserving delta function is transformed to a rate by dividing with the length of this infinitely long time interval dividing away one delta function expressing the conservation of energy. One can feed the resulting rates to kinetic equations and assume that quantum coherence is present only in infinitely short time scales. Do not get frustrated: this is the situation! One can guess that there must be some finite time scale for which quantum coherence and sum over amplitudes makes sense and in longer time scales one must use Boltzmann's discretized approach and sum over probabilities.

There are also problems related to the justification of probabilistic approach.

  1. To some degree this picture could be formally justified by using path integral formulation of quantum field theory. What one must do is to replace the modulus squared for the path integral approximated with a sum over discrete paths with a sum in which all interference terms are neglected. This means a de-coherence. This kind of approximation can be defended by saying that due to the small value ofΔt in the example discussed. Do we really understand the origin and mechanisms of de-coherence? Could de-coherence have a more detailed description involving perhaps new physics?

  2. De-coherence assumption is rather strong in the many-sheeted space-time of TGD. TGD based view about dark matter as a hierarchy of phases partially labeled by Planck constant predicts macroscopic quantum coherence even in astrophysical time and length scales so that the Markovian view can be used only if one restricts the consideration to processes in definite time scale below the natural time scale characterizing the time intervals during which observations are made.

The basic problem of this approach is that the observer is not part of the Universe. In classical physics observer was a complete outsider and in quantum measurement theory the situation remains the same although the measurement interaction leading to state function reduction affects the measured system.

TGD inspired theory of consciousness can be seen as a generalization of the quantum measurement theory to resolve its basic paradox by making observer part of the Universe via the notion of self as well as to understand the differences and relation between the time of physics (geometric time) and experienced time by identifying the chronon of latter as quantum jump defining moment of consciousness.

  1. The outcome is what I call zero energy ontology. Zero energy states are pairs of positive and negative energy states localizable to the upper and lower boundaries of causal diamonds defined as intersections of future and past directed light-cones of Minkowski space (and taking Cartesian product with CP2). There are CDs within CDs and they form a hierarchy.

  2. The hierarchy of CDs is also a correlate for a hierarchy of conscious entities which I refer to as selves. CD represents the perceptive field of self. CD represents correlate for quantum jump identified as the chronon of experienced time and there is an infinite hierarchy of chronons. CD also defines also quantum coherence region inside which the sum over probabilities must be replaced with the sum over amplitudes so that Boltzmann's kinetic description fails.

  3. The notion of time measurement resolution reduces to the time scale of CD. If this time scale comes as powers of 2, p-adic length scale hypothesis follows. One would have a hierarchy of physics realized in different p-adic length and time scales characterized by primes near integer powers of 2.

  4. Although one has quantum coherence in a given time scale (CD), it is possible to have de-coherence in shorter time scale (sub-CDs). The description of hadronic reactions in terms of quarks and gluons using kinetic distributions defined in relatively short time and length scales and the description of hadrons using wave functions defined in considerably longer scales is a good example about de-coherence within coherence.

  5. This gives hopes about improved understanding of the second law in living matter. The essentially new notion is that of scale: when one speaks about second law one must specify the time scale in which it is applied. Only if applier is CD modeling what happens in the ensemble of sub-CDs this description works. If one tries to understand what happens in CDs characterized by time scale longer than the natural time scale of the observation- the approach fails. These CDs are expected to be highly relevant in biology.

I will not continue here further but give instead a link to the article About the Nature of Time and also a link to a video summarizing the recent view about the relation between geometric and subjective time: this includes explanation for the emergence of the arrow of time and for the fact that the contents of sensory experience are about very narrow time interval although one would expect that entire CD determines the contents of also sensory experience. I hope that I do not sound too authoritative and that my badly broken English is not too painful an experience;-).

Labels: , , ,

Thursday, August 27, 2009

Is N=8 supergravity finite?

Kram sent to me a link to a highly interesting popular article relating to N=8 supergravity. Zvi Bern and collaborators have been able make a progress in attempt to prove the finiteness of N=8 supergravity. Work has been done during one decade: good to learn during the age of hyper hype physics that work in this time scale is still done. For some reason the work has not been commented by Lubos nor by others. If the finiteness is true, one can only admire the incredible power of Einstein's conceptualization.

I have not anything interesting to say about the topic but I can give link to Vanquishing Infinity: Old Methods Lead To New Approach To Finding Quantum Theory Of Gravity.

Monday, August 24, 2009

Three new physics realizations of the genetic code and the role of dark matter in bio-systems

TGD inspired quantum biology leads naturally to the idea that several realizations of genetic code exist. Besides the realizations based on temporal patterns of electromagnetic fields I have considered three different new physics realizations of the genetic code based the notions of many-sheeted space-time, magnetic body, and the hierarchy of Planck constants explaining dark matter in TGD framework.

  1. The first realization - proposed in the model for DNA as topological quantum computer (tqc) - maps the nucleotides A,G and T,C to dark quarks u,d and their anti-quarks assignable to the ends of magnetic flux tubes representing braid strands and connecting nucleotides to lipids of cell membrane.

  2. Second realization was discovered in the model of dark nuclei as strings of dark baryons. Dark baryons realize codons in terms of quantum entanglement and without decomposition to letters. Dark baryons are strings of 3 quarks connected by two color flux tubes. The neutral states of the dark baryon predicted by the model are in 1-1 correspondence with DNA, RNA, aminoacids. Candidates for the counterparts of tRNA anticodons are also obtained if one accepts that genetic code actually decomposes to 2 steps 64→40→20 such that there are 40 dark baryon counterparts for tRNA anticodons. The amazing finding is that vertebrate genetic code comes out correctly.

  3. The third realization is a physical realization for the divisor code proposed by Khrennikov and Nilsson. The realization relies on two integers labeling magnetic flux tubes containing dark matter. The dark magnetic flux tubes assignable to DNA codons and amino-acids could be labeled by these integers providing a representation of the genetic code consistent with the divisor code. Also a physical mechanism implying the physical equivalence of the dark baryon code and divisor code can be imagined.

The basic proposal is that dark baryon counterparts of basic bio-molecules and genetic code were present from beginning and gave rise to pre-biotic life at the magnetic flux tubes so that the evolution of biological life meant the development of translation and transcription mechanisms allowing to transform dark baryon variants of the codons to their chemical variants. These mechanisms would be still at work inside the living cell and allow the living matter to perform genetic engineering. This proposal is consistent with recent findings about large variations of genomes inside organism.

There is a strange experimental finding by a roup led by a HIV Nobel winner Montagnier giving support for this picture. A water solution containing human cells infected by bacteria is sterilized by a filtering procedure and healthy cells are added to the filtrate. Within few weeks the infected cells re-appear. A possible explanation is that dark baryon variant of the bacterial genome realized as nano-sized particles remains in the solution despite the filtering.

The codes are discussed from the point of view of DNA as tqc hypothesis and the model for protein folding and bio-catalysis. The basic selection rules of bio-catalysis could be based on the two integers assignable to the dark magnetic flux tubes. Only bio-molecules whose dark magnetic bodies contain a layer characterized by same integers can be connected by dark magnetic flux tubes. The reconnection of the dark magnetic flux tubes selecting the bio-molecules participating the catalytic reaction and the contraction of these flux tubes induced by a phase transition reducing Planck constant and forcing the bio-molecules near to each other would represent basic mechanisms of bio-catalysis.

For background see the new chapter Three new physics realizations of the genetic code and the role of dark matter in bio-systems of "Genes and Memes".

Monday, August 10, 2009

In what sense c could be changing in solar system?

There have been continual claims that the speed of light in solar system is decreasing. The latest paper about this is by Sanejouand and to my opinion must be taken seriously. The situation is summarized by an excerpt from the abstract of the article:

The empirical evidences in favor of the hypothesis that the speed of light decreases by a few centimeters per second each year are examined. Lunar laser ranging data are found to be consistent with this hypothesis, which also provides a straightforward explanation for the so-called Pioneer anomaly, that is, a time-dependent blue-shift observed when analyzing radio tracking data from distant spacecrafts, as well as an alternative explanation for both the apparent time-dilation of remote events and the apparent acceleration of the Universe.

Before one can speak about change of c seriously, one must specify precisely what the measurement of speed of light means. In GRT framework speed of light is by definition a constant in local Minkowski coordinates. It seems very difficult to make sense about varying speed of light since c is purely locally defined notion.

  1. In TGD framework space-time as abstract manifold is replaced by 4-D surface in H=M4×CP2 (forgetting complications due to the hierarchy of Planck constants). This brings in something new: the sub-manifold geometry allowing to look space-time surfaces "from outside", from H-perspective. The shape of the space-time surface appears as new degrees of freedom. This leads to the explanation of standard model symmetries, elementary particle quantum numbers and geometrization of classical fields, the dream of Einstein. Furthermore, CP2 length scale provides a universal unit of length and p-adic length scale hypothesis brings in an entire hierarchy of fixed meter sticks defined by p-adic length scales. The presence of imbedding space M4×CP2 brings in light-like geodesics of M4 for which c is maximal and by a suitable choice of units could be taken c=1.

  2. In TGD framework the operational definition for the speed of light at given space-time sheet is in terms of the time taken for light to propagate from point A to B along space-time sheet. In TGD framework this can occur via several routes because of many-sheeted structure and each sheet gives its own value for c. Even if space-time surface is only warped (no curvature) this time is longer than along light-like geodesic of M4(×CP2) and the speed of light measured in this manner is reduced from its maximal value. The light-like geodesics of M4 serve as universal comparison standards when one measures speed of light - something which GRT does not provide.

What TGD then predicts?

  1. TGD inspired cosmology predicts that c measured in this manner increases in cosmological scales, just the opposite for what Louise Riofrio claims. The reason is that strong gravitation makes space-surface strongly curved and it takes more time to travel from A to B during early cosmology. This means that TGD based explanation has different cosmological consequences as that of Riofrio. For instance, Hubble constant depends on the space-time sheet in TGD framework.

  2. The paradox however disappears since local systems like solar system do not normally participate in cosmic expansion as predicted by TGD. This is known also experimentally. In TGD Universe local systems could however participate cosmic expansion in average sense via phase transitions increasing Planck constant of the appropriate space-time sheet and thus increasing its size. The transition would occur in relatively short time scales: this provides new support for expanding Earth hypothesis needed to explain the fact that continents fit nicely together to form single super continent covering entire Earth if the radius of Earth is by a factor 1/2 smaller than its recent radius (see this).

  3. If one measures the speed of light in local system and uses its cosmic value taken constant by definition (fixing particular coordinate time) then one indeed finds that the speed of light is decreasing locally and the decrease should be expressible in terms of Hubble constant.

  4. TGD based explanation of Pioneer anomaly can be based on completely analogous reasoning.

Addition: I added a videoclip about varying light velocity here.

For background see for instance the chapter TGD and Astrophysics of "p-Adic length Scale Hypothesis and Dark Matter Hierarchy".

Tuesday, August 04, 2009

Indications for excited states of Z0 boson

Tommaso Dorigo is a highly inspiring physics blogger since he writes from the point of view of experimental physicist without the burden of theoretical dogmas and does not behave aggressively;-). I share with him also the symptons of splitting of personality to fluctuation-enthusiast and die-hard skeptic. This makes life interesting but not easy. This time Tommaso told about the evidence for new neutral gauge boson states in high energy ppbar collisions. The title of the posting was A New Z' Boson at 240 GeV? No, Wait, at 720!?.

1. The findings

The title tells that the tentative interpretation of these states are as excited states of Z0 boson and that the masses of the states are around 240 GeV and 720 GeV. The evidence for the new states comes from electron-positron pairs in relatively narrow energy interval produced by the decays of the might-be-there gauge boson. This kind of decay is an especially clean signature since strong interaction effects are not present and it appears at sharp energy.

240 GeV bump was reported by CDF last year in ppbar collisions at cm energy s1/2=1.96 TeV. The probability that it is a fluctuation is .6 per cent. What is encouraging that also D0 found the same bump. Tommaso explains much better the experimental side so that I need not lose that little credibility that I might still have. If the particle in question is analogous to Z0, it should decay also to muons. CDF checked this and found a negative result. This made Tommaso rather skeptic.

Also indications for 720 GeV resonance (this is just a nominal value, the mass could be somewhere between 700-800 GeV) was reported by D0 collaboration: the report is titled as Search for high-mass narrow resonances in the di-electron channel at D0. There are just 2 events above 700 GeV but background is small;-): just three events above 600 GeV. It is easy to guess what skeptic would say.

Before continuing I want to make clear that I refuse to be blind believer or die-hard skeptic and that I am unable to say anything serious about the experimental side. I am just interested to see whether these events might be interpreted in TGD framework. TGD indeed predicts -or should I say strongly suggests- a lot of new physics above intermediate boson length scale.

Are exotic Z0 bosons p-adically scaled up variants of ordinary Z0 boson?

p-Adic length scale hypothesis allows the p-adic length scale characterized by prime p ≈ 2k vary since k can have several integer values. The TGD counterpart of Gell-Mann-Okubo mass formula involves varying value of k for quark masses. Several anomalies reported by Tommaso during years could be resolved if k can have several values. Last anomaly was the discovery that Ωb baryon containing two strange quarks and bottom quark seems to appear with two masses differing by about 110 MeV. TGD explains the mass difference correctly by assuming that second strange quark has mass two times the ordinary. The predicted mass difference is 100 MeV.

One can look whether p-adic length scale hypothesis could explain the masses of exotic Z0 candidates as multiples of half octaves of Z0 mass which is 91 GeV. k=3 would give 257 GeV, not too far from 240 GeV. k=6 would give 728 GeV consistent with the nominal value of the mass. Also other masses are predicted and this could serve as a test for the theory. This option does not however explain why muon pairs are not produced in the case of 240 GeV resonance.

Support for topological explanation of family replication phenomenon?

An improved explanation is based on TGD based view about family replication phenomenon.

  1. In TGD the explanation of family replication is in terms of genus of 2-dimensional partonic surface representing fermion. Fermions correspond to SU(3) triplet of a dynamical symmetry assignable to the three lowest genera (sphere, torus, sphere with two handles). Bosons as wormhole contacts have two wormhole throats carrying fermion numbers and correspond to SU(3) singlet and octet. Sooner or later the members of the octet - presumably heavier than singlet- should be observed (maybe this has been done now;-)).

  2. The exchange of these particles predicts also charged flavor changing currents respecting conservation of corresponding "isospin" and "hypercharge." For instance, lepton quark scattering e+s → μ+d would be possible. The most dramatic signature of these states is production of muon-positron pairs (for instance) via decays.

  3. Since the Z0 or photon like boson in question has vanishing "isospin" and "hypercharge", it must be orthogonal to the ordinary Z0 which couples identically to all families. There are states particles of this kind and they correspond to superpositions of fermion pairs of different generations in TGD framework. The two bosons - very optimistically identified as 240 GeV and 720 GeV Z0, must be orthogonal to the ordinary Z0. This requires that the phase factors in superposition of pairs adjust themselves properly. Also mixing effects breaking color symmetry are possible and expected to occur since the SU(3) in question is not an exact symmetry. Hence the exotic Z0 bosons could couple preferentially to some fermion generation. This kind of mixing might be used to explain the absence of muon pair signal in the case of 240 GeV resonance.

  4. The prediction for the masses is same as for the first option if the octet and singlet bosons have identical masses for same p-adic mass scale so that mass splitting between different representations would take place via the choice of the mass scale alone.

Could scaled up copy of hadron physics involved?

One can also ask whether these particles could come from the decays of hadrons of a scaled up copy of hadron physics strongly suggested by p-Adic length scale hypothesis.

  1. Various hadron physics would correspond to Mersenne primes: standard hadron physics to M107 and new hadron physics to Mersenne prime M89=289-1. The first guess for the mass scale of "light" M89 hadrons would be 2(107-89)/2=512 times that for ordinary hadrons. The electron pairs might result in a decay of scaled up variant of pseudoscalar mesons π , η, or of η' or spin one ρ and ω mesons with nearly the same mass. Only scaled up ρ and ω mesons remains under consideration if one assumes spin 1.

  2. The scaling of pion mass about 140 MeV gives 72 GeV. This is three times smaller than 240 GeV but this is extremely rough estimate. Actually it is the p-adic mass scale of quarks involved which matters rather than that of hadronic space-time sheet characterized by M89. The naive scaling of the mass of η meson with mass 548 MeV would give about 281 GeV. η' would give 490 GeV. ρ meson with mass would give 396 GeV. The estimates are just order of magnitude estimates since the mass splitting between pseudoscalar and corresponding vector meson is sensitive to quark mass scale.

  3. This option does not provide any explanation for the lack of muon pairs in decays of 240 GeV resonance.

To conclude, family replication phenomenon for gauge bosons is consistent with the claimed masses and also absence of muon pairs might be understood. It remains to be seen whether only statistical fluctuations are in question.

For background see the chapter p-Adic Particle Massivation: New Physics of "p-Adic length Scale Hypothesis and Dark Matter Hierarchy".

Monday, August 03, 2009

Why viXra?

viXra is a new electronic e-print archive (not a mirror site of arXiv.org;-)) giving hopes for people like me in attempts to overcome the censorship wall making impossible to communicate using ordinary channels. The following quote summarizes the reasons why for viXra.
In 1991 the electronic e-print archive, now known as arXiv.org, was founded at Los Alamos National Laboratories. In the early days of the World Wide Web it was open to submissions from all scientific researchers, but gradually a policy of moderation was employed to block articles that the administrators considered unsuitable. In 2004 this was replaced by a system of endorsements to reduce the workload and place responsibility of moderation on the endorsers. The stated intention was to permit anybody from the scientific community to continue contributing. However many of us who had successfully submitted e-prints before then found that we were no longer able to. Even those with doctorates in physics and long histories of publication in scientific journals can no longer contribute to the arXiv unless they can find an endorser in a suitable research institution. The policies of Cornell University who now control the arXiv are so strict that even when someone succeeds in finding an endorser their e-print may still be rejected or moved to the "physics" category of the arXiv where it is likely to get less attention. Those who endorse articles that Cornell find unsuitable are under threat of losing their right to endorse or even their own ability to submit e-prints. Given the harm this might cause to their careers it is no surprise that endorsers are very conservative when considering articles from people they do not know. These policies are defended on the arXiv's endorsement help page. A few of the cases where people have been blocked from submitting to the arXiv have been detailed on the Archive Freedom website, but as time has gone by it has become clear that Cornell have no plans to bow to pressure and change their policies. Some of us now feel that the time has come to start an alternative archive which will be open to the whole scientific community. That is why viXra has been created. viXra will be open to anybody for both reading and submitting articles. We will not prevent anybody from submitting and will only reject articles in extreme cases of abuse, e.g. where the work may be vulgar, libellous, plagiarius or dangerously misleading. It is inevitable that viXra will therefore contain e-prints that many scientists will consider clearly wrong and unscientific. However, it will also be a repository for new ideas that the scientific establishment is not currently willing to consider. Other perfectly conventional e-prints will be found here simply because the authors were not able to find a suitable endorser for the arXiv or because they prefer a more open system. It is our belief that anybody who considers themselves to have done scientific work should have the right to place it in an archive in order to communicate the idea to a wide public. They should also be allowed to stake their claim of priority in case the idea is recognised as important in the future. Many scientists argue that if arXiv.org had such an open policy then it would be filled with unscientific papers that waste peoples time. There are problems with that argument. Firstly there are already a high number of submissions that do get into the archive which many people consider to be rubbish, but they don't agree on which ones they are. If you removed them all, the arXiv would be left with only safe papers of very limited interest. Instead of complaining about the papers they don't like, researchers need to find other ways of selecting the papers of interest to them. arXiv.org could help by providing technology to help people filter the article lists they browse. It is also often said that the arXiv.org exclusion policies don't matter because if an amateur scientist were to make a great discovery, it would certainly be noticed and recognised. There are two reasons why this argument is wrong and unhelpful. Firstly, many amateur scientists are just trying to do ordinary science. They do not have to make the next great paradigm shift in science before their work can be useful. Secondly, the best new ideas do not follow from conventional research and it may take several years before their importance can be appreciated. If such a discovery cannot be put in a permanent archive it will be overlooked to the detriment of both the author and the scientific community. Another argument is that anybody can submit their work to a journal where it will get an impartial review. The truth is that most journals are now more concerned with the commercial value their impact factor than with the advance of science. Papers submitted by anyone without a good affiliation to a research institution find it very difficult to publish. Their work is often returned with an unhelpful note saying that it will not be passed on for review because it does not meet the criteria of the journal. In part viXra.org is a parody of arXiv.org to highlight Cornell University's unacceptable censorship policy. It is also an experiment to see what kind of scientific work is being excluded by the arXiv. But most of all it is a serious and permanent e-print archive for scientific work. Unlike arXiv.org it is truly open to scientists from all walks of life. You can support this project by submitting your articles.
I love physics but -to be honest- find it very diffult to say the same about physicists in general. I used to think that people with highest academic ranks would behave like civilized human beings but my fate has been to gradually learn that too often scientist is what you obtain by subtracting what makes us humans - ethical behavior. Just a few days ago I learned that a new queen of beehive kills mercilessly all other candidates for queens. This could be understood as basic biology: Dawkins might speak about selfish genes. This brought to my mind the behavior of dictators like Stalin and what quite generally happens when a new idea is transformed to ideology. As super-string theory -just one example about a winner meme- became an ideology it used its fanatic proponents as instruments to kill competing memes using censorship and funding policy. This behavior brings to my mind a small child who thinks that he is the center of the world but slowly learns that also other human beings have desires and goals. Human kind has gone through a long cultural evolution and developed something called ethics and this kind of 'ideology justifies anything' and 'winner takes all' behavior means regression from culture to mere biology. My hope that viXra could help to create a new culture of science in which intellectual freedom would prevail and barbaric crackpot labeling and exclusion from scientific community would not be the everyday practice to treat those who have courage and ability to develop new views about reality.