Send As SMS

The most important events in our and your superstringy Universe - May 2005 backup of physics blog

Friday, May 05, 2006

Why particles can't be octopi

A reader has pointed out that the idea that particles are different octopi is becoming popular. Apparently, almost no one knows why an electron can't be either a mammal or at least a reptile. ;-)

There are several basic defining properties that distinguish elementary fermions - quarks and leptons - from other objects such as dogs:
  • they are quantum mechanical entities that are able to interfere in double-slit experiments; they are indistinguishable
  • these particles satisfy the rules of special relativity and the relativistic dispersion relations
  • they carry spin and are created by quantum fields with a spinor index
  • quarks carry color and they transform as a representation of a gauge group
  • consequently, they interact through interactions mediated by Yang-Mills gauge fields
These features are no details that can be ignored: instead, they are features that distinguish elementary particles from other objects, such as supermarkets, weapons of mass destruction, or extraterrestrial aliens. These features underlie our description of everything we know about the particles - as encoded in quantum field theory.

Elementary particles can't be dogs or octopi because dogs and octopi violate every single property listed above. If someone violates all these requirements, it means that he has made no (0) progress in describing these elementary particles. Let us see why the octopi violate these requirements one by one.

Uniqueness

Dogs and octopi can't realistically generate interference patterns in double-slit experiments. It is because they carry a huge entropy: a dog can be represented by many different microstates and these microstates don't interfere with each other. It is absolutely necessary for the quantum state describing an electron to be unique - determined by the position (or momentum) and spin only - in order for the electron to be able to interfere.

Elementary particles of the same kind must be indistinguishable and the fermions such as electrons must moreover obey the Pauli principle; otherwise chemistry could not work. Mammals violate these rules because their complicated internal structure makes them distinguishable.

A necessary condition for this uniqueness that is required for the existence of interference is the uniqueness of the vacuum itself.

Unusual theories that imagine a chaotic discrete structure at every point of "empty" space violate this rule maximally. A double-slit experiment would never create an interference pattern if vacuum had a large degeneracy, as envisioned by theories of "discrete gravity" and similar hopeless approaches to physics. The vacuum must be a completely pure, unique state, and the one-electron states must also be pure, well-defined states in the Hilbert space that are described by the position/momentum and the spin only.

No chaos is acceptable for particle physics. Some people, especially cranks, imagine that the deepest idea that they can dedicate to the world of physics is to make things complex and to give the particles complicated internal structure. But the experiments offer clean outcomes and require just the opposite: the vacuum and the particles must be very clean, unique, and if there is any internal structure, there must exist reasons why the structure is always found in the same state.

The difficult task for those who probe physics beyond quantum field theory is not how to make things complex, foggy, and chaotic: it is on the contrary how to make things consistent, pure, well-defined, and consistent with the very sharp and nearly accurate laws of Nature that we have already understood, experimentally verified, and summarized in the Standard Model and General Relativity.

Relativity

A related point is that the elementary particles can't be identified with distortions of some "underlying structure" that is not unique because such a picture would violate the fact that the elementary particles follow the rules of special relativity. Elementary particles can't be excitations of an aether - a discredited notion from the 19th century physics that some people are trying to revive under the name "spin network" - because such an aether would break the Lorentz invariance. We know from experiments that the Lorentz invariance is either exact or an extremely accurate approximation.

All people who say that the Lorentz invariance is something that can be ignored or something that is cheap to obtain are crackpots because special relativity is one of five most important and five most constraining discoveries of the 20th century physics.

Octopi that swim in the ocean do not respect the rules of special relativity (unless you include the details of the ocean into your description): water spontaneously breaks this symmetry. Everything else that resembles a complex object on a generic complex background is all but guaranteed to do the same thing. Particle physics follows very different rules from the ocean.

Spinors

Elementary fermions are excitations of spin-1/2 fields in spacetime - quantum fields that transform as a spinor under the Lorentz group. Again, this is no detail. It is a very defining feature of the leptons and quarks. Octopi and dogs don't transform as a spinor. A naive classical picture of octopi can't be compatible with the spinorial gauge theory of an electron.

Note that the correct spin of particles can be extracted from string theory in all of its realizations. For example, particles can be viewed as excited strings and the excitations themselves transform as the appropriate representation of the Lorentz group. It is because the excitations of the "minimal energy string" are operators themselves and they naturally transform as representations of various groups. A random octopus embedded in a complex environment won't transform as a representation of the Lorentz group - this group will be broken.

Color

Quarks also transform as the three-dimensional representation of a group, namely the colorful SU(3) group. The word "group" really means a "symmetry". A symmetry is a set of transformations that transform one object onto another object. It is something that shows that these two or more objects related by symmetries actually have identical physical properties.

If you draw three octopi that differ in details, they can't have the same physical properties, and consequently they cannot form a representation of a group. If three "colors" of quarks don't have physically indistinguishable properties, you won't ever be able to find an SU(3) theory that creates consistent forces between them. They will never have the right interaction terms with the gauge fields, even if you believe that such a gauge field could also be found.

Strings are the most complicated objects that can behave as elementary particles with the right properties. Branes of other dimensions are in principle capable to do the same thing. But it is hard to quantize branes of wrong dimensions directly - we know how to define quantum theories describing internal dynamics of branes using open strings stretched between them: a brane on which an open string ends is called a D-brane.

Generic animals can't play the role of elementary particles because of all the reasons above. Any paradigm that is meant to be treated seriously by theoretical physicists must explain why it reproduces the same kind of Lagrangians - or equations of motion - that we know from the Standard Model and/or General Relativity: quantum fields with Lorentz or spinor indices, color indices, and their right products in the Lagrangian. This task is very nontrivial which is why string theory is the only known way (and probably the only mathematically possible way) how to describe observed physics by something else than point-like particles or by quantum fields defined at each point of a spacetime.

Al Gore, Kyoto, and Canada

Naomi Oreskes and Christopher Monckton and co2 temperature correlation The Canadian conservative government of Stephen Harper has made one of the obvious rational decisions:

... Massachusetts v. EPA ...

they have erased the Kyoto protocol from their federal budget and slashed funding for the greenhouse gas programs, bringing their country closer to the rest of the civilized North America. I always believed that they would be trying to act in this direction. Many people including Steve McIntyre did not believe me.

The Tories care about the environment but they also care about common sense. They prefer solutions that make sense and have a high enough chance to work. They will introduce tax breaks to support the public transportation, among other things. Alberta in particular is rather happy and wants to
Note that according to Kyoto, Canadians would eventually have to pay about $600 per family for carbon dioxide credits. It's not a devastating amount but still, it is silly to throw money away for such entirely useless things and it is even sillier to torture yourself even if you know that you will have to pay anyway.

When the Tories planned to return rational thinking to the environmental policymaking, they had to think about possible criticism. But I guess that there exists no real threat to be afraid of. Who are the critics? They're people like Al Gore. If you have forgotten, Al Gore is a megalomanic eco-prophet who has had plans to control the entire territory of the United States of America six years ago. Now he tries to promote his movie full of convenient untrue statements (they're convenient for the producers' pockets):
The movie argues that the planet will face a catastrophe in 10 years (...) unless the instructions of the prophet and narrator Gore will be followed. Moreover, Gore is now explaining that climate change is no longer a political issue: it has become a spiritual (religious) issue! Wow. The debate is over and a new era of crucification of those who find the prophet intellectually challenged is getting started.



When read the "made-in-Canada" quotes from Environment Minister Rona Ambrose, Gore rolled his eyes and made a flag-waving gesture with his hand. During his stay in Toronto, Gore also claimed that the government had no mandate to make any decisions about the environment. I guess that Gore believes that he has a mandate himself - a direct mandate from the God of climate change. ;-)

Star Trek technology stolen

David Goss has pointed out an article of
Nicolae Nicorovici and Graeme Milton consider objects called "superlenses" that hypothetically create appropriate electromagnetic resonance that cancels light reflected from a seed of dust - and, maybe, from a spaceship in the future - rendering the seed invisible. ;-) Alternatively, you can say that the superlens forces the seed of dust to emit no light. Sounds intriguing and bizarre.

Glenn Reynolds vs. Sean Carroll

Glenn Reynolds has suggested a solution of oil problems that should not be unexpected from an instant right-wing pundit:
  • Of course, if we seized the Saudi and Iranian oil fields and ran the pumps full speed, oil prices would plummet, dictators would be broke, and poor nations would benefit from cheap energy. But we'd be called imperialist oppressors, then.
Sean Carroll disagrees. He thinks that Reynolds has squeezed five units of wrongness into four statements. Sean predicts that Reynolds' blog will collapse into a black hole.

I happen to think that Sean's prediction is a flawed prediction and Reynolds is closer to the truth than Carroll. Let's analyze the statements one by one.

Prices would plummet

Sean thinks that they won't plummet because the oil fields are essentially running at full capacity. Sean has a naive idea about the driving forces behind these prices. In 2002, the oil price was $18 instead of $70. Does it mean that the oil fields were running at a much-higher-than-full capacity?

The oil price is a very volatile quantity that sensitively responds to many different factors. The consumers are ready to pay higher prices because they feel that oil is something valuable that can cease to be available tomorrow. OPEC's statements have a dramatic impact on the price. If there were real competition, the prices could drop. Of course, the conflicts started by September 2001 did not really move the oil industry in this right direction.

Dictators would be broke

Sean thinks that dictators are rich even without oil. In principle, dictators can be rich even without natural resources, but it is naive to claim that oil does not make them richer. In Reynolds's article, many figures are listed that show how the regimes of oil-rich countries financially benefit from the higher prices.

Poor nations would benefit

Sean argues that oil price is more important for rich countries because they consume most of oil. That's completely unrealistic, much like the previous points. Rich nations and people may consume more oil, but oil is still a smaller percentage of the money that they must spend and a change of the price has smaller consequences. Poor countries are affected by higher prices more than the rich ones. This is why many officials have proposed an IMF-backed fund to help the poor countries hit by oil price volatility. See, for example, BBC or Sinha's calls.

We would be called imperialist oppressors

I think it is obvious that even if a fuller control by U.S. capitalism led to a smaller influence of dictators, lower prices, and stronger growth of the economies, especially the poor ones, the U.S. would be blamed as an imperialist oppressor. Even Sean Carroll agrees that it is the case. But he disagrees that it would be inappropriate to blame the U.S. for such changes. Well, if the governments and political systems impose things such as affirmative action, stifling political correctness, nationalization of corporations, huge redistribution plans, far left-wing blogs offer their support. If someone thinks about government plans that would actually make things better, not worse, and cheaper, not more expensive, far left-wing bloggers complain about imperialist oppression.

The far left-wing approach is counterproductive for everyone who actually wants to live in a better world.

And that's the memo.

Thursday, May 04, 2006

Crackpot papers on the arXiv

We have some good news for those people who complain that the physics mafia does not allow the fans of alternative physics to submit their work: the arXiv is now apparently fully open to crackpots.

Tonight, there are at least four crackpot papers on gr-qc and hep-th.

On gr-qc, an author from Chicago with a dot-com address derives the masses of all elementary particles. His groundbreaking idea is based on Kaluza-Klein theory but he only cites Dirac and Georgi+Glashow. The physicist calculates the masses of all elementary fermions using a simple square-root formula. Because the results disagree, among many other things, with the known properties of the s,c,b,t quarks, the author predicts that these quarks probably don't exist.

On hep-th, there are two papers about an unusual mechanism to generate masses for non-Abelian gauge bosons. One of them is short and the following one is longer. The author writes a non-local action for the massive gauge boson involving the inverse box. For U(1) gauge bosons, it is a standard textbook trick that creates an equivalent action. For non-Abelian gauge groups, one needs the inverse covariant box which obviously leads to a non-polynomial and non-local theory that breaks down exactly where you expect problems with the unitarity of the WW-WW scattering.

It is a standard material from graduate courses of quantum field theory that one can make gauge bosons massive with extra Goldstone bosons that live in the group manifold. However, the non-linear sigma model is not renormalizable and breaks down at energies comparable to 4.pi.f where f is the decay constant. If we want the theory to be valid at higher energies, we must complete it and the Higgs mechanism is the only perturbative way to do it. The exchange of additional fields such as the Higgs helps to keep the WW to WW scattering unitary.

The statements in the paper claiming that the strange new theory can be proved to be perturbatively renormalizable must be incorrect. The microscopic source of the confusion is probably that the author does not appreciate how difficult it is to invert the covariant box. The only way how to complete the theory into a renormalizable theory is to effectively create new particles corresponding to the non-invertible modes of the covariant box - as long as these particles will interact just like the Higgses, and one obtains a theory equivalent to the standard spontaneous symmetry breaking.

In another paper, an author proposes a list of generic predictions of quantum gravity. It would be more accurate to call it a list of misconceptions inspired by sloppy thinking about quantum gravity - and it would be even more accurate to call it a list of reasons why all "alternative" attempts to define quantum gravity must be inconsistent. Neither of the bizarre effects is predicted by anything that could be called a theory of quantum gravity - and most likely, neither of the bizarre effects is even consistent with quantum gravity.

The list includes double special relativity, something that is known to be inconsistent with locality and additivity of energy, even with the approximate ones. Similar considerations show that doubly special relativity leads to the so-called soccer ball problem (thanks to Sabine for explaining me the terminology): you can't kick a soccer ball if its total energy (including the latent one) is going to exceed the Planck energy - about a few micrograms. In fact, the soccer ball couldn't exist.

The second "general prediction" is that elementary particles are "coherent excitations of quantum geometry" that probably refers to a recent kindergarden theory that elementary particles are different octopi. Well, in reality, gravitons (and perhaps KK-photons) are coherent excitations of quantum geometry while other particles are excitations of something more general - and whether or not you call this more general thing (string theory) "geometry" is a matter of terminology. The third "general prediction" is that "locality is disordered".

The author also repeats many misconceptions about the quasinormal modes - such as the fantasy that they have something to do with the black hole entropy counting in loop quantum gravity. This fantasy has been known to be patently false for more than two years. In fact, it is wrong on all sides: the quasinormal frequencies are generally not what they needed to be according to the hypothesis; the entropy predicted by loop quantum gravity is not what it needed to be either; and the link between these two is completely unphysical.

The author of this particular paper always tells me that he understands my explanations why these things are safely known not to work. Then he returns home and writes another silly paper claiming that they do work. Sigh.

Wednesday, May 03, 2006

New critic of evolution and string theory

Apparently, a classmate of two other famous critics. Moreover, all of them will probably believe the theory of global warming.

Via Rae Ann.

Predict your climate

Via Steve McIntyre's blog (posting by John A.)

Dave Stockwell has created a script whose source is found here (mirror) and described here. If you click the image below, it opens in a separate window: you probably need to click because the image does not quite fit here. Every time you reload the image, the calculation starts from the beginning.



Although Dave offers an explanation, let me offer you mine, too.

The blue graph shows temperatures from 1856 to 1994 or so measured by the CRU thermometers - the array is called "cru" - and these real numbers are used to make predictions from 1994 to 2093 with an important help of a random generator: the predicted temperatures for the period 1995-2094 depend on random numbers as well as the CRU data from the past.

The eleven temperatures from the period 1995-2005 are known from the CRU data, but they are also predicted using the random forecasting algorithm. These eleven years are used to calculate the verification statistics - a kind of score that is used to evaluate how much you should believe the prediction: statistical skill.

How are the random predictions made?

Weighted random data

The temperatures predicted for the years 1995-2094 are calculated using the array called "fcser" that later becomes the second part of the array "graphValues"; the role of the "fcser" array is to emulate the temperature persistence. These "predictions" are "calculated" from the "series" as follows: the temperatures in the "temp" array are calculated as
  • the CRU temperature from 1994 plus a random number between -0.5 and +0.5 plus "fcser" for the given year

where fcser for the given year is a weighted average of the values of "temp" from previous years (for the years up to 1994, the real CRU data are used): the weights, defined in the array "weight", are a particular decreasing function of the time delay. If you care, "weight[y]" for the delay of "y" years is recursively calculated by

  • weight[1] = 1/2
  • weight[y] = weight[y-1] * (y-1.5)/y

For a long time delay, you see that "dw/dy = -1.5 w/y" which means that the weight goes like "y^{-1.5}", a power law. All the numerical constants are variables in the script that can be modified if you wish. The formula for the weights has the interesting feature that they automatically sum to one, in fact for a general value of "d":

  • weight[1] = d
  • weight[y] = weight[y-1] * [1 - (1+d)/y]
I leave you the proof as a homework exercise. The value of "d" leading to the most reasonable color of the noise is clearly related to the critical exponents encoding the temperature autocorrelation.

Verification statistics

Two verification statistics are calculated to quantify the agreement between the observed CRU temperatures and the randomly predicted temperatures in the 1995-2005 interval:
  • r2 - or "r squared"
  • re - or "reduction of error"

Here, "r2" is the usual correlation coefficient squared - something that measures the correlation between eleven numbers "x_i" (CRU temperatures) and eleven numbers "y_i" (randomly predicted temperatures). The correlation coefficient is a number between -1 and 1 calculated as follows:

  • [Sum(xy) - 11 Average(x) Average(y)] / sqrt(Variance(x) Variance(y)]

where "Variance(x) = Sum[(x_i-Average(x))^2]" and similarly for "y". This "r2" statistics is normally used to evaluate statistical skill, and you may see that this number is extremely close to zero whenever you reload the picture; they're much smaller than one. This smallness tells you that the random numbers (of course) are statistically insignificant and the prediction is not trustworthy. The "hockey stick graph" of the past temperatures gives you a tiny "r2", too.

On the other hand, "re" is the reduction of error. You usually get high numbers around 0.5; the Mann-Bradley-Hughes gives a rather high verification statistics, too. Because in this experiment, you see that "re" is high even though the prediction is based on random data - i.e. on complete garbage - it shows that high "re" can't be trusted. This "re" is calculated as follows:

  • re = 1 - SumVariances/SumVariancesRef

where "SumVariances" is the sum of "(cru-predictedtemp)^2" over the eleven years while "SumVariancesRef" is the sum of "(cru-averagecru)^2" where "cru" are the actually measured temperatures in the eleven years of the verification period. In other words, the number "re" is a number between 0 and 1 that tells you by how much your prediction is better from the assumption of a simple "null hypothesis" that the temperature is constant over the 11-year period.

This particular program predicts the 1995-2094 temperatures as random data with a particular power law encoding the noise at different time scales, but otherwise oscillating around constant data (the 1994 temperature). You could modify the "predictions" by any kind of bias you want - global warming or global cooling - and the statistical significance of your results would not change much. Also, the M&M mining effect is still not included: if you allow your algorithm to choose the "best trees", you can increase your verification statistics even though the data is still noise.

The punch line is that the reconstructions that imply that the 20th century climate was unprecedented are as statistically trustworthy as sequences of random numbers. If you want to verify the hypotheses, you must actually pay attention to the "r2" statistic. With this method, you can see that the randomly generated predictions are garbage, much like various existing "hockey stick" graphs whose goal was to "prove" that the 20th century climate was unprecedented.

Tuesday, May 02, 2006

Ari Pakman and Rajesh Gopakumar

Short comments about two interesting physicists (and speakers) who visited Harvard.

Ari

Ari Pakman et al. have done something that should have been calculated eight years ago or so: they have verified the three-point functions of the chiral primary operators in the AdS3/CFT2 correspondence. Recall that the same task in AdS5/CFT4 was solved by Lee, Minwalla, Rangamani, and Seiberg in 1998.

The calculation of Ari and his company starts with the Wess-Zumino-Witten models for the groups SU(2) and SL(2,R) that are combined in various ways. The intermediate results depend on the double gamma function and similar monsters. But all these complicated functions eventually cancel between the SU(2) and SL(2,R) parts of the theory to give you a very simple result (essentially equal to one) - one that matches the correlators in the symmetric orbifold CFTs that describe the boundary conformal field theory - correlators calculated by Lunin and Mathur, among others. I can't tell you more details because the paper by Ari et al. is yet to be published.

When Ari visited Harvard at the end of 2004, he showed a picture of Che Guevara on one of his transparencies. At that time, I did not know that particular communist bastard, so I asked Ari who was that - and Ari answered that it was a Czech dissident. Ari assumed that I was joking - because we certainly had to hear about Che all the time, he thought - but I was not joking and in fact the Czechoslovak communists did not tell us a single nice word about Che. He was never popular in Czechoslovakia and as far as I can tell, the Czechoslovak communists did not trust allies such as Che.

Rajesh

Rajesh Gopakumar is continuing with his program to derive the worldsheet theory of a string from the known gauge theory on the boundary of the AdS spacetime. He has a sophisticated sequence of steps to translate the diagrams in gauge theory - and he considers free diagrams in the free gauge theory only. By imagining that the worldsheet is discretized in a particular way, he can construct the hypothetical worldsheet correlators that indeed lead, after an integral over the worldsheet positions (and perhaps other moduli, if you considered string loops), to the simple power-law correlators of the chiral primaries on the CFT side.

The worldsheet correlators satisfy all the usual properties that you expect from a CFT, and as Davide Gaiotto has pointed out, they resemble powers of correlators of spin fields in the Ising model. Indeed, it is not unnatural to expect that the vertex operators for physical states in the hypothetical CFT are represented by some kind of spin fields.

Polar bears thriving

Polar bears are often used as symbols of victims of the so-called global warming. Dr. Mitchell Taylor, a polar bear biologist, debunks various convenient lies of this kind in

11 out of 13 Canadian populations of polar bears are stable or growing. One Southern population could actually be over-abundant.

Monday, May 01, 2006

How to spend 6 billion dollars - results

After 151 votes, the percentual gains seem to be rather constant, so let me close the polls. The results are the following:
  1. ILC: the linear collider, 47%
  2. Millions of PC for kids, 31%
  3. Two weeks of Kyoto, 11%
  4. One month of war in Iraq, 8%
  5. Ten space shuttle flights, 3%
The message is that the Pentagon and especially NASA should either improve their public relations or modify their military or research goals because their result is worse than the support for the mad agreement to prevent the climate from changing.

On the contrary, the voters have shown that a new linear collider should be built, and to a lesser extent, they have also demonstrated a pretty good support for the MIT plan to produce millions of $100 laptops for the kids in the third world.

Blayne Heckel: torsion balance tests of gravity

  • Update - van der Waals and Casimir: Nima told me an obviously true statement whose validity I did not appreciate although I had to know it because it is apparently explained in the Landau-Lifshitz books. I thought that there were too problematic forces competing with gravity: the Casimir force and the van der Waals forces. In fact, they're the same one-loop effect. The Casimir force is a macroscopic description of the overall effect of van der Waals forces between the atoms of the metallic plates.
Blayne Heckel from University of Washington gave a colloquium about the submillimeter tests of gravity. He started with motivation for these experiments - with a review of Newton's theory, general relativity, old large dimensions, and warped dimensions, among other possibilities inspired by string theory.

What do you need in their experiment? You need a one-meter-long fiber and a hanging gadget whose size is seven centimeters. There are mirrors on the gadget: because they reflect a laser beam, you can measure the orientation of the gadget.

On that gadget, you find a horizontal disk with many holes that respect a Z_{21} symmetry. This disk can rotate relatively to another disk beneath it. In the experiments, it rotates very slowly, with a period comparable to several hours. The existence of holes exerts a gravitational torque on the gadget whose magnitude is periodically oscillating. This effect caused by gravity is still pretty large. Because you want to study deviations from the "1/r^2" force law, you add another disk with holes whose torques exactly cancel the previously discussed torques assuming that Newton's law is exact.

That's a method to measure the hypothetical deviations only. You must be careful about many details - for example, you must insert a thin conducting foil in between the different layers to screen the electromagnetic effects including the Casimir forces. This foil is really thin because eventually you are able to measure the forces at distances slightly below 100 microns.

The deviations are conveniently parameterized as a Yukawa force that is, relatively to Newton's force, suppressed by the factor "alpha.exp(-r/r0)". You measure the angular orientation of your gadget and finally you evaluate the data: you allow the coefficient "alpha", the distance scale of the new force "r0", as well as various parameters of your gadget's mass distribution to vary, and you calculate the best fit.

After having made several versions of their experiment with slightly modified details, they end up with
  • alpha = (-0.7 +- 0.9) x 10^{-4}
which means that "alpha" is really zero and you can't therefore determine "r0" at all. What the value of alpha above shows that what we see is less than a one-sigma effect because 0.7 is less than the standard deviation of 0.9: while that would be enough in climate science to prove a new kind of looming catastrophe, it means "no discovery" in physics.

The intermediate results indicated some effects - equivalent to an additional repulsive force - but these were just 3 sigma effects that disappeared when they tried to do things slightly differently.

They also measure a possible existence of the preferred reference frame and other unusual terms and they can show that the coefficients of the terms responsible for these effects are at least 100,000 times weaker than what you would normally expect if you assumed that CPT and the Lorentz invariance are broken at the Planck scale. While Lee Smolin waits for GLAST to prove his unusual theories that the normal Lorentz invariance is broken, I think that these terrestrial experiments have already falsified these bizarre theories, proving that they were indeed (easily) falsifiable.

Clifford Johnson accepts Jesus Christ (almost)

Clifford Johnson from Cosmic Variance was

He has learned that the Christians can be great people and in principle, they could even become scientists. One can talk to them, smile at them, and respect them as human beings. They can write and they do write lovelier articles about Clifford than the left-wing atheists. Such an experience makes a difference.

Indeed, Clifford

and it went wonderfully: singing with the piano, clapping, preaching. Indeed, Clifford has found out that the Christians can be more human and more friendly people than the officers of PC police. Moreover, some verses from the Bible resonated with the message he wanted to give.

Of course, the idea of Clifford Johnson in the church was rather controversial at Cosmic Variance. Religion is viewed as the source of all lies in the world by Sean Carroll. He emphasizes that religion is not necessarily evil: it is just false. And one must do everything to fight it; see, for example, these 172 pages.

More seriously, there are some scientifically strange things that many Christians believe. But there are also many scientifically strange things that left-wingers such as Sean Carroll believe. I have discussed the fact that the color or the amount of religion in some ideas cannot universally predict their scientific strength.

Moreover, I still view religion as the basis of moral principles in our society. Yes, I am primarily talking about the judeo-Christian tradition. But more generally, religions showed their power to give our lives a direction. Religions can't provide us with the final word about difficult scientific questions; but they have been and they are a part of the transformation of skillful monkeys to human beings.

Science and religion were born into the same cradle. Their diversification only occured when the human civilization made many other important steps.

Although it has always been clear that I would remain an infidel, the Christian environment is something that many of us are able to live with. If we had to spend years with extraterrestrial aliens, it could be difficult - but if they were Christians, things could simplify dramatically. ;-)