Arcadian Functor

occasional meanderings in physics' brave new world

My Photo
Name:
Location: New Zealand

Marni D. Sheppeard

Thursday, April 19, 2007

What's New

Thanks to Marcus at PF for pointing out the abstract of the new paper by Dr S. D. M. White, with the provocative title "Fundamentalist physics: why Dark Energy is bad for Astronomy". Amusingly, despite the title, the author actually thinks Dark Energy is an acceptable theoretical idea.

Meanwhile, I have run down to the library to get hold of Heisenberg's original article Zs. Phys. 33 (1925) 879-893 in English translation (thank you, anonymous) from the volume Sources of Quantum Mechanics, edited by van der Waerden and published by North-Holland in 1967.

As anonymous has noted, Heisenberg was indeed thinking in terms of Fourier transforms, not necessarily commutative, and his original approach to quantum mechanics takes the measurement geometry philosophy seriously. To quote: "instead it seems more reasonable to try to establish a theoretical quantum mechanics, analogous to classical mechanics, but in which only relations between observable quantities occur." Heisenberg's relations look very categorical in nature. His first example is the category of emission frequencies for an electron. In two short pages he argues that the phases are of just as much significance in the quantum case as in the classical. In terms of the real part of expressions $U(n, n - a)exp(i \omega (n, n - a)t)$ he says: "Only the origin of the time scale and hence a phase factor common to all the $U$ is arbitrary and accordingly devoid of physical significance, but the phases of the individual $U$ enter in an essential manner into the [induced scattering moment]."

6 Comments:

Blogger CarlBrannen said...

Ouch. From the White paper on dark energy and astronomy: The fact that it is hard to imagine an enthusiastic amateur community devoted to high-energy physics is another indicator of the cultural differences between the two fields.

April 19, 2007 2:56 PM  
Blogger Kea said...

Carl, LOL! But note that White seems to think that DE has solid foundations.

April 19, 2007 3:05 PM  
Blogger nige said...

Carl, that quotation you give may* be the proof of parallel universes! Amateurs aren't in the same universe as White is ...

_____
*An alternative theory to explain White's remark is it may indicate something about the way arXiv.org encourages contributions from 'amateurs' in parallel universes, but not in this one.

April 19, 2007 8:50 PM  
Blogger nige said...

"But note that White seems to think that DE has solid foundations." - Kea

Even Dr Woit might agree with White, because anything based on observation seems more scientific than totally abject speculation.

If you assume the Einstein field equation to be a good description of cosmology and to not contain any errors or omissions of physics, then you are indeed forced by the observations that distant supernovae aren't slowing, to accept a small positive cosmological constant and corresponding 'dark energy' to power that long range repulsion just enough to stop the gravitational retardation of distant supernovae.

Quantum gravity is supposed - by the mainstream - to only affect general relativity on extremely small distance scales, ie extremely strong gravitational fields.

According to the uncertainty principle, for virtual particles acting as gauge boson in a quantum field theory, their energy is related to their duration of existence according to: (energy)*(time) ~ h-bar.

Since time = distance/c,

(energy)*(distance) ~ c*h-bar.

Hence,

(distance) ~ c*h-bar/(energy)

Very small distances therefore correspond to very big energies. Since gravitons capable of graviton-graviton interactions (photons don't interact with one another, for comparison) are assumed to mediate quantum gravity, the quantum gravity theory in its simplest form is non-renormalizable, because at small distances the gravitons would have very great energies and be strongly interacting with one another, unlike the photon force mediators in QED, where renormalization works. So the whole problem for quantum gravity has been renormalization, assuming that gravitons do indeed cause gravity (they're unobserved). This is where string theory goes wrong, in solving a 'problem' which might not even be real, by coming up with a renormalizable quantum graviton based on gravitons which they then hype as being the 'prediction of gravity'.

The correct thing to do is to first ask how renormalization works in gravity. In the standard model, renormalization works because there are different charges for each force, so that the virtual charges will become polarized in a field around a real charge, affecting the latter and thus causing renormalization, ie, the modification of the observable charge as seen from great distances (low energy interactions) from that existing near the bare core of the charge at very short distances, well within the pair production range (high energy interactions).

The problem is that gravity has only one type of 'charge', mass. There's no anti-mass, so in a gravitational field everything falls one way only, even antimatter. So you can't get polarization of virtual charges by a gravitational field, even in principle. This is why renormalization doesn't make sense for quantum gravity: you can't have a different bare core (high energy) gravitational mass from the long range observable gravitational mass at low energy, because there's no way that the vacuum can be polarized by the gravitational field to shield the core.

This is the essential difference between QED, which is capable of vacuum polarization and charge renormalization at high energy, and gravitation which isn't.

However, in QED there is renormalization of both electric charge and the electron's inertial mass. Since by the equivalence principle, inertial mass = gravitational mass, it seems that there really is evidence that mass is renormalizable, and the effective bare core mass is higher than that observed at low energy (great distances) by the same ratio that the bare core electric charge is higher than the screened electronic charge as measured at low energy.

This implies (because gravity can't be renormalized by the effects of polarization of charges in a gravitational field) that the source of the renormalization of electric charge and of the electron's inertial mass in QED is that the mass of an electron is external to the electron core, and is being associated to the electron core by the electric field of the core. This is why the shielding which reduces the effective electric charge as seen at large distances, also reduces the observable mass by the same factor. In other words, if there was no polarized vacuum of virtual particles shielding the electron core, the stronger electric field would give it a similarly larger inertial and gravitational mass.

Penrose claims in his book 'The Road to Reality' that the bare core charge of the electron is 'probably' (137.036^0.5)*e = 11.7e.

In getting this he uses Sommerfeld's fine structure parameter,

alpha = (e^2)/(4*Pi*permittivity of free space*c*h-bar) = 1/137.036...

Hence, e^2 is proportional to alpha, so you'd expect from dimensional analysis that electric charge shielding should be proportional to (alpha)^0.5.

However, this is wrong physically.

From the uncertainty principle, the range r of a gauge boson is related to its energy E by:

E = hc/(2*Pi*r).

Since the force exerted is F = E/r (from: work energy = force times distance moved in direction of the applied force), we get

F = E/r = [hc/(2*Pi*r)]/r

= hc/(2*Pi*r^2)

= (1/alpha)*(Coulomb's law for electrons)

Hence, the bare core electron's bare core charge really has the value e/alpha, not e/(alpha^0.5) as Penrose guessed from dimensional analysis. This "leads to predictions of masses."

It's really weird that this simple approach to calculating the total amount of vacuum shielding for the electron core is so ignorantly censored out. It's published in an Apr. 2003 Electronics World paper, and I haven't found it elsewhere. It's a very simple calculation, so it's easy to check both the calculation and its assumptions, and it leads to predictions.

I won't repeat the argument that dark energy is a false theory here at length. Just let's say that on cosmological distances, all radiation including gauge bosons, will be stretched and degraded in frequency and hence in energy. This, the exchange radiation which causes gravity will be weakened by redshift due to expansion over large distances, and when you include this effect on the gravitational interaction coupling parameter G in general relativity, general relativity then predicts the supernovae redshifts correctly. Instead of inventing an additional unobservable to offset the unobserved long range gravitational retardation being offset by dark energy, you just have no long range gravitational deceleration. Hence, no outward acceleration to offset inward gravity at long distances. The universe is simply flat on large scales because gravity is weakened by the redshift of gauge bosons over great distances in an expanding universe where gravitational charges (masses) are receding from one another. Simple.

Another problem with general relativity as currently used is the T_{ab} tensor which is usually represented by a smooth source for the gravitational field, such as a continuum of uniform density.

In reality, the whole idea of density is a statistical approximation, because matter consists of particle of very high density, distributed in the vacuum. So the idea that general relativity shows that spacetime is flat on small distance scales is just bunk, it's based on the false statistical approximation (which holds on large scales, not on small scales) that you can represent the source for gravity (ie, quantized particles) by a continuum.

So the maths used to make T_{ab} generate solvable differential equations is an approximation which is correct at large scales (after you make allowances for the mechanism of gravity, including redshift of gauge bosons exchanged over large distances), but is inaccurate in general on small scales.

General relativity doesn't prove a continuum exists, it requires a continuum because its based on continuously variable differential tensor equations which don't easily model the discontinuities in the vacuum (ie, real quantized matter). So the nature of general relativity forces you to use a continuum as an approximation.

Sorry for the length of comment, feel free to delete.

April 19, 2007 10:24 PM  
Blogger L. Riofrio said...

Remember that the big science HEP community can't fund their increasingly expensive expereiments, and DE hasn't caught on with the public at all. Hundreds of papers about DE just decreases the chance of any one of them being right. They are losing.

April 20, 2007 4:48 AM  
Blogger Doug said...

I am uncertain about dark energy. I do suspect that there is untapped kinetic energy associated with the ordinary three types of neutrinos?

BUT look at this apparently eclipsed black hole NCG 1365 AGN captured by Chandra, slide #3 of 10 from April 18:

http://www.usatoday.com/tech/space/twis07/flash.htm

Red square is in slide #1 of 10

April 20, 2007 1:25 PM  

Post a Comment

<< Home