Physical Insights

An independent scientist’s observations on society, technology, energy, science and the environment. “Modern science has been a voyage into the unknown, with a lesson in humility waiting at every stop. Many passengers would rather have stayed home.” – Carl Sagan

Posts Tagged ‘particle physics

Detecting a nuclear fission reactor at the centre of the Earth.

with 2 comments

Some months ago I wrote a post discussing Marvin Herndon’s controversial theory regarding a nuclear fission reactor, a “georeactor”, at the molten core of the Earth. One point I noted is that it should be entirely practical to falsify such a theory, to test it, to prove the existence of and to study the characteristics of such a nuclear reactor, by simply studying the flux of neutrinos (electron antineutrinos, specifically) from inside the Earth.

Here’s an interesting paper I found discussing just that.

Of course, the usual cautions regarding ArXiv preprint material apply – it is not peer-reviewed, and should be treated with skeptical scrutiny and caution.

Ultimately, the best possible site for a geo-reactor search is Hawaii (panel d) in Fig. 1). This option requires however, construction of a new excavated laboratory. In Hawaii, situated entirely on the oceanic crust with very low geo-U/Th. only the small signal from U/Th deep in the Mantle is visible. The remoteness from populated continents on either side reduces the power reactor signals to a comfortably low level.

With these considerations in mind, I wonder if the IceCube experiment at the South Pole wouldn’t be a much more useful detector site, with perhaps the best possible isolation from manmade reactors as well as geological U/Th radioactivity as you could possibly get? Not to mention the fact that the detector itself already exists, and doesn’t need to be constructed.

Written by Luke Weston

October 14, 2008 at 5:04 pm

What really goes on at the Large Hadron Collider – Part 2: or, How I Learned to Stop Worrying and Love the Black Hole

leave a comment »

No, black holes produced in the Large Hadron Collider (LHC) aren’t going to kill you, eat Geneva, or destroy the Earth. If those black holes are formed, it will be really quite fantastic, and it will represent everything that we’ve hoped for from LHC and more.

I’ll take a couple of hours from my Saturday afternoon to write this, and hopefully it will reduce, even if just infinitesimally, the number of stupid conversations I have to listen to where people’s entire knowledge of particle physics comes from the Herald Sun. I consider that a worthwhile investment of my time, and if it means that I don’t have to hear about teenage girls tragically taking their own lives for absolutely no reason at all because of such nonsense, then I feel that that’s a really good thing, too.

If the centre-of-mass energy of two colliding elementary particles (a maximum of 14 TeV for pp collisions in the Large Hadron Collider) reaches the Planck scale, \mathrm{E_{D}}, and their impact parameter, b, is smaller than the corresponding Schwarzschild radius, \mathrm{R_{H}}, then a black hole will indeed be produced. However, the energy corresponding to the Planck scale, \mathrm{E_{Pl} \sim 10^{28} eV}, is a lot of energy, if you’re an experimental physicist. Such energies are entirely outside the reach of the experimental physicist – so, surely, generation of microscopic black holes (hereafter, \mathrm{\mu BH}) at the Large Hadron Collider (LHC) has got to be impossible – doesn’t it?

According to the Standard Model of Particle Physics, \mathrm{\mu BH} generation in a particle collider is indeed impossible at the TeV-scale energies associated with the current generation of high energy experimental particle physics endeavors, such as the LHC. The much-publicised speculations regarding the possibility of \mathrm{\mu BH} formation at the LHC are based on speculative hypotheses derived from theoretical models of cosmology and particle physics beyond the standard model (“new physics”).

Certain models put forward some years ago by theoretical physicists offer a seemingly neat and efficient lead into answering the questions, such as those of the hierarchy problem, of interest to particle physicists, and involve the existence of higher spatial dimensions.

The novelty of these higher-dimensional models lie in the fact that it is no longer necessary to assume that these dimensions are of sizes close to the Planck length (\mathrm{ \sim 10^{-35}\ m}). Rather, large extra dimensions could be as large as around a millimetre, if we suppose that the `fields of matter’ – those fields of relevance to electroweak interactions and QCD, for example – `live’ in the 3+1 dimensional hypersurface of our 3-brane – our familiar 3+1 dimensional world – and that only the gravitational field can interact across the higher-dimensional universe. Experiments involving the direct measurement of Newtonian gravity put upper bounds on the size of extra dimensions to a value of less than a few hundred microns. Under such an approach, the traditional Planck scale, corresponding to \mathrm{E_{Pl} \sim 10^{28} eV}, is no more than an effective scale and the real fundamental Planck scale in D dimensions is given by \mathrm{E_{D} = (E_{Pl}^{2} / V_{D-4})^{1/(D-2)}}, where \mathrm{V_{D-4}} is the volume associated with the D - 4 extra dimensions. In 10 dimensions, with radii associated with the extra dimensions of the Fermi scale, we find \mathrm{E_{D} \sim TeV}.

OK, now, if you’re thinking that that all sounds horribly complicated, I understand. I don’t understand it all, either. Allow me to try and re-explain that.

Imagine two particles being smashed together within the collider. As they come together, their gravitational interaction increases according to the familiar inverse square law of classical gravitational physics. The formation of a black hole, at least the astrophysical ones with which we’re familiar, is a phenomenon which is all about gravity. For the force of gravity to become strong enough to create a black hole in our proton-proton system, these protons would have to be bought together inside a distance on the scale of the Planck length – \mathrm{1.6 \times 10^{-35}\ m}. The energy corresponding to such a distance scale is \mathrm{\hbar c / l_{Pl} \approx 1.2 \times 10^{28}\ eV} – an energy that dwarfs the most energetic cosmic rays known – which themselves dwarf the proudest achievements of Earthly accelerator physicists.

But this all assumes that the gravitational force exists only within the familiar world of three dimensions. If gravity extends across the higher spatial dimensions, the force of gravitational interactions increases much more rapidly with increasing proximity of the particles, and at very small length scales associated with high energy experimental particle physics, it’s just barely possible that such phenomena can start to become relevant.

So suppose when the particles interact with these very high energies, they’re interacting on the scale of the higher-dimensional world. As gravity “becomes strong”, black hole formation could start to become relevant on a length scale on the order of \mathrm{10^{-19}\ m}. That is still an inconceivable scale compared to everyday experience, but it is a factor of ten thousand trillion times closer to our reach than the Planck scale in a three-dimensional universe. And this length scale corresponds to energies on the order of only a couple of teraelectron volts. It is these TeV-scale energies that are well within the grasp of the LHC.

NB: For an extremely readable introduction to these possibly daunting concepts of theoretical particle physics and cosmology, which is fully accessible to all readers, with no prior knowledge of advanced physics required, Lisa Randall’s Warped Passages is suggested reading.

If such models are have any meaning, it is effectively a natural choice (and not an arbitrary one based on phenomenological motivations) because it essentially presents a resolution to the heirarchy problem. [1]

If the signature of the decay of a microscopic black hole is observed within the LHC’s detectors, then on that day, these fascinating models are no longer theoretical physics – they represent our best empirically-motivated description of what nature is.

If the Planck scale is thus accessible down into the TeV range, then \mathrm{\mu BH}-generation in TeV-scale particle collider experiments is indeed possible. It could be, maybe. The 14 TeV centre-of-mass energy of the LHC could allow it to become a black hole factory with a \mathrm{\mu BH} production rate as high as perhaps about \mathrm{1\ s^{-1}}.

Many studies are underway to make a precise evaluation of the cross-section for the creation of black holes via parton collisions, but it appears that the naive geometric approximation, \mathrm{\sigma \sim \pi R_{H}^{2}}, is quite reasonable for setting the orders of
magnitude. [1]

The possibility of the presence of large extra dimensions would be doubly favourable for the production of black holes. The key point is that it allows the Planck scale to be reduced to accessible values, but additionally, it also allows the Schwarzschild radius to be significantly increased, thus making the \mathrm{b < R_{H}} condition for black hole formation distinctly easier to satisfy.

One notable property of any microscopic black holes that could result from LHC collisions is that these black holes will have radii corresponding to the TeV scale – \mathrm{10^{-19}\ m} – much smaller than the size of the large extra dimensions. Hence, any \mathrm{\mu BH} can be considered as totally immersed in a D-dimensional space (which has, to a good approximation, a time dimension and D-1 large (non-compact) spatial dimensions.) It follows that such a \mathrm{\mu BH} does indeed rapidly `evaporate’ into fundamental particles, with an extremely short lifetime, on the order of \mathrm{10^{-26}\ s}. [1] This argument is exclusively based on the same theoretical physics that predicts the possibility of microscopic black hole formation at TeV-scale energies, and is completely independent of the oft-cited argument regarding Hawking radiation.

The temperature of the \mathrm{\mu BH}, typically about 100 GeV under such conditions, is much lower than it would be for a black hole of the same mass in a four-dimensional space, but nevertheless, the black hole retains the expected characteristic quasithermal radiation spectrum corresponding to its temperature.

In the case of the hypothetical microscopic black holes, if they can be produced in the collisions of elementary particles, they must also be able to decay back into elementary particles. Theoretically, it is expected that microscopic black holes would indeed decay via Hawking radiation, a mechanism based on fundamental physical principles, for which there is general consensus as to the validity.

It is well established in the literature that there is no way that a \mathrm{\mu BH} produced at the LHC could possibly be able to accrete matter in a potentially Earth-destroying fashion, even if, somehow, it turned out that such a black hole could be particularly stable. [3] [4] If the physical models that provide a basis for considering the possibility of \mathrm{\mu BH} formation in TeV-scale particle collisions are indeed valid, then microscopic black holes could be produced not only by our particle collider experiments, but also in high energy cosmic ray interactions, and those black holes would have stopped in the Earth or in other astronomical bodies. The stability of these astronomical bodies demonstrates that such black holes, if they exist, cannot possibly present any credible danger of planetary destruction.

Familiar astrophysical black holes have very large masses, ranging from several solar masses to perhaps as high as a billion solar masses for the largest supermassive black holes. On the other hand, the maximum centre-of-mass energy of pp collisions in the LHC corresponds to an equivalent mass of the order of \mathrm{10^{-53}} solar masses. If a microscopic black hole is produced in the LHC, it will have a mass far, far, far smaller than any black hole with which astrophysicists may be familiar, and possibly markedly different characteristics as well.

The rate at which any stopped black hole would accrete the matter surrounding it and grow in mass is dependent on how it is modeled. Several scenarios for hypothetical matter accretion into a terrestrial black hole have been studied and reported in the literature, where well-founded macroscopic physics has been used to establish conservative or worst-case-scenario limits to the rate of accretion of matter into a terrestrial black hole.

In the extra-dimensional scenarios that motivate the possibility of \mathrm{\mu BH} formation at TeV-scale energies (but which also motivate the extreme instability of those black holes), the rate at which the \mathrm{\mu BH} would accrete matter would be so slow, in a ten-dimensional universe, that the Earth would survive for billions of years before any harm befell it – a limiting time scale for Earth’s survival which is comparable to that already set by the finite main-sequence lifetime of the Sun.

At the intersection of astrophysics and particle physics, cosmology and field theory, quantum mechanics and general relativity, the possibility of \mathrm{\mu BH}-production in particle collider experiments such as the LHC opens up new fields of investigation and could constitute an invaluable pathway towards the joint study of gravitation and high-energy physics. Their possible absence already provides much information about the early universe; their detection would constitute a major advance. The potential existence of extra dimensions opens up new avenues for the production of black holes in colliders, which would become, de facto, even more fascinating tools for penetrating the mysteries of the fundamental structure of nature. [1]

The production of microscopic black holes at the LHC is just barely possible, perhaps. These microscopic black holes do not represent some kind of doomsday scenario for the Earth – quite the opposite, in fact.
They represent, arguably, some of the most incredible insights into exciting new physics that we could wish to take away from the LHC, with profoundly interesting implications for the future of physics.

There have also been suggestions over the years that there may exist magnetic monopoles, particles with non-zero free “magnetic charge”. As was originally established by Dirac, any free magnetic charge on a monopole will be quantised, as is electric charge, and necessarily much larger in magnitude than the elementary quantum of electric charge. For this reason, past efforts to look for evidence of a magnetic monopole have looked for strongly ionising particles with quantised magnetic charge.

In some grand unified theories, though not in the Standard Model of particle physics, magnetic monopoles are predicted to possibly be able to catalyse the decay of the protons comprising ordinary baryonic matter, into leptons and unstable mesons. If this is the case, successive collisions between a monopole and atomic nuclei could release substantial amounts of energy. Magnetic monopoles which may possess such properties are predicted to have masses as high as \mathrm{10^{24}\ eV/c^{2}}, [4] or higher, making them far too massive to be produced at the LHC.

The impact of such magnetic monopoles interacting with the Earth has been presented and quantitatively discussed in the literature [2], where it was concluded that the potential for a monopole to catalyse nucleon decay would allow the monopole to destroy nothing more than an infinitesmal, microscopic quantity of matter – \mathrm{10^{18}} nucleons – before exiting the Earth.

Independent of this conclusion, if magnetic monopoles could be created in collisions in the LHC, high-energy cosmic rays would already be creating numerous magnetic monopoles, with many of them striking the Earth and other astronomical bodies. Due to their large magnetic charges and strong ionising effect, any magnetic monopoles thusly generated would be rapidly stopped within the Earth.

The continued existence of the Earth and other astronomical bodies over billions of years of bombardment with ultra-high-energy cosmic rays demonstrates that any such magnetic monopoles can not catalyse proton decay at any appreciable rate. If particle collisions within the LHC could produce any dangerous magnetic monopoles, high-energy cosmic ray processes would already have done so. [3] [4]

As with the consideration of the possbility of the formation of black holes in TeV-scale particle collisions at LHC, the continued existence of the Earth and other astronomical bodies such as the Sun demonstrate that any magnetic monopoles produced by high-energy processes – be they in a particle collider or in a high-energy cosmic ray interaction – must be harmless.

As with the case of black hole production within the LHC, these arguments are not to say that such a process could not occur – only that such processes are not dangerous.

Whilst the existence of a magnetic monopole is viewed with the most extreme skepticism by the overwhelming majority of physicists, perhaps there is just the most remote possibility that it could be. The formation of a magnetic monopole at the LHC will not endanger the Earth, but if such a monopole was to be detected, we would of course find ourselves faced the prospect of re-writing two hundred years of physics.

As with the formation of a microscopic black hole, the formation of a magnetic monopole does not represent some kind of doomsday scenario – it represents one of the most incredible revelations we could possibly hope to find from experiments at the LHC.

References and suggested reading:

[1] Aurelien Barrau and Julien Grain
The case for mini black holes
CERN Courier, 2004.
http://cerncourier.com/cws/article/cern/29199.

[2]:
J P Blaizot, John Iliopoulos, J Madsen, Graham G Ross, Peter Sonderegger, and Hans J Specht.
Study of potentially dangerous events during heavy-ion collisions at the LHC: report of the LHC safety study group.
CERN, Geneva, 2003.

[3]:
Savas Dimopoulos and Greg L. Landsberg.
Black holes at the LHC.
Phys. Rev. Lett., 87:161602, 2001.

[4]:
John Ellis, Gian Giudice, Michelangelo Mangano, Igor Tkachev, and Urs Wiedemann (LHC Safety Assessment Group).
Review of the safety of LHC collisions
Journal of Physics G: Nuclear and Particle Physics, 35(11):115004 (18pp), 2008.

Written by Luke Weston

September 15, 2008 at 7:57 am

What really goes on at the Large Hadron Collider?

leave a comment »

What really goes on at the Large Hadron Collider?

What are the questions that it actually seeks to answer? I’ll begin by outlining these questions.

The Higgs mechanism: Can the Higgs boson be empirically detected? What if it doesn’t actually exist?

(A bit of trivia about the Higgs boson:
When Leon Lederman wrote The God Particle, and coined that phrase, which journalists love, and scientists hate, he originally wanted to call it the goddamn particle, but the publisher wouldn’t allow it.)

Even if a Higgs boson is not found, at the TeV-scale energies being probed here, there will be an extremely good chance of getting some insights into whatever process is responsible for things like electroweak symmetry breaking, if it’s not the Higgs mechanism.

The Hierarchy Problem: Why is gravity such a very weak force, compared to the other fundamental forces? Is the answer related to higher spatial dimensions, or to supersymmetry?

Higher dimensions: Can we empirically “see” evidence of higher spatial dimensions in the universe in high-energy particle interactions? What are those dimensions like? What are their properties?

CP-violation: How was CP-violation established in the primordial universe, to the degree that it was? What mechanism is responsible for the CP-violation, and the resultant amount of matter in the cosmos?

Dark matter: It’s quite well accepted today that there is indeed “dark matter”, and it is comprised of weakly-interacting particles that have a significant amount of mass. Exactly what are these particles? Could the lightest supersymmetric particles, such as the neutralino, if they exist, be consistent with the dark matter?

Black holes: If “micro black holes” really could be produced in LHC interactions, do they behave as predicted by Hawking, evaporating and radiating Hawking radiation?

Supersymmetry: If supersymmetric particles can be observed, what is the mechanism responsible for supersymmetry breaking, making the s-particles so massive, compared to the familiar standard model particles? Can the existence of supersymmetry explain the strength of gravity, or the composition of dark matter?

Every one of these areas is potentially going to be answered, or research is going to be considerably furthered, by work at the LHC.

An entire book (or many) could be written on each and every one of these things. For now, though, I’ll elaborate a little on just one of these areas of interest – CP-violation and its connection to cosmology.

One of the great open problems in physics at the moment is the question of why the Universe has so much matter in it, and essentially no antimatter.

If matter and antimatter (quarks and antiquarks, fundamentally) were created in equal amounts following the Big Bang, then all the matter and antimatter would annihilate, and the matter-filled Universe we see would not exist.

Something that we might expect, perhaps somewhat naively, from laws of physics is CP-symmetry – that is, that the laws of physics are ‘symmetrical’ under CP-transformation (CP- as in a combination of both Charge and Parity operators.) In other words, basically, the laws of physics are “symmetrical” between matter and antimatter, since CP-symmetry is the symmetry between matter and antimatter. We might expect particles and antiparticles should behave “symmetrically” in every way.

However, as per the above, this isn’t true. At least, it’s not always true. There exists some mechanism whereby CP-symmetry is perturbed, just a tiny bit – it was perturbed just enough in the early universe to create the universe that we see. This is CP-violation, an example of a symmetry violation in physics.

The CP operator is the product of two: C for charge conjugation, which transforms a particle into its antiparticle, and P for parity, which creates the mirror image of a physical system. The strong interaction and electromagnetic interaction seem CP-invariant, but a slight degree of CP-violation is observed in weak interactions under certain conditions.

The greater the degree of CP-violation present in the early Universe, the greater the amount of matter left in the Universe. Thus, the understanding of CP-violation plays an important role in cosmology, in explaining the amount of matter in the universe, which is a rather important quantity, from the point of view of physical cosmology.

Quoteth Wikipedia a bit because I’m getting sick of writing and can’t remember exact dates and all the names:

Until 1956, parity conservation was believed to be one of the fundamental geometric conservation laws (along with conservation of energy and conservation of momentum). However, in 1956 a careful critical review of the existing experimental data by theoretical physicists Tsung-Dao Lee and Chen Ning Yang revealed that while parity conservation had been verified in decays by the strong or electromagnetic interactions, it was untested in the weak interaction. They proposed several possible direct experimental tests. The first test based on beta decay of Cobalt-60 nuclei was carried out in 1956 by a group led by Chien-Shiung Wu, and demonstrated conclusively that weak interactions violate the P symmetry or, as the analogy goes, some reactions did not occur as often as their mirror image.

Only a weaker version of the symmetry could be preserved by physical phenomena, which was CPT-symmetry. Besides C and P, there is a third operation, time reversal (T), which corresponds to reversal of motion. Invariance under time reversal implies that whenever a motion is allowed by the laws of physics, the reversed motion is also an allowed one. The combination of CPT is thought to constitute an exact symmetry of all types of fundamental interactions. Because of the CPT-symmetry, a violation of the CP-symmetry is equivalent to a violation of the T-symmetry. CP violation implied nonconservation of T, provided that the long-held CPT theorem was valid. In this theorem, regarded as one of the basic principles of quantum field theory, charge conjugation, parity, and time reversal are applied together.

From that last paragraph, of course, we arrive at the seemingly incredible conjecture, which is absolutely true, that an antiparticle is a corresponding particle… it’s just traveling backwards in time, as was explained perhaps most famously by Richard Feynman. That is indeed how Quantum Field Theory predicts the existence of antiparticles.

The \mathrm{K_{0}} (neutral K) meson (or kaon) consists of a down quark and a strange antiquark – \mathrm{d\bar{s}} – and its corresponding antiparticle \mathrm{\bar{K_{0}}} is of course made up of a strange and an anti-down, \mathrm{s\bar{d}}.

Similarly, the \mathrm{B_{0}} is \mathrm{d\bar{b}}, and the \mathrm{\bar{B_{0}}} is \mathrm{b\bar{d}}. (B mesons, by definition, contain a b quark/antiquark, which is why they’re named thus, and kaons contain a strange combined with a non-strange quark.)

These mesons can ‘oscillate’ back and forth – with a particle spontaneously turning into the antiparticle, and vice versa, as shown in this Feynman diagram.

But the transition between particle and antiparticle and between antiparticle and particle don’t occur at quite the same rate – because of the CP-violating term!

Whilst CP-violation was first experimentally discovered, it was discovered in neutral Kaon interactions – but today, most experimental studies of CP-violation deal with the B-mesons.

Two of today’s best known particle physics experiments investigating CP-violation in the decay of B-mesons are the Belle and BaBar experiments – where B mesons are produced in electron-positron collisions using particle accelerators – the latter at the Stanford Linear Accelerator, and the former at an electron-positron synchrotron collider at KEK in Japan. The interaction points are surrounded by optimised detectors to watch the decay of the B-mesons created. When, say, a \mathrm{B_{0}} decays into some stuff, say a \mathrm{K_{0}} and a couple of leptons, the anti-reaction, a \mathrm{\bar{B_{0}}} decaying into the corresponding antiparticles, will occur, but at a different rate.

The observation of these decay events inside these detector experiments, like LHCb and Belle, provides insights into the mechanism by which the symmetry is broken to the degree that it is.
The LHCb detector experiment on the LHC is intended to be very similar in nature to these existing experiments – with similar goals.

Not Even Wrong: LHC and Doomsday

Yes, you’ve heard it all before. Black holes, strangelets, and even attack from Nibiru.

I will not spent any time entertaining such things, which we’ve thoroughly dismissed as nonsense already, other than to look at the latest in a long line of rubbish anti-science claims.

Yesterday, a group of LHC critics filed a suit against CERN in the European Court of Human Rights, in Strasbourg . The authors of the suit are physicists, professors and students largely from Germany and Austria, who feel that the operation of the $10 billion Large Hadron Collider near Geneva, poses grave risks for the safety and well-being of the 27 member states of the European Union and their citizens.

Who are these authors? What are their qualifications? What are their arguments? Does their science stack up?

Bosenovas are a new risk theory in the suit, besides the better known Strangelets and Lowered Vacuum State theories. Unlike the others there is some experimental evidence for a Bosenova, but this phenomenon of implosion/explosion has only been produced in small groups of atoms of Rubidium-85 in an ultracold state, a Bose-Einstein Condensate.

What might occur at the LHC, is a new type of Bosenova from what amounts to a BEC used there as a coolant, an ultracold Superfluid Helium II, of about 60 metric tonnes in the LHC ring, and a further 60 tonnes of somewhat warmer Superfluid Helium I in refrigeration plants on the surface connected to the subterranean main ring. Whether possible or not is unknown, no experiments having been done by CERN to rule out the possibility, nor any theoretical model studies.

A bosenova is a very small, supernova-like explosion, which can be induced in a Bose–Einstein condensate (BEC) by changing the magnetic field in which the BEC is located, so that the BEC quantum wavefunction’s self-interaction becomes attractive. This is a poorly understood, very, very interesting phenomenon – but it’s not dangerous, and it’s of little relevance to the LHC.

This stuff is taken from this page

But superfluid Helium II BEC is being used in great quantities as a coolant in certain nuclear reactors and particle accelerators.

The possibilities of a giant BEC bosenova produced in superfluid Helium II haven’t been investigated. The matter is urgent as 120 T of superfluid Helium II are being used at the Large Hadron Collider at Geneva, whose energies far surpass any other collider’s, not only beam energies, but RF applied, extreme Tesla Fields by superconducting magnets, and electrical energies equivalent to the consumption of Geneva…

“superfluid Helium II BEC”…. well, it’s just liquid helium. Liquid helium’s quantum-hydrodynamical properties are really cool indeed, but it’s just liquid helium. These days, liquid helium is routine technology.

We use LHe cryostats every day in scientific research, and technical applications…. cryostats for scientific experiments, superconducting niobium RF resonator cavities for particle accelerators, superconducting magnets for Nuclear Magnetic Resonance Spectrometers, and (N)MRI imaging machines at most major hospitals.

These are all based around liquid helium cryostats, and none of them ever lead to “Bosenova” explosions, despite being constantly irradiated by cosmic ray particle showers.

“Extreme Tesla fields”?? It’s called a magnetic field. (Usually, when you build a magnet, you do tend to get a magnetic field.) Why is a magnetic field referred to as a “Tesla field”? Is it an attempt to invoke the aura of mystical, magical, pseudoscience, superstition and suspicion that surrounds Tesla’s name?

Also… there’s no such thing as a liquid-helium-cooled nuclear reactor. Unless you’re talking about superconducting magnets in ITER… and it’s not even built yet.

What happens next at the LHC will be the next big experiment in a superfluid Helium II BEC. It’s not part of the design parameters, as physicists assume that the helium will be stable based on its use in the much smaller, much less powerful, up to 250 GeV per beam, RHIC collider in Long Island, NY. CERN’s interests lie in producing the Higgs boson at the LHC, perhaps micro black holes and quark-gluon plasma. Even in the much awaited CERN safety study released last month, there’s absolutely nothing on a possible bosenova implosion/explosion. Of course to test the safety of the enormous LHC to handle foreseen and unforeseen events you’d need another disposable one. But at least it is possible to subject Helium II to some of these high energies and hadron beams as a test. Not at the low energies of the RHIC, but at Fermilab’s Tevatron, currently the most energetic collider with 0.9 TeV per beam, though still far short of the power of the monster LHC at ordinary operating conditions of 7 TeV and ultimately 1,150 TeV collisions of lead ions at nearly twice light speed. Helium II could simply be used as a target by Tevatron beams to see what would happen, besides being exposed to high and fluctuating Tesla fields, ionized by electrical currents, subjected to some of the extreme conditions anticipated at the LHC.

Superconducting cables, superconducting magnets, or superconducting niobium RF cavities in LHe cryostats are already in widespread use around the world. They’re just auxiliary pieces of technology that are associated with the particle accelerator. The liquid helium is just needed to keep those superconductors at the appropriate temperature, and it’s got nothing to do with the particle interactions themselves, and it’s nowhere the actual particle collisions.

Deliberately, carefully twiggling the wavefunction of a Rubidium-87 BEC in order to induce a “bosenova” on purpose is a far cry from saying that a magnet inside a helium dewar can spontaneously explode.

There’s still a suit in the Hawaii courts to delay LHC startup because of safety concerns like black hole and strangelet production. Lately and since I first considered the possible dangers of superfluid helium in my article of March 7, 2008, ‘The Almost Thermonuclear LHC’, the plaintiffs, Dr Walter Wagner and Luis Sancho have announced they will seek an addendum to their suit to include bosenova risks at the LHC.

Oh yes, because they’re so credible. I’m surprised they don’t file an addendum to their lawsuit to include the risk of disruption of the van Allen belts, allowing the return of the Annunaki from the planet Nibiru.

Good references:

CERN’s official LHC backgrounder, brochure and FAQ
http://cdsweb.cern.ch/record/1092437/files/CERN-Brochure-2008-001-Eng.pdf

I will probably post some more later. Until then, your comments and questions are fully welcomed, and will likely determine the direction of the next post.

Written by Luke Weston

September 10, 2008 at 7:02 am

Follow

Get every new post delivered to your Inbox.

Join 75 other followers