ONLINE LIBRARY

Book: "Intelligence Behind the Universe!"

Author: Ronald D. Pearson B.Sc (Hons)

Availability: From Michael Roll

Contents / Previous Chapter / Next Chapter

 

- Chapter 7 -

The Nature of Subatomic Particles

___________________________________________

7.1 The Particle-Sequence          

          The phenomenon of wave-particle duality also introduces another puzzle. Why is it necessary for particles to be controlled by wave functions when they have inertia to guide them? Surely they could simply travel in straight lines, as Newton proposed, unless acted upon by an impressed force. For real particles, real in the sense of being permanent, nature appears to like both belt and braces. Why does Nature need two such totally separate mechanisms to perform one task?

There is one beautifully simple and elegant explanation. No particles can be truly real! If all sub-atomic particles such as the electrons whose orbitals define the size of atoms, or the quarks and gluons making up sub-nuclear particles, are assumed to be similar to the virtual particles of space, the whole system becomes explicable. The only difference now between the virtual particles of space and the constituents of atoms is permanence of energy. Those of space have no permanent energy and so must have a life limited by Heisenberg's "uncertainty principle" (202) (208) or by instability which somehow has similar mathematical description. This is a matter crying out for mathematical investigation but so far this has not been attempted. If the "permanent" particles are similarly limited but have permanent energy associated with them, then they must continually arise in one place, complete with their previous kinetic energy and momentum, as they vanish in the other previous place. The energy is to be imagined as a mercurial fluid, indestructible but able to move anywhere instantly along the Grid filaments. This energy and the virtual particles of space would be the raw material from which the universe was formed. To make atoms these needed organisation. Hence the Grid acting for the most part autonomously as an ultra-fast computer. The latter provides a blueprint as the sums of wave functions confining each re-materialisation within prescribed limits, so that atoms and motion can be organised.

But with the mesh of a Grid obstructing space the only way the motion of atoms through it could be simulated would be by the computer control of a sequence of sub-atomic particles, made to appear as real atoms. Each member of the sequence exists for a fleeting instant and is then reconstructed in a different place. The places of re-appearance need to be so organised that the shape of an atom is produced. It follows that the inertia of complete atoms could not be used for control. The sub-atomic particles will obey Newton's classical laws of motion whist they live. But then their energy can jump to any place. The Grid so limits the places where replacements are allowed that the motion of objects consisting of myriads of particles have the same classical laws of motion. It can do this by measuring the position, energy, speed and direction of a particle whilst it exists. Then it can so limit the places allowable for replacement production that on average the simulated real particle would appear to move as if it existed as a permanent structure. This is consistent With De Broglie's postulate of the wavelength of a particle being represented by h/p, but this is now seen as a deliberately introduced mathematical contrivance. Universes on this basis appear as carefully contrived illusions!

It is possible that the Creator was restricted by another consideration. At the design stage it would have been discovered that all particles needed to be formed as composites of positive and negative energies with one form dominant. Unfortunately they could not be made stable.

They had to be like the temporary atoms we can make in the laboratory called "positronium". These are atoms consisting of an electron and its anti-particle, the positron, in mutual orbit. They only last for a fleeting fraction of a second before mutually annihilating to collapse into a pair of photons, which then shoot away in opposite directions. What has happened here is that positive and negative electric charge has mutually annihilated, not the energy, and the two must not be confused. Sub-atomic particles will be similarly limited to short life-spans with the difference that on mutual annihilation only the energy imbalance remains. This residual needs to be absorbed by the Grid to be made immediately available for new particle construction. with this refined model the particle factories do not make matter as a single creative act and then stop working.

So the particle factories fixed to the Grid filaments have to be capable of mass production at a phenomenal rate in order to maintain a quasi-steady state in which losses are continually replaced by new production. This is not a random process. The places where new production is to occur has to be controlled by the Grid itself. This uses its wave-function-numbers to control production rates at every tiny factory location by enabling new positive and negative energy forms to be simultaneously created and added to the residuals from previously expired particles. only in such a way can structures be formed having the required specialised properties.

What can be termed the "permanent" net positive energy of matter would, in the clumped matter state, be permanent in one sense but not in another. This would be the net permanent energy measured by weighing. It would be balanced, however, by a tenuous halo of negative energy stretching out for about a billion light years in all directions. It will be remembered that this halo provided the primary cause of gravitation. If it could be collected together at the place where the matter it just balances exists, then it would mutually annihilate with that matter. In this way the energy is not really permanent. It is, however, permanent for all practical purposes because it is not Possible to gather together the negative energy of the halo. This, then, is the basis on which the First Law of Thermodynamics rests. It simply postulates that energy can be neither created or destroyed. The truth of this statement can now be related to a wider context in which it appears simply as a special case.

But all sub-atomic particles need to have a negative built into them as part of the energy component structure. For example, the nucleus of atoms consist of charged protons and uncharged neutrons having about equal net positive energy. But each is composed of subnuclear components. It is generally thought that each has three so-called "quarks" very tightly bound by "gluons" which are responsible for the strong nuclear force. Since the latter produce an attractive force, they need to transfer negative momentum and need therefore to consist of negative energy. The latter has to be balanced by positive energy in addition to the measured net positive energy of the nucleon. Hence to build sub-atomic particles it is first necessary to have available the permanent net positive energy requirement; then to this equal amounts of positive and negative energy, spontaneously generated from pure nothingness, need to be built on.

The whole system described may seem unacceptably complex with the number of particle factories required being of "googol" proportions. (A googol is a number so large as to be practically meaningless to us. But it is not infinite. An infinite number is totally undefinable, but a googol seems the same to our limited Perception.) But what is the alternative? Is it more reasonable to think that complex particles, like electrons with sophisticated means for producing and interacting with tagged mediators, are simply created from nothing automatically in space? Apart from the sheer daunting scale of numbers and size, both of the very large and the very small scale, this scenario is not fundamentally impossible to imagine without resort to the unacceptable type of analogue.

This refinement also eliminates one of the major objections physicists bring up against the possible existence of negative energy states. They say that particles made from both would be unstable and so would mutually annihilate. We agree. This is the reason apparently real particles have very short lives and require to be continually reconstructed.

If particles are being continually re-created and the places of new appearance are chosen at random, then how could they simulate permanent entities moving along? This seems a suitable question needing to be answered at this point. The answer is that the particles are too small to be observed directly. Their motion can only be inferred. It is possible to have many particles in motion and shine a light upon them. Then from the way the light is scattered, motion can be inferred. Unfortunately such methods are unable to discriminate between motion in pure straight lines or that produced by one which jumps about at random from one alterative path to another.

The precise compound measurement involved in locating the speed and position of a particle contradicts Heisenberg's uncertainty principle. This states that if the position is measured accurately, then the velocity (defined as speed and direction combined) or alternatively its momentum, could not be known. However, this principle is based on the assumption that only photons are available for measurement and in using one to determine position the resulting momentum exchange would alter the speed of the particle under study. But this limitation does not apply to the Grid because it need not use photons as its measuring tool. It could sample the virtual photons emitted during the brief life of the particle. Such a passive measurement could determine all properties exactly and simultaneously. Precise information can therefore be made available for planning the next reappearance.

Such a model gives a neat explanation for wave-particle duality. Instead of a particle existing as a wave-ghost as it travels, it keeps sparking in and out with random spins and other properties arising at each materialisation. The grid keeps track, making sure that any corresponding particle has matching values on average. Alternatively the Grid ensures that the same direction of spin is maintained at each re-materialisation, just as it needs to conserve kinetic energy. The wave functions ensure that re-materialisation occurs within the volumes of space allowed by all alternate paths. In this way particles can travel all alternate paths simultaneously by jumping about randomly from one path to any other which the wave functions allow.

Young's two-slit experiment is illustrated in FIG.1. This can now be interpreted in a new way. The regions of greatest probability arise where constructive interference occurs. These are shown as thick lines. A single particle travelling from slits to screen would keep re-materialising a little nearer the screen each time, but it could be anywhere along each thick line derived from the constructive interference of waves all at the same distance from the slits. These regions are indicated by the five bands of thickened lines. Distance from the slits would increase progressively with time until finally the particle hit the screen at any point, chosen at random, where the thick lines occur.

Clearly this model also matches the description of the way mirrors and diffraction gratings work. It answers Richard Feynman's comment about the working of the universe being crazy. The photons keep rematerialising at any place where they are permitted to do so by non-cancellation of abstract wave functions.

Young's two-slit experiment works just as well with electrons as the photons of light. In the Copenhagen interpretation the electrons exist only as waves whilst in transit. But to exhibit electromagnetic properties they need to be sophisticated little machines, capable of manufacturing labelled mediators which they throw off continuously in all directions. They also need to be able to recognise and interact with similar mediators arising from any other source. It is readily proved that whilst in motion electrons act this way because they are deflected by static electric and magnetic fields. As fuzzy waves this would be impossible because they could not produce or absorb mediators. Hence the Copenhagen interpretation cannot explain all observation.

This problem is avoided by the Many Universes hypothesis because electrons then follow all possible paths without needing to travel only as waves. However, in the new theory of quantum gravitation it is shown that virtual particles take up a sizeable proportion of the total volume of space. Hence the total number of universes which could exist has some maximum value since there is some finite limit to the number of interpenetrating systems of matter which can be accomodated. The Many Universes proposal does not therefore stand much chance of being correct because an infinite number is said to be needed by both Everett and Wheeler.

7.2 A Model of the Atom

A very simple conceptualisation of the hydrogen atom is also furnished. To remind the reader, the accepted model is that given by application of Schrodinger's wave equation. It is thought of as being a proton, forming the simplest of all nuclei, surrounded by an electron somehow "smeared out" in the surrounding space to form a fuzzy ball. Yet the electrons are confined to precise energy levels by wave functions. The waves fit round the atom to form "energy shells" and each shell is defined by a precise integral number of wavelengths.

But the virtual particle idea fits perfectly! Heisenberg's uncertainty principle would limit the duration of each manifestation of the electron to about four-millionths of the time it would take to orbit the proton at the so-called "Bohr radius". The first model of the atom, due to Bohr, envisaged it like a miniature solar system with electrons orbiting the proton nucleus like planets going round the sun. This model was later abandoned and replaced by the one produced by Schrodinger, in which the electron's position was controlled by wave functions. With the new concept the electron would dodge about at random, within a ballshaped orbital, giving rise to the impression of a distributed cloud.

If the life at each manifestation is indeed predicted by Heisenberg's uncertainty principle, then there would be 250,000 appearances in the time taken for a permanent electron to make a single orbit. At each appearance the now temporary electron would execute part of an orbit, being attracted by the nucleus, yet having a tangential component of velocity, due to the conserved kinetic energy. The trajectory executed in the time available would be so short, however, that it would hardly seem to move. Hence atoms in a crystal would look like that depicted in FIG.2. using a short timeframe which allows only a few re-materialisations to be recorded.

If hydrogen atoms were being viewed, then with very short exposure time a single electron would be seen somewhere within the volume of the orbital. As the exposure time is increased, with images stored for this period, more electrons would appear to exist dotted about at random within the orbital. The exposure time could be steadily increased until it equalled that needed for a permanent electron to orbit the nucleus. Then about 250,000 images would be observed dotted around within the orbital. The effect of a diffuse electron cloud would be produced. The density distribution of this cloud would exactly match the Shrodinger specification.

Only the total energy of the electron is permanent. It will obey classical Newtonian laws whilst it lives as an electron, exhibiting the property of momentum and so explaining why it interacts as a solid little object when colliding with other particles such as photons. On expiry and absorption of its net positive energy by the Grid a different kind of physics needs to exist. The energy can no longer exhibit inertia since it has to be able to jump to any specified new location along the Grid filaments at speeds many thousands of times the speed of light.

All other sub-atomic particles including photons will be governed the same way and will obey the conservation laws of energy and momentum whilst they live. Then interactions between such particles will translate to yield the extended Newtonian laws of physics when viewed at the macroscopic scale. Countless billions of particles are now involved so the transient nature of each will not be apparent.

This interpretation is consistent with observation and satisfies one important aim set out in Chapter 3. This objective was to reconcile the Schrodinger model with the concept of an electron being real, so that it existed permanently.

There is a problem that during each brief life the electron, possessing electric charge, will experience a strong acceleration toward the charged nucleus. An accelerating charge radiates photons and so the atom should radiate energy at a prodigious rate. But this does not happen. only relatively minute energy is radiated and then only when the atom is vibrating. Consequently a problem exists. It is common also to the Bohr and Schrodinger descriptions. Originally this was one of the reasons the Bohr model had to be discarded. A little consideration shows, however, that the difficulty did not vanish by its replacement with the Schrodinger model. The latter specifies where the electrons are most likely to be found. So in this model they do exist as particles. Whilst they live they will be accelerated, hence radiation ought to occur.

However now that a Grid exists a possible solution to the difficulty can be advanced. The Grid computes the natural acceleration for the electron and provides an inhibiting command so that only differences in acceleration from this state cause photon emission. It seems most probable that the photons will be emitted direct from the Grid filaments at the same time as the electron is re-created in order to satisfy the energy and momentum balance. This would seem easier to organise than providing the electron itself with a control system.

It may also be objected that if atoms are organised by abstract waves, to make the electron dodge about all over the interior of its orbital, then real electric forces are not needed for holding the atom together. So why does nature take such trouble to confer real electric forces on her particles? The most probable answer is that residual electrical and magnetic forces are needed to cause atoms to stick together to make liquids and solids. The abstract organising power could not on its own provide real binding forces between atoms or allow the interchange of physical forms of energy. Hence real electric forces have to be present as well as the abstract copy.

It seems clear that complete sub-atomic particles must in reality be particle sequences joined end to end in time, though not joined in position. But each is a composite structure of both kinds of energy with the Positive form dominant. The question now needing to be asked is: "Are the sub-units of nucleons and electrons permanent for the lifetime of the composite?"

It is not possible to answer this question absolutely. They could be arranged as similar particle sequences controlled by the Grid using wave-numbers. In this case the Grid would need to be even more fine-grained than ever. It would have to be fine-grained even on a sub-electron scale. In my opinion this is Unlikely but I could be wrong. The alternative is that the quarks which make up the nucleons or the as yet unnamed bits of the electron are permanent until the composites collapse in on themselves and are destroyed. Then the motion of all sub-units will be of the orbital kind, like planets going round the Sun, for the lifetime of the sub-atomic particle. The gluons which bind the permanent parts will be virtual. This means they exist on borrowed energy, vanishing to nothingness after a short distance of travel. Since many of them exist simultaneously, however, a permanent negative energy is represented by this mediator cloud. It will neutralise a large part of the positive inertia of the complete assembly. In this case the pitch of the Grid filaments would need be no finer than about 5% of the radius of an atom. This option seems the more reasonable and there is in fact some supporting evidence for this model.

The nucleons and electrons have fixed amounts of "spin". This means they rotate on their axes at a definite fixed speed and have a fixed "angular momentum". The latter is defined as momentum multiplied by a radius of action. It can be written as the product "mass x radius x tangential velocity", i.e. "mvr ". Orbiting models can be devised which describe such behaviour very accurately, as shown in Chapter T.S.3. It is difficult to see how a similar explanation could arise by the alternative picture.

7.3 Non-Locality

Non-locality is an associated mysterious phenomenon. It arose as the famous EPR paradox(107). Einstein was not happy with quantum theory because it did not fit in with relativity. So he suggested an absurd "gedanke" experiment. In a simpler alternative devised by David Bohm in 1952 a pair of particles, such as electrons, were to be created spinning in opposite directions. Spin is like that of a top flying through the air with the spin axis not necessarily lined up with the motion. Looked at in flight direction "right spin" is clockwise and "left spin" anticlockwise. They would start out with the same spins so that their net angular momentum would be zero. On catching one of them and measuring the direction of spin, even at a great distance, the direction of spin of the other would be instantly known. This was so because angular momentum had to be conserved. So finding the direction of spin of one would determine the other. The particle-pair could only arise with their spins balanced. But this was not quantum theory.

In quantum theory at that time, the particles travelled as unresolved wave functions containing both spins at once in limbo. Only one spin would develop by chance on collapse of the wave function into a real particle. On collapse of one wave to a particle the other would have to do the same by an impossible process of "action at a distance". If it did not, then angular momentum would not be conserved. Either way quantum theory would be confounded.

A sophisticated theory called "Bell's theorem" was developed from a quantum base and this suggested that there would indeed be a correlation, so that action at a distance would occur. It seemed to imply that pairs of sub-atomic particles, arising together like identical twins, can affect each other instantly, at any distance of separation.

A most amazingly difficult experiment was carried out by Alain Aspect(201) to test the theory. Instead of employing the spin of electrons, the property of "polarisation" of photons was adopted for the investigation. It will be remembered that light waves have an oscillating transverse component. In a new interpretation described in Chapter T.S.1 this sideways motion is induced in virtual electrons and positrons of space as photons are absorbed and in turn cause the photon paths to be snake-like. The direction of Polarisation is defined by this motion.

In the aspect experiment polarised photon-pairs were produced by the decay of excited caesium atoms. An excited atom has absorbed a photon of incident radiation by a process called "pumping". The photon has been absorbed by one of the electrons, which then has greater energy. Such an electron is displaced to a greater distance from the nucleus so that it can exist at this unnaturally high energy state. It can only remain there for a short time because the state is unstable. At some time it must fall back, emitting the energy previously absorbed. A pair of photons are then emitted with exactly balanced properties.

In the experiment such twins were emitted in opposite directions and passed through screens of polarised glass placed several metres away. The predictions were supported. So action at a distance really did seem to take place.

The conventional interpretation is that space is so curved in higher dimensions that it can fold back on itself. Then two widely separated points seen from our perspectives can be adjacent one another in the other dimension. Hence by a short- cut one particle can appear to act upon the other at a distance.

This interpretation cannot be allowed in the new theory because only three spatial dimensions are admitted and geometry is strictly Euclidean, so straight lines are always straight. But now the existence of a Grid has been deduced. The fundamental propagation speed for information along Grid filaments needs to be so high as to appear infinite to us. The same wave functions control each particle of a pair having simultaneous origin. They would be linked by the Grid to control overall conservation of both linear and angular momentum. The particles are really sequences in which rematerialisations continually spark in and out of existence at places organised by wave function control. But the law of conservation of angular momentum and other properties are deliberately incorporated to help control the motions of matter. Hence the Grid keeps count of the properties arising by chance at each manifestation and ensures that on average the programmed laws are obeyed.

A simpler and more probable explanation can, however, be advanced.

In this the Grid ensures that at each new manifestation the replacement particle is produced with the same direction of spin or polarisation as its predecessor. The latter seems the more likely option since, for the theory to work, it is essential that certain other properties like kinetic energy and momentum be conserved in any particle-sequence.

At least these last two explanations are imaginable without recourse to an impossible-to-visualise curved space analogy. in any case, now that a solution of the problem of quantum gravitation is available, and since this only requires three dimensions plus time and energy, there is no longer any justification for seeking solutions in higher dimensions.

7.4 Psychic Energy

Now the point has been reached at which a meaning can be found for the intangible idea of psychic energy. Most people are content to regard "force fields" or "energy fields" as something mysterious which certain objects or materials give out. These can then be picked up by people having the required sensitivity, such as water-diviners. Also such energies are claimed to exist around the sites of ancient shrines. They are invoked to explain healing and telepathic communication. Some people are said to have negative energies. This seems to be merely a psychological description, however. it will not be considered here as one of the manifestations of the psychic kind of energy.

Pierre Teillard(122) proposed that energy exists in two forms in order to explain spirituality and psychic phenomena. In his description there is the energy associated with the matter of the universe and a separate psychic form of energy. one he considers "radial", the other "tangential", so they are perpendicular to one another. This is not a concept to which I am able to relate, as the reason for such a geometry is not explained. Also psychic energy remains undefined but is postulated to account for paranormal effects. This idea seems partially correct and is to a degree compatible with those adopted in this book. it cannot explain the origin of everything from nothing, however, nor is the psychic form of energy defined in a manner which gives any insight of its nature. Perhaps he was feeling his way and this was a long time ago. He was not a physicist and so probably did not know about wave-particle duality. Without this key further progress according to the new interpretation would be very difficult, if not impossible, to achieve.

In the new concept both positive and negative forms of physical energy exist as compact particles. The particles have short lives but a replacement is instantly reconstructed by the Grid from the residual energy of the old. But the place at which the new one appears is governed by wave-functions manipulated like numbers. This is the clue. The psychic energy form is this control system! It is an abstract form of energy more akin to computing power. it is not really a true energy form at all. It is the controlling influence which forms the plan for organising both matter and its motion.

The effects previously considered, such as telepathy, then appear as minor residuals of a major driving force of the universe. Without the psychic element physical energy could not be structured to provide systems of matter capable of building universes or capable of motion. Psychic energy is:- 

The Intelligence Behind the Universe!

It is the number-crunching power of the Grid acting as an ultra-fast computer. Consequently psychic energy is not a true energy form at all because it has an abstract number-like quality. Because of this it cannot be measured in the manner of physical energies.

A point has now been reached at which an attempt can be made to explain the whole spectrum of psychic effects in terms of extended physics. Before attempting to do so, however, the solution built up will be summarised in the next chapter. A small amount of new material will, however, be incorporated.

 

Chapters

Contents / Notes / Synopsis / Acknowledgements / Background / Chapter 1 / Chapter 2 / Chapter 3 / Chapter 4 / Chapter 5 / Chapter 6 / Chapter 7 / Chapter 8 / Chapter 9 / Chapter 10 / Chapter 11 / Chapter 12 / Chapter 13

Home / Intro / News / Challenge / Investigators / Articles / Experiments / Photographs / Theory / Library / Info / Books / Contact / Campaigns / Glossary

 

The International Survivalist Society 2001

Website Design and Construction by Tom Jones, Graphic Designer with HND