Science Feed

The Top 30 Problems with the Big Bang

The Top 30 Problems with the Big Bang

[reprinted from Meta Research Bulletin 11, 6-13 (2002)]
Abstract. Earlier, we presented a simple list of the top ten problems with the Big Bang. [[1]] Since that publication, we have had many requests for citations and additional details, which we provide here. We also respond to a few rebuttal arguments to the earlier list. Then we supplement the list based on the last four years of developments – with another 20 problems for the theory.

(1) Static universe models fit observational data better than expanding universe models.

 
Static universe models match most observations with no adjustable parameters. The Big Bang can match each of the critical observations, but only with adjustable parameters, one of which (the cosmic deceleration parameter) requires mutually exclusive values to match different tests. [[2],[3]] Without ad hoc theorizing, this point alone falsifies the Big Bang. Even if the discrepancy could be explained, Occam’s razor favors the model with fewer adjustable parameters – the static universe model.

(2) The microwave “background” makes more sense as the limiting temperature of space heated by starlight than as the remnant of a fireball.

  
The expression “the temperature of space” is the title of chapter 13 of Sir Arthur Eddington’s famous 1926 work, [[4]] Eddington calculated the minimum temperature any body in space would cool to, given that it is immersed in the radiation of distant starlight. With no adjustable parameters, he obtained 3°K (later refined to 2.8°K [[5]]), essentially the same as the observed, so-called “background”, temperature. A similar calculation, although with less certain accuracy, applies to the limiting temperature of intergalactic space because of the radiation of galaxy light. [[6]] So the intergalactic matter is like a “fog”, and would therefore provide a simpler explanation for the microwave radiation, including its blackbody-shaped spectrum.

Such a fog also explains the otherwise troublesome ratio of infrared to radio intensities of radio galaxies. [[7]] The amount of radiation emitted by distant galaxies falls with increasing wavelengths, as expected if the longer wavelengths are scattered by the intergalactic medium. For example, the brightness ratio of radio galaxies at infrared and radio wavelengths changes with distance in a way which implies absorption. Basically, this means that the longer wavelengths are more easily absorbed by material between the galaxies. But then the microwave radiation (between the two wavelengths) should be absorbed by that medium too, and has no chance to reach us from such great distances, or to remain perfectly uniform while doing so. It must instead result from the radiation of microwaves from the intergalactic medium. This argument alone implies that the microwaves could not be coming directly to us from a distance beyond all the galaxies, and therefore that the Big Bang theory cannot be correct.

None of the predictions of the background temperature based on the Big Bang were close enough to qualify as successes, the worst being Gamow’s upward-revised estimate of 50°K made in 1961, just two years before the actual discovery. Clearly, without a realistic quantitative prediction, the Big Bang’s hypothetical “fireball” becomes indistinguishable from the natural minimum temperature of all cold matter in space. But none of the predictions, which ranged between 5°K and 50°K, matched observations. [[8]] And the Big Bang offers no explanation for the kind of intensity variations with wavelength seen in radio galaxies.

(3) Element abundance predictions using the Big Bang require too many adjustable parameters to make them work.


The universal abundances of most elements were predicted correctly by Hoyle in the context of the original Steady State cosmological model. This worked for all elements heavier than lithium. The Big Bang co-opted those results and concentrated on predicting the abundances of the light elements. Each such prediction requires at least one adjustable parameter unique to that element prediction. Often, it’s a question of figuring out why the element was either created or destroyed or both to some degree following the Big Bang. When you take away these degrees of freedom, no genuine prediction remains. The best the Big Bang can claim is consistency with observations using the various ad hoc models to explain the data for each light element. Examples: [[9],[10]] for helium-3; [[11]] for lithium-7; [[12]] for deuterium; [[13]] for beryllium; and [[14],[15]] for overviews. For a full discussion of an alternative origin of the light elements, see [[16]].

(4) The universe has too much large scale structure (interspersed “walls” and voids) to form in a time as short as 10-20 billion years.


The average speed of galaxies through space is a well-measured quantity. At those speeds, galaxies would require roughly the age of the universe to assemble into the largest structures (superclusters and walls) we see in space [[17]], and to clear all the voids between galaxy walls. But this assumes that the initial directions of motion are special, e.g., directed away from the centers of voids. To get around this problem, one must propose that galaxy speeds were initially much higher and have slowed due to some sort of “viscosity” of space. To form these structures by building up the needed motions through gravitational acceleration alone would take in excess of 100 billion years. [[18]]

(5) The average luminosity of quasars must decrease with time in just the right way so that their average apparent brightness is the same at all redshifts, which is exceedingly unlikely.


According to the Big Bang theory, a quasar at a redshift of 1 is roughly ten times as far away as one at a redshift of 0.1. (The redshift-distance relation is not quite linear, but this is a fair approximation.) If the two quasars were intrinsically similar, the high redshift one would be about 100 times fainter because of the inverse square law. But it is, on average, of comparable apparent brightness. This must be explained as quasars “evolving” their intrinsic properties so that they get smaller and fainter as the universe evolves. That way, the quasar at redshift 1 can be intrinsically 100 times brighter than the one at 0.1, explaining why they appear (on average) to be comparably bright. It isn’t as if the Big Bang has a reason why quasars should evolve in just this magical way. But that is required to explain the observations using the Big Bang interpretation of the redshift of quasars as a measure of cosmological distance. See [[19],[20]].

By contrast, the relation between apparent magnitude and distance for quasars is a simple, inverse-square law in alternative cosmologies. In [20], Arp shows great quantities of evidence that large quasar redshifts are a combination of a cosmological factor and an intrinsic factor, with the latter dominant in most cases. Most large quasar redshifts (e.g., z > 1) therefore have little correlation with distance. A grouping of 11 quasars close to NGC 1068, having nominal ejection patterns correlated with galaxy rotation, provides further strong evidence that quasar redshifts are intrinsic. [[21]]

(6) The ages of globular clusters appear older than the universe.


Even though the data have been stretched in the direction toward resolving this since the “top ten” list first appeared, the error bars on the Hubble age of the universe (12±2 Gyr) still do not quite overlap the error bars on the oldest globular clusters (16±2 Gyr). Astronomers have studied this for the past decade, but resist the “observational error” explanation because that would almost certainly push the Hubble age older (as Sandage has been arguing for years), which creates several new problems for the Big Bang. In other words, the cure is worse than the illness for the theory. In fact, a new, relatively bias-free observational technique has gone the opposite way, lowering the Hubble age estimate to 10 Gyr, making the discrepancy worse again. [[22],[23]]

(7) The local streaming motions of galaxies are too high for a finite universe that is supposed to be everywhere uniform.


In the early 1990s, we learned that the average redshift for galaxies of a given brightness differs on opposite sides of the sky. The Big Bang interprets this as the existence of a puzzling group flow of galaxies relative to the microwave radiation on scales of at least 130 Mpc. Earlier, the existence of this flow led to the hypothesis of a "Great Attractor" pulling all these galaxies in its direction. But in newer studies, no backside infall was found on the other side of the hypothetical feature. Instead, there is streaming on both sides of us out to 60-70 Mpc in a consistent direction relative to the microwave "background". The only Big Bang alternative to the apparent result of large-scale streaming of galaxies is that the microwave radiation is in motion relative to us. Either way, this result is trouble for the Big Bang. [[24],[25],[26],[27],[28]]

(8) Invisible dark matter of an unknown but non-baryonic nature must be the dominant ingredient of the entire universe.


The Big Bang requires sprinkling galaxies, clusters, superclusters, and the universe with ever-increasing amounts of this invisible, not-yet-detected “dark matter” to keep the theory viable. Overall, over 90% of the universe must be made of something we have never detected. By contrast, Milgrom’s model (the alternative to “dark matter”) provides a one-parameter explanation that works at all scales and requires no “dark matter” to exist at any scale. (I exclude the additional 50%-100% of invisible ordinary matter inferred to exist by, e.g., MACHO studies.) Some physicists don’t like modifying the law of gravity in this way, but a finite range for natural forces is a logical necessity (not just theory) spoken of since the 17th century. [[29],[30]]

Milgrom’s model requires nothing more than that. Milgrom’s is an operational model rather than one based on fundamentals. But it is consistent with more complete models invoking a finite range for gravity. So Milgrom’s model provides a basis to eliminate the need for “dark matter” in the universe at any scale. This represents one more Big Bang “fudge factor” no longer needed.

(9) The most distant galaxies in the Hubble Deep Field show insufficient evidence of evolution, with some of them having higher redshifts (z = 6-7) than the highest-redshift quasars.


The Big Bang requires that stars, quasars and galaxies in the early universe be “primitive”, meaning mostly metal-free, because it requires many generations of supernovae to build up metal content in stars. But the latest evidence suggests lots of metal in the “earliest” quasars and galaxies. [[31],[32],[33]] Moreover, we now have evidence for numerous ordinary galaxies in what the Big Bang expected to be the “dark age” of evolution of the universe, when the light of the few primitive galaxies in existence would be blocked from view by hydrogen clouds. [[34]]

(10) If the open universe we see today is extrapolated back near the beginning, the ratio of the actual density of matter in the universe to the critical density must differ from unity by just a part in 1059. Any larger deviation would result in a universe already collapsed on itself or already dissipated.


Inflation failed to achieve its goal when many observations went against it. To maintain consistency and salvage inflation, the Big Bang has now introduced two new adjustable parameters: (1) the cosmological constant, which has a major fine-tuning problem of its own because theory suggests it ought to be of order 10120, and observations suggest a value less than 1; and (2) “quintessence” or “dark energy”. [[35],[36]] This latter theoretical substance solves the fine-tuning problem by introducing invisible, undetectable energy sprinkled at will as needed throughout the universe to keep consistency between theory and observations. It can therefore be accurately described as “the ultimate fudge factor”.


Anyone doubting the Big Bang in its present form (which includes most astronomy-interested people outside the field of astronomy, according to one recent survey) would have good cause for that opinion and could easily defend such a position. This is a fundamentally different matter than proving the Big Bang did not happen, which would be proving a negative – something that is normally impossible. (E.g., we cannot prove that Santa Claus does not exist.) The Big Bang, much like the Santa Claus hypothesis, no longer makes testable predictions wherein proponents agree that a failure would falsify the hypothesis. Instead, the theory is continually amended to account for all new, unexpected discoveries. Indeed, many young scientists now think of this as a normal process in science! They forget or were never taught that a model has value only when it can predict new things that differentiate the model from chance and from other models before the new things are discovered. Explanations of new things are supposed to flow from the basic theory itself with at most an adjustable parameter or two, and not from add-on bits of new theory.

Of course, the literature also contains the occasional review paper in support of the Big Bang. [[37]] But these generally don’t count any of the prediction failures or surprises as theory failures as long as some ad hoc theory might explain them. And the “prediction successes” in almost every case do not distinguish the Big Bang from any of the four leading competitor models: Quasi-Steady-State [16,[38]], Plasma Cosmology [18], Meta Model [3], and Variable-Mass Cosmology [20].

For the most part, these four alternative cosmologies are ignored by astronomers. However, one web site by Ned Wright does try to advance counterarguments in defense of the Big Bang. [[39]] But his counterarguments are mostly old objections long since defeated. For example:


(1) In “Eddington did not predict the CMB”:


a. Wright argues that Eddington’s argument for the “temperature of space” applies at most to our Galaxy. But Eddington’s reasoning applies also to the temperature of intergalactic space, for which a minimum is set by the radiation of galaxy and quasar light. The original calculations half-a-century ago showed this limit probably fell in the range 1-6°K. [6] And that was before quasars were discovered and before we knew the modern space density of galaxies.


b. Wright also argues that dust grains cannot be the source of the blackbody microwave radiation because there are not enough of them to be opaque, as needed to produce a blackbody spectrum. However, opaqueness is required only in a finite universe. An infinite universe can achieve thermodynamic equilibrium (the actual requirement for a blackbody spectrum) even if transparent out to very large distances because the thermal mixing can occur on a much smaller scale than quantum particles – e.g., in the light-carrying medium itself.


c. Wright argues that dust grains do not radiate efficiently at millimeter wavelengths. However, efficient or not, if the equilibrium temperature they reach is 2.8°K, they must radiate away the energy they absorb from distant galaxy and quasar light at millimeter wavelengths. Temperature and wavelength are correlated for any bodies in thermal equilibrium.


(2) About Lerner’s argument against the Big Bang:


a. Lerner calculated that the Big Bang universe has not had enough time to form superclusters. Wright calculates that all the voids could be vacated and superclusters formed in less than 11-14 billion years (barely). But that assumes that almost all matter has initial speeds headed directly out of voids and toward matter concentrations. Lerner, on the other hand, assumed that the speeds had to be built up by gravitational attraction, which takes many times longer. Lerner’s point is more reasonable because doing it Wright’s way requires fine-tuning of initial conditions.


b. Wright argues that “there is certainly lots of evidence for dark matter.” The reality is that there is no credible observational detection of dark matter, so all the “evidence” is a matter of interpretation, depending on theoretical assumptions. For example, Milgrom’s Model explains all the same evidence without any need for dark matter.


(3) Regarding arguments against “tired light cosmology”:


a. Wright argues: “There is no known interaction that can degrade a photon's energy without also changing its momentum, which leads to a blurring of distant objects which is not observed.” While it is technically true that no such interaction has yet been discovered, reasonable non-Big-Bang cosmologies require the existence of entities many orders of magnitude smaller than photons. For example, the entity responsible for gravitational interactions has not yet been discovered. So the “fuzzy image” argument does not apply to realistic physical models in which all substance is infinitely divisible. By contrast, physical models lacking infinite divisibility have great difficulties explaining Zeno’s paradoxes – especially the extended paradox for matter. [3]

b. Wright argues that the stretching of supernovae light curves is not predicted by “tired light”. However, one cannot measure the stretching effect directly because the time under the lightcurve depends on the intrinsic brightness of the supernovae, which can vary considerably. So one must use indirect indicators, such as rise time only. And in that case, the data does not unambiguously favor either tired light or Big Bang models.

c. Wright argued that tired light does not produce a blackbody spectrum. But this is untrue if the entities producing the energy loss are many orders of magnitude smaller and more numerous than quantum particles.

d. Wright argues that tired light models fail the Tolman surface brightness test. This ignores that realistic tired light models must lose energy in the transverse direction, not just the longitudinal one, because light is a transverse wave. When this effect is considered, the predicted loss of light intensity goes with (1+z)-2, which is in good agreement with most observations without any adjustable parameters. [ NOTEREF _Ref4051228 \h \* MERGEFORMAT 2,[40]] The Big Bang, by contrast, predicts a (1+z)-4 dependence, and must therefore invoke special ad hoc evolution (different from that applicable to quasars) to close the gap between theory and observations.

By no means is this “top ten” list of Big Bang problems exhaustive – far from it. In fact, it is easy to argue that several of these additional 20 points should be among the “top ten”:


· "Pencil-beam surveys" show large-scale structure out to distances of more than 1 Gpc in both of two opposite directions from us. This appears as a succession of wall-like galaxy features at fairly regular intervals, the first of which, at about 130 Mpc distance, is called "The Great Wall". To date, 13 such evenly-spaced "walls" of galaxies have been found! [[41]] The Big Bang theory requires fairly uniform mixing on scales of distance larger than about 20 Mpc, so there apparently is far more large-scale structure in the universe than the Big Bang can explain.


· Many particles are seen with energies over 60x1018 eV. But that is the theoretical energy limit for anything traveling more than 20-50 Mpc because of interaction with microwave background photons. [[42]] However, this objection assumes the microwave radiation is as the Big Bang expects, instead of a relatively sparse, local phenomenon.


· The Big Bang predicts that equal amounts of matter and antimatter were created in the initial explosion. Matter dominates the present universe apparently because of some form of asymmetry, such as CP violation asymmetry, that caused most anti-matter to annihilate with matter, but left much matter. Experiments are searching for evidence of this asymmetry, so far without success. Other galaxies can’t be antimatter because that would create a matter-antimatter boundary with the intergalactic medium that would create gamma rays, which are not seen. [[43],[44]]


· Even a small amount of diffuse neutral hydrogen would produce a smooth absorbing trough shortward of a QSO’s Lyman-alpha emission line. This is called the Gunn-Peterson effect, and is rarely seen, implying that most hydrogen in the universe has been re-ionized. A hydrogen Gunn-Peterson trough is now predicted to be present at a redshift z » 6.1. [[45]] Observations of high-redshift quasars near z = 6 briefly appeared to confirm this prediction. However, a galaxy lensed by a foreground cluster has now been observed at z = 6.56, prior to the supposed reionization epoch and at a time when the Big Bang expects no galaxies to be visible yet. Moreover, if only a few galaxies had turned on by this early point, their emission would have been absorbed by the surrounding hydrogen gas, making these early galaxies invisible. [34] So the lensed galaxy observation falsifies this prediction and the theory it was based on. Another problem example: Quasar PG 0052+251 is at the core of a normal spiral galaxy. The host galaxy appears undisturbed by the quasar radiation, which, in the Big Bang, is supposed to be strong enough to ionize the intergalactic medium. [[46]]


· An excess of QSOs is observed around foreground clusters. Lensing amplification caused by foreground galaxies or clusters is too weak to explain this association between high- and low-redshift objects. This apparent contradiction has no solution under Big Bang premises that does not create some other problem. It particular, dark matter solutions would have to be centrally concentrated, contrary to observations that imply that dark matter increases away from galaxy centers. The high-redshift and low-redshift objects are probably actually at comparable distances, as Arp has maintained for 30 years. [[47]]


· The Big Bang violates the first law of thermodynamics, that energy cannot be either created or destroyed, by requiring that new space filled with “zero-point energy” be continually created between the galaxies. [[48]]


· In the Las Campanas redshift survey, statistical differences from homogenous distribution were found out to a scale of at least 200 Mpc. [[49]] This is consistent with other galaxy catalog analyses that show no trends toward homogeneity even on scales up to 1000 Mpc. [[50]] The Big Bang, of course, requires large-scale homogeneity. The Meta Model and other infinite-universe models expect fractal behavior at all scales. Observations remain in agreement with that.


· Elliptical galaxies supposedly bulge along the axis of the most recent galaxy merger. But the angular velocities of stars at different distances from the center are all different, making an elliptical shape formed in that way unstable. Such velocities would shear the elliptical shape until it was smoothed into a circular disk. Where are the galaxies in the process of being sheared?


· The polarization of radio emission rotates as it passes through magnetized extragalactic plasmas. Such Faraday rotations in quasars should increase (on average) with distance. If redshift indicates distance, then rotation and redshift should increase together. However, the mean Faraday rotation is less near z = 2 than near z = 1 (where quasars are apparently intrinsically brightest, according to Arp’s model). [[51]]


· If the dark matter needed by the Big Bang exists, microwave radiation fluctuations should have “acoustic peaks” on angular scales of 1° and 0.3°, with the latter prominent compared with the former. By contrast, if Milgrom’s alternative to dark matter (Modified Newtonian Dynamics) is correct, then the latter peak should be only about 20% of the former. Newly acquired data from the Boomerang balloon-borne instruments clearly favors the MOND interpretation over dark matter. [[52]]


· Redshifts are quantized for both galaxies [[53],[54]] and quasars [[55]]. So are other properties of galaxies. [[56]] This should not happen under Big Bang premises.


· The number density of optical quasars peaks at z = 2.5-3, and declines toward both lower and higher redshifts. At z = 5, it has dropped by a factor of about 20. This cannot be explained by dust extinction or survey incompleteness. The Big Bang predicts that quasars, the seeds of all galaxies, were most numerous at earliest epochs. [[57]]


· The falloff of the power spectrum at small scales can be used to determine the temperature of the intergalactic medium. It is typically inferred to be 20,000°K, but there is no evidence of evolution with redshift. Yet in the Big Bang, that temperature ought to adiabatically decrease as space expands everywhere. This is another indicator that the universe is not really expanding.] [[58]]


· Under Big Bang premises, the fine structure constant must vary with time. [[59]]


· Measurements of the two-point correlation function for optically selected galaxies follow an almost perfect power law over nearly three orders of magnitude in separation. However, this result disagrees with n-body simulations in all the Big Bang’s various modifications. A complex mixture of gravity, star formation, and dissipative hydrodynamics seems to be needed. [[60]]


· Emission lines for z > 4 quasars indicate higher-than-solar quasar metallicities. [[61]] The iron to magnesium ratio increases at higher redshifts (earlier Big Bang epochs). [[62]] These results imply substantial star formation at epochs preceding or concurrent with the QSO phenomenon, contrary to normal Big Bang scenarios.


· The absorption lines of damped Lyman-alpha systems are seen in quasars. However, the HST NICMOS spectrograph has searched to see these objects directly in the infrared, but failed for the most part to detect them. [[63]] Moreover, the relative abundances have surprising uniformity, unexplained in the Big Bang. [[64]] The simplest explanation is that the absorbers are in the quasar’s own environment, not at their redshift distance as the Big Bang requires.


· The luminosity evolution of brightest cluster galaxies (BGCs) cannot be adequately explained by a single evolutionary model. For example, BGCs with low x-ray luminosity are consistent with no evolution, while those with high x-ray luminosity are brighter on average at high redshift. [[65]]


· The fundamental question of why it is that at early cosmological times, bound aggregates of order 100,000 stars (globular clusters) were able to form remains unsolved in the Big Bang. It is no mystery in infinite universe models. [[66]]


· Blue galaxy counts show an excess of faint blue galaxies by a factor of 10 at magnitude 28. This implies that the volume of space is larger than in the Big Bang, where it should get smaller as one looks back in time. [[67]]

Perhaps never in the history of science has so much quality evidence accumulated against a model so widely accepted within a field. Even the most basic elements of the theory, the expansion of the universe and the fireball remnant radiation, remain interpretations with credible alternative explanations. One must wonder why, in this circumstance, that four good alternative models are not even being comparatively discussed by most astronomers.

Acknowledgments


Obviously, hundreds of professionals, both astronomers and scientists from other fields, have contributed to these findings, although few of them stand back and look at the bigger picture. It is hoped that many of them will add their comments and join as co-authors in an attempt to sway the upcoming generation of astronomers that the present cosmology is headed nowhere, and to join the search for better answers.

References
[[1]] T. Van Flandern (1997), MetaRes.Bull. 6, 64; <http://metaresearch.org>, “Cosmology” tab, “Cosmology” sub-tab.
[[2]] T. Van Flandern, “Did the universe have a beginning?”, Apeiron 2, 20-24 (1995); MetaRes.Bull. 3, 25-35 (1994); http://metaresearch.org, “Cosmology” tab, “Cosmology” sub-tab.
[[3]] T. Van Flandern (1999), Dark Matter, Missing Planets and New Comets, North Atlantic Books, Berkeley (2nd ed.).
[[4]] Sir Arthur Eddington (1926), “The temperature of space”, Internal constitution of the stars, Cambridge University Press, reprinted 1988, chapter 13.
[[5]] Regener (1933), Zeitschrift fur Physiks; confirmed by Nerost (1937).
[[6]] Finlay-Freundlich (1954).
[[7]] E.J. Lerner, (1990), “Radio absorption by the intergalactic medium”, Astrophys.J. 361, 63-68.
[[8]] T. Van Flandern, “Is the microwave radiation really from the big bang 'fireball'?”, Reflector (Astronomical League) XLV, 4 (1993); and MetaRes.Bull. 1, 17-21 (1992).
[[9]] (2002), Nature 415, vii & 27-29 & 54-57.
[[10]] (1997), Astrophys.J. 489, L119-L122.
[[11]] (2000), Science 290, 1257.
[[12]] (2000), Nature 405, 1009-1011 & 1025-1027.
[[13]] (2000), Science 290, 1257.
[[14]] (2002), Astrophys.J. 566, 252-260.
[[15]] (2001), Astrophys.J. 552, L1-L5.
[[16]] C.F. Hoyle, G. Burbidge, J.V. Narlikar (2000), A different approach to cosmology, Cambridge University Press, Cambridge, Chapter 9: “The origin of the light elements”.
[[17]] (2001), Science 291, 579-581.
[[18]] E.J. Lerner (1991), The Big Bang Never Happened, Random House, New York, pp. 23 & 28.
[[19]] T. Van Flandern (1992), “Quasars: near vs. far”, MetaRes.Bull. 1, 28-32; <http://metaresearch.org>, “Cosmology” tab, “Cosmology” sub-tab.
[[20]] H.C. Arp (1998), Seeing Red, Apeiron, Montreal.
[[21]] (2002), Astrophys.J. 566, 705-711.
[[22]] (1999), Nature 399, 539-541.
[[23]] (1999); Sky&Tel. 98 (Oct.), 20.
[[24]] D.S. Mathewson, V.L. Ford, & M. Buchhorn (1992), Astrophys.J. 389, L5-L8.
[[25]] D. Lindley (1992), Nature 356, 657.
[[26]] (1999), Astrophys.J. 512, L79-L82.
[[27]] (1993), Science 257, 1208-1210.
[[28]] (1996), Astrophys.J. 470, 49-55.
[[29]] T. Van Flandern (1996), “Possible new properties of gravity”, Astrophys.&SpaceSci. 244, 249-261; MetaRes.Bull. 5, 23-29 & 38-50; <http://metaresearch.org>, “Cosmology” tab, “Gravity” sub-tab.
[[30]] T. Van Flandern (2001), “Physics has its principles”, Redshift and Gravitation in a Relativistic Universe, K. Rudnicki, ed., Apeiron, Montreal, 95-108; MetaRes.Bull. 9, 1-9 (2000).
[[31]] (2001), Astron.J. 122, 2833-2849.
[[32]] (2001), Astron.J. 122, 2850-2857.
[[33]] (2002), Astrophys.J. 565, 50-62.
[[34]] (2002), <http://www.ifa.hawaii.edu/users/cowie/z6/z6_press.html>.
[[35]] (2000), Astrophys.J. 530, 17-35.
[[36]] (1999), Nature 398, 25-26.
[[37]] (2000), Science 290, 1923.
[[38]] (1999), Phys.Today Sept, 13, 15, 78.
[[39]] E.L. Wright (2000), <http://www.astro.ucla.edu/~wright/errors.html>.
[[40]] (2001), Astron.J. 122, 1084-1103.
[[41]] H. Kurki-Suonio (1990), Sci.News 137, 287.
[[42]] C. Seife (2000), “Fly’s Eye spies highs in cosmic rays’ demise”, Science 288, 1147.
[[43]] (2000), Sci.News 158, 86.
[[44]] (1997), Science 278, 226.
[[45]] (2000), Astrophys.J. 530, 1-16.
[[46]] (2002), <http://oposite.stsci.edu/pubinfo/PR/96/35/A.html>.
[[47]] (2000), Astrophys.J. 538, 1-10.
[[48]] B.R. Bligh (2000), The Big Bang Exploded!, <brbligh@hotmail.com>.
[[49]] (2000), Astrophys.J. 541, 519-526.
[[50]] (1999), Nature 397, 225.
[[51]] (1998), Seeing Red, H. Arp, Apeiron, Montreal, 124-125.
[[52]] McGaugh (2001), Astronomy 29#3, 24-26.
[[53]] (1992), Astrophys.J. 393, 59-67.
[[54]] Guthrie & Napier (1991), Mon.Not.Roy.Astr.Soc. 12/1 issue.
[[55]] (2001), Astron.J. 121, 21-30.
[[56]] (1999), Astron.&Astrophys. 343, 697-704.
[[57]] (2001), Astron.J. 121, 54-65.
[[58]] (2001), Astrophys.J. 557, 519-526.
[[59]] (2001), Phys.Rev.Lett. 9/03 issue.
[[60]] (2001), Astrophys.J. 558, L1-L4.
[[61]] (2002), Astrophys.J. 565, 50-62.
[[62]] (2002), Astrophys.J. 565, 63-77.
[[63]] (2002), Astrophys.J. 566, 51-67.
[[64]] (2002), Astrophys.J. 566, 68-92.
[[65]] (2002), Astrophys.J. 566, 103-122.
[[66]] (2002), Astrophys.J. 566, L1-L4.
[[67]] (1992), Nature 355, 55-58.


Fukushima and the death of the Pacific Ocean - a list of links

Scientists are “baffled”, “befuddled”, concerned”, and “curious” about the die-off of the Pacific Ocean.  Almost no one mentions the F word, Fukushima.  I put together this links from Enenews.com that shows a direct correlation between the Fukushima and the die-off, even if scientists refuse to admit it.  Every story comes from a major publication.  These stories have been out since the beginning of the meltdowns.  There is not one scientist studying this who does not have access to this information.  Either they chose to look the other way or they are incompetent researchers who don’t really want to know the truth.  

There are a thousand links here, all in chronological order.  If you are pressed for time, skip to the back and read backwards to see how bad it really is.  I put this file together so that when someone says prove it, at least there are 50 plus pages of links that prove that Fukushima is killing the North Pacific Ocean.  Humans are next.

Continue reading "Fukushima and the death of the Pacific Ocean - a list of links" »


30 Minutes Exposure To 4G Cell Phone Radiation Affects Brain

The peer-reviewed journal Clinical Neurophysiology has just published research showing that 30 minutes of exposure to LTE cellphone radiation affects brain activity on both sides of the brain.1

Researchers exposed the right ear of 18 participants to LTE radio frequency radiation for 30 minutes. The absorbed amount of radiation in the brain was well within international (ICNIRP) cell phone legal limits and the source of the radiation was kept 1 cm from the ear. To eliminate study biases the researchers employed a double blind, crossover, randomized design, exposing participants to real and sham exposures.

The resting state brain activity of each participant was measured by magnetic resonance imaging (fMRI) twice, once after exposure to LTE radio frequency radiation, and then again after a sham exposure.

The results demonstrate that radio frequency radiation from LTE 4G technology affects brain neural activity in both the closer brain region and in the remote region, including the left hemisphere of the brain.

LTE Fastest Developing Mobile System Technology Ever

This study is important for two reasons. Firstly because it is the first one to be carried out on the short-term effects of Long Term Evolution (LTE), fourth generation (4G) cell phone technology. Secondly, because of the rapid rate of adoption of this technology.

According to the Global mobile Suppliers Association “LTE is the fastest developing mobile system technology ever”.  The United States is the largest LTE market in the world. By March 2013 the global total of LTE subscriptions was already 91 million subscribers. Over half of these, 47 million, were American 4G subscribers.

Cell Phone Radiation Image

Image republished with permission. Original source.

Cell Phone Exposures and Disease

This study establishes that short-term exposure to LTE radio frequency radiation affects brain activity. The long-term effects of these exposures have yet to be studied but there is already considerable evidence linking these exposures to a myriad of adverse biological effects including:

Sperm damage

DNA breaks

Increased glucose in the brain

Weakened bones

Genetic stress

Immune system dysfunction

Effects on unborn children

More worrying is the link between these exposures and a long list of diseases such as:

Alzheimer’s disease

Autism

Brain Tumors

Breast cancer

Brain cancer

More research is needed on the effects of LTE and other forms of cell phone radiation but the evidence is already compelling.  Many scientific and medical experts are sounding the alarm.

via 30 Minutes Exposure To 4G Cell Phone Radiation Affects Brain.


CDC Admits 98 Million Americans Received Polio Vaccine Contaminated With Cancer Virus

The CDC has quickly removed a page from their website, which is now cached here, admitting that more than 98 million Americans received one or more doses of polio vaccine within an 8-year span when a proportion of the vaccine was contaminated with a cancer causing polyomavirus called SV40. It has been estimated that 10-30 million Americans could have received an SV40 contaminated dose of the vaccine.

V40 is an abbreviation for Simian vacuolating virus 40 or Simian virus 40, a polyomavirus that is found in both monkeys and humans. Like other polyomaviruses, SV40 is a DNA virus that has been found to cause tumors and cancer. 

SV40 is believed to suppress the transcriptional properties of the tumor-suppressing genes in humans through the SV40 Large T-antigen and SV40 Small T-antigenMutated genes may contribute to uncontrolled cellular proliferation, leading to cancer.

Michele Carbone, Assistant Professor of Pathology at Loyola University in Chicago, has recently isolated fragments of the SV-40 virus in human bone cancers and in a lethal form of lung cancer called mesothelioma. He found SV-40 in 33% of the osteosarcoma bone cancers studied, in 40% of other bone cancers, and in 60% of the mesotheliomas lung cancers, writes Geraldo Fuentes. 

Dr. Michele Carbone openly acknowledged HIV/AIDS was spread by the hepatitis B vaccine produced by Merck & Co. during the early 1970s. It was the first time since the initial transmissions took place in 1972-74, that a leading expert in the field of vaccine manufacturing and testing has openly admitted the Merck & Co. liability for AIDS

The matter-of-fact disclosure came during discussions of polio vaccines contaminated with SV40 virus which caused cancer in nearly every species infected by injection. Many authorities now admit much, possibly most, of the world’s cancers came from the Salk and Sabin polio vaccines, and hepatitis B vaccines, produced in monkeys and chimps. 

It is said mesothelioma is a result of asbestos exposure, but research reveals that 50% of the current mesotheliomas being treated no longer occurs due to asbestos but rather the SV-40 virus contained in the polio vaccination. In addition, according to researchers from the Institute of Histology and General Embryology of the University of Ferrara, SV-40 has turned up in a variety other tumors. By the end of 1996, dozens of scientists reported finding SV40 in a variety of bone cancers and a wide range of brain cancers, which had risen 30 percent over the previous 20 years.

The SV-40 virus is now being detected in tumors removed from people never inoculated with the contaminated vaccine, leading some to conclude that those infected by the vaccine might be spreading SV40.

Soon after its discovery, SV40 was identified in the oral form of the polio vaccine produced between 1955 and 1961 produced by American Home Products (dba Lederle).

Both the oral, live virus and injectable inactive virus were affected.  It was found later that the technique used to inactivate the polio virus in the injectable vaccine, by means of formaldehyde, did not reliably kill SV40.

Just two years ago, the U.S. government finally added formaldehyde to a list of known carcinogens and and admitted that the chemical styrene might cause cancer.  Yet, the substance is still found in almost every vaccine.

According to the Australian National Research Council, fewer than 20% but perhaps more than 10% of the general population may be susceptible to formaldehyde and may react acutely at any exposure level. More hazardous than most chemicals in 5 out of 12 ranking systems, on at least 8 federal regulatory lists, it is ranked as one of the most hazardous compounds (worst 10%) to ecosystems and human health (Environmental Defense Fund).

In the body, formaldehyde can cause proteins to irreversibly bind to DNA. Laboratory animals exposed to doses of inhaled formaldehyde over their lifetimes have developed more cancers of the nose and throat than are usual. 

Facts Listed on The CDC Website about SV40

-SV40 is a virus found in some species of monkey.

-SV40 was discovered in 1960. Soon afterward, the virus was found in polio vaccine.

-SV40 virus has been found in certain types of cancer in humans.


Additional Facts

-In the 1950s, rhesus monkey kidney cells, which contain SV40 if the animal is infected, were used in preparing polio vaccines.

-Not all doses of IPV were contaminated. It has been estimated that 10-30 million people actually received a vaccine that contained SV40.

-Some evidence suggests that receipt of SV40-contaminated polio vaccine may increase risk of cancer.


A Greater Perspective on Aerial Spraying and SV40

The Defense Sciences Office of the Pathogen Countermeasures Program, in September 23, 1998 funded the University of Michigan’s principal investigator, Dr. James Baker, Jr. Dr. Baker, Director of Michigan Nanotechnology Institute for Medicine and Biological Sciences under several DARPA grants. Dr. Baker developed and focused on preventing pathogens from entering the human body, which is a major goal in the development of counter measures to Biological Warfare. This research project sought to develop a composite material that will serve as a pathogen avoidance barrier and post-exposure therapeutic agent to be applied in a topical manner to the skin and mucous membranes. The composite is modeled after the immune system in that it involves redundant, non-specific and specific forms of pathogen defense and inactivation. This composite material is now utilized in many nasal vaccines and vector control through the use of hydro-gel, nanosilicon gels and actuator materials in vaccines.

Through Dr. Baker’s research at the University of Michigan; he developed dendritic polymers and their application to medical and biological science. He co-developed a new vector system for gene transfer using synthetic polymers. These studies have produced striking results and have the potential to change the basis of gene transfer therapy. Dendrimers are nanometer-sized water soluble polymers that can conjugate to peptides or arbohydrates to act as decoy molecules to inhibit the binding of toxins and viruses to cells. They can act also as complex and stabilize genetic material for prolonged periods of time, as in a “time released or delayed gene transfer”. Through Dr. Baker’s ground breaking research many pharmaceutical and biological pesticide manufacturers can use these principles in DNA vaccines specific applications that incorporate the Simian Monkey Virus SV40.

WEST NILE VIRUS SPRAYING

In 2006 Michael Greenwood wrote an article for the Yale School of Public Health entitled, “Aerial Spraying Effectively Reduces Incidence of West Nile Virus (WNV) in Humans.” The article stated that the incidence of human West Nile virus cases can be significantly reduced through large scale aerial spraying that targets adult mosquitoes, according to research by the Yale School of Public Health and the California Department of Public Health.

Under the mandate for aerial spraying for specific vectors that pose a threat to human health, aerial vaccines known as DNA Vaccine Enhancements and Recombinant Vaccine against WNV may be tested or used to “protect” the people from vector infection exposures. DNA vaccine enhancements specifically use Epstein-Barr viral capside’s with multi human complement class II activators to neutralize antibodies. The recombinant vaccines against WNV use Rabbit Beta-globulin or the poly (A) signal of the SV40 virus. In early studies of DNA vaccines it was found that the negative result studies would go into the category of future developmental research projects in gene therapy. During the studies of poly (A) signaling of the SV40 for WNV vaccines, it was observed that WNV will lie dormant in individuals who were exposed to chicken pox, thus upon exposure to WNV aerial vaccines the potential for the release of chicken pox virus would cause a greater risk to having adult onset Shingles.

CALIFORNIA AERIAL SPRAYING for WNV and SV40

In February 2009 to present date, aerial spraying for the WNV occurred in major cities within the State of California. During spraying of Anaheim, CA a Caucasian female (age 50) was exposed to heavy spraying, while doing her daily exercise of walking several miles. Heavy helicopter activity occurred for several days in this area. After spraying, she experienced light headedness, nausea, muscle aches and increased low back pain. She was evaluated for toxicological mechanisms that were associated with pesticide exposure due to aerial spraying utilizing advanced biological monitoring testing. The test results which included protein band testing utilizing Protein Coupled Response (PCR) methods were positive for KD-45. KD-45 is the protein band for SV-40 Simian Green Monkey virus. Additional tests were performed for Epstein-Barr virus capside and Cytomeglia virus which are used in bioengineering for gene delivery systems through viral protein envelope and adenoviral protein envelope technology. The individual was positive for both; indicating a highly probable exposure to a DNA vaccination delivery system through nasal inhalation.

The question of the century is how many other viruses and toxins are within current day vaccines that we’ll only find out about in a few decades?

Dave Mihalovic is a Naturopathic Doctor who specializes in vaccine research, cancer prevention and a natural approach to treatment.

Source: Prevent Disease


Read more at http://www.realfarmacy.com/cdc-admits-98-million-americans-received-polio-vaccine-contaminated-with-cancer-virus/#4XsVHhIJ6f12RGiz.99

 


Earth in perspective

“You think man can destroy the planet?  What intoxicating vanity. Let me tell you about our planet.  Earth is four-and-a-half-billion-years-old. There's been life on it for nearly that long, 3.8 billion years.  Bacteria first; later the first multicellular life, then the first complex creatures in the sea, on the land.  Then finally the great sweeping ages of animals, the amphibians, the dinosaurs, at last the mammals, each one enduring millions on millions of years, great dynasties of creatures rising, flourishing, dying away -- all this against a background of continuous and violent upheaval.  Mountain ranges thrust up, eroded away, cometary impacts, volcano eruptions, oceans rising and falling, whole continents moving, an endless, constant, violent change, colliding, buckling to make mountains over millions of years.  Earth has survived everything in its time. It will certainly survive us. If all the nuclear weapons in the world went off at once and all the plants, all the animals died and the earth was sizzling hot for a hundred thousand years, life would survive, somewhere: under the soil, frozen in Arctic ice. Sooner or later, when the planet was no longer inhospitable, life would spread again.  The evolutionary process would begin again. It might take a few billion years for life to regain its present variety.  Of course, it would be very different from what it is now, but the earth would survive our folly, only we would not.  If the ozone layer gets thinner, ultraviolet radiation sears the earth, so what? Ultraviolet radiation is good for life. It's powerful energy. It promotes mutation, change. Many forms of life will thrive with more UV radiation.  Many others will die out. Do you think this is the first time that's happened? Think about oxygen.  Necessary for life now, but oxygen is actually a metabolic poison, a corrosive glass, like fluorine.  When oxygen was first produced as a waste product by certain plant cells some three billion years ago, it created a crisis for all other life on earth.  Those plants were polluting the environment, exhaling a lethal gas.  Earth eventually had an atmosphere incompatible with life.  Nevertheless, life on earth took care of itself.  In the thinking of the human being a hundred years is a long time.  A hundred years ago we didn't have cars, airplanes, computers or vaccines. It was a whole different world, but to the earth, a hundred years is nothing.  A million years is nothing. This planet lives and breathes on a much vaster scale.  We can't imagine its slow and powerful rhythms, and we haven't got the humility to try.  We've been residents here for the blink of an eye.  If we're gone tomorrow, the earth will not miss us.” 

Water, Consciousness & Intent

Water is the source of life.  This point was brought home to me in a sweatlodge that I attended a few days ago.  I do not normally drink much before going into a sweat and by the time I was done with this one, I had come to appreciate water as never before.  At one point the shaman told us to drink it in as he poured  precious pitchers of clean water onto the earth floor.   "Think about all of the people who have no access to clean water," he told us.  "Think about the people who have to walk hours just for a bucket of dirty water."  With door and each song celebrating the sacred water, we became thirstier like never before.

"Drink with gratitude," he said - and I did and have ever since.

I haev seen this video before, but a recent sighting on Zengardner inspired me to post it here.  Our oceans are being decimated.  Not only is the Macondo well leaking again (or an area nearby - BP won't give out any info), but the oceans are being contaminated from Fukushima.  The sea animals are dying.  The marine life is radioactive.  We need to do something to heal the planet and good intentions will pave the way.

Please pray for the water.


Twenty Terrific Facts About Hemp!

by Becca Wolford

Hemp has been grown for over 10,000 years as food, medicine, and textiles. I have composed a list of terrific facts about this amazing plant we know as hemp!

1. The original Levi Strauss jeans were made from hemp fibers.

2. The fiber from 1 acre of hemp (which matures in 100 days) is equivalent to 4 acres of trees (which take decades to mature).

3. Most of the bird seed sold in the U.S. has hemp seed in it

4. HEMP IS THE NUMBER ONE biomass producer on the planet. It produces ten tons per acre in about a four month period. It is a woody plant containing 77% cellulose (wood has 60% cellulose).

5. All products manufactured from hemp are biodegradable.

6. A hemp shirt in 1776 cost 50 cents to one dollar; a cotton shirt cost 100-200 dollars.

7. The first Bibles were made from hemp.

8. Vincent Van Gogh’s paintings were done on hemp canvas.

9. Hemp is an amazing medicine, treating disorders and diseases such as Crohn’s, Tourette’s, Parkinson’s, depression, bipolar, diabetes, cancer, asthma, heart disease, and many more.

10. Hemp fuel emits 80% less carbon dioxide than fossil fuels, and almost 100% less sulfur dioxide.

11. Hemp is 35% dietary fiber, the highest of all flour grains (and is gluten-free).

12. Hemp is mold-, mildew-, pest-, and insect-repellent.

13. Hemp fabric is UV resistant and will not break down in sunlight.

14. Hemp is the richest source of polyunsaturated fatty acids. With a content of approximately 80%, it is also high in essential amino acids and GLA.

15. A historical note: “Even in England, the much-sought-after prize of full British citizenship was bestowed by a decree of the crown on foreigners who would grow cannabis, and fines were often levied against those who refused.” (The Emperor Wears No Clothes)

16. Alice In Wonderland was originally printed on hemp paper.

17. The hardiness of the hemp plant and the nutritional benefits could eradicate world starvation.

18. Bio-diesel emissions have lower levels of PAH and nitrited PAH compounds (polycyclic aromatic hydrocarbons), which are shown to possibly cause cancer.

19. Hemp is a non-toxic alternative cleaner compared to chemical-based househ0ld cleansers.

20. Of the millions of edible plants on the planet earth, no other can compare nutritionally to the value of hempseeds.

About the Author

Becca Wolford is a writer, entrepreneur, artist, reiki practitioner, and hemp activist. She has experienced first-hand the nutritional and healing benefits of hemp and her passion is learning, writing, and educating others about the benefits of hemp – benefits that encompass nutritional health for humans, a healthy environment, and a healthier economy.

Becca also distributes Versativa, an amazing raw, clean, hemp-based nutritional supplement andRestoration90, a raw, clean, nutritional product with marine phytoplankton, hemp, and essential nutrients for optimum health. Please support her at her excellent blog Hemphealer.com.

This article is offered under Creative Commons license. It’s okay to republish it anywhere as long as attribution bio is included and all links remain intact.


Neurosurgeon Shows How Low Levels of Radiation Such As Wi-Fi, Smart Meters And Cell Phones Cause The Blood Brain Barrier To Leak

by Marco Torres
PreventDisease.com

Neurosurgeon and researcher Dr. Leif Salford has conducted many studies on radio frequency radiation and its effects on the brain. Dr. Salford called the potential implications of some of his research "terrifying." Some of the most concerning conclusions result from the fact that the weakest exposure levels to wireless radiation caused the greatest effect in causing the blood brain barrier to leak. 

Since he began his line of research in 1988, Dr. Leif Salford and his colleagues at Lund University Hospital in Sweden has exposed over 1,600 experimental animals to low-level radiation. Their results were consistent and worrisome: radiation, including that from cell phones, caused the blood-brain barrier--the brain's first line of defense against infections and toxic chemicals--to leak. 

Researchers in 13 other laboratories in 6 different countries had reported the same effect, but no one had proven whether it would lead to any damage in the long term. In a study published June 2003 in Environmental Health Perspectives, Salford's team repeated the experiment on 32 additional animals, but this time waited eight weeks before sacrificing them and examining their brains. In those animals that had been exposed to a cell phone, up to two percent of the neurons in all areas of the brain were shrunken and degenerated. 

Salford, chairman of the Department of Neurosurgery at his institution, called the potential implications "terrifying." "We have good reason to believe," he said, "that what happens in rats' brains also happens in humans." Referring to today's teenagers, the study's authors wrote that "a whole generation of users may suffer negative effects, perhaps as early as in middle age." 


Dr. Devra Davis, author of "Disconnect" explains the science of cell phone radiation in a very comprehensive way. For example she shows photos of two cells, one whose DNA has been damaged by "gamma" radiation (which is what was emitted in Hiroshima) and another cell damaged by low level pulsing non ionizing radiation (from a cell phone). Both cells look very damaged compared to a normal cell; but she even goes on to say the DNA from the cell exposed to the cell phone radiation looks worse. She also discusses the campaign to discredit reputable scientists and their studies--some of these reputable studies having been around since 1972 (Frey). 


In May of 2011, the World Health Organization official recognized that wireless radiation such as emitted by "smart meters" is a possible carcinogen. After decades of corporately-funded, biased research being held up as "industry-standard", there are hundreds of independent peer-reviewed scientific studies now showing there is a clear health hazard with technology emitting wireless radiation in the range that "smart meters" do. Meanwhile, tens of thousands of people with a "smart meter" installed, have contracted illness, insomnia, rashes, headaches, and worse. And many have been forced to leave their homes entirely, due to health effects. What's more, in apartment buildings where 30+ "smart meters" are installed in a single electrical room, the dangers are even higher. There have been no long-term health studies done on this high level of Electromagnetic Radiation. 


PBS Interview with Dr. Keith Black (neurosurgeon) regarding WHO's classification of RF Elecctromagnetic radiation as a 2b possible carcinogen. "We haven't had any good studies in the pediatric population. A child's skull is much thinner. . . .and the amount of radiation that goes into the pediatric brain is much higher than in an adult. So we should be cautious with how we allow our children to use a cell phone. They're going to be the ones that not only are going to use it at a much younger age but at a much longer duration." 


Let's start connecting the dots and end this madness to our health and the health of future generations. 

Marco Torres is a research specialist, writer and consumer advocate for healthy lifestyles. He holds degrees in Public Health and Environmental Science and is a professional speaker on topics such as disease prevention, environmental toxins and health policy.

Products That Contain Triclosan

via Federal Jack

by Dr. Ben Kim   

 If you are not yet aware of the potential dangers of triclosan, you should know that this antibacterial agent has been strongly linked to the following effects on human health:

* Abnormalities with the endocrine system, particularly with thyroid hormone signaling
* Weakening of the immune system
* Birth defects
* Uncontrolled cell growth
* Unhealthy weight loss

Although triclosan is best known for its presence in many brands of antibacterial soap, it is also found in a wide variety of personal care and household products. According to BeyondPesticides.org, triclosan is found in the following products:

Soaps:

* Dial® Liquid Soap
* Softsoap® Antibacterial Liquid Hand Soap
* Tea Tree Therapy™ Liquid Soap
* Provon® Soap
* Clearasil® Daily Face Wash
* Dermatologica® Skin Purifying Wipes
* Clean & Clear Foaming Facial Cleanser
* DermaKleen™ Antibacterial Lotion Soap
* Naturade Aloe Vera 80® Antibacterial Soap
* CVS Antibacterial Soap
* pHisoderm Antibacterial Skin Cleanser

Dental Care:

* Colgate Total®; Breeze™ Triclosan Mouthwash
* Reach® Antibacterial Toothbrush
* Janina Diamond Whitening Toothpaste

Cosmetics:

* Supre® Café Bronzer™
* TotalSkinCare Makeup Kit
* Garden Botanika® Powder Foundation
* Mavala Lip Base
* Jason Natural Cosmetics
* Blemish Cover Stick
* Movate® Skin Litening Cream HQ
* Paul Mitchell Detangler Comb
* Revlon ColorStay LipSHINE Lipcolor Plus Gloss
* Dazzle

Deodorant:

* Old Spice High Endurance Stick Deodorant
* Right Guard Sport Deodorant
* Queen Helene® Tea Trea Oil Deodorant and Aloe Deodorant
* Nature De France Le Stick Natural Stick Deodorant
* DeCleor Deodorant Stick
* Epoch® Deodorant with Citrisomes
* X Air Maximum Strength Deodorant

Other Personal Care Products:

* Gillette® Complete Skin Care MultiGel Aerosol Shave Gel
* Murad Acne Complex® Kit®
* Diabet-x™ Cream
* T.Taio™ sponges and wipes
* Aveeno Therapeutic Shave Gel

First Aid:

* SyDERMA® Skin Protectant plus First Aid Antiseptic
* Solarcaine®
* First Aid Medicated Spray;
Nexcare™ First Aid
* Skin Crack Care
* First Aid/Burn Cream
* HealWell® Night Splint
* 11-1X1: Universal Cervical Collar with Microban

Kitchenware:

* Farberware® Microban Steakknife Set and Cutting Boards
* Franklin Machine Products FMP Ice Cream Scoop SZ 20 Microban
* Hobart Semi-Automatic Slicer
* Chix® Food Service Wipes with Microban
* Compact Web Foot® Wet Mop Heads

Computer Equipment:

* Fellowes Cordless Microban Keyboard and Microban Mouse Pad

Clothes:

* Merrell Shoes
* Sabatier Chef’s Apron
* Dickies Socks
* Fruit of the Loom Socks
* Biofresh® Socks

Children’s Toys:

* Playskool® :
o Stack ‘n Scoop Whale
o Rockin’ Radio
o Hourglass
o Sounds Around Driver
o Roll ‘n’ Rattle Ball
o Animal Sounds Phone
o Busy Beads Pal
o Pop ‘n’ Spin Top
o Lights ‘n’ Surprise Laptop

Other:

* Bionare® Cool Mist Humidifier
* Microban® All Weather Reinforced Hose
* Thomasville® Furniture
* Deciguard AB Ear Plugs
* Bauer® 5000 Helmet
* Aquatic Whirlpools
* Miller Paint Interior Paint
* QVC® Collapsible 40-Can Cooler
* Holmes Foot Buddy™ Foot Warmer
* Blue Mountain Wall Coverings
* California Paints®
* EHC AMRail Escalator Handrails
* Dupont™ Air Filters
* Durelle™ Carpet Cushions
* Advanta One Laminate Floors
* San Luis Blankets
* J Cloth® towels
* JERMEX mops

Source: BeyondPesticides.org

Please share this information with your families and groups of friends, particularly those that include young children and pregnant women.

SEE ALSO: Antibacterial Soap Weakens Heart and Muscle Function, Study Says

http://drbenkim.com/articles/triclosan-products.htm


The Heart Has Its Own “Brain” and Consciousness

via Waking Times

Many believe that conscious awareness originates in the brain alone. Recent scientific research suggests that consciousness actually emerges from the brain and body acting together. A growing body of evidence suggests that the heart plays a particularly significant role in this process.

Far more than a simple pump, as was once believed, the heart is now recognized by scientists as a highly complex system with its own functional “brain.”

Research in the new discipline of neurocardiology shows that the heart is a sensory organ and a sophisticated center for receiving and processing information. The nervous system within the heart (or “heart brain”) enables it to learn, remember, and make functional decisions independent of the brain’s cerebral cortex. Moreover, numerous experiments have demonstrated that the signals the heart continuously sends to the brain influence the function of higher brain centers involved in perception, cognition, and emotional processing.

In addition to the extensive neural communication network linking the heart with the brain and body, the heart also communicates information to the brain and throughout the body via electromagnetic field interactions. The heart generates the body’s most powerful and most extensive rhythmic electromagnetic field. Compared to the electromagnetic field produced by the brain, the electrical component of the heart’s field is about 60 times greater in amplitude, and permeates every cell in the body. The magnetic component is approximately 5000 times stronger than the brain’s magnetic field and can be detected several feet away from the body with sensitive magnetometers.

The heart generates a continuous series of electromagnetic pulses in which the time interval between each beat varies in a dynamic and complex manner. The heart’s ever-present rhythmic field has a powerful influence on processes throughout the body. We have demonstrated, for example, that brain rhythms naturally synchronize to the heart’s rhythmic activity, and also that during sustained feelings of love or appreciation, the blood pressure and respiratory rhythms, among other oscillatory systems, entrain to the heart’s rhythm.

We propose that the heart’s field acts as a carrier wave for information that provides a global synchronizing signal for the entire body. Specifically, we suggest that as pulsing waves of energy radiate out from the heart, they interact with organs and other structures. The waves encode or record the features and dynamic activity of these structures in patterns of energy waveforms that are distributed throughout the body. In this way, the encoded information acts to in-form (literally, give shape to) the activity of all bodily functions—to coordinate and synchronize processes in the body as a whole. This perspective requires an energetic concept of information, in which patterns of organization are enfolded into waves of energy of system activity distributed throughout the system as a whole.

Basic research at the Institute of HeartMath shows that information pertaining to a person’s emotional state is also communicated throughout the body via the heart’s electromagnetic field. The rhythmic beating patterns of the heart change significantly as we experience different emotions. Negative emotions, such as anger or frustration, are associated with an erratic, disordered, incoherent pattern in the heart’s rhythms. In contrast, positive emotions, such as love or appreciation, are associated with a smooth, ordered, coherent pattern in the heart’s rhythmic activity. In turn, these changes in the heart’s beating patterns create corresponding changes in the structure of the electromagnetic field radiated by the heart, measurable by a technique called spectral analysis.

More specifically, we have demonstrated that sustained positive emotions appear to give rise to a distinct mode of functioning, which we call psychophysiological coherence. During this mode, heart rhythms exhibit a sine wave-like pattern and the heart’s electromagnetic field becomes correspondingly more organized.

At the physiological level, this mode is characterized by increased efficiency and harmony in the activity and interactions of the body’s systems. [1]

Psychologically, this mode is linked with a notable reduction in internal mental dialogue, reduced perceptions of stress, increased emotional balance, and enhanced mental clarity, intuitive discernment, and cognitive performance.

In sum, our research suggests that psychophysiological coherence is important in enhancing consciousness—both for the body’s sensory awareness of the information required to execute and coordinate physiological function, and also to optimize emotional stability, mental function, and intentional action. Furthermore, as we see next, there is experimental evidence that psychophysiological coherence may increase our awareness of and sensitivity to others around us. The Institute of HeartMath has created practical technologies and tools that all people can use to increase coherence.

Heart Field Interactions Between Individuals

Most people think of social communication solely in terms of overt signals expressed through language, voice qualities, gestures, facial expressions, and body movements. However, there is now evidence that a subtle yet influential electromagnetic or “energetic” communication system operates just below our conscious awareness. Energetic interactions likely contribute to the “magnetic” attractions or repulsions that occur between individuals, and also affect social exchanges and relationships. Moreover, it appears that the heart’s field plays an important role in communicating physiological, psychological, and social information between individuals.

Experiments conducted at the Institute of HeartMath have found remarkable evidence that the heart’s electromagnetic field can transmit information between people. We have been able to measure an exchange of heart energy between individuals up to 5 feet apart. We have also found that one person’s brain waves can actually synchronize to another person’s heart. Furthermore, when an individual is generating a coherent heart rhythm, synchronization between that person’s brain waves and another person’s heartbeat is more likely to occur. These findings have intriguing implications, suggesting that individuals in a psychophysiologically coherent state become more aware of the information encoded in the heart fields of those around them.

The results of these experiments have led us to infer that the nervous system acts as an “antenna,” which is tuned to and responds to the electromagnetic fields produced by the hearts of other individuals. We believe this capacity for exchange of energetic information is an innate ability that heightens awareness and mediates important aspects of true empathy and sensitivity to others Furthermore, we have observed that this energetic communication ability can be intentionally enhanced, producing a much deeper level of nonverbal communication, understanding, and connection between people. There is also intriguing evidence that heart field interactions can occur between people and animals.

In short, energetic communication via the heart field facilitates development of an expanded consciousness in relation to our social world.

The Heart’s Field and Intuition

There are also new data suggesting that the heart’s field is directly involved in intuitive perception, through its coupling to an energetic information field outside the bounds of space and time. Using a rigorous experimental design, we found compelling evidence that both the heart and brain receive and respond to information about a future event before the event actually happens. Even more surprising was our finding that the heart appears to receive this “intuitive” information before the brain. This suggests that the heart’s field may be linked to a more subtle energetic field that contains information on objects and events remote in space or ahead in time. Called by Karl Pribram and others the “spectral domain,” this is a fundamental order of potential energy that enfolds space and time, and is thought to be the basis for our consciousness of “the whole.” (See heartmath.org for further detail.)

Social Fields

In the same way that the heart generates energy in the body, we propose that the social collective is the activator and regulator of the energy in social systems.

A body of groundbreaking work shows how the field of socioemotional interaction between a mother and her infant is essential to brain development, the emergence of consciousness, and the formation of a healthy self-concept. These interactions are organized along two relational dimensions—stimulation of the baby’s emotions, and regulation of shared emotional energy. Together they form a socioemotional field through which enormous quantities of psychobiological and psychosocial information are exchanged. Coherent organization of the mother-child relations that make up this field is critical. This occurs when interactions are charged, most importantly, with positive emotions (love, joy, happiness, excitement, appreciation, etc.), and are patterned as highly synchronized, reciprocal exchanges between these two individuals. These patterns are imprinted in the child’s brain and thus influence psychosocial function throughout life. (See Allan Schore, Affect Regulation and the Origin of the Self.)

Moreover in a longitudinal study of 46 social groups, one of us (RTB) documented how information about the global organization of a group—the group’s collective consciousness—appears to be transmitted to all members by an energetic field of socio-emotional connection. Data on the relationships between each pair of members was found to provide an accurate image of the social structure of the group as a whole. Coherent organization of the group’s social structure is associated with a network of positively charged emotions (love, excitement, and optimism) connecting all members. This network of positive emotions appears to constitute a field of energetic connection into which information about the group’s social structure is encoded and distributed throughout the group. Remarkably, an accurate picture of the group’s overall social structure was obtained from information only about relationships between pairs of individuals. We believe the only way this is possible is if information about the organization of the whole group is distributed to all members of the group via an energetic field. Such correspondence in information between parts and the whole is consistent with the principle of holographic organization. [2]

Synthesis and Implications

Some organizing features of the heart field, identified in numerous studies at HeartMath, may also be shared by those of our hypothesized social field. Each is a field of energy in which the waveforms of energy encode the features of objects and events as energy moves throughout the system. This creates a nonlocal order of energetic information in which each location in the field contains an enfolded image of the organization of the whole system at that moment. The organization and processing of information in these energy fields can best be understood in terms of quantum holographic principles. [3]

Another commonality is the role of positive emotions, such as love and appreciation, in generating coherence both in the heart field and in social fields. When the movement of energy is intentionally regulated to form a coherent, harmonious order, information integrity and flow are optimized. This, in turn, produces stable, effective system function, which enhances health, psychosocial well-being, and intentional action in the individual or social group.

Heart coherence and social coherence may also act to mutually reinforce each other. As individuals within a group increase psychophysiological coherence, psychosocial attunement may be increased, thereby increasing the coherence of social relations. Similarly, the creation of a coherent social field by a group may help support the generation and maintenance of psychophysiological coherence in its individual members. An expanded, deepened awareness and consciousness results—of the body’s internal physiological, emotional, and mental processes, and also of the deeper, latent orders enfolded into the energy fields that surround us. This is the basis of self-awareness, social sensitivity, creativity, intuition, spiritual insight, and understanding of ourselves and all that we are connected to. It is through the intentional generation of coherence in both heart and social fields that a critical shift to the next level of planetary consciousness can occur—one that brings us into harmony with the movement of the whole.

For more information on the Institute of HeartMath’s research and publications, please visitwww.heartmath.org.

Footnotes

1.Correlates of physiological coherence include: increased synchronization between the two branches of the autonomic nervous system, a shift in autonomic balance toward increased parasympathetic activity, increased heart-brain synchronization, increased vascular resonance, and entrainment between diverse physiological oscillatory systems.                 2.Holographic organization is based on a field concept of order, in which information about the organization of an object as a whole is encoded as an interference pattern in energy waveforms distributed throughout the field. This makes it possible to retrieve information about the object as a whole from any location within the field.               3.The term “quantum,” as used in quantum holography, does not mean that this kind of energetic information processing is understood in terms of the principles of quantum physics. Rather, quantum holography is a special, nondeterministic form of holographic organization based on a discrete unit of energetic information called a logon or a “quantum” of information.


97 Percent of Our DNA Has a Higher Purpose And Is Not ‘Junk’ As Labeled By Scientists

by Micheal Forrester, Prevent Disease
Waking Times

After thousands of years of being disconnected from higher dimensional frequencies, our DNA is finally breaking free from old patterns which have been stuck in a universal time matrix. However, humans will soon  know and understand why  97% of our  DNA has a  higher purpose  and why its transformation is leading us into  an   awakening that we never could have imagined.

The human genome is packed with at least four million gene switches that reside in bits of DNA that once were dismissed as “junk” but it turns out that so-called junk DNA plays critical roles in controlling how cells, organs and other tissues behave. The discovery, considered a major medical and scientific breakthrough, has enormous implications for human health and consciousness because many complex diseases appear to be caused by tiny changes in hundreds of gene switches.

As scientists delved into the “junk” — parts of the DNA that are not actual genes containing instructions for proteins — they discovered a complex system that controls genes. At least 80 percent of this DNA is active and needed. Another 15-17 percent has higher functions scientists are still decoding.

The result of the work is an annotated road map of much of this DNA, noting what it is doing and how. It includes the system of switches that, acting like dimmer switches for lights, control which genes are used in a cell and when they are used, and determine, for instance, whether a cell becomes a liver cell or a neuron.

There is evidence for a whole new type of medicine in which DNA can be influenced and reprogrammed by words and frequencies WITHOUT cutting out and replacing single genes.

Russian researchers’ findings and conclusions are simply revolutionary! According to them, our DNA is not only responsible for the construction of our body but also serves as data storage and in communication. The Russian linguists found that the genetic code, especially in the apparently useless junk DNA follows the same rules as all our human languages. To this end they compared the rules of syntax (the way in which words are put together to form phrases and sentences), semantics (the study of meaning in language forms) and the basic rules of grammar. They found that the alkalines of our DNA follow a regular grammar and do have set rules just like our languages. So human languages did not appear coincidentally but are a reflection of our inherent DNA.

The Russian biophysicist and molecular biologist Pjotr Garjajev and his colleagues also explored the vibrational behavior of the DNA. The bottom line was: “Living chromosomes function just like solitonic/holographic computers using the endogenous DNA laser radiation.” This means that they managed for example to modulate certain frequency patterns onto a laser ray and with it influenced the DNA frequency and thus the genetic information itself. Since the basic structure of DNA-alkaline pairs and of language (as explained earlier) are of the same structure, no DNA decoding is necessary.

This finally and scientifically explains why affirmations, autogenous training, hypnosis and the like can have such strong effects on humans and their bodies. It is entirely normal and natural for our DNA to react to language. While western researchers cut single genes from the DNA strands and insert them elsewhere, the Russians enthusiastically worked on devices that can influence the cellular metabolism through suitable modulated radio and light frequencies and thus repair genetic defects.

Garjajev’s research group succeeded in proving that with this method chromosomes damaged by x-rays for example can be repaired. They even captured information patterns of a particular DNA and transmitted it onto another, thus reprogramming cells to another genome. So they successfully transformed, for example, frog embryos to salamander embryos simply by transmitting the DNA information patterns! This way the entire information was transmitted without any of the side effects or disharmonies encountered when cutting out and re-introducing single genes from the DNA. This represents an unbelievable, world-transforming revolution and sensation! All this by simply applying vibration and language instead of the archaic cutting-out procedure! This experiment points to the immense power of wave genetics, which obviously has a greater influence on the formation of organisms than the biochemical processes of alkaline sequences.

This discovery also points to the significance of sound frequencies and vibrations in the origin of human life and the possibility that creation was generated by waves of consciousness. The Phantom DNA effect is a case in point: the energy field of a DNA sample remains detectable by laser light even when the physical sample is removed. At a fundamental level, man is pure energy. In Wave Genetics, the junk DNA functions at a rich infrastructure level of super codes and wave communication, realized in material form as crystalline structures–dynamic gene-holograms in liquid crystals of the chromosome continuum. What this model suggests is that the human gene is part of larger holograms (multiverse) of wave information reality. Hyper-communication, in the form of remote sensing, remote healing and telepathy, is definitely a part of the human protocol.

Scientists are aware that 97% of our DNA is, as they call it “junk DNA”. They call it junk because they don’t see that we have any use for it. Only 3% of our DNA is wrapped up in the spiraling double helix strand. During the time of the 75,000 year cycle when we are exposed to the most torsion energy waves and it affects our DNA by reorganizing the 97% “junk” DNA from a 2-strand double helix to a 12-strand helix advancing man in a leap of evolution.

Empty space is not really empty but filled with the invisible torsion wave energy at different degrees of concentration. Thus, as the stars and planets drift through the galaxy they pass through different concentrations in very exact intervals of time, with precise cycles that can vary in length from thousands to millions of years. As planets move through periods of high concentration of these torsion waves a transformation affects the DNA structure on the planet, which causes more highly evolved forms to more rapidly replicate than less evolved forms of life. Ample evidence of this is seen to occur through our fossil records which evolution has shown this to occur in sudden jolts rather than as a gradual process. The effect has been named as “punctuated equilibrium” by mainstream biologists.


My Comment to the Nuclear Regulatory Commission

We've arranged a global civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces. ~ Carl Sagan

I tried to get away from the news, and yet I was sucked into the internet today.  There was a 7.6 earthquake in Costa Rica that shook our building, making everyone sort of woozy.  Naturally, I had to spend half the day looking at reports, since the epicenter is near a dear place of mine.  Several of my friends live there.  Between earthquake stories, I checked out Enenews.com, the ultimate source of doom and gloom.  It's where you find stories like, Tepco Makes “Critical” Admission: Fukushima Unit 4 quake testing “does not take horizontal shaking into equation” — Claims it can withstand a “6+” quake only apply to VERTICAL shaking and Asahi: Tepco didn’t have enough cash to buy essential supplies as Fukushima reactors melted down and Asahi: White blood cell count spikes in Fukushima worker — 3,000+ may not have used dosimeters at plant just after 3/11, or in case you missed it, Appalling: Tepco admits there’s no prepositioned chemicals at Fukushima plant in event water drains from fuel pool after quake — “They had not even considered it”.  

When I read American Students Featured in Fukushima Propaganda Film: The cherries are delicious — The nuclear power plant has brought us together — “Fukushima is here, unchanged to this day”,  I thought I had entered the Twilight Zone.  One of the comments took me to http://nukeprofessional.blogspot.com/, which led me to the NRC website.  The only way I could relieve the pressure in my head was to share in the public comments section. My letter is awaiting moderation and has a zero chance of making it onto their website, despite all this talk of "Open Goverment".   I signed my name, as I always do.  I do not believe in anonymity to express my opinions and I use my real name whenever possible.  I don't care if a prospective employer checks me out.  (I cannot imagine for working for someone else again!)  I stand by my beliefs, even if it puts me closer to Obama's Enemy Kill List.  

Here is my letter.  There are no other comments on this blog post.

http://public-blog.nrc-gateway.gov/2012/08/22/taking-the-next-step-building-a-21st-century-digital-government/comment-page-1/#comment-15787

Kelly Thomas September 5, 2012 at 11:01 pm Your comment is awaiting moderation.

The only way you can “build a better platform to better serve the American people” is to create a platform that dismantles all nuclear power plants. How are you serving young Americans right now, the ones who will be forced to maintain these aging nuclear power plants once they have surpassed their lifespans? These plants have a 40 year shelf life, and most are expiring soon. Then what? Is it the responsibility of the next 5,000 generations to maintain these expired nuclear sites? How are you serving the American people by allowing these Extinction Level Event disasters to litter the American landscape?

Fukushima is already an ELE. Three of the cores have melted through the containment vessels and into the Earth. TEPCO says it doesn’t know where it is (NASA can easily spot it, but refuses to show the pictures to the public), but not to worry because they have everything under control and they have achieved cold shutdown. Never mind that pesky little leak or two and the Photoshopped image of #4, and please take our word for it that we are not killing the Pacific Ocean by continuously pouring waste to sea 24/7. Oh, and let’s pretend that the yellow cloud from the explosion at #2 was hydrogen, even though hydrogen burns white and the explosion was obviously plutonium. How sad that I know such a thing and the NRC doesn’t. We are facing the worst environmental disaster of humankind, a disaster that IS DESTROYING THE PLANET every passing second and the NRC is pushing for MORE nuclear power plants! Do you guys have a death wish? Do you believe that Armageddon is around the corner, so what the hell, why not build more nuclear power planters because the world is going to end in some Daliesque radioactive landscape? How can you possibly keep a straight face while putting forth such useless garbage touting Obama’s “Open Government” (the biggest oxymoron in the world)? Has the NRC been open about the effects of radiation? Of course not. You guys compare the radiation from Fukushima to that of a banana or a plane ride. You ignore the contamination of the US food supply. Meanwhile in Japan, THE CORIUM HAS MELTED THROUGH THE CONTAINMENT VESSELS! There is nothing underneath. The reactors are on a major fault line. This is a freaking disaster!!! If I can figure this out, why can’t the NRC? It is what is referred to as “The China Syndrome” to the common folk. The fourth building is near collapse and when – not if – it does, it is “Adios!” to life on this planet. It’s already happening. The entire Northern Hemisphere has been contaminated by radiation and almost all of the food supply has been affected. There has been a huge spike in stillbirths, miscarriages and mutant babies in the Pacific Northwest and California. Seals are dying. Insects are dying. Plants are changing. The United States is doused in radiation and Radnet is…I am not sure where it is. Last I heard it was checking quarterly instead of daily. We do not know what the true levels of radiation are because this government isn’t an open government, no matter what your PR rep wrote.

I do not now what the solution is, but I know that nuclear energy is not part of the equation. Your new platform must rebrand the agency as the Nuclear Decommissioning Agency – and do it soon before the US and Japanese economies and governments collapse, because then there is no one left to fund the decommissioning of these reactors. And god forbid solar activity takes out part of the grid and these plants do not have enough energy to maintain hot and/or cold shutdown. But you guys probably have a plan for this, some super secret plan that does not fall within the parameter’s of Obama’s “Open Government,” right?

If the agency, employees and the PR shills who work for them collectively believe that nuclear power is safe and Fukushima radiation is just bananas, then I propose you put your money where your mouth is and have a little field trip to Fukushima – without radiation gear or even Geiger counter (because nuclear energy is SAFE!) – and eat everything grown in the Fukushima prefecture. You can put it live on YouTube for all to see! Or you could even do Pay-per-view – I think that it would be a revenue maker for the federal government! I would pay to watch every GS-13 and up (if they are still called that) and every manager have a picnic outside of Fukushima Daichi (but don’t sit too close to #4 in case it collapses). Until you are willing to expose yourself to Fukushima (not that you haven’t been for a year and a half), I think you need to adopt a new platform that better serves Americans and dismantles all nuclear facilities.

I know that signing my name to this will likely put me on some sort of enemy list, if I am not already on one. So be it. At least the person in charge of my dossier will be forced to confront the reality of the situation and to realize just who it is they are serving when they collect their paychecks. At least I cannot be accused of committing a crime against humanity.