Saturday, December 12, 2009

Don't Blame Cows for Climate Change

Global warming (GW) is well proven complex of climate changes, which manifests mainly by increase of area of deserts and by glacier melting in latest 50 - 70 years.

Because thermal capacity of oceanic water is at least 5.000x higher, then this one of atmosphere, global mean temperature of atmosphere is much less relevant, then the global mean temperature of ocean.  And this temperature is still raising in accordance to increase of carbon dioxide concentration, despite the weak slow down of warming of atmosphere in recent decade.

Other phenomena, like increase of global temperature of atmosphere aren't so relevant for GW, because they undergo El-Nina / El-Nino cycle and solar cycle due the wobble of center of mass in Sun-Jupiter system (because mean solar cycle is faster then Jupiter period, it's probable, some other planets are affecting it, too). The center of mass controls the direction of global current of plasma beneath surface of Sun, which affects the frequency of sun spots (i.e. magnetic bubbles in plasma, raising to surface) and solar eruptions. The charged particles of solar wind penetrating the atmosphere of Earth are serving like condensation nuclei of fog and snow and they're making Earth surface more reflective.

The effect of ocean heating is not so straightforward. In general, it increases the number of convective cells in our atmosphere and frequency of their switching in similar way, like during heating of water in open vessel. Note that this change increases continental character of the weather, so that above continents the global warming may even lead to temperature records on both sides of temperature scale.
At the moment, when convective circulation switches from horizontal to vertical, an ice age period may occur, because Earth becomes intensively cooled. This is forced by hysteresis, because snow-white surface of Earth becomes more reflective at the same time. Just after cooling of oceans (which takes some time due their thermal capacity) the warm period is restored. Here are some indicia, the start of ice age can be very fast (compare the disaster movie Day after tomorrow) and period of fast paced global warming had preceded this event in younger dryas period, so maybe we are facing ice age soon.

Concerning the hypothesis of man made global warming, it's proven statistically, people are making weather warmer and drier on per week basis (1, 2), so no further evidence is necessary - we can just extrapolate these weekend fluctuations to decades of years.  During last warming periods the rise of carbon dioxide followed warming with delay of many decades with compare to present situation - so we can see this argument of many skeptics rather as another evidence for man-made origin of global warming. In addition, we can consider for example September 11, 2001 climate impact study. Measurements showed that without contrails, the local diurnal temperature range (difference of day and night temperatures) was about 1 degree Celsius higher than immediately before.

In my opinion, human activity started irreversible process, which couldn't be reversed so easily due the hysteresis described above. Nevertheless, we should save money from carbon dioxide taxes for faster research of alternative energy sources to replace fossil fuel as soon, as possible. This would be useful with respect to both prevention of ice age period, both prevention of another rise of carbon dioxide concentration. Carbon dioxide dissolves the shells of coral and plankton, thus destroys the fishing grounds and diversity of biosphere.

But the main risk of fossil fuel depletion is the global nuclear war for the rest of their sources. It's generally ignored, the reason of the recent oil & food price crisis was always lost USA war. These wars are very expensive and at the case of global nuclear conflict the things would get even way, way worse.

Concerning the rise of carbon dioxide, assigned to farming of poor countries, often neglected point is, many animals are able to collect proteins from life environment more efficiently, then the agricultural plants by using of solar radiation, because they can consume even the plants growing in wild, which people cannot. Which is the reason, why people in rain forests, deserts or arctic areas are feeded by meat preferably - the farming of moose is apparently more economical and therefore ecological(!) there, then the growing of plants.

For example, for production of rice it's required 2552 m³ of water/ ton rice, whereas for production of one ton of poultry 3809 m³ of water its required. Therefore the consumption of poultry may sound like ineffective waste of water for someone - but the content of proteins in rice is ten times lower, then in chicken meat! This explains, why people from deserts in Chad or Mongolia are living from pasturage, instead of agriculture. I even suspect, farming is more ecological then the agriculture as a whole, providing it doesn't use agricultural products (which usually does). Methane released by cows on pastures is negligible with compare to amount of methane, released by annual decomposition of plants without cows.

"Those who do not think about the future, do not deserve to have one."

AWT and cyclic evolution

This post is an reaction to recent study of thousands of species of plants and animals, which suggests, that new species may arise from rare events instead of through an accumulation of small changes made in response to changes in the environment. This is basically an emergent mechanism of AWT, so we can find many analogies of this finding in both social, both physical systems.

It's well known, human society is evolving "in circles", too. Many human inventions (Antikythera machine and gear mechanism), theories (plenum or Aether theory) or social arrangement (the constitution and voting systems) were forgotten and reinvented and subsequentially abandoned later again. We can see it as an analogy with spreading of energy in particle system, for example in ripples at water surface, which transforms gradually from longitudinal (Brownian noise) into transversal (capillary waves) and back into longitudinal (gravity waves) again.

A dual model consist in nested particle condensation: the effects of tiny density fluctuations are cumulating in emergent way, until it transforms into new homogeneous phase, in which density of fluctuations increases gradually until it forms nearly homogeneous phase in which... etc... Note, that we can observe a dark matter or supersymmetry phenomena here: the chaotic portion of quantitative models can ofter serve for formulation of new qualitative models and vice versa in simmilar way, like electric field transforms into magnetic one and back again during light wave spreading through vacuum.
This transform corresponds incident structure of Leaky Quantum Gravity (LQG) theory related to Möbius transform, which is isomorphic to restricted Lorentz group - so if Universe or black hole appears like doughnut, it becomes broken in symmetry (i.e. sliced) in recursive way of Möbius strip of Klein bottle. We can met with this geometry in Möbius strip structure of electron or inside of atom nuclei in structure of hadrons (see the previous post). In archetypal and sacred geometry we can met with this concept in Ouroboros archetype (based supposedly on Cordylus giganteus scink observations).

For example, mainstream physics is now in the phase, when the number of various theories developed in formal models in rather ad-hoced way increased above critical level, so that new meta-theories started to emerge. These meta-theories are now in their protoscience state like vague density fluctuations forming inside of dense gas - but we shouldn't throw the baby out with the bath water too soon, because the formal approach apparently reached its limits. We can call it an informational singularity, but I'd prefer rather gradualistic view of this process, in which phase-shifted boundaries of both formal, both non-formal approaches are fuzzy and they traveling together in duality like zone of crystallization through block universe (1, 2), or - even better - like the light wave spreading through vacuum.

The model of EM wave spreading is consistent with Red Queen Theory of co-evolution, based on constant evolutionary arms race between competing species. For example the high number of insects or deep sea species can be explained by adaptation to predators or parasites, which are specialized to its prey, where we can find many examples of co-evolution. During this genotypes oscillate over time in waves phase-shifted by their half-period, as if they were "running" in circles and informational event horizons are formed in accordance of paleobiological theory of punctuated equilibrium. It means, from local perspective we can see certain steps in gradualist evolution, which corresponds nested phase transforms in AWT. 
This model has its analogies in social systems, too. Frozen plasticity theory (article in Czech), which is based on game theory and selfish gene model considers, only portion of population can evolve freely, after certain time it becomes unable of further evolution and after catastrophic change it will not survive. This character of evolution, which occurs when natural conditions are changing fast (as the result of impact of asteroids, global volcanic or man-made fossil fuels burning activity) could be explained by reservoir of sleeping genes in so called the "junk" DNA", which are activated in poor life situation of organisms (i.e. infection by virulent agents), where horizontal gene transfers via RNA takes place.

The basic point here is, "junk" DNA is not junk at all, but it doesn't serve for production of proteins, but various RNAs, which are serving both like enhancers or suppressors of transcription of proximal genes, which are used by immune system for production of antibodies, for example. This is particularly because fylogenetic evolution is too slow to accommodate changes in life environment, represented by various infections and parasites. We can expect, at the very beginning of fast-paced organic life, whole genetic variability was represented just by RNA, as DNA is more advanced stuff. This flexibility can explain Lamarckian offspring of fast adaptation to large infections or environmental catastrophes.

Because these events can repeat, we can find many traits of cyclic evolution in repetitive occurrence of many genes observed in "junk" DNA. Recently, experimental results by Gariaev et al indicate that some, and perhaps important, aspects of genetic regulation are mediated at a quantum level (1, 2 - possibly via quantum mirage mechanism).

AWT and quark model of hadrons

Using precise data recently gathered at three different laboratories and some new theoretical tools, Gerald A. Miller, a UW physics professor, has found that the neutron has a negative charge both in its inner core and its outer edge, with a positive charge sandwiched in between to make the particle electrically neutral.

This finding can be explained easily by particle model of AWT, in which more energetic/massive down-quarks (3.5–6.0 MeV/c2) are concentrated bellow up-quark (1.5–3.3 MeV/c2) near the center of neutron, like inside of gravitationaly coupled Eefimov state of three massive bodies of different mass, predicted in 1970. We can consider it a quantum gravity effect at low scale - compare the AWT's knot model of neutron and proton. The same structure, just inverse one is relevant for proton, where uncompensated isospin charge of up quarks manifests itself by electrostatic charge at distance.

Efimov states exists on every dimensional scale, for example inside of hadrons and boson condensates or superconductors. Note, that Efimov trimer state becomes flat, when all particles involved are of the same mass - so its responsible for fractional Hall charge (quantum Hall effect) in thin layers of graphene or Hahn purpur BaCuSi2O6, where path of electrons is geometrically degenerated (frustrated) into flat structure by external magnetic field. Analogously, higher - just less stable/probable - Efimov state exist in four-body systems of boson condensates.

Efimov trimers are analogy of chaotic double pendulum, the rods of which are mediated by gravity force (N-body problem). It just illustrates the limits of formal math to describe even conceptually quite simple systems, which belongs into realm of Aether theory and must be solved by particle simulations in iterative/recursive way.

Tuesday, December 08, 2009

Does weak equivalence break down at the quantum level?

This post is motivated by PhysOrg comment of article Light-pulse atom interferometry in microgravity. It's surprising, how deeply scientists are surprised by fact, pair of dual theories (relativity and quantum mechanics) are inconsistent mutually. Especially if they know already, these theories are giving quite different predictions, concerning energy density of vacuum or cosmological constants. I mean different in more than one hundred orders of magnitude.

Weak equivalence is indeed violated by Casimir force, which is proportional to cross-sectional area of massive objects instead of their mass, so that equivalence principle of general relativity doesn't apply here / and no large speculations are required about it, question marks the less. 

This insight basically means, quantum scale begins at Casimir force scale, which roughly corresponds the wavelength of cosmic microwave background radiation (CMB), which roughly corresponds the size of transversal waves inside of human brain. Photons of CMB are manifestation of gravitational waves, which are of longitudinal character, so that their shielding resulting in Casimir force is proportional to cross-sectional area (compare the Duillier/LeSage theory of gravitation). Note that the violation of equivalence principle is manifestation of violation of dimensionality of 4D space-time, i.e. the manifestation of extradimensions and nonzero rest mass of photon at the same moment. This force is in fact supersymmetric effect of relativity, i.e. the quantum mechanics effect, too.

Friday, November 13, 2009

AWT and supersymmetry

This post is motivated by recent New Scientist article about possibility to validate string theory by observation of s-tauons, or another supersymmetric particles. But such interpretation is not exact, as 4D space-time superalgebra was first discovered by soviet physicists Yuri Gol'fand and E. Likhtman, who extended the Poincaré algebra into a superalgebra and discovered supersymmetry in four spacetime dimensions in 1970 (published in 1971) together with Akulov-Volkov (1971/72) independently to string theory. At the same year, in 1971 Pierre Ramond, André Neveu and John Schwarz develop a string theory with fermions and bosons and Gervais and Sakita recognized a version of 2D world-sheet supersymmetry in the new fermionic string theory, i.e. supersymmetry algebra in two dimensions. This led to Wess and Zumino rediscovering 4D supersymmetry in 1973 (consider this concise review of SUSY history). In general, superalgebra is spinor extension of quantum mechanics, which can be incorporated into whatever else quantum field theory including Standard Model and LQG (for example, Lee Smolin promoted it for advanced version of loop quantum gravity LQG II) - which effectively means, it cannot serve as an evidence of whatever particular theory, including large group of various string theories.

In context of AWT, supersymmetry is special stuff, it shouldn't be mixed with symmetry as such. We can observe it at water surface, just in quite limited scope. At the water transversal surface waves (so called capillary waves) are dispersing gradually, thus changing itself into longitudinal waves (so called the gravity waves - don't confuse it with gravitational waves, albeit they've similar nature in AWT). We can see, how undulation in one plane shears in complex way, until it becomes undulation in perpendicular complex plane. This rotation is closely related to Poincaré transforms in relativity and Wick rotation in quantum mechanics and Weyl spinors in Cartan composite geometry.

Whereas waves in one plane could be considered as a bosons, the another waves resulting from dispersion are fermions and vice-versa. Note that the dispersion is symmetric with distance scale around distance scale of CMB wavelength  - the very small waves of Brownian noise are longitudinal too! Therefore we can postulate general gauge group, which transforms bosons into fermions and vice-versa, infinitely on both sides of dimensional scale. In real world SUSY gauge symmetry remains broken heavily due the dispersion and subsequent lost of information, though - so we can observe only few members of it. More illustratively, you wouldn't see very much of longitudinal waves at water surface, while observing it via transversal waves and vice versa. 

If the surface waves couldn't disperse into density fluctuations of water, then the bosons and fermions (energy and matter carriers) would be destined to forever remain distinct. But in 1975 Haag-Lopuszanski-Sohnius theorem named after Rudolf Haag, Jan Lopuszanski, and Martin Sohnius  pointed out, that if one allows anticommuting operators as generators of the symmetry group, then there is possibility of unification of internal and space-time symmetries. Such a symmetry is called supersymmetry by now and it constitutes a large part of current research into particle physics. It means, SUSY is just another case of Aether dispersion phenomena at short scales.

In addition, supersymmetry gauge group is closely connected with E8 Lie group and famous Lissi Garret's E8-theory: Every energy wave, exchanged between pair of particles (i.e. density fluctuations of foam) is behaving like less or more dense blob of foam, i.e. like gauge boson particle. Every boson can exchange its energy with another particles, including other gauge bosons, thus forming the another generation of intercalated s-particles. After then the E8 Lie group solves the nontrivial question: "Which structure should have the tightest lattice of particles, exchanged/formed by another particles?".

In AWT supersymmetry could be based on idea, inside of gradient driven reality every gradient has its mass. When we pile a huge amount of lightweight particles, such pile would have a larger mass, then the simple sum of original particles, because it creates more pronounced gradient of mass density/space-time curvature along surface of resulting pile. The difference can be assigned to virtual particles, whose nature depends on the composition of original clusters. For example surface waves on large droplet of neutrinos will be formed by so called neutralinos. If we broke resulting cluster, we wouldn't find them in their individual state, as they evaporate into gravitational waves, i.e. tachyons. 

The same result follows from relativity theory as well, if we think a bit about it. From GR follows, every curvature of space has it's own energy density - this is basically, what Einstein's field equations are about. But as we know from E=mc^2 formula, every energy density can be assigned to its corresponding mass energy density, which should exhibit it's own additional gravitational field and resulting additional curvature of space. This idea can be applied ad infinitum onto resulting solution, which would make relativity recursively nested, implicit theory of geometrodynamics. The supersymmetry concept is just a small distance scale application of the above implicit property of general relativity. It could be demonstrated, analogous recursive principle can be applied to quantum mechanics too - and resulting fractal foam solution would be quite similar and forming general solution of quantum gravity.

Aether is not supposed to replace quantum and/or relativity religion. In addition, there are many concepts and models, which could be derived from Aether concept in a much more straightforward way, then just SUSY concept. It took me some time, when I realized, what the SUSY is all about. In this way, SUSY theory is achievement of mainstream physics - although Aether concept could help in its understanding substantially. SUSY theory has it's analogies even in context of biological sciences. Specialized parazites and predators could be considered as s-particles related to host organisms in process of energy dissipation. Whenever political situation changes, a new social layer of people emerges. These people are following newly formed gradient of energy density, thus blocking its further evolution, as they're playing for himself preferably. These conjunctural zealots have antigravity behavior and their fanaticism discourages another people, who could be interested about new idea - despite of how useful it could be. It's analogy of dark matter particles, which surrounds large particle clusters (strangelets) or galaxies, thus repulsing ordinary matter on behalf of antimatter

In AWT supersymmetric particles are surface waves of rather large dense clusters of ordinary particles, which are stabilized by their surface tension. For example, inside of neutron stars neutrons are stabilized against their decay into protons and electrons by huge hydrostatic pressure. But the same pressure exists inside of atom nuclei, which is behaving like tiny dense droplet. It's well known, inside of tiny water droplets high pressure exist due the surface tension of high surface curvature. In this way, the dense clusters of elementary particles can be stabilized against its decay in similar way, like inside of quark stars made of strange matter and they can merge with another particles of ordinary matter into another strangelets via avalanche like mechanism. IMO top quark Yukawa coupling used for Higgs boson detection, nucleons pairing inside of atom nuclei, observation of pentaquarks, glueballs and/or indicia of tetraneutron formation are all stuff of the same category and it could be attributed to SUSY. Recent spooky observation of muon pairs formation well outside of collider tube at FermiLab could serve as an indicia of formation of strangelet and/or s-muons as well. 

For example, dark matter is generally believed to be composed of so called WIMPs, some of which are supposed to be supersymmetric particles, predicted by SUSY. But such s-particles must remain very stable to be able to form dark matter - and we didn't observe them in accelerators yet. This is strange, especially in connection to arguments, LHC is indeed safe, because much more energetic cosmic rays doesn't form s-particles, too. Many people argument the risk of black holes formation in LHC by fact, cosmic rays can be way, way more energetic - and we still didn't find any trace of black hole during cosmic ray events yet. But this argument can be reversed easily. If the long-lived s-tau does exist, it should already have been found in secondary cosmic rays. It hasn't, so it probably does not exist - or the LHC safety argument is wrong...;-) Why we are expecting the formation of s-tau in LHC, after then?

Tauon particle is ultraheavy lepton, composed of pair of strange quarks (1/3 and 2/3 of electron charge). Analogously to muon, it could catalyze high temperature fusion of lithium and beryllium atom nuclei - so it was proposed for explanation of seemingly missing lithium problem in Big Bang model. This prediction means, if we collect sufficient amount of tau particles, the resulting cluster of tauons could survive for minutes, thus becoming strangelet. AWT proposes an explanation, based on dense droplet model of strangelet formation. Cosmic rays are always individual particles, mostly protons - whereas LHC jet is dense stream of particles, enabling pilling of particles and formation of microscopic black holes and strangelets. Energy density isn't the only criterion of strangelet formation here - the particle mass density and their collision geometry plays a significant role here too. BTW IMO there are better adepts for strangelet formation, composed of neutral and more stable particles, then just tauons (compare the recent observation of muon pairs in FermiLab, which could be attributed to s-muons). 

The problem with s-tau s-particle is, its strangelet should be very stable, being formed by heaviest dense leptons known so far - but the precursor (i.e. tauon) is extremelly unstable stuff. The optimal approach should balance the stability of both strangelet, both its precursors. From this perspective s-muon is a better candidate for SUSY detection and in fact it was observed already in form of spooky muon pairs well outside of collider tube on Tevatron before year - i.e. in simmilar way, like top-quark pairs in 2008, which could serve as an evidence of heavy Higgs. As we can see, formal theory is one thing - the understanding, where to look for its confirmation is another one. Interesting point of these extrapolations (predictions of postdictions) is, I'm foreshadowing the future interpretation of the past events. Couldn't it be an example of situation, in which future affects the past - which was predicted by some quantum theorists recently (compare the critique here)?

AWT connection of SUSY to strangelets brings another problem to popular dark matter models: WIMPS particles are forming surface waves of these droplets - so they shouldn't exist independently to these strangelets. If dark matter is full of WIMPS, it should contain many strangelets as well. Such models really exists for example in context of string theory: Randall-Sundrum braneworlds models considers existence of primordial microscopical black holes, which could play a role of strangelets here. But strangelets aren't very stable in general and currently the only stable strangelets known so far are atom nuclei.  So we can expect, dark matter contains atom nuclei in accordance to Alfen's plasma universe model and WIMPS models of dark matter are BS - or strangelets aren't related to AWT model in any way. As we can see, there's still a lotta strange concerns about SUSY. 

SUSY is quite general geometrical concept, which could be expressed in proverbs: "Ne quid nimis" (Nothing in excess) or "The road to hell is paved with good intentions" (El infierno está empedrado de buenas intenciones) and it has many social and political analogies. It means, when we advance in technology too fast, we can surpass our social and moral ability to handle it. After then the technology wouldn't help us - on the contrary. We should always ballance practical pros and negatives. In contemporary level of technology and life environment pollution we should orient into cold fusion or room superconductivity research ASAP, because it can save us from geopolitical crisis resulting from fight for remnants of fossil fuel supplies and consequences of global warming droughts. In this moment, LHC research is expensive and dangerous luxury, which can be achieved in much more safe and effective way in cosmic space. In addition, we could save money for vacuum pumps, refrigerators, magnets keeping particles at curved path, isolation against noise, etc... 

The stance of contemporary science seems to be quite irresponsible for me. Scientists are like children, who want their toy just now - although they've no idea, how to use it and how dangerous it really may be. We already collected large enough list of experimental evidence, we are rather close to point of spontaneous strangelet formation of many particle types from gluons or quarks to neutrons. In this way, the success of human civilization lead by mainstream science in SUSY detection could become its very last achievement in the same moment.

Friday, October 16, 2009

Can Time-travelling Higgs sabotage the LHC?

This post is motivated by recent paper by H. Nielsen and M. Ninomiya and related NewScientist article and New York Times essay, in which organized effort in finding of Higgs boson would be inherently predestined to become unsuccessful in laws of thermodynamics and quantum mechanics. Article proposes an explanation, why USA Congress stopped the funding for the USA's SSC in 1993, and why the LHC itself suffered an embarrassing meltdown shortly after starting up last year just by this aspect of time travel behavior. This story illustrates in such way, in contemporary science every nonsense can be promoted, providing its supported by formal math, thus evading the accusation from crackpotism, which obligued some formally thinking bloggers to vindicate this generally accepted difference between speculation and crackpottery. Anyway, as the result of ongoing discussion, arXiv has reclassified related papers to "less serious" General Physics section.

The problem of commonly used reasoning of physical models by abstract math and/or even computer simulations is indeed in violation of causal hierarchy, in which formal models are always based on predicate logics, not vice-versa. Therefore if underlying model is proven logically wrong, then the whole formal derivations based on it becomes wrong as well - as the destiny of some formally brilliant - though logically missunderstood models has demonstrated clearly (hollow Earth theory, geocentric model of epicycles, interpretation of luminiferous Aether model by Michellson-Morley experiments, etc..). In Aether theory Higgs model plays no significant model of casual background, because AWT assumes, there are infinitely many levels of space-time compactification, which manifests in real world by may complex high dimensional interactions inside of complex ecosystems, like Borneo jungle or human society. Constrained string theory models of twelve or twenty six dimensions cannot be considered as ultimate causal background of Universe from practical reasons, Higgs boson background of Standard Model the less, because observable world is apparently more rich and dimensional, then these models are considering.

In addition, Higgs model is too vague to be considered seriously, because it has more then single formulations: Higgs model in classical physics is based on different phenomena, then Higgs-Anderson model in boson condensates and its technical derivation consists in a mere reshuffling of degrees of freedom by transforming the Higgs Lagrangian in a gauge-invariant manner. Well known "hierarchy problem" implies, that quantum corrections can make the mass of the Higgs particle arbitrarily large, since virtual particles with arbitrarily large energies are allowed in quantum mechanics. Therefore in my opinion physicists are just mixing various concepts and mechanisms mutually at each level of physical model derivation from phenomenological to formal one, which leads effectively in prediction of many types of Higgs bosons of different rest mass and behavior, thus making such hypothesis untestable.

We are facing this conceptual confusion clearly at the moment, when mainstream physics presents some discrete predictions about Higgs boson. Each particle that couples to the Higgs field has a Yukawa coupling, too. The mass of a fermion is proportional to its Yukawa coupling, meaning that the Higgs boson will couple to the most massive particle. This means that the most significant corrections to the Higgs mass will originate from the heaviest particles, most prominently the top quark. From Standard model follows, the product of Higgs boson Yukawa coupling to the left- and right-handed top quarks have nearly the same rest mass (173.1±1.3 GeV/c2) like those predicted for Higgs boson (178.0 ± 4.3 GeV/c2). We can compare the way, in which Higgs is supposed to be proved and detected at LHC:

And the way, in which formation of top-quark pairs was evidenced and detected already at Fermilab:

Because the observation agrees well both in Higgs mass, both in decay mechanism expected, it basically means, Higgs boson was observed already as a dilepton channel of top-quark pairs decay and no further research is necessary, investments into LHC experiments the less from perspective of evidence of this particular Higgs boson model - which indeed falsifies the above hypothesis of Nielsen & Ninomiya as well. Of course, conflict of many research interests with needs of society keeps these connections in secret more effectively, then every model of time-traveling Higgs thinkable can do. In another way, physicists didn't recognize the duality of heaviest particle of matter (top quark) and Higgs boson in similar way, in which they didn't recognize the duality of most lightweight photons and gravitational waves at the opposite side of energy density spectrum.

This stance is nothing very new in contemporary physics, which often looks for evidence at incorrect places, while neglecting or even refusing clear evidence from dual view of AWT. We can compare it to search for event horizon during travel into black hole, while it's evident from more distant/general perspective, we crossed it already. The "unsuccessful" research for luminiferous Aether, while ignoring dense Aether model is the iconic case of this confusion, but we can find many other analogies here. For example, scientists are looking for evidence of Lorentz symmetry violation and hidden dimensions by violation of gravitational law, while ignoring Casimir force, or they trying to search for gravitational waves, while filtering out noise from detectors, just because they don't understand their subject at transparent, intuitive level.

Apparently, additional cost of research and general confusion of layman society is the logical consequence of this collective ignorance, while it keeps many scientists in their safe jobs and salaries in the same way, like mysticism of Catholic Church of medieval era - so I don't believe in comprehension and subsequent atonement in real time.

Monday, October 12, 2009

Rachel Bean: GR is probably (98%) wrong

This post is motivated by recent finding of Rachel Bean, who found, various WMAP, 2MASS, SDSS, COSMOS data concerning the Sachs-Wolfe, galaxy distributions, weak lensing shear field, and the cosmic expansion history doesn't fit general theory of relativity (GR for short). The reactions of Sean Carroll and/or Lubos Motl are careful, as someone may expect : "well, this could be challenging - but probably irrelevant, because GR has proved itself so many times, but the science should care about such details, mumbojumbo..."

Jeez - but how GR was derived before eighty years? This theory puts an equivalence between curvature of space and spatial distribution of energy of gravitational potential, as borrowed from Newton's theory (because we really have no better source for function of gravitational potential with distance, then the forty years old gravitational law). So, if we know the mass of object, we can compute the spatial distribution of potential energy, so we can compute the spatial distribution of space-time curvature - end of story (of GR). Or not?

Not at all, because from the very same theory follows, energy density is equivalent to mass density by E=mc^2 formula - so we are facing new distribution of matter in space, which should lead into another distribution of space-time curvature and energy of gravitational potential curvature, which leads to another distribution of matter, and so on - recursively. Such implicit character of GR was never mentioned in classical field theory of GR and corresponding textbooks - so it's nothing strange, it violates all observations available by now. But it's still prediction of GR postulates and it fits well with fractal implicit character of Universe and AWT - it just requires to derive Einstein's field equations more consequently and thoroughly.

Wow, this could be really breakthrough in physics and challenging task for new Einstein - or not? Of course not - and here we come to real problem of contemporary science - because such approach is fifty years old already and its even used in dark matter theory, in fact. Such modification would lead into quantization of gravity and longly awaited quantum gravity - the only problem for formally thinking physmatics is, it brings a quantum chaos into ordered world of formal relativity too, as there is (nearly) infinite number of ways, how to derive it - and all ways are still only approximations of real situation. The names like Cartan, Evans, Heim, Yilmaz, J. Bekenstein or Rudi V. Nieuwenhove are all dealing with this approach in less or more straightforward form - but this cannot change thinking of incompetent, though loudly blogging people, who invested two or more years of their life into learning of GR derivations, until they become "productive" with it (as measured by number of articles published) - so now they simply have no time and/or mental capacity to understand something new, to extrapolate the less.

Of course it's not just a problem of few desoriented bloggers, but inertia of whole mainstream community, the size of which prohibits introduction of new ideas and which has chosen formal approach to classical theories as a salary generator for their safe life. In this way, every new idea or derivation is simply forgotten, until it's revealed again in another, slightly different connection, when everyone appears surprised, how is it possible, GR isn't working properly?

Tuesday, September 29, 2009

AWT and gravitational waves

Concept of gravitation waves (GWs) belongs into subjects, where AWT can bring a substantial insight and testable predictions immediately even from pure qualitative point of view. This is particularly because GWs were subject of controversy from its very beginning before more then fifty years. Even Albert Einstein didn't believe in both black hole concept, both GWs very much (1, 2) and now, after seventy years we still have no direct observational evidence of both phenomena. Who is responsible for it - or isn't whole truth a bit different? AWT proposes a simple explanation of this paradox, following analogy of vacuum with water surface, where transversal light wave corresponds the surface wave and GW's are corresponding the longitudinal waves spreading through hidden dimensions of underwater. This analogy explains too, why we didn't observe gravitational waves already, while ignoring CMB noise in the role of GW's.

This post was motivated by recent articles, announcing null result in search for primordial GWs at frequencies around 100 hertz on LIGO and VIRGO (1, 2, 3). These GWs should be created during the first instants of the Universe's existence. It's somewhat surprising -but still logical with respect to common disbelief in Aether model - that while microwave photons of cosmic microwave background (CMB) were correctly recognized in famous expanding balloon analogy as a remnant of primordial gamma ray photons, covering surface of our Universe expanded during inflation, the same photons weren't considered primordial GW's collapsed during inflation from supersymmetric perspective of AdS/CFT duality.

Such duality points directly to equivalence of primordial photons and gravitons which was transformed into duality of CMB photons and gravitational waves. It means, primordial gravitational waves searched are just the tiny density fluctuations of vacuum density, responsible for CMB noise, which was laboriously filtered out from signal in LIGO/VIRGO detectors! This is funny situation, indeed. The equivalence of CMB photons and gravitational waves could be understood as a manifestation of GUT too, because Big Bang theory is assuming, during Universe formation all interactions were merged into single one. So it's not so strange, we are observing both primordial gravitons, both primordial gravitational waves like primordial photons, i.e. like microwave background (CMB) - every other result would violate GUT.
There are at least three levels of dumbness:
  1. First level is to spend money while trying to find artifacts, which cannot be observed by their very definition. But this is basically, what the research is about by R. Feynman ("Research is what I do when I don′t know what I′m doing".)
  2. Second level is to spend money while trying to find artifacts, which everyone can detect in his TV antenna (i.e. background noise in GWs detectors).
  3. The third level of stupidity is to intentionally ignore / filter / wipe out just these results of observations, which are supposed to be detected.
It's not dumbness of scientists involved though, because these scientists aren't spending their money: these money are going from our taxis. It's just our dumbness.

Gravitational waves are deformations of space-time curvature, i.e. they're manifesting like density fluctuations of space. They shouldn't be confused with CMB photons - CMB photon is component of density fluctuation, which propagates in light speed, but in short distance only. Gravitational waves forming CMB noise at human scale are dual to gravitons at Planck scale: we can say, they're gravitons expanded during big bang in similar way, like CMB photons are form of gamma radiation expanded during inflation of early universe. In addition, GWs shouldn't be confused with gravity waves, which are product of gravity and EM interaction coupling in material environment. As a direct analogy of GWs in AWT model of water surface are underwater sound waves, which are spreading in extra-dimensions with respect to surface waves. Due the low compressibility and high density of water such underground waves can be observed at water surface only during the most intensive explosions, like at the case of underwater nuclear explosions (compare YouTube videos 1, 2, 3). Because sound spread in higher number of dimensions, then the surface waves, it suffers by faster dispersion with compare to surface waves.

Inside of vacuum - which is much more dense phase of Aether, then the water - such effects are even much more pronounced and gravitational waves are dispersing there at the distance scale corresponding the wavelength of CMB (~ 2 cm). As an evidence of GWs dispersion can serve Casimir force, which can be detected at micrometer scale and its distance dependence corresponds the longitudinal wave shielding in six dimensions, thus violating equivalence principle of general relativity, being proportional the area, not by inertial mass of objects. Shielding effect of gravitational waves from CMB manifest even at cosmological scale as a anomalous deceleration of Pioneer and other space-probes (1, 2), which violates equivalence principle, too.

From analogy with underwater wave spreading follows, GWs are way way faster, then the light waves: they would be able to cross whole observable Universe in a moment in the same way, like light wave is able to cross black hole of average size. So GWs are violating causality of information spreading mediated by light waves (radiative time arrow) and they're inherently chaotic, so they're interacting with chaotic matter only, i.e. boson condensates (compare the gravitational waves reflection and shielding during Podkletnov anitgravity experiments with rotating superconductors). Being tachyons, gravitational waves are expected to be primarily responsible for entanglement and "action at distance" phenomena of quantum mechanics.
Highly dimensional character of gravity interaction is the main reason, the intensity of gravitational waves decreases a much faster with distance, being dispersed by membranes of quantum foam, because they're spreading across quantum foam bubbles in longitudinal waves. This effectively means, gravitational waves are of dispersive character, so they cannot be observed at distance even et the case of quite energetic events, like at the case of black hole and pulsar merging. This doesn't mean, dispersion of energy doesn't exist here so we can still consider theory of binary pulsar changes relevant to gravitational radiation.

While AWT explanation of GWs is quite simple and intuitive, general disbelief in Aether concept prohibited scientists to think in such straightforward way for many years, until recently some of them changed their opinion in relation to brane world paradigm and so-called holographic principle. Accordingly, some superluminal GWs models were presented in peer-reviewed journals (1, 2) recently together with observation of random changes in level of gravitational noise, which is attributed to holographic model of GWs (so called the holographic noise). Note that holographic projection at Universe scale requires superluminal speed of GWs for to be able to work at all, thus violating general relativity theory of GWs from its very beginning. But it's not quite clear for me, why just the noise was considered here. If we found a harmonic gravitational wave, it could be interpreted as a part of giant holograph as well. In this way, the finding of gravitational noise appears rather invariant to holographic theory for me and it can still have a more robust and consistent explanation in context of AWT.

Steven Weinberg: "An expert is a person who avoids the small errors while sweeping on to the grand fallacy."

Sunday, September 20, 2009

AWT and Big Bang theory.

In AWT the relevance of Big Bang theory is closely related to the concept of "Observable universe" and the "edge of observable space-time", which depends (as everything in AWT) on duality between insintric and exsintric perspective. According to the theory of cosmic inflation and its founder, Alan Guth, the entire universe could be (at least) 1023 to 1026 times as large as the observable universe, which roughly correspond the speed of gravitational waves propagation through observable Universe.

From insintric perspective we can apply principle "Similia simillibus observatur" (only things of similar nature can interact mutually in observation) - so we can consider, Universe is composed of many different combinations of Aether states, whereas we are composed from limited number of such states, being only part of Universe. If the probability of occurrence of our particular combination of states decreases with distance, then the probability, we could interact with the rest of Universe decreases with distance as well. Therefore we can observe "edge of space", but we cannot reach it, because we would evaporate first in unfriendly ("hot") vacuum around it.

This is basically anthropocentric "black hole model" of Universe formation, which introduces evolutionary absolute reference frame: if we evolved in certain part of Universe, it's because, the conditions were relatively favorable for us here and if we would travel outside of from this pretty place, we can only face problems there. It's evident, every surface of matter, like hot star or black hole forms such natural boundary of observable Universe for us and it's even possible, matter in remote galaxies evolved into exotic forms of matter, which would annihilate (i.e. explode) less or more completely in direct contact with us.

Exsintric perspective of Universe is less real but more optimistic and it has no apparent boundary: the probability of our particular combination of states decreases, but the number of new combinations increases even faster with distance, so we can always find a some friendly combinations of states there. Until space-time is formed by "friendly combinations" of states, it means, we are observing free space, so we can travel through it without apparent limit and danger - but there's no certainty, we aren't just moving in circles during this, because concept of distance and free space is tautological here. From AWT follows, real appearance of Universe is a formed by nested foam mixture of insintric and exsintric perspective: there are many places, where we can finish in our travel prematurely (planets, stars and black holes) - but in general Universe appears as an empty free space due the Olbers paradox.

Olbers paradox is a consequence of Cosmic Microwave Background Radiation (CMBR), which is behaving like subtle density fluctuations of space (i.e. gravitational waves) violating Lorentz and Poincaré symmetry, which results in gradual dispersion of light into hidden dimensions of space-time in analogy to spreading of splash ripples at the water surface. This analogy follows directly from concept of dense Aether of AWT, but it's not completely new, as it was proposed independently by James Clifford Cranwell between years 2001 and 2007. Mr. Cranvell's concept was bright, but it illustrates clearly, without active (self)promotion no idea has chance to become famous in fast expanding Internet space, if it has no immediate usage for sufficiently large group of people - no matter how brilliant it is. Unfortunately Mr. Cranvell didn't recognize the power of dense Aether concept as such and he turned his attention to derived concept of omni-directional expansion of space-time, which is apparently more abstract and it can be interpreted in dual way by AWT.

For observer of transversal waves at water surface the speed of ripples increases fast with distance (they change into gravitational waves), which can be interpreted from the perspective of this observer as an omni-directional expansion of space-time with distance. The same situation is relevant for observation of distant parts of our Universe via light waves.

At sufficient distance from observer wavelength of transversal waves ceases to zero, which can be interpreted by this observer as an initial singularity of Big Bang theory, or like surface (event horizon) of black hole (cosmic void), which observer occurs. From AWT follows, these models may be of certain relevance from perspective of numerical models - but they're all invariant to observer localtion. We can compare this situation to observation of landscape under haze - visibility scope will be limited by dispersion of light, but it will move accordingly to actual location of observer.

Because this effect is apparently nonlinear, we can explain dark energy phenomena in the same way. From outside we can see, these ripples get more dense gradually, until they disperse completely. The same effect appears at cosmological scale like slowing the speed of light in dense environment, surrounding every source of radiation, so it can be used for explanation of cold portion of dark matter and its connection to Hubble constant as well. Whereas from perspective of insintric observer, Universe appears like we would travel through density gradient, forming event horizon of black hole (some string theorists are talking about "throat of dark brane" in this connection).

From more general view of AWT this evolution is just an illusion, which follows from insintric observational perspective inside of fractal Aether foam, because we are connecting the observation of microscopic scale with the future of Universe expansion, the past with observation of vast cosmic space. Evolution of large objects is the more slow, the larger these objects are and Universe as a whole doesn't really evolve at general scale. The only question is, whether such general perspective is achievable for human creatures by experiments, i.e. by different way then just by pure human imagination.

Nevertheless, from AWT follows, Big Bang theory would suffer by fundamental problems in near future, which are of both of practical, both philosophical (ontological) nature. Observational problem with Big Bang is, we can observe well developed and separated galaxies in the Hubble ultra-deep field, when the Universe was just 2 - 3 billions of years old. And Milky Way galaxy is more then ten billions of years old, so that these ancient galaxies have not enough time to separate and develop. This conclusion was supported by recent observation of well developed galaxies (with very low speed of star formation) in ultradeep field of refurbished Hubble telescope in contrary to standard cosmological model based on Big Bang theory, in which star formation couldn't occur during dark era of universe formation.

Ontological problem of Big Bang theory is, it brings more questions, then answers - not saying about the problem of initial singularity. It can explain red shift, but it cannot explain dark energy. And it requires inflation, which appears like ad-hoced concept from contemporary perspective, although it can be interpreted as a phase transition of Aether easily and it can be reconciled with ekpyrotic cosmology in such way. But from general perspective it seems, all these models are just plural result of dispersive nature of Aether environment. The analogy of Universe evolution with stellar and galactic evolution is just apparent, because from very general and remote perspective Universe behaves like atemporal stuff (perceived mass/energy density of Aether increases ad infinitum).

Monday, September 14, 2009

Michelson–Morley experiment is best yet

Physicists in Germany have performed the most precise Michelson-Morley experiment to date, confirming that the speed of light is the same in all directions. The experiment, which involves rotating two optical cavities, is about 10 times more precise than previous experiments – and a hundred million times more precise than famous Michelson and Morley's 1887 experiment (MMX).

This zero result still cannot be interpreted as an absence of Aether environment for light spreading (so called luminiferous Aether), though. This is because no (motion/reference frame of) environment can be detected by its own waves locally, and nothing strange is about it. Whether we can observe water surface by water surface waves? Indeed not - with respect to these waves water surface is just a void, empty space. If we would observe something, it would be obstacle for surface waves, but not environment anymore. This is a trivial geometrical insight, independent to character of light wave and or even Aether theory validity: no local object can be observed from insintric and exsintric perspective at the same moment, i.e. locally.

This still doesn't mean, such environment doesn't exist from nonlocal and/or higher-dimensional perspective, mediated by tachyons, like gravitational waves, for example. And because every laser is of finite length, we should observe nonlocal effects, too. The true is, due the quantum phenomena no object is completely local and for microwaves of cosmic microwave background radiation (CMB) Lorentz symmetry would remain fulfilled even at the case, when these objects would remain relatively large - in distance/size range of CMB wavelength (~2,64 cm at 2,73 K black body temperature), which roughly corresponds human scale (size of brain waves). Therefore for CMB Lorentz and Poincaré symmetry remains violated only at the cosmological scale (Doppler anisotropy of CMB).

For photons of observable light we should detect gradient of CMB photons density (dark matter effect, dual to Casimir force) and Lense-Thirring effect (frame dragging effect) in proper orientation of laser perpendicularly to gravitational field gradient. The later effect will lead to antigravity effects at speed higher then 0,57 c, thus switching the sign of dark matter effect. Such insights renders warped space around moving Earth in rather complex way. The nonzero rest mass of photons would complicate result of MMX even more for light of shorter wavelengths. For example gamma ray photons are insintrically slow and they could be outdistanced even by lightweight part of observable matter (aka neutrinos), whereas photons of longer wavelengths, then the CMB are effectively tachyons and they undergo fast dispersion in CMB field.

Sunday, September 13, 2009

Multidimensional character of emergent perspective

This post is just a copy of few silly comments to ongoing discussion about concept of minimal length in quantum gravity and Lorentz symmetry violation. AWT enables to separate the subtleties of particular quantum field theories from general problem consisting in subconscious mixing of insintric and exsintric perspectives.

In AWT Lorentz symmetry(LS) is direct consequence of observational perspective. When we are observing low dimensional space (like 2D+1T water surface) from strictly 2D+1T perspective, LS is indeed maintained. When we are observing the same situation from higher dimensional perspective, LS can be violated and nothing very special is about it. AWT stance is, every space-time is completely homogeneous from its own perspective by definition and its LS cannot be violated. At the moment, when we are discussing some homogeneities in it, we are applying higher dimensional perspective, which enables LS to become violated. "An Outside View" is always of higher dimensionality, then insider's view, so its LS can be violated by definition. If it wouldn't, we couldn't distinguish it from inside view, after all.

In AWT concept of minimal length doesn't exist from global perspective, because even the tiniest density fluctuations can be formed by some more smaller ones without apparent constrains. But there exist limit in observability of smallest density fluctuations from perspective of larger density fluctuations (like humans) or instrumentation, which was used for their detection. Aether fluctuation at particular dimensional scale cannot interact with fluctuations at all remaining dimensional scales directly in accordance to principle "Simillia simillibus observatur". If we would use more sensitive/large apparatus, the limit of fluctuations on both sides of dimensional scale will increase accordingly and we would observe our Universe larger and quantum fluctuations smaller - but some general limit still persists here.

The philosophical question is, if such dimensional scale is real for people, because it's always interpreted by apparatus. Science answered such question positively already from the time of Galilei and van Leeuwenhoek. The main gnoseologic problem is, outside perspective remains undetectable from insiders, so we are always talking about somehow abstract phenomena, which can be proven by higher dimensional emergent approach only, i.e. by coincidence of two or more indirect evidences - but not by direct observation. Whole evidence of emergent Aether concept is about it, after all.

It should be pointed out, the existence of space-time at sub-Planck scale (i.e. existence of "subminimal length") lies outside of observational perspective scope of insiders too, so we are relating existence of one unprovable phenomena (Lorentz symmetry violation) by existence of another one (sub-Planck length).

Proclamativelly rigorous people, who are working with insintric perspective preferably can say easily, both ideas are BS - whereas other people, who knows, that more is different and really is different and how emergent phenomena are working, can expect, combination of two or more undetectable phenomena (assumption) could still lead to new observable (i.e. testable) predictions, thus fulfilling utilitarian perspective of further evolution. After all, renormalization procedure is quite similar approach based on emergence, because its extrapolating singular function by pair of their derivations from both sides of divergence. In this way, modern physicists just replaced wide-scale philosophical extrapolations by these less-visible formalized ones.

Saturday, August 29, 2009

Continental Europe bans USA invention

Starting from Tuesday, September 1st, 2009, European Union is banning the production of incandescent light bulbs above 80 Watts in a bid to introduce compact fluorescent models, widely known as energy-savings bulbs. In 2012, only "efficient" light bulbs will be allowed and by 2016, they want to ban even the halogen lamps. EU contend that the average family will save $64 per year on electric bills, and carbon emissions could be cut by 15 million tons. On the flip side, some 3,000 jobs could be lost since most incandescent bulbs sold in Europe are made in the region, while the fluorescent variety come from elsewhere.

This can be perceived as temporal victory of energy over matter, as the compact fluorescent models are five to seven times more energy efficient, then incandescent light bulbs. But this balance can be easily reversed in near future, because fluorescent lamps are more demanding on irrecoverable sources in form of rare earth elements (REEs), used in luminophore production. 95% of output production of rare earth elements comes from China and China is now considering a ban on certain rare earth elements. The solution may be organized recycling of these luminophores or the replacement of rare elements by another ones or increased usage of LED-based sources for illumination. This example illustrates, the replacement of power hungry solution is always followed by increasing consumption of material sources, thus demonstrating universal matter-energy duality.

Because younger son of Czech president Vaclav Klaus is top manager of CEZ, main energetic company of Czech Republic, his wife, economist Livie Klaus was member of the CEZ supervisory board until 2002 and another son got four million euros donation from CEZ for his private school last year, it's logical, Vaclav Klaus himself is well known lobbyist of CEZ company and promoter of energetic dependence of Czech Republic to Russian fossil fuel import. Therefore it's very not surprising, Vaclav Klaus boycotts environmental politics of EU and he is openly promoting the consumption of energy hungry incandescent light bulbs in public.

AWT and cosmological time arrow

Contemporary physics distinguishes many time arrows, which are related mutually, so that no discussion about time can have a deeper meaning without specification of particular time arrow. Contemporary physics handles no generally acceptable model for these time arrows and it doesn't understand time concept from general perspective. In particular description of relativistic space-time from macroscopic perspective remains separated from description of time at microscale, where thermodynamical time arrow is applied. Without consideration of concept of Aether emergence these time arrows cannot be reconciled at predicate logic level.

In AWT most general time arrow is so called cosmological time arrow related to omnidirectional Universe expansion, which is manifestation of dispersive character of energy spreading. While the general understanding is, Universe is facing thermodynamical death, it's not true at all as such conclusion is observer dependent. It's observer, not a Universe, who suffers by entropic processes and entropy of Universe as a whole remains constant from exsintric perspective. Thermodynamical death of Universe is just a consequence of Simillia simillibus observatur principle.

We cannot neglect fact, one half of Universe evaporates and separates by antigravity (radiation pressure), while the second one agglomerates by gravity. In AWT boundary between insintric and exsintric observational perspective is divided by observer distance scale, which corresponds to wavelength of cosmic microwave background (CMB scale) at 1.73 cm. Above such scale thermodynamic time arrow for material object becomes reversed and driven by gravity. So what we are observing are two thermodynamical processes separated by CMB/human scale into exsintric and insitric perspective. Material objects, which are large then 1.73 cm tends to agglomerate in their gravity field into larger ones. This is essentially negentropic process, related to inverse time arrow, whereas object smaller then CMB photons are evaporating into radiation, whis is indeed common entropic process. For particles of energy the whole situation remains reciprocal: large photons are dissolving like tachyons in CMB, while smaller one are condensing into solitons, i.e. material particles. No process violating CPT symmetry was observed so far.

If we consider material particles as the only observable part of Universe, thermodynamical time arrow becomes dual for 3D space-time, so we can propose more general, cosmological time arrow, which is independent to entropy of Universe (which remains the same in this case), but it's defined by the combination of the above processes. At the moment, when we would observe separation of large objects while the smaller ones would condense, we could say, not just thermodynamical, but cosmological time arrow gets reversed, too. It corresponds the propagation of observable objects across space-time brane, composed of mutually interacting gradients of Aether foam density, so that entropic processes are always balanced by these negentropic ones. From macroscopic (the past) or microscopic (the future of space-time expansion) perspective Universe is behaving like randomly undulating Aether gas, where formation of density fluctuations balances their dissolution and Universe appear atemporal, albeit it's still full of random motion. Every time arrow observed is therefore a local effect only.

While thermodynamical time arrow appears broken above CMB scale for material objects, it's just an effect of inverse geometry, in which dispersion of information occurs. Thermodynamical time arrow still remains valid here due the dispersive nature of energy spreading. In AWT gravity is just thermalization as being observed from exsintric perspective of Le-Sage Aether model. All forces are of dispersive nature, which is behaving like shielding Duillier-LeSage force from dual perspective (dual force to gravity is pressure of radiation, i.e. the only force, which can defy gravity). But we are observing universe from both perspectives, so we shouldn't omit gravity when talking about entropy from general perspective. Note that near CMB scale inverse square law for gravity becomes violated due the cosmic microwave radiation and gravity becomes effectively repulsive force bellow this scale, because of prevailing pressure of CMB radiation. This manifests by violation of equivalence principle in 3D and by weak deceleration assigned to dark matter. Again, it's just an result of perspective inversion - as a finite size fluctuation of Aether we're observing the same Le Sage gravitation "from inside". From local perspective time arrow and sign of gravity are related mutually by trivial projective geometry of mutual interactions of density fluctuations via transversal waves.

Monday, August 24, 2009

AWT and GRB090510 photon controversy

This story was discussed extensively on Bee's and LuMo's blogs. In brief, recent observation of very remote (12.8+ Glyrs) gamma ray burst GRB090510 observed by Fermi observatory (former GLAST satellite) was followed by another lone gamma ray photon of extraordinary high energy (31 GeV), detected be terrestrial observatory MAGIC (Major Atmospheric Gamma Imaging Cherenkov) in the same moment (six seconds window of the whole three minute burst).

While string theory (ST) is based on LS, this result was interpreted by Motl as a confirmation of ST, although in fact it confirms the validity of one of string theory postulates only. Although gamma ray dispersion was considered as one of main tests of quantum gravity theories and the picture bellow was presented in many places, LQG theory in its current state of development has nothing to say very much to this result, because it maintains LS in 3D in the same way, like string theory, as Lee Smolin explained. This picture is a snapshot of article in Scientific American, 59: "Atoms of space and time", which is definitelly worth reading by itself - but it contains desinformation, concerning LQG predictions.

Phenomenological explanation of this controversy is simple in AWT and it's based on the fact, LS is valid only for strictly 3D space, whereas cosmic space is filled by gravitons expanded into gravitational waves during inflation, i.e. tiny density fluctuations responsible for cosmic microwave background (CMB). Therefore cosmic space isn't completely "flat" and it contains "traces of higher dimensions". While LS is indeed valid for all higher hyperspaces, their projection into 3D space isn't invariant with respect to LS anymore. From AWT follows, only microwaves can propagate through vacuum like harmonic wave, thus fulfilling LS at long distances, while longer waves are propagating like tachyons and shorter waves are always composed of photons, which are propagating by subluminal speed. This dispersion can be observed in GZK limit for gamma ray photons and it manifests by delay for gamma ray photons during weak (short distance) gamma ray bursts, like MKN501 event, observed last year. In accordance with this explanation, the dispersion of more close gamma ray flashes is usually much more pronounced, then at the case of these remote ones.

As a pronounced example of light dispersion during spreading through compactified gradient can can serve Hawking radiation of black holes, which can be interpreted like light escaping from glass sphere. Long wavelengths and gravitational waves can penetrate it freely, whereas shorter wavelengths are reflected back again by total reflection mechanism

The reason, why GRB090510 burst (and some others, like GRB 080916C from September 2008) didn't exhibit a pronounced dispersion consist in point, such bursts were very remote and as such they were quite energetic - an energy corresponding mass of Sun is released in form of gamma photons in brief moment! In AWT dynamic mass of photons manifests by real mass with gravitational effects, not just a combination of momentum and kinetic energy, as presented by mainstream propaganda. This is because in AWT photon has a nonzero rest mass, albeit quite minute one. Therefore the gamma ray burst propagates through vacuum like dense cluster of photons, tied their own gravity, or like soliton, similar to vortex rings, which can propagate through fluids and gases without dispersion.

The slowing of ligth in boson condensates is based on the fact, EM wave is spreading in form of much heavier electrons, which are revolving around atoms for most of time. Neverthelles the appearance of single photons in vacuum should correspond the appearance of solitons (quantum vortices) in boson condensates.

Because wave packet cannot contain only fraction of wave (it would violate the eigenvalue postulate of quantum mechanics) both wavelength of light, both size of wave packet is compressed in the same way and the shape of vortices in boson condensates is pretty close to appearance of real photons in vacuum - it's just greatly reduced in scale along path of direction.

The cluster of photon ("photoball") is analogous to glueballs, known from weak force scale and it can serve as a prototype of heavier elementary particles. It's formed by dense swarm of photons, where the most energetic and heaviest photons are propagating at the center, while these lightweight ones are revolving center of soliton along substantially longer path, which corresponds the segregation of matter by particle density at the case of massive bodies. This could have testable impact to the distribution of energies along time axis of gamma ray flash: heavy photons should appear at the center of Gaussian curve, representing the gamma burst observation.

Because photons influence mutually at distance in this model, it may be even possible, the lone photon observed in GRB090510 was actually trapped into gamma ray flash during its travel through wast cosmic space, or it could serve as its condensation nuclei in similar way, like particle of dust enables molecules of water to condense into droplet. It would mean, the occurrence of photons of unexpectedly high energy density inside of gamma ray flashes isn't accidental at all and such model leads to another testable predictions concerning gamma ray photons distributions. The "snowball" mechanism of avalanche-like photon trapping has its analogies at the case of rain or snow condensations, laser pumping or rise of Hitler's power before WWW II.

Because photons inside are moving independently to motion of soliton, they're propagating in hidden dimensions effectively: we can say, higher dimensionality of space-time, i.e. symmetry breaking of mass density (inhomogeneity) converts into higher dimensionality of particle motion during sufficiently large space-time interval, i.e. into symmetry breaking of energy density (dispersion). The same mechanism of composite particle formation can be applied onto every other heavier particles or even objects in social systems. All elementary particles are propagating through space like solitons, composed of smaller bosons, which can be illustrated for example by relation of spin projection into axis of motion to speed of particle. The escape of particles through polar jets of black holes can be considered as an exagerrated case of soliton mechanism.

Concerning LS violation, we aren't disputing a lone photons, the exact path of which is unavailable for us - but cluster of photons as a whole, which is indeed quite different situation: at the scope of such cluster individual photons may move randomly along different paths - while they're still keeping the shape of cluster as a whole. Therefore LS remains maintained at the cluster level with respect to dispersion, thus leaving postulates of string or LQG theory intacted - but whole cluster is still moving in subluminal speed with respect to lone microwave photons, so that even lightweight neutrinos can move faster in certain cases. Atemporal logics of formal math, used in these theories cannot handle such situation easily, because of collective motion of many objects at the same moment - although it's still quite trivial to understand. For example, while theories like DSR/DSR2 proposed by Smolin and Maguejio consider violation of LS less or more successfully in 3D, they still cannot explain "violation of LS violation" at both large distance and large energy density scales, which is indeed the case of gamma ray propagation across whole Universe.

String theory could easily model violation of Lorentz symmetry in inhomogeneous 4D space-time simply by declaring it a higher-dimensional flat space in the same way, like LQG - the only problem is, scientists on both sides of ST/LQG duality still didn't realize it, while they're still seeking for signs of both extradimensions, both Lorentz symmetry violation - although they have them before eyes all the time. In addition, here exists an interesting deadlock mechanism: string theorists (ST) could introduce Lorentz symmetry (LS) violation by considering of extradimensions, but they hesitate to propose it, because LS belongs between ST postulates in 4D space-time, while LQG proponents could introduce extradimensions by considering Lorentz symmetry violation, but they grudge against it, because they proposed LQG a just "4D theory" originally.

At the moment, when both sides are earning half of grant support, no one wants to start the reconciliation of both theories by considering of ideas of the dual theory. In such a way both sides are effectively locked inside of ivory towers of their own prejudices. I presume, this example situation explains a lot, how symmetry breaking is occurring at phenomenological level and it illustrates clearly, why theoretical physicists should be payed for reconciliation of existing theories be decreasing of number of postulates, instead of for development of new ones by increasing of number of existing postulates, because divergent character of their formal thinking prohibits them in reconciliation of existing theories.