Thursday, January 29, 2009

AWT and definition of intelligence

By AWT correct - i.e. physically relevant - definition of intelligence is rather important, as it can give us a clue about direction of psychological time arrow.

From certain perspective every free particle appears like quite intelligent "creature", because it can find the path of the optimal potential gradient unmistakably even inside of highly dimensional field where interactions of many particles overlaps mutually. Whereas single particle is rather "silly" and it can follow just a narrow density gradient, complex multidimensional fluctuations of Aether can follow a complex gradients and they can even avoid a wrong path or obstacles at certain extent. They're "farseeing" and "intelligent". Note that the traveling of particle along density gradient leads into gradual dissolving of it and "death". The same forces, which are keeping the particle in motion will lead to its gradual disintegration of it.

The ability of people to make correct decisions in such fuzzy environment is usually connected with social intelligence. We can say, motion of particle is fully driven by its "intuition". They can react fast in many time dimensions symmetrically (congruently), whereas their ability to interact with future (i.e. ability of predictions) still remains very low, accordingly to low (but nonzero) memory capacity of single gradient particle. Nested clusters of many particles are the more clever, the more hidden dimensions are formed by. Electrochemical waves of neural system activity should form a highly nested systems of energy density fluctuations.


Neverthelles, if we consider intelligence as "an ability to obtain new abilities", then the learning ability and memory capacity of single level density fluctuations still remains very low. Every particle has a surface gradient from perspective of single level of particle fluctuations, so it has an memory (compacted space-time dimensions) as well. Therefore for single object we can postulate the number of nested dimensions inside of object as a general criterion of intelligence. The highly compactified character of neuron network enables people to handle a deep level of mutual implications, i.e. manifolds of causual space defined by implication tensors of high order. Such definition remains symmetrical, i.e. invariant to both intuitive behaviour driven by parallel logics, both conscious behaviour, driven by sequential logics.

Every highly condensed system becomes chaotic, because intelligent activities of individual particles are temporal and they're compensating mutually here. By such way, the behavior of human civilization doesn't differ very much from behavior of dense gas, as we can see from history of wars and economical crisis, for instance. The ability of people to drive the evolution of their own society is still quite limited in general. We can consider such ability as a criterion of social self-awareness. The process of phase transition corresponds learning phase of multi-particle system.

Interesting point is, individual members of such systems may not be aware of incoming phase transition, because theirs space-time expands (the environment becomes more dense) together with these intelligent artifacts. At certain moment the environment becomes more conscious (i.e. negentropic), then the particle system formed by it and phase transition will occur. The well known superfluidity and superconductivity phenomena followed by formation of boson condensate can serve as a physical analogy of sectarian community formation, separated from the needs/feedback of rest of society. Members of community can be internally characterized by their high level of censorship (total reflection phenomena with respect to information spreading) and by superfluous homogeneity of individual stance distribution, followed by rigidity and fragility of their opinions (i.e. by duality of odd and even derivations in space and time) from outside perspective.

AWT explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. Because such environment becomes more dense, the space-time dilatation occurs here and everything vents OK from insintric perspective. As the result, nobody from sectarian community will realize, he just lost control over situation.

For example, people preparing LHC experiments cannot be accused from evil motives - they just want to do some interesting measurements on LHC, to finish their dissertations, make some money in attractive job, nurse children, learn French, and so on… Just innocent wishes all the time, am I right? But as a whole their community has omitted serious precautionary principles under hope, successful end justifies the means.

Particle model explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. For example, nobody of this community has taken care about difference in charged and neutral black holes in their ability to swallow surrounding matter. As a result, nobody of members of such community realizes consequence of his behavior until very end.

And this is quite silly and unscouscios behavior, indeed.

AWT and LHC safety risk

The LHC "black hole" issue disputed (1, 2, 3) and recently reopened (1, 2, 3) is manifestation of previously disputed fact, every close community becomes sectarian undeniably and separated from needs of rest of society like singularity by total reflection mechanism. Ignorance of fundamental ideas (Heim theory) or discoveries (cold fusion, surface superconductivity, "antigravity") on behalf of risky and expensive LHC experiments illustrates increasing gap between priorities of physical community and interests of the rest of society.

The power of human inquisitiveness is the problem here: as we know from history, scientists as a whole never care about morality, just about technical difficulties. If they can do something, then they will do it - less or more lately, undeniably. No matter whether it's nuclear weapon, genetically engineered virus and/or collider. Which makes trouble at the moment, the results of such experiments can threaten the whole civilization. We should know about this danger of human nature and we should be prepared to suffer consequences. Max Tegmark’s “quantum suicide” experiment doesn't say, how large portion of the original system can survive its experiment.

So, what's the problem with LHC experiments planned? Up to this day, no relevant analysis, evaluating all possible risks and their error bars is publicly available. Existing safety analysis and reports (1, 2) are very rough and superficial, as they doesn't consider important risk factors and scenarios, like formation of charged black holes or surface tension phenomena of dense particle clusters. There's an obstinate tendency to start LHC experiments without such analysis and to demonstrate first successful results even without thorough testing phase. Because the load of accelerator was increased over 80% of nominal capacity during first days impatiently, the substantial portion of cooling system crashed due the massive spill (100 tons) of expensive helium and monitoring systems of whole LHC are in extensive upgrade and replacement to avoid avalanche propagation of the same problem over whole accelerator tube in future.

Up to these days, publicity has no relevant and transparent data about probability of supercritical black hole formation during expected period of LHC lifetime and about main factors, which can increase total risk above acceptable level, in particular the risk associated to:

  1. Extreme asymmetry of head-to-head collisions, during which a zero momentum/speed black holes can be formed, so they would have a lot of time to interact with Earth with compare to natural protons from cosmic rays. The collision geometry is has no counterpart in nature, as it's a product of long-term human evolution, not natural processes.

  2. Avalanche-like character of multi-particle collisions. When some piece of matter appears in accelerator line, then whole content of LHC will feed it by new matter incoming from both directions by nearly luminal speed, i.e. in much faster way with compare to collisions of natural cosmic rays appearing in stratosphere

  3. Proximity of dense environment. With compare to stratospheric collisions of gamma rays, the metastable products of LHC collisions can be trapped by gravitational field of Earth and to interact with it in long term fashion. Some models are considering, the black hole can move in Earth core for years without notion, thus changing the Earth into time-bomb for further generations.

  4. Formation of charged and magnetic black hole. As we know from theory, real black holes should always exhibit nonzero charge and magnetic field as the result of their fast surface rotation. While force constant of electromagnetic force is about 10^39 times stronger then those of gravitational interaction (and the force constant of nuclear force is even much higher), the omitting of such possibility from security analysis is just a illustration of deep incompetence of high energy physics and it looks rather like intention, than just omission. It's not so surprising, as every introduction of such risk into safety analysis would lead into increasing of LHC risk estimations in many orders of magnitude, making them unfeasible in the eyes of society.

  5. Formation of dense clusters of quite common neutral particles, which are stable well outside from LHC energy range (presumably the neutrons). This risk is especially relevant for ALICE experiment, consisting of head-to-head collisions of heavy atom nuclei, during which the large number of free neutrons can be released in the form of so called neutron fluid. The signs of tetra-neutron existence supports this hypothesis apparently. The neutron fluid would stabilize neutrons against decay due its strong surface tension by analogous way, like the neutrons inside neutron stars. The risk of neutron fluid formation is connected to possible tendency to expel protons from atom nuclei in contact with neutron fluid, thus changing them into droplets of another neutron fluid by avalanche like mechanism, which was proposed for strangelet risk of LHC originally.

  6. Surface tension effects of large dense particle clusters, like the various gluonium and quarkonium states which CAN stabilize even unstable forms of mater, like neutral mesons and other hadrons up to levels, they can interact with ordinary matter by mechanism above described under formation of another dense particle clusters, so called strangelets (sort of tiny quark stars, originally proposed by Ed Witten). The evidence of these states was confirmed recently for tetra- and pentaquark exotic states. By AWT the surface tension phenomena are related to dark matter and supersymmetry effects observed unexpectedly in Fermilab (formation of di muon states well outside of collider pipe), as we can explain later. If this connection will be confirmed, we aren't expected to worry about strangelet formation anymore - simply because we observed it already!

With compare to black hole formation, the risks of strangelet and neutron fluid aren't connected to collapse of Earth into gravitational singularity, but to release of wast amount of energy (comparable to those of thermonuclear fusion), during which of most of matter would be vaporized and expelled into cosmic space by pressure of giant flash of accretion radiation.

As I explained already, cosmic ray arguments aren’t wery relevant to highly asymmetric LHC collisions geometry, so it has no meaning to repeat them again and again. This geometry - not the energy scale - is what makes the LHC collisions so unique and orthogonal to extrapolations based on highly symmetrical thermodynamics. It’s product of very rare human evolution. Whole AWT is just about probability of various symmetries.

So we are required to reconsider LHC experiments in much deeper, publicly available and peer reviewed security analysis. We should simply apply scientific method even to security analysis of scientific experiments - no less, no more. By my opinion, these objections are trivial and mostly evident - but no safety analysis has considered them so far from apparent reason: not to threat the launch of LHC. So now we can just ask, who is responsible for this situation and for lack of persons responsible for relevant safety analysis of LHC project of 7 billions € in total cost?

Safety is the main concern of LHC experiments. You can be perfectly sure, LHC experiments are safe because of many theories. After all, the main purpose of these experiments is to verify these theories.

Isn't the only purpose of LHC to verify it's own safety at the very end? Is it really enough for everybody?

Tuesday, January 27, 2009

AWT and Bohmian mechanics

This post is a reaction to recent L. Motl's comments (1, 2, reactions) concerning the Bohm interpretation of quantum mechanics (QM), the concept of Louis de Broglie pilot wave in particular (implicate/explicate order is disputed here). Bohm's holistic approach (he was proponent of marxistic ideas) enabled him to see general consequences of this concept a way deeper, the aristocratic origin of de Broglie. It's not surprising, Bohm's interpretation has a firm place in AWT interpretations of various concepts, causual topology of implications and famous double slit experiment in particular. After all, we have a mechanical analogy of double slit experiment (DSE) presented already (videos), therefore it’s evident, QM can be interpreted by classical wave mechanics without problem..

Single-particle interference observed for macroscopic objects

AWT considers pilot wave as an analogy of Kelvin waves formed during object motion through particle environment. Original AWT explanation of double slit experiment is, every fast moving particle creates an undulations of vacuum foam around it by the same way, like fish flowing beneath water surface in analogy to de Broglie wave.


These undulations are oriented perpendicular to the particle motion direction and they can interfere with both slits, whenever particle passes through one of them. Aether foam gets more dense under shaking temporarily, thus mimicking mass/energy equivalence of relativity and probability density function of quantum mechanics at the same moment. The constructive interference makes a flabelliform paths of more dense vacuum foam, which the particle wave follows preferably, being focused by more dense environment, thus creating a interference patterns at the target.

By AWT the de Broglie wave or even quantum wave itself are real physical artifacts. The fact, they cannot be observed directly by the using of light wave follows from Bose statistics: the surface waves are penetrating mutually, so they cannot be observed mutually. But by Hardy's theorem weak (gravitational or photon coupling) measurement of object location without violating of uncertainty principle is possible. What we can observe is just a gravitational lensing effect of density gradients (as described by probability function), induced by these waves in vacuum foam by thickening effect during shaking.

Other thing is, whether pilot wave concept supplies a deeper insight or even other testable predictions, then for example time dependent Schrödinger equation does. By my opinion it doesn't, or it's even subset of information contained in classical QM formalism. This doesn't mean, in certain situations pilot wave formalism cannot supply an useful shortcut for formal solution (by the same way, like for example Bohr's atom model) - whereas in others cases it can become more difficult to apply, then other interpretations.

Monday, January 26, 2009

AWT and definition of observable reality

When comparing contemporary physical theories, a natural question can emerge immediately: if AWT is proclamativelly more general, then for example various quantum field or quantum gravity theories, shouldn't it lead to even more solutions, then these theories can supply? And if the vagueness is the main objection against these theories, why we should take care about AWT, after then?

The true is, AWT can lead into virtually infinite number of solutions, because even in quite limited particle system the number of possible states increases by extremely fast way. But AWT introduces a gradient driven reality concept, which is probability driven. Many results of particle-particle collisions aren't simply probable, because they're too rare. Therefore we can see only density gradients inside of dense particles system, not a particles or intermediate states as such. The concept of gradient driven reality is apparently anthropocentric, but it can be derived from AWT concept independently, because only artifacts, which were created by long term evolution of high number of mutations, i.e. by causal time events can interact with reality by gradient driven way.

The probability based approach based on particle statistics brings a rather strict restriction into number of possible solutions of every fuzzy theory. String theorists are aware of this opportunity, so they're trying to apply a statistical approach onto landscape of string theory predictions as well. But because the number of predictions of string theory (~10E+500) roughly corresponds the number of particles states inside of observable portion of Universe, then such approach is phenomenologically identical to AWT, if we simply omit whole intermediate step related to tedious string theory formalism (which is serving like random number generator only) - and if we apply Boltzmann statistics to these states directly.

By such way, the AWT wins over formal theories in simplicity (i.e. by Occam's razor criterion), just because it introduces a gradient driven definition of observable reality into physics, thus reducing the number of possible observable states in it: every object can be observed only and if only it contains some space-time gradient from sufficiently general perspective. For example, the (movement of) density gradients inside of condensing supercritical vapor can be observed, while the molecules (motion) itself not. The single Aether concept i.e. material conditional (antecedent) is sufficient for such decision, if we apply an observability criterion (consequent), thus introducing basic implication vector, which the AWT is based on: if Universe is formed by chaotic/particle environment, then every fluctuation evolved/emerged in it via (number of) causal events would see only the (same number of) causal gradients of it. (... and we can predict an appearance of this observable reality by unique way). By such way, we can always see exactly the part of Universe, which has served for our evolution (space-time emergence) and the observable scope of reality expands gradually. This is the way, how Bohm's implicate/explicate order may be undertood in context of AWT, because implication vector defines a time arrow of causual space-time curvature and subsequent compactification of it here.

The testability of AWT insintric perspective is provided by nonscalar implication vector, which is based on nonsingular (zero or infinite) order of axiomatic tensor.  Outside of this perspective AWT remains tautology inherently, whis is given by fact, no assumption can consider itself, or less generally, that no object of observation can serve both like mean, both like subject of the same observation in the same time and space point. Aether concept itself remains a tautology, as it cannot be proven by observation and causual logic without violation of this logic in less or more distant perspective by the same way, like God concept.

It can be demonstrated easily, many conceptual problems of contemporary science simply follows from the fact, the scientists have no clue, what is observable and what not, because of lack of relevant definition of observable reality. By such way, many possible combinations would simply disappear from testable predictions, if we apply the gradient driven statistics or Lagrange/Hamilton mechanics, which is based on it. In particular, the misinterpretation of results of M-M experiment just follows from the fact, scientists didn't realize, the motion of environment isn't observable by waves of this environment. The refusal of deBroglie /Bohmian mechanics is misunderstanding of the same category: scientists didn't realize, deBroglie wave cannot be observable by light wave (so easily), being a wave of the same environment, so that the lack of experimental evidence of deBroglie wave cannot serve as an evidence against Bohmian mechanics.

AWT, emergence and Hardy's paradox

Recently, the fundamental experimental evidence of Hardy's paradox was given, which basically means, quantum mechanics isn't pure statistics based theory following Bell inequalities anymore. The non-formal understanding of this paradox is easy: if every combination of mutually commutable quantities cannot be measured with certainty, how can we be sure about it? Whether some combination exists, which violates such uncertainty? By such way, uncertainty principle of quantum mechanics violates itself on background, thus enabling so called "weak" measurements.

This was demonstrated recently for the case of entangled photon pairs - it can serve as an evidence, even the photons have a distinct "shape", which is the manifestation of the rest mass of photon. This is because the explicit formulation of quantum mechanics neglects the gravity phenomena and the rest mass concept on background: by Schrödinger equation every particle should dissolve into whole Universe gradually - which violates the everyday observations, indeed. Such behavior is effectively prohibited by acceleration following from omni-directional Universe expansion i.e. the gravity potential, so that every locatable particle has a nonzero surface curvature and its conditionally stable at the human scale. From nested character of Aether fluctuations follows, not only single level of "weak" measurement should be achievable here. After all, the fact we can interact with another people and object without complete entanglement can serve as an evidence, the "weak" observation is very common at the human scale.

By AWT every strictly causual theory violates itself in less or more distant perspective due the emergence phenomena. While the classical formulation of general relativity remains seemingly self-consistent (being strictly based on single causality arrow) - the deeper analysis reveals, derivation of Einstein field equations neglects the stress energy tensor contribution (Yilmaz, Heim, Bekenstein and others), which is the result of mass-energy equivalence. This approach makes relativity implicit and infinitely fractal theory by the same way, like the quantum mechanics (which is AdS/CFT dual theory). For example, gravitational lensing, multiple event horizons of charged black holes and/or dark matter phenomena can serve as an evidence of spontaneous symmetry breaking of time arrows and manifestation of quantum uncertainty and super-symmetry in relativity. This uncertainty leads into landscape of many solutions for every theory quantum field or quantum gravity theory, based on combination of mutually inconsistent (i.e. different) postulates.

Such behavior follows Gödel's incompleteness theorems, by which formal proof of rules valid for sufficiently large natural number sets becomes more difficult, then these rules itself - thus remaining unresolvable by their very nature. This is a consequence of emergence, which introduces a principal dispersion into observation of large causal objects and/or phenomena, which cannot be avoided, or such artifacts wouldn't observable anymore. By such way, every strictly formal (i.e. sequential logic based) proof of natural law becomes violated in less or more distant perspective and it follows "More is Different" theorem. AWT demonstrates, this emergence is followed by causal (i.e. transversal wave based) energy spreading through large system of scale invariant symmetry fluctuations (unparticles), which are behaving like soap foam with respect to light spreading and they enable to observe the universe (and all objects inside it) both from excentric, both from insintric perspective simultaneously. The mutual interference of these two perspectives leads to the quantization of observable reality, which is insintrically chaotic, exsintrically causal by its very nature.

In this connection it's useful (..and sometimes entertaining) to follow deductions of formally thinking theorists, like Lubos Motl, whose strictly formal thinking leads him to the deep contradiction/confrontation with common sense and occasionally the whole rest of world undeniably. It may appear somewhat paradoxical, just fanatic proponent of string theory - which has introduced the duality concept into physics - has so deep problem with dual/plural thinking. This paradox is still logical though, if we realize, how complex the string theory is and how strictly formal thinking it requires for its comprehension.

By such way, "emergence group" of dense Aether theory makes understanding of observable reality quite transparent and easy task at sufficiently general level. It still doesn't mean, here's not still a lotta things to understand at the deeper levels, dedicated to individual formal theories.