textwallplz's avatar

textwallplz

Textwall PLZ
15 Watchers1 Deviation
13.7K
Pageviews

The words plz by textwallplz, literature

The words plz by textwallplz, literature

ask-jeff-teh-killer
creavey
Redfoxbennington
fuwa-moi
Spirit-Whispers
NekoMewMix
n0irex
digimagicnb
RaincloudProductions
fraggedICE
luigirules64
thredith
desudesuplz
Savay

Collection

Favourites
  • United States
  • Deviant for 17 years
Badges
Llama: Llamas are awesome! (5)

hi

0 min read
It's textwallplz (https://www.deviantart.com/textwallplz). Without Savay (https://www.deviantart.com/savay) there would be no textwallplz. D: Since many a foo wants to know what's on this emote, it's in the gallery.
Join the community to add your comment. Already a deviant? Log In

Profile Comments 90

Join the community to add your comment. Already a deviant? Log In
A meme ( /ˈmiːm/; MEEM)[1]) is "an idea, behavior or style that spreads from person to person within a culture." A meme acts as a unit for carrying cultural ideas, symbols or practices, which can be transmitted from one mind to another through writing, speech, gestures, rituals or other imitable phenomena. Supporters of the concept regard memes as cultural analogues to genes in that they self-replicate, mutate and respond to selective pressures. The word meme is a shortening (modeled on gene) of mimeme (from Ancient Greek μίμημα Greek pronunciation: [míːmɛːma] mīmēma, "something imitated", from μιμεῖσθαι mimeisthai, "to imitate", from μῖμος mimos "mime") and it was coined by the British evolutionary biologist Richard Dawkins in The Selfish Gene (1976) as a concept for discussion of evolutionary principles in explaining the spread of ideas and cultural phenomena. Examples of memes given in the book included melodies, catch-phrases, fashion and the technology of building arches. Proponents theorize that memes may evolve by natural selection in a manner analogous to that of biological evolution. Memes do this through the processes of variation, mutation, competition and inheritance, each of which influence a meme's reproductive success. Memes spread through the behaviors that they generate in their hosts. Memes that propagate less prolifically may become extinct, while others may survive, spread and (for better or for worse) mutate. Memes that replicate most effectively enjoy more success, and some may replicate effectively even when they prove to be detrimental to the welfare of their hosts. A field of study called memetics arose in the 1990s to explore the concepts and transmission of memes in terms of an evolutionary model. Criticism from a variety of fronts has challenged the notion that academic study can examine memes empirically. However, developments in neuroimaging may make empirical study possible. Some commentators question the idea that one can meaningfully categorize culture in terms of discrete units. Others, including Dawkins himself, have argued that this usage of the term is the result of a misunderstanding of the original proposal. The word meme originated with Dawkins' 1976 book The Selfish Gene. To emphasize commonality with genes, Dawkins coined the term "meme" by shortening "mimeme", which derives from the Greek word mimema ("something imitated"). He said that he wanted "a monosyllable that sounds a bit like 'gene'". Dawkins wrote that evolution depended not on the particular chemical basis of genetics, but only on the existence of a self-replicating unit of transmission—in the case of biological evolution, the gene. For Dawkins, the meme exemplified another self-replicating unit with potential significance in explaining human behavior and cultural evolution. Dawkins used the term to refer to any cultural entity that an observer might consider a replicator. He hypothesised that one could view many cultural entities as replicators, and pointed to melodies, fashions and learned skills as examples. Memes generally replicate through exposure to humans, who have evolved as efficient copiers of information and behaviour. Because humans do not always copy memes perfectly, and because they may refine, combine or otherwise modify them with other memes to create new memes, they can change over time. Dawkins likened the process by which memes survive and change through the evolution of culture to the natural selection of genes in biological evolution. Dawkins defined the meme as a unit of cultural transmission, or a unit of imitation and replication, but later definitions would vary. Memes, analogously to genes, vary in their aptitude to replicate; memes which are good at getting themselves copied tend to spread and remain, whereas the less good ones have a higher probability of being ignored and forgotten. Thus "better" memes are selected. The lack of a consistent, rigorous, and precise understanding of what typically makes up one unit of cultural transmission remains a problem in debates about memetics. In contrast, the concept of genetics gained concrete evidence with the discovery of the biological functions of DNA. Meme transmission does not necessarily require a physical medium, unlike genetic transmission. Life-forms can transmit information both vertically (from parent to child, via replication of genes) and horizontally (through viruses and other means). Malcolm Gladwell wrote, "A meme is an idea that behaves like a virus--that moves through a population, taking hold in each person it infects." Memes can replicate vertically or horizontally within a single biological generation. They may also lie dormant for long periods of time. Memes spread by the behaviors that they generate in their hosts. Imitation counts as an important characteristic in the propagation of memes. Imitation often involves the copying of an observed behaviour of another individual, but memes may transmit from one individual to another through a copy recorded in an inanimate source, such as a book or a musical score. McNamara has suggested that memes can be thereby classified as either internal or external memes, (i-memes or e-memes). Researchers have observed memetic copying in just a few species on Earth, including hominids, dolphins and birds (that learn how to sing by imitating their parents or neighbors). Some commentators have likened the transmission of memes to the spread of contagions. Social contagions such as fads, hysteria, copycat crime, and copycat suicide exemplify memes seen as the contagious imitation of ideas. Observers distinguish the contagious imitation of memes from instinctively contagious phenomena such as yawning and laughing, which they consider innate (rather than socially learned) behaviors. Aaron Lynch described seven general patterns of meme transmission, or "thought contagion":
1. Quantity of parenthood: an idea that influences the number of children one has. Children respond particularly receptively to the ideas of their parents, and thus ideas that directly or indirectly encourage a higher birthrate will replicate themselves at a higher rate than those that discourage higher birthrates.
2. Efficiency of parenthood: an idea that increases the proportion of children who will adopt ideas of their parents. Cultural separatism exemplifies one practice in which one can expect a higher rate of meme-replication—because the meme for separation creates a barrier from exposure to competing ideas.
3. Proselytic: ideas generally passed to others beyond one's own children. Ideas that encourage the proselytism of a meme, as seen in many religious or political movements, can replicate memes horizontally through a given generation, spreading more rapidly than parent-to-child meme-transmissions do.
4. Preservational: ideas that influence those that hold them to continue to hold them for a long time. Ideas that encourage longevity in their hosts, or leave their hosts particularly resistant to abandoning or replacing these ideas, enhance the preservability of memes and afford protection from the competition or proselytism of other memes.
5. Adversative: ideas that influence those that hold them to attack or sabotage competing ideas and/or those that hold them. Adversative replication can give an advantage in meme transmission when the meme itself encourages aggression against other memes.
6. Cognitive: ideas perceived as cogent by most in the population who encounter them. Cognitively transmitted memes depend heavily on a cluster of other ideas and cognitive traits already widely held in the population, and thus usually spread more passively than other forms of meme transmission. Memes spread in cognitive transmission do not count as self-replicating.
7. Motivational: ideas that people adopt because they perceive some self-interest in adopting them. Strictly speaking, motivationally transmitted memes do not self-propagate, but this mode of transmission often occurs in association with memes self-replicated in the efficiency parental, proselytic and preservational modes.
Richard Dawkins initially defined meme as a noun that "conveys the idea of a unit of cultural transmission, or a unit of imitation". John S. Wilkins retained the notion of meme as a kernel of cultural imitation while emphasizing the meme's evolutionary aspect, defining the meme as "the least unit of sociocultural information relative to a selection process that has favourable or unfavourable selection bias that exceeds its endogenous tendency to change." The meme as a unit provides a convenient means of discussing "a piece of thought copied from person to person", regardless of whether that thought contains others inside it, or forms part of a larger meme. A meme could consist of a single word, or a meme could consist of the entire speech in which that word first occurred. This forms an analogy to the idea of a gene as a single unit of self-replicating information found on the self-replicating chromosome. While the identification of memes as "units" conveys their nature to replicate as discrete, indivisible entities, it does not imply that thoughts somehow become quantized or that "atomic" ideas exist that cannot be dissected into smaller pieces. A meme has no given size. Susan Blackmore writes that melodies from Beethoven's symphonies are commonly used to illustrate the difficulty involved in delimiting memes as discrete units. She notes that while the first four notes of Beethoven's Fifth Symphony form a meme widely replicated as an independent unit, one can regard the entire symphony as a single meme as well. The inability to pin an idea or cultural feature to quantifiable key units is widely acknowledge as a problem for memetics. It has been argued however that the traces of memetic processing can be quantified utilizing neuroimaging techniques which measure changes in the connectivity profiles between brain regions." Blackmore meets such criticism by stating that memes compare with genes in this respect: that while a gene has no particular size, nor can we ascribe every phenotypic feature directly to a particular gene, it has value because it encapsulates that key unit of inherited expression subject to evolutionary pressures. To illustrate, she notes evolution selects for the gene for features such as eye color; it does not select for the individual nucleotide in a strand of DNA. Memes play a comparable role in understanding the evolution of imitated behaviors. The 1981 book Genes, Mind, and Culture: The Coevolutionary Process by Charles J. Lumsden and E. O. Wilson proposed the theory that genes and culture co-evolve, and that the fundamental biological units of culture must correspond to neuronal networks that function as nodes of semantic memory. They coined their own term, "culturgen", which did not catch on. Coauthor Wilson later acknowledged the term meme as the best label for the fundamental unit of cultural inheritance in his 1998 book Consilience: The Unity of Knowledge, which elaborates upon the fundamental role of memes in unifying the natural and social sciences. Richard Dawkins noted the three conditions that must exist for evolution to occur:
1. variation, or the introduction of new change to existing elements;
2. heredity or replication, or the capacity to create copies of elements;
3. differential "fitness", or the opportunity for one element to be more or less suited to the environment than another.
Dawkins emphasizes that the process of evolution naturally occurs whenever these conditions co-exist, and that evolution does not apply only to organic elements such as genes. He regards memes as also having the properties necessary for evolution, and thus sees meme evolution as not simply analogous to genetic evolution, but as a real phenomenon subject to the laws of natural selection. Dawkins noted that as various ideas pass from one generation to the next, they may either enhance or detract from the survival of the people who obtain those ideas, or influence the survival of the ideas themselves. For example, a certain culture may develop unique designs and methods of tool-making that give it a competitive advantage over another culture. Each tool-design thus acts somewhat similarly to a biological gene in that some populations have it and others do not, and the meme's function directly affects the presence of the design in future generations. In keeping with the thesis that in evolution one can regard organisms simply as suitable "hosts" for reproducing genes, Dawkins argues that one can view people as "hosts" for replicating memes. Consequently, a successful meme may or may not need to provide any benefit to its host. Unlike genetic evolution, memetic evolution can show both Darwinian and Lamarckian traits. Cultural memes will have the characteristic of Lamarckian inheritance when a host aspires to replicate the given meme through inference rather than by exactly copying it. Take for example the case of the transmission of a simple skill such as hammering a nail, a skill that a learner imitates from watching a demonstration without necessarily imitating every discrete movement modeled by the teacher in the demonstration, stroke for stroke. Susan Blackmore distinguishes the difference between the two modes of inheritance in the evolution of memes, characterizing the Darwinian mode as "copying the instructions" and the Lamarckian as "copying the product." Clusters of memes, or memeplexes (also known as meme complexes or as memecomplexes), such as cultural or political doctrines and systems, may also play a part in the acceptance of new memes. Memeplexes comprise groups of memes that replicate together and coadapt. Memes that fit within a successful memeplex may gain acceptance by "piggybacking" on the success of the memeplex. As an example, John D. Gottsch discusses the transmission, mutation and selection of religious memeplexes and the theistic memes contained. Theistic memes discussed include the "prohibition of aberrant sexual practices such as incest, adultery, homosexuality, bestiality, castration, and religious prostitution", which may have increased vertical transmission of the parent religious memeplex. Similar memes are thereby included in the majority of religious memeplexes, and harden over time; they become an "inviolable canon" or set of dogmas, eventually finding their way into secular law. This could also be referred to as the propagation of a taboo. The discipline of memetics, which dates from the mid 1980s, provides an approach to evolutionary models of cultural information transfer based on the concept of the meme. Memeticists have proposed that just as memes function analogously to genes, memetics functions analogously to genetics. Memetics attempts to apply conventional scientific methods (such as those used in population genetics and epidemiology) to explain existing patterns and transmission of cultural ideas. Principal criticisms of memetics include the claim that memetics ignores established advances in other fields of cultural study, such as sociology, cultural anthropology, cognitive psychology, and social psychology. Questions remain whether or not the meme concept counts as a validly disprovable scientific theory. This view regards memetics as a theory in its infancy: a protoscience to proponents, or a pseudoscience to some detractors. An objection to the study of the evolution of memes in genetic terms (although not to the existence of memes) involves a perceived gap in the gene/meme analogy: the cumulative evolution of genes depends on biological selection-pressures neither too great nor too small in relation to mutation-rates. There seems no reason to think that the same balance will exist in the selection pressures on memes. Luis Benitez-Bribiesca M.D., a critic of memetics, calls the theory a "pseudoscientific dogma" and "a dangerous idea that poses a threat to the serious study of consciousness and cultural evolution". As a factual criticism, Benitez-Bribiesca points to the lack of a "code script" for memes (analogous to the DNA of genes), and to the excessive instability of the meme mutation mechanism (that of an idea going from one brain to another), which would lead to a low replication accuracy and a high mutation rate, rendering the evolutionary process chaotic. British political philosopher John Gray has characterized Dawkins' memetic theory of religion as "nonsense" and "not even a theory... the latest in a succession of ill-judged Darwinian metaphors", comparable to Intelligent Design in its value as a science. Another critique comes from semiotic theorists such as Deacon and Kull This view regards the concept of "meme" as a primitivized concept of "sign". The meme is thus described in memetics as a sign lacking a triadic nature. Semioticians can regard a meme as a "degenerate" sign, which includes only its ability of being copied. Accordingly, in the broadest sense, the objects of copying are memes, whereas the objects of translation and interpretation are signs. Fracchia and Lewontin regard memetics as reductionist and inadequate. Burman, by contrast, has shown that the misunderstanding that memes are "real" is a result of a popularization based on a confused interpretation of Dawkins' The Selfish Gene. Instead, for him, the idea of an "infectious idea" can be a useful conceit if used under certain conditions. He explained this in a subsequent discussion regarding his article:
...you can't take the meme seriously as "a thing that jumps." You can only ask what insights are derived if we adopt a stance in which we accept jumping as a shortcut to get to the more interesting problem. Memes, in this sense, are a philosophical method; they aren't a scientific object.
In his chapter titled "Truth" published in the Encyclopedia of Phenomenology, Dieter Lohmar questions the memeticists' reduction of the highly complex body of ideas (such as religion, politics, war, justice, and science itself) to a putatively one-dimensional series of memes. He sees memes as an abstraction and such a reduction as failing to produce greater understanding of those ideas. The highly interconnected, multi-layering of ideas resists memetic simplification to an atomic or molecular form; as does the fact that each of our lives remains fully enmeshed and involved in such "memes". Lohmar argues that one cannot view memes through a microscope in the way one can detect genes. The leveling-off of all such interesting "memes" down to some neutralized molecular "substance" such as "meme-substance" introduces a bias toward "scientism" and abandons the very essence of what makes ideas interesting, richly available, and worth studying. Opinions differ as to how best to apply the concept of memes within a "proper" disciplinary framework. One view sees memes as providing a useful philosophical perspective with which to examine cultural evolution. Proponents of this view (such as Susan Blackmore and Daniel Dennett) argue that considering cultural developments from a meme's-eye view—as if memes themselves respond to pressure to maximise their own replication and survival—can lead to useful insights and yield valuable predictions into how culture develops over time. Others such as Bruce Edmonds and Robert Aunger have focused on the need to provide an empirical grounding for memetics to become a useful and respected scientific discipline. A third approach, described as "radical memetics", seeks to place memes at the centre of a materialistic theory of mind and of personal identity. Prominent researchers in evolutionary psychology and anthropology, including Scott Atran, Dan Sperber, Pascal Boyer, John Tooby and others, argue the possibility of incompatibility between modularity of mind and memetics. In their view, minds structure certain communicable aspects of the ideas produced, and these communicable aspects generally trigger or elicit ideas in other minds through inference (to relatively rich structures generated from often low-fidelity input) and not high-fidelity replication or imitation. Atran discusses communication involving religious beliefs as a case in point. In one set of experiments he asked religious people to write down on a piece of paper the meanings of the Ten Commandments. Despite the subjects' own expectations of consensus, interpretations of the commandments showed wide ranges of variation, with little evidence of consensus. In another experiment, subjects with autism and subjects without autism interpreted ideological and religious sayings (for example, "Let a thousand flowers bloom" or "To everything there is a season"). People with autism showed a significant tendency to closely paraphrase and repeat content from the original statement (for example: "Don't cut flowers before they bloom"). Controls tended to infer a wider range of cultural meanings with little replicated content (for example: "Go with the flow" or "Everyone should have equal opportunity"). Only the subjects with autism—who lack the degree of inferential capacity normally associated with aspects of theory of mind—came close to functioning as "meme machines". In his book The Robot's Rebellion, Stanovich uses the memes and memeplex concepts to describe a program of cognitive reform that he refers to as a "rebellion". Specifically, Stanovich argues that the use of memes as a descriptor for cultural units is beneficial because it serves to emphasize transmission and acquisition properties that parallel the study of epidemiology. These properties make salient the sometimes parasitic nature of acquired memes, and as a result individuals should be motivated to reflectively acquire memes using what he calls a "Neurathian bootstrap" process. Although social scientists such as Max Weber sought to understand and explain religion in terms of a cultural attribute, Richard Dawkins called for a re-analysis of religion in terms of the evolution of self-replicating ideas apart from any resulting biological advantages they might bestow. As an enthusiastic Darwinian, I have been dissatisfied with explanations that my fellow-enthusiasts have offered for human behaviour. They have tried to look for 'biological advantages' in various attributes of human civilization. For instance, tribal religion has been seen as a mechanism for solidifying group identity, valuable for a pack-hunting species whose individuals rely on cooperation to catch large and fast prey. Frequently the evolutionary preconception in terms of which such theories are framed is implicitly group-selectionist, but it is possible to rephrase the theories in terms of orthodox gene selection. He argued that the role of key replicator in cultural evolution belongs not to genes, but to memes replicating thought from person to person by means of imitation. These replicators respond to selective pressures that may or may not affect biological reproduction or survival. In her book The Meme Machine, Susan Blackmore regards religions as particularly tenacious memes. Many of the features common to the most widely practiced religions provide built-in advantages in an evolutionary context, she writes. For example, religions that preach of the value of faith over evidence from everyday experience or reason inoculate societies against many of the most basic tools people commonly use to evaluate their ideas. By linking altruism with religious affiliation, religious memes can proliferate more quickly because people perceive that they can reap societal as well as personal rewards. The longevity of religious memes improves with their documentation in revered religious texts. Aaron Lynch attributed the robustness of religious memes in human culture to the fact that such memes incorporate multiple modes of meme transmission. Religious memes pass down the generations from parent to child and across a single generation through the meme-exchange of proselytism. Most people will hold the religion taught them by their parents throughout their life. Many religions feature adversarial elements, punishing apostasy, for instance, or demonizing infidels. In Thought Contagion Lynch identifies the memes of transmission in Christianity as especially powerful in scope. Believers view the conversion of non-believers both as a religious duty and as an act of altruism. The promise of heaven to believers and threat of hell to non-believers provide a strong incentive for members to retain their belief. Lynch asserts that belief in the Crucifixion of Jesus in Christianity amplifies each of its other replication advantages through the indebtedness believers have to their Savior for sacrifice on the cross. The image of the crucifixion recurs in religious sacraments, and the proliferation of symbols of the cross in homes and churches potently reinforces the wide array of Christian memes. Although religious memes have proliferated in human cultures, the modern scientific community has been relatively resistant to religious belief. Robertson (2007) reasoned that if evolution is accelerated in conditions of propagative difficulty, then we would expect to encounter variations of religious memes, established in general populations, addressed to scientific communities. Using a memetic approach, Robertson deconstructed two attempts to privilege religiously held spirituality in scientific discourse. Advantages of a memetic approach as compared to more traditional "modernization" and "supply side" theses in understanding the evolution and propagation of religion were explored. In Cultural Software: A Theory of Ideology, Jack Balkin argued that memetic processes can explain many of the most familiar features of ideological thought. His theory of "cultural software" maintained that memes form narratives, networks of cultural associations, metaphoric and metonymic models, and a variety of different mental structures. Balkin maintains that the same structures used to generate ideas about free speech or free markets also serve to generate racist beliefs. To Balkin, whether memes become harmful or maladaptive depends on the environmental context in which they exist rather than in any special source or manner to their origination. Balkin describes racist beliefs as "fantasy" memes that become harmful or unjust "ideologies" when diverse peoples come together, as through trade or competition. In A Theory of Architecture, Nikos Salingaros speaks of memes as “freely propagating clusters of information” which can be beneficial or harmful. He contrasts memes to patterns” and true knowledge, characterizing memes as “greatly simplified versions of patterns” and as “unreasoned matching to some visual or mnemonic prototype”. Taking reference to Dawkins, Salingaros emphasizes that they can be transmitted due to their own communicative properties, that “the simpler they are, the faster they can proliferate”, and that the most successful memes “come with a great psychological appeal”. Architectural memes, so Salingaros, can have destructive power. “Images portrayed in architectural magazines representing buildings that could not possibly accomodate everyday uses become fixed in our memory, so we reproduce them unconsciously.” He lists various architectural memes that circulated since the 1920s and which, in his view, have led to contemporary architecture becoming quite decoupled from human needs. They lack connection and meaning, thereby preventing “the creation of true connections necessary to our understanding of the world”. He sees them as no different from antipatterns in software design – as solutions that are false but are re-utilized nonetheless. The term "Internet meme" refers to a concept that spreads rapidly from person to person via the Internet, largely through Internet-based email, blogs, forums, Imageboards, social networking sites, instant messaging and video streaming sites such as YouTube. One technique of meme mapping represents the evolution and transmission of a meme across time and space. Such a meme map uses a figure-8 diagram (an analemma) to map the gestation (in the lower loop), birth (at the choke point), and development (in the upper loop) of the selected meme. Such meme maps are non-scalar, with time mapped onto the y-axis and space onto the x-axis transect. One can read the temporal progress of the mapped meme from south to north on such a meme map. Paull has published a worked example using the "organics meme" (as in organic agriculture). Robertson (2010) used a second technique of meme mapping to create two-dimensional representations of the selves of eleven participants drawn from both individualist and collectivist cultures. Participant narratives were transcribed, segmented and coded using a method similar to grounded theory. Coded segments exhibiting referent, connotative, affective and behavioral dimensions were declared to be memes. Memes that shared connotative, affective or behavioral qualities were linked. All of the maps in Robertson's sample evidenced volition, constancy, uniqueness, production, intimacy, and social interest. This method of mapping the self was successfully used in therapy to treat a youth who had attempted suicide on five occasions (Robertson 2011). The youth and psychotherapist co-constructed a plan to change the youth's presenting self, and her progress in making those changes was tracked in subsequent self-maps.
[b]Quantum mechanics[/b], also known as quantum physics or quantum theory, is a branch of physics providing a mathematical description of the dual particle-like and wave-like behavior and interaction of matter and energy. Quantum mechanics describes the time evolution of physical systems via a mathematical structure called the wave function. The wave function encapsulates the probability that the system is to be found in a given state at a given time. Quantum mechanics also allows one to calculate the effect on the system of making measurements of properties of the system by defining the effect of those measurements on the wave function. This leads to the well-known uncertainty principle as well as the enduring debate over the role of the experimenter, epitomised in the Schrödinger's Cat thought experiment.

Quantum mechanics differs significantly from classical mechanics in its predictions when the scale of observations becomes comparable to the atomic and sub-atomic scale, the so-called quantum realm. However, many macroscopic properties of systems can only be fully understood and explained with the use of quantum mechanics. Phenomena such as superconductivity, the properties of materials such as semiconductors and nuclear and chemical reaction mechanisms observed as macroscopic behaviour, cannot be explained using classical mechanics.

The term was coined by Max Planck, and derives from the observation that some physical quantities can be changed only by discrete amounts, or quanta, as multiples of the Planck constant, rather than being capable of varying continuously or by any arbitrary amount. For example, the angular momentum, or more generally the action,[citation needed] of an electron bound into an atom or molecule is quantized. Although an unbound electron does not exhibit quantized energy levels, one which is bound in an atomic orbital has quantized values of angular momentum. In the context of quantum mechanics, the wave–particle duality of energy and matter and the uncertainty principle provide a unified view of the behavior of photons, electrons and other atomic-scale objects.

The mathematical formulations of quantum mechanics are abstract. Similarly, the implications are often counter-intuitive in terms of classical physics. The centerpiece of the mathematical formulation is the wavefunction (defined by Schrödinger's wave equation), which describes the probability amplitude of the position and momentum of a particle. Mathematical manipulations of the wavefunction usually involve the bra-ket notation, which requires an understanding of complex numbers and linear functionals. The wavefunction treats the object as a quantum harmonic oscillator and the mathematics is akin to that of acoustic resonance.

Many of the results of quantum mechanics do not have models that are easily visualized in terms of classical mechanics; for instance, the ground state in the quantum mechanical model is a non-zero energy state that is the lowest permitted energy state of a system, rather than a traditional classical system that is thought of as simply being at rest with zero kinetic energy.

Fundamentally, it attempts to explain the peculiar behaviour of matter and energy at the subatomic level—an attempt which has produced more accurate results than classical physics in predicting how individual particles behave. But many unexplained anomalies remain.

Historically, the earliest versions of quantum mechanics were formulated in the first decade of the 20th Century, around the time that atomic theory and the corpuscular theory of light as interpreted by Einstein first came to be widely accepted as scientific fact; these later theories can be viewed as quantum theories of matter and electromagnetic radiation.

Following Schrödinger's breakthrough in deriving his wave equation in the mid-1920s, quantum theory was significantly reformulated away from the old quantum theory, towards the quantum mechanics of Werner Heisenberg, Max Born, Wolfgang Pauli and their associates, becoming a science of probabilities based upon the Copenhagen interpretation of Niels Bohr. By 1930, the reformulated theory had been further unified and formalized by the work of Paul Dirac and John von Neumann, with a greater emphasis placed on measurement, the statistical nature of our knowledge of reality, and philosophical speculations about the role of the observer.

The Copenhagen interpretation quickly became (and remains) the orthodox interpretation. However, due to the absence of conclusive experimental evidence there are also many competing interpretations.

Quantum mechanics has since branched out into almost every aspect of physics, and into other disciplines such as quantum chemistry, quantum electronics, quantum optics and quantum information science. Much 19th century physics has been re-evaluated as the classical limit of quantum mechanics and its more advanced developments in terms of quantum field theory, string theory, and speculative quantum gravity theories.







[b]History[/b]
The history of quantum mechanics dates back to the 1838 discovery, of cathode rays by Michael Faraday. This was followed by the 1859 statement of the black body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, and the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta", or "energy elements", precisely matched the observed patterns of black body radiation. According to Planck, each energy element E is proportional to its frequency ν:

E = h[i]v[/i]

where h is Planck's constant. Planck cautiously insisted that this was simply an aspect of the processes of absorption and emission of radiation and had nothing to do with the physical reality of the radiation itself. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material.

The foundations of quantum mechanics were established during the first half of the twentieth century by Niels Bohr, Werner Heisenberg, Max Planck, Louis de Broglie, Albert Einstein, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Wolfgang Pauli, David Hilbert, and others. In the mid-1920s, developments in quantum mechanics led to its becoming the standard formulation for atomic physics. In the summer of 1925, Bohr and Heisenberg published results that closed the "Old Quantum Theory". Out of deference to their dual state as particles, light quanta came to be called photons (1926). From Einstein's simple postulation was born a flurry of debating, theorizing and testing. Thus the entire field of quantum physics emerged, leading to its wider acceptance at the Fifth Solvay Conference in 1927.

The other exemplar that led to quantum mechanics was the study of electromagnetic waves such as light. When it was found in 1900 by Max Planck that the energy of waves could be described as consisting of small packets or quanta, Albert Einstein further developed this idea to show that an electromagnetic wave such as light could be described as a particle - later called the photon - with a discrete quanta of energy that was dependent on its frequency. This led to a theory of unity between subatomic particles and electromagnetic waves called wave–particle duality in which particles and waves were neither one nor the other, but had certain properties of both.

While quantum mechanics traditionally described the world of the very small, it is also needed to explain certain recently investigated macroscopic systems such as superconductors and superfluids.

The word quantum derives from Latin, meaning "how great" or "how much". In quantum mechanics, it refers to a discrete unit that quantum theory assigns to certain physical quantities, such as the energy of an atom at rest (see Figure 1). The discovery that particles are discrete packets of energy with wave-like properties led to the branch of physics dealing with atomic and sub-atomic systems which is today called quantum mechanics. It is the underlying mathematical framework of many fields of physics and chemistry, including condensed matter physics, solid-state physics, atomic physics, molecular physics, computational physics, computational chemistry, quantum chemistry, particle physics, nuclear chemistry, and nuclear physics. Some fundamental aspects of the theory are still actively studied.

Quantum mechanics is essential to understand the behavior of systems at atomic length scales and smaller. For example, if classical mechanics governed the workings of an atom, electrons would rapidly travel towards and collide with the nucleus, making stable atoms impossible. However, in the natural world the electrons normally remain in an uncertain, non-deterministic "smeared" (wave–particle wave function) orbital path around or through the nucleus, defying classical electromagnetism.

Quantum mechanics was initially developed to provide a better explanation of the atom, especially the differences in the spectra of light emitted by different isotopes of the same element. The quantum theory of the atom was developed as an explanation for the electron remaining in its orbit, which could not be explained by Newton's laws of motion and Maxwell's laws of classical electromagnetism.

[b]Mathematical formulations[/b]
n the mathematically rigorous formulation of quantum mechanics developed by Paul Dirac and John von Neumann, the possible states of a quantum mechanical system are represented by unit vectors (called "state vectors"). Formally, these reside in a complex separable Hilbert space (variously called the "state space" or the "associated Hilbert space" of the system) well defined up to a complex number of norm 1 (the phase factor). In other words, the possible states are points in the projective space of a Hilbert space, usually called the complex projective space. The exact nature of this Hilbert space is dependent on the system; for example, the state space for position and momentum states is the space of square-integrable functions, while the state space for the spin of a single proton is just the product of two complex planes. Each observable is represented by a maximally Hermitian (precisely: by a self-adjoint) linear operator acting on the state space. Each eigenstate of an observable corresponds to an eigenvector of the operator, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. If the operator's spectrum is discrete, the observable can only attain those discrete eigenvalues.

In the formalism of quantum mechanics, the state of a system at a given time is described by a complex wave function, also referred to as state vector in a complex vector space. This abstract mathematical object allows for the calculation of probabilities of outcomes of concrete experiments. For example, it allows one to compute the probability of finding an electron in a particular region around the nucleus at a particular time. Contrary to classical mechanics, one can never make simultaneous predictions of conjugate variables, such as position and momentum, with accuracy. For instance, electrons may be considered to be located somewhere within a region of space, but with their exact positions being unknown. Contours of constant probability, often referred to as "clouds", may be drawn around the nucleus of an atom to conceptualize where the electron might be located with the most probability. Heisenberg's uncertainty principle quantifies the inability to precisely locate the particle given its conjugate momentum.

According to one interpretation, as the result of a measurement the wave function containing the probability information for a system collapses from a given initial state to a particular eigenstate. The possible results of a measurement are the eigenvalues of the operator representing the observable — which explains the choice of Hermitian operators, for which all the eigenvalues are real. We can find the probability distribution of an observable in a given state by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do not commute.

The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr-Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Newer interpretations of quantum mechanics have been formulated that do away with the concept of "wavefunction collapse"; see, for example, the relative state interpretation. The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wavefunctions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics. Generally, quantum mechanics does not assign definite values. Instead, it makes predictions using probability distributions; that is, it describes the probability of obtaining possible outcomes from measuring an observable. Often these results are skewed by many causes, such as dense probability clouds or quantum state nuclear attraction. Naturally, these probabilities will depend on the quantum state at the "instant" of the measurement. Hence, uncertainty is involved in the value. There are, however, certain states that are associated with a definite value of a particular observable. These are known as eigenstates of the observable ("eigen" can be translated from German as meaning inherent or characteristic).

In the everyday world, it is natural and intuitive to think of everything (every observable) as being in an eigenstate. Everything appears to have a definite position, a definite momentum, a definite energy, and a definite time of occurrence. However, quantum mechanics does not pinpoint the exact values of a particle's position and momentum (since they are conjugate pairs) or its energy and time (since they too are conjugate pairs); rather, it only provides a range of probabilities of where that particle might be given its momentum and momentum probability. Therefore, it is helpful to use different words to describe states having uncertain values and states having definite values (eigenstate). Usually, a system will not be in an eigenstate of the observable (particle) we are interested in. However, if one measures the observable, the wavefunction will instantaneously be an eigenstate (or generalised eigenstate) of that observable. This process is known as wavefunction collapse, a controversial and much debated process. It involves expanding the system under study to include the measurement device. If one knows the corresponding wave function at the instant before the measurement, one will be able to compute the probability of collapsing into each of the possible eigenstates. For example, the free particle in the previous example will usually have a wavefunction that is a wave packet centered around some mean position x0, neither an eigenstate of position nor of momentum. When one measures the position of the particle, it is impossible to predict with certainty the result. It is probable, but not certain, that it will be near x0, where the amplitude of the wave function is large. After the measurement is performed, having obtained some result x, the wave function collapses into a position eigenstate centered at x.

The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian (the operator corresponding to the total energy of the system) generates time evolution. The time evolution of wave functions is deterministic in the sense that, given a wavefunction at an initial time, it makes a definite prediction of what the wavefunction will be at any later time.

During a measurement, on the other hand, the change of the wavefunction into another one is not deterministic; it is unpredictable, i.e. random. A time-evolution simulation can be seen here. Wave functions can change as time progresses. An equation known as the Schrödinger equation describes how wavefunctions change in time, a role similar to Newton's second law in classical mechanics. The Schrödinger equation, applied to the aforementioned example of the free particle, predicts that the center of a wave packet will move through space at a constant velocity, like a classical particle with no forces acting on it. However, the wave packet will also spread out as time progresses, which means that the position becomes more uncertain. This also has the effect of turning position eigenstates (which can be thought of as infinitely sharp wave packets) into broadened wave packets that are no longer position eigenstates.

Who reeds this gets a free llama badge!

Some wave functions produce probability distributions that are constant, or independent of time, such as when in a stationary state of constant energy, time drops out of the absolute square of the wave function. Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics it is described by a static, spherically symmetric wavefunction surrounding the nucleus (Fig. 1). (Note that only the lowest angular momentum states, labeled s, are spherically symmetric).

The Schrödinger equation acts on the entire probability amplitude, not merely its absolute value. Whereas the absolute value of the probability amplitude encodes information about probabilities, its phase encodes information about the interference between quantum states. This gives rise to the wave-like behavior of quantum states. It turns out that analytic solutions of Schrödinger's equation are only available for a small number of model Hamiltonians, of which the quantum harmonic oscillator, the particle in a box, the hydrogen molecular ion and the hydrogen atom are the most important representatives. Even the helium atom, which contains just one more electron than hydrogen, defies all attempts at a fully analytic treatment. There exist several techniques for generating approximate solutions. For instance, in the method known as perturbation theory one uses the analytic results for a simple quantum mechanical model to generate results for a more complicated model related to the simple model by, for example, the addition of a weak potential energy. Another method is the "semi-classical equation of motion" approach, which applies to systems for which quantum mechanics produces weak deviations from classical behavior. The deviations can be calculated based on the classical motion. This approach is important for the field of quantum chaos.

There are numerous mathematically equivalent formulations of quantum mechanics. One of the oldest and most commonly used formulations is the transformation theory proposed by Cambridge theoretical physicist Paul Dirac, which unifies and generalizes the two earliest formulations of quantum mechanics, matrix mechanics (invented by Werner Heisenberg) and wave mechanics (invented by Erwin Schrödinger). In this formulation, the instantaneous state of a quantum system encodes the probabilities of its measurable properties, or "observables". Examples of observables include energy, position, momentum, and angular momentum. Observables can be either continuous (e.g., the position of a particle) or discrete (e.g., the energy of an electron bound to a hydrogen atom). An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over histories between initial and final states; this is the quantum-mechanical counterpart of action principles in classical mechanics. qRTDTVHBMJKNHGGTED5qaesfyhuyjgdsaAQWEFGHUJL,JUHYTFDSWAqawsefyhujikjuhydeswa