Threesology Research Journal
Let's Talk Peace
page 44

http://cenocracy.org




Note: the contents of this page as well as those which precede and follow, must be read as a continuation and/or overlap in order that the continuity about a relationship to/with the dichotomous arrangement of the idea that one could possibly talk seriously about peace from a different perspective as well as the typical dichotomous assignment of Artificial Intelligence (such as the usage of zeros and ones used in computer programming) ... will not be lost (such as war being frequently used to describe an absence of peace and vice-versa). However, if your mind is prone to being distracted by timed or untimed commercialization (such as that seen in various types of American-based television, radio, news media and magazine publishing... not to mention the average classroom which carries over into the everyday workplace), you may be unable to sustain prolonged exposures to divergent ideas about a singular topic without becoming confused, unless the information is provided in a very simplistic manner.



List of Pages in this Series

Let's Talk Peace page 1 Let's Talk Peace page 2 Let's Talk Peace page 3 Let's Talk Peace page 4 Let's Talk Peace page 5 Let's Talk Peace page 6 Let's Talk Peace page 7 Let's Talk Peace page 8
Let's Talk Peace page 9 Let's Talk Peace page 10 Let's Talk Peace page 11 Let's Talk Peace page 12 Let's Talk Peace page 13 Let's Talk Peace page 14 Let's Talk Peace page 15 Let's Talk Peace page 16
Let's Talk Peace page 17 Let's Talk Peace page 18 Let's Talk Peace page 19 Let's Talk Peace page 20 Let's Talk Peace page 21 Let's Talk Peace page 22 Let's Talk Peace page 23 Let's Talk Peace page 24
Let's Talk Peace page 25 Let's Talk Peace page 26 Let's Talk Peace page 27 Let's Talk Peace page 28 Let's Talk Peace page 29 Let's Talk Peace page 30 Let's Talk Peace page 31 Let's Talk Peace page 32
Let's Talk Peace page 33 Let's Talk Peace page 34 Let's Talk Peace page 35 Let's Talk Peace page 36 Let's Talk Peace page 37 Let's Talk Peace page 38 Let's Talk Peace page 39 Let's Talk Peace page 40
Let's Talk Peace page 41 Let's Talk Peace page 42 Let's Talk Peace page 43 Let's Talk Peace page 44 Let's Talk Peace page 45 Let's Talk Peace page 46 Let's Talk Peace page 47 Let's Talk Peace page 48
Let's Talk Peace page 49 Let's Talk Peace page 50 Let's Talk Peace page 51 Let's Talk Peace page 52 Let's Talk Peace page 53 Let's Talk Peace page 54 Let's Talk Peace page 55 Let's Talk Peace page 56
Let's Talk Peace page 57 Let's Talk Peace page 58 Let's Talk Peace page 59 Let's Talk Peace page 60 Let's Talk Peace page 61 Let's Talk Peace page 62 Let's Talk Peace page 63 Let's Talk Peace page 64

Let's face it, humanity has a lousy definition, accompanying practice, and analysis of peace.



The topic of Information Theory is of some importance in a discussion of Peace (not to mention its "War" obverse)... since both peace and war represent types of "expressions"— or if you prefer, a form of communication about prevailing circumstances. Because neither peace nor war are long sustained, they are like conversations which begin and end, though remnants of what was, is or will be said can linger in a potential state of existence to predispose the occurrence of social/behavioral (mental/emotional) "oscillations", reverberations or echoes which may take a long while to diminish.

The desire for a sustained peace sought by many has not been achieved, but sporadic efforts to such a fame are rewarded by the Nobel Prize Committee whose actions appear to be the result of not actually appreciating what Peace is, because it has never been accurately discussed from a variety of perspectives instead of traditional definitions comprising wishful thinking. In other words, it's easy to reward Peace when the applied definition is severely restricted... and permitted to exercise an appraisal based on superficiality when anything more substantial for consideration is lacking. For one reason or another, while one person is saying "Peace" another person is hearing "War", or vice versa. In this perspective, we can consider that there is a communications problem from one to another or an intermediary... unless both are products of a miscommunication in that there is some other state (not peace nor war) that is being conveyed and is routinely misunderstood because of too much noise or interference. But let us begin the present selection by providing a definition of Information Theory culled from the Britannica followed by additional information that may be of some interest to one or another reader, so as to provide a better understanding of the current application of Information:

Information Theory

Introduction:

(Information Theory is) a mathematical representation of the conditions and parameters affecting the transmission and processing of information. Most closely associated with the work of the American electrical engineer Claude Shannon in the mid-20th century, information theory is chiefly of interest to communication engineers, though some of the concepts have been adopted and used in such fields as psychology and linguistics. Information theory overlaps heavily with communication theory, but it is more oriented toward the fundamental limitations on the processing and communication of information and less oriented toward the detailed operation of particular devices.

Historical background:

Interest in the concept of information grew directly from the creation of the telegraph and telephone. In 1844 the American inventor Samuel F.B. Morse built a telegraph line between Washington, D.C., and Baltimore, Maryland. Morse encountered many electrical problems when he sent signals through buried transmission lines, but inexplicably he encountered fewer problems when the lines were suspended on poles. This attracted the attention of many distinguished physicists, most notably the Scotsman William Thomson (Baron Kelvin). In a similar manner, the invention of the telephone in 1875 by Alexander Graham Bell and its subsequent proliferation attracted further scientific notaries, such as Henri Poincaré, Oliver Heaviside, and Michael Pupin, to the problems associated with transmitting signals over wires. Much of their work was done using Fourier analysis, a technique described later in this article, but in all of these cases the analysis was dedicated to solving the practical engineering problems of communication systems.

The formal study of information theory did not begin until 1924, when Harry Nyquist, a researcher at Bell Laboratories, published a paper entitled "Certain Factors Affecting Telegraph Speed." Nyquist realized that communication channels had maximum data transmission rates, and he derived a formula for calculating these rates in finite bandwidth noiseless channels. Another pioneer was Nyquist's colleague R.V.L. Hartley, whose paper "Transmission of Information" (1928) established the first mathematical foundations for information theory.

The real birth of modern information theory can be traced to the publication in 1948 of Claude Shannon's "A Mathematical Theory of Communication" in the Bell System Technical Journal. A key step in Shannon's work was his realization that, in order to have a theory, communication signals must be treated in isolation from the meaning of the messages that they transmit. This view is in sharp contrast with the common conception of information, in which meaning has an essential role. Shannon also realized that the amount of knowledge conveyed by a signal is not directly related to the size of the message. A famous illustration of this distinction is the correspondence between French novelist Victor Hugo and his publisher following the publication of Les Misérables in 1862. Hugo sent his publisher a card with just the symbol "?". In return he received a card with just the symbol "!". Within the context of Hugo's relations with his publisher and the public, these short messages were loaded with meaning; lacking such a context, these messages are meaningless. Similarly, a long, complete message in perfect French would convey little useful knowledge to someone who could understand only English.

Shannon thus wisely realized that a useful theory of information would first have to concentrate on the problems associated with sending and receiving messages, and it would have to leave questions involving any intrinsic meaning of a message—known as the semantic problem—for later investigators. Clearly, if the technical problem could not be solved—that is, if a message could not be transmitted correctly—then the semantic problem was not likely ever to be solved satisfactorily. Solving the technical problem was therefore the first step in developing a reliable communication system.

It is no accident that Shannon worked for Bell Laboratories. The practical stimuli for his work were the problems faced in creating a reliable telephone system. A key question that had to be answered in the early days of telecommunication was how best to maximize the physical plant—in particular, how to transmit the maximum number of telephone conversations over existing cables. Prior to Shannon's work, the factors for achieving maximum utilization were not clearly understood. Shannon's work defined communication channels and showed how to assign a capacity to them, not only in the theoretical sense where no interference, or noise, was present but also in practical cases where real channels were subjected to real noise. Shannon produced a formula that showed how the bandwidth of a channel (that is, its theoretical signal capacity) and its signal-to-noise ratio (a measure of interference) affected its capacity to carry signals. In doing so he was able to suggest strategies for maximizing the capacity of a given channel and showed the limits of what was possible with a given technology. This was of great utility to engineers, who could focus thereafter on individual cases and understand the specific trade-offs involved.

Shannon also made the startling discovery that, even in the presence of noise, it is always possible to transmit signals arbitrarily close to the theoretical channel capacity. This discovery inspired engineers to look for practical techniques to improve performance in signal transmissions that were far from optimal. Shannon's work clearly distinguished between gains that could be realized by adopting a different encoding scheme from gains that could be realized only by altering the communication system itself. Before Shannon, engineers lacked a systematic way of analyzing and solving such problems.

Shannon's pioneering work thus presented many key ideas that have guided engineers and scientists ever since. Though information theory does not always make clear exactly how to achieve specific results, people now know which questions are worth asking and can focus on areas that will yield the highest return. They also know which sorts of questions are difficult to answer and the areas in which there is not likely to be a large return for the amount of effort expended.

Since the 1940s and '50s the principles of classical information theory have been applied to many fields. The section Applications of information theory surveys achievements not only in such areas of telecommunications as data compression and error correction but also in the separate disciplines of physiology, linguistics, and physics. Indeed, even in Shannon's day many books and articles appeared that discussed the relationship between information theory and areas such as art and business. Unfortunately, many of these purported relationships were of dubious worth. Efforts to link information theory to every problem and every area were disturbing enough to Shannon himself that in a 1956 editorial titled "The Bandwagon" he issued the following warning:

I personally believe that many of the concepts of information theory will prove
useful in these other fields—and, indeed, some results are already quite promising—but
the establishing of such applications is not a trivial matter of translating words to a new domain,
but rather the slow tedious process of hypothesis and experimental verification.

With Shannon's own words in mind, we can now review the central principles of classical information theory.

Classical information theory

Shannon's communication model

Shannon's Communication Model

As the underpinning of his theory, Shannon developed a very simple, abstract model of communication, as shown in the figure. Because his model is abstract, it applies in many situations, which contributes to its broad scope and power.

The first component of the model, the message source, is simply the entity that originally creates the message. Often the message source is a human, but in Shannon's model it could also be an animal, a computer, or some other inanimate object. The encoder is the object that connects the message to the actual physical signals that are being sent. For example, there are several ways to apply this model to two people having a telephone conversation. On one level, the actual speech produced by one person can be considered the message, and the telephone mouthpiece and its associated electronics can be considered the encoder, which converts the speech into electrical signals that travel along the telephone network. Alternatively, one can consider the speaker's mind as the message source and the combination of the speaker's brain, vocal system, and telephone mouthpiece as the encoder. However, the inclusion of “mind” introduces complex semantic problems to any analysis and is generally avoided except for the application of information theory to physiology.

The channel is the medium that carries the message. The channel might be wires, the air or space in the case of radio and television transmissions, or fibre-optic cable. In the case of a signal produced simply by banging on the plumbing, the channel might be the pipe that receives the blow. The beauty of having an abstract model is that it permits the inclusion of a wide variety of channels. Some of the constraints imposed by channels on the propagation of signals through them will be discussed later.

Noise is anything that interferes with the transmission of a signal. In telephone conversations interference might be caused by static in the line, cross talk from another line, or background sounds. Signals transmitted optically through the air might suffer interference from clouds or excessive humidity. Clearly, sources of noise depend upon the particular communication system. A single system may have several sources of noise, but, if all of these separate sources are understood, it will sometimes be possible to treat them as a single source.

The decoder is the object that converts the signal, as received, into a form that the message receiver can comprehend. In the case of the telephone, the decoder could be the earpiece and its electronic circuits. Depending upon perspective, the decoder could also include the listener's entire hearing system.

The message receiver is the object that gets the message. It could be a person, an animal, or a computer or some other inanimate object.

Shannon's theory deals primarily with the encoder, channel, noise source, and decoder. As noted above, the focus of the theory is on signals and how they can be transmitted accurately and efficiently; questions of meaning are avoided as much as possible.

Four types of communication

There are two fundamentally different ways to transmit messages: via discrete signals and via continuous signals. Discrete signals can represent only a finite number of different, recognizable states. For example, the letters of the English alphabet are commonly thought of as discrete signals. Continuous signals, also known as analog signals, are commonly used to transmit quantities that can vary over an infinite set of values—sound is a typical example. However, such continuous quantities can be approximated by discrete signals—for instance, on a digital compact disc or through a digital telecommunication system—by increasing the number of distinct discrete values available until any inaccuracy in the description falls below the level of perception or interest.

Communication can also take place in the presence or absence of noise. These conditions are referred to as noisy or noiseless communication, respectively.

All told, there are four cases to consider: discrete, noiseless communication; discrete, noisy communication; continuous, noiseless communication; and continuous, noisy communication. It is easier to analyze the discrete cases than the continuous cases; likewise, the noiseless cases are simpler than the noisy cases. Therefore, the discrete, noiseless case will be considered first in some detail, followed by an indication of how the other cases differ.

Discrete, noiseless communication and the concept of entropy

From message alphabet to signal alphabet

As mentioned above, the English alphabet is a discrete communication system. It consists of a finite set of characters, such as uppercase and lowercase letters, digits, and various punctuation marks. Messages are composed by stringing these individual characters together appropriately. (Henceforth, signal components in any discrete communication system will be referred to as characters.)

For noiseless communications, the decoder at the receiving end receives exactly the characters sent by the encoder. However, these transmitted characters are typically not in the original message's alphabet. For example, in Morse Code appropriately spaced short and long electrical pulses, light flashes, or sounds are used to transmit the message. Similarly today, many forms of digital communication use a signal alphabet consisting of just two characters, sometimes called bits. These characters are generally denoted by 0 and 1, but in practice they might be different electrical or optical levels.

A key question in discrete, noiseless communication is deciding how to most efficiently convert messages into the signal alphabet.

Discrete, noisy communication and the problem of error

In the real world, transmission errors are unavoidable—especially given the presence in any communication channel of noise, which is the sum total of random signals that interfere with the communication signal. In order to take the inevitable transmission errors of the real world into account, some adjustment in encoding schemes is necessary. The figure shows a simple model of transmission in the presence of noise, the binary symmetric channel. Binary indicates that this channel transmits only two distinct characters, generally interpreted as 0 and 1, while symmetric indicates that errors are equally probable regardless of which character is transmitted. The probability that a character is transmitted without error is labeled p; hence, the probability of error is 1 - p.

Consider what happens as zeros and ones, hereafter referred to as bits, emerge from the receiving end of the channel. Ideally, there would be a means of determining which bits were received correctly. In that case, it is possible to imagine two printouts:

10110101010010011001010011101101000010100101—Signal
00000000000100000000100000000010000000011001—Errors

Signal is the message as received, while each 1 in Errors indicates a mistake in the corresponding Signal bit. (Errors itself is assumed to be error-free.)

Shannon showed that the best method for transmitting error corrections requires an average length of:

E = p log2(1/p) + (1 - p) log2(1/(1 - p))

bits per error correction symbol. Thus, for every bit transmitted at least E bits have to be reserved for error corrections. A reasonable measure for the effectiveness of a binary symmetric channel at conveying information can be established by taking its raw throughput of bits and subtracting the number of bits necessary to transmit error corrections. The limit on the efficiency of a binary symmetric channel with noise can now be given as a percentage by the formula 100 × (1 - E). Some examples follow.

Suppose that p = 1/2, meaning that each bit is received correctly only half the time. In this case E = 1, so the effectiveness of the channel is 0 percent. In other words, no information is being transmitted. In effect, the error rate is so high that there is no way to tell whether any symbol is correct—one could just as well flip a coin for each bit at the receiving end. On the other hand, if the probability of correctly receiving a character is .99, E is roughly .081, so the effectiveness of the channel is roughly 92 percent. That is, a 1 percent error rate results in the net loss of about 8 percent of the channel's transmission capacity.

One interesting aspect of Shannon's proof of a limit for minimum average error correction length is that it is nonconstructive; that is, Shannon proved that a shortest correction code must always exist, but his proof does not indicate how to construct such a code for each particular case. While Shannon's limit can always be approached to any desired degree, it is no trivial problem to find effective codes that are also easy and quick to decode.

Continuous communication and the problem of bandwidth

Continuous communication, unlike discrete communication, deals with signals that have potentially an infinite number of different values. Continuous communication is closely related to discrete communication (in the sense that any continuous signal can be approximated by a discrete signal), although the relationship is sometimes obscured by the more sophisticated mathematics involved.

Source: "Information Theory." Encyclopædia Britannica Ultimate Reference Suite, 2013.

With respect to human communication, let us look at a brief appraisal of human vocalization:


The first of the two basic sounds made by infants includes all those related to crying; these are present even at birth. A second category, described as cooing, emerges at about eight weeks and includes sounds that progress to babbling and ultimately become part of meaningful speech. Almost all children make babbling sounds during infancy, and no relationship has been established between the amount of babbling during the first six months and the amount or quality of speech produced by a child at age two. Vocalization in the young infant often accompanies motor activity and usually occurs when the child appears excited by something he sees or hears. Environmental influences ordinarily do not begin to influence vocalization seriously before two months of age; in fact, during the first two months of postnatal life, the vocalizations of deaf children born to deaf parents are indistinguishable from those of infants born to hearing parents. Environmental effects on the variety and frequency of the infant's sounds become more evident after roughly eight weeks of age. The use of meaningful words differs from simple babbling in that speech primarily helps to obtain goals, rather than simply reflecting excitement.

Source: "Human Behaviour." Encyclopædia Britannica Ultimate Reference Suite, 2013.

We can either agree with this "two" assessment, or say that such a statement says more about the interpreter's orientation... particularly if we were to claim otherwise, since we have an appreciation of existing different patterns with respect to human physiology. Also, if we say that infants are conduits or echo chambers of the Universe's development, the recurrence of the cry as a vocalization may be a microscopic representation of a larger macroscopic event... in that the presumed Big Bang was quite loud... if we had been in earshot of it... and the background "noise" of the Cosmos is a representative echo that is represented as a metaphor going through the medium of the human body being used as a primitive reception and translation station... just like the three-patterned "messages" of DNA and RNA which have after-effects called life.


Newborns typically sleep for about 16–18 hours a day, but the total amount of time spent sleeping gradually decreases to about 9–12 hours a day by age two years. At birth infants display a set of inherited reflexes involving such acts as sucking, blinking, grasping, and limb withdrawal. Infants' vision improves from 20/800 (in Snellen notation) among two-week-olds to 20/70 vision in five-month-olds to 20/20 at five years. Even newborns are sensitive to certain visual patterns, chiefly movement and light/dark contrasts and show a noticeable preference for gazing at the human face; by the first or second month they can discriminate between different faces, and by the third they can identify their mother by sight. Young infants also show a predilection for the tones of their mother's voice, and they manifest a surprising sensitivity to the tones, rhythmic flow, and sounds that together make up human speech.

Crying is basic to infants from birth, and the cooing sounds they have begun making by about eight weeks progress to babbling and ultimately become part of meaningful speech. Virtually all infants begin to comprehend some words several months before they themselves speak their first meaningful words. By 11 to 12 months of age they are producing clear consonant-vowel utterances such as "mama" or "dada." The subsequent expansion of vocabulary and the acquisition of grammar and syntax mark the end of infancy and the beginning of child development.

Source: "Infancy." Encyclopædia Britannica Ultimate Reference Suite, 2013.

Human Evolution

Speech and symbolic intelligence

The origin and development of human culture—articulate spoken language and symbolically mediated ideas, beliefs, and behaviour—are among the greatest unsolved puzzles in the study of human evolution. Such questions cannot be resolved by skeletal or archaeological data. Research on the behaviour and cognitive capabilities of apes, monkeys, and other animals and on cognitive development in human children provide some clues, but extrapolating this information back through time is tenuous at best. Complicating the scenario further, it may be that today's chimpanzees, bonobos, and other anthropoid primates have more sophisticated cognitive capabilities and behavioral skills than those of some early hominins, because they and their ancestors have had several million years to overcome many challenges and perhaps have become more advanced in the process. Speech has been inferred by some investigators on the basis of certain internal skull features, for example, in H. habilis, but jaw shape and additional traits suggest otherwise. Still other researchers claim that human speech was not even fully developed in early members of anatomically modern Homo sapiens, because of the simplicity of their tool kits and art before the Late Paleolithic.

It is impossible to assess linguistic competency by observing the insides of reassembled fossil craniums that are incomplete, battered, and distorted—and in any case the brains probably did not fit snugly against the walls of the braincase. The apparent cerebral expansion in H. habilis and H. rudolfensis may imply a general increase in cognitive abilities, manipulative skill, or other factors besides speech. Particularly unreliable are claims that the specific internal cranial impressions of a Broca cap is evidence of speech. Prominent Broca caps exist among some chimpanzees, yet no ape has uttered a word, despite laborious attempts to get them to speak.

A humanoid vocal tract is undetectable in fossils because it comprises only soft tissues and leaves no bony landmarks. Although versatile human speech is reasonably linked to a relatively spacious pharynx and mobile tongue, the absence of such features is not a compelling reason to deny some form of vocal language in ancestral hominins. It is argued that articulate human speech is impossible without a lowered voice box (larynx) and an expanded region above it. If this presumption were true, even Neanderthals would be inept vocally and probably also quite primitive cognitively as compared with Late Paleolithic Homo sapiens populations such as the Cro-Magnons. Gibbons and great apes do not speak, yet they have throat traits concomitant with speech, albeit to a lesser degree than humans'. The calls of gibbons are wonderfully varied in pitch and pattern, and, if such sounds were broken into discrete bits with consonants, they could emulate words. The same may be said for great apes. Orangutans, chimpanzees, and bonobos have sufficiently mobile lips and tongues; they simply lack neural circuitry for speech.

Conversely, if the theory that different abilities are governed by distinct and separate forms of intelligence (multiple intelligences) is correct, much of tool-using behaviour and artistic ability would have to be based upon neurological structures fundamentally different from those that support verbal ability. Human children begin to use language before they become sophisticated tool users. Similarly, a form of speech might have preceded forms of tool behaviour that are symbolically mediated. Visual arts such as painting and sculpture are expressions of spatial intelligence, which is centred principally in areas of the brain different from those related to speech. Therefore, one cannot expect the problem of language origins or language competence to be clarified by studying Paleolithic symbolism and imagery, despite the awesome array of cave art and polished bone, antler, ivory, stone, and shell artifacts associated with the period. Yet if the stunning proliferation and stylistic variability of tools, bodily ornaments, and artistic works during the Paleolithic do not point unequivocally to the specific use of speech, the presence of these symbolically mediated artifacts—among the earliest of which are shell beads found in Morocco and made about 82,000 years ago—does indicate that early humans were capable of complex conceptual and abstract thought.

Historically, all human groups manifest rich symbolically mediated language, religion, and social, political, and economic systems, even in the absence of elaborate material culture. The demands on the social intelligence of peoples who live in environments with relatively few artifacts are similar to demands placed upon those who depend upon complex technological gadgets and shelters for comfort. Consequently, prehistoric Homo sapiens cannot be regarded as cognitively less capable than ourselves, and it is impossible to state which hominin species were "fully human" as symbol users. As a case in point, meticulously documented language studies of captive bonobos and chimpanzees demonstrate that they have the capability to comprehend and use symbols in order to communicate with humans and with one another, but the use of this potential in the wild remains to be demonstrated. Perhaps the human capacity symbolically to represent feelings, situations, objects, and ideas developed before being commandeered by the several intelligences and before it became a boon to vocal communication.

Archaeological evidence indicates that, like at least some of their Pliocene predecessors, the most recent hominins were probably omnivorous, though how much meat was in their diets and whether they obtained it by scavenging, hunting, or both are poorly documented until about 200–100 kya. Stone tools and cut marks on bones at archaeological sites attest to a long history of meat eating in the tribe Hominini, but this practice could have existed long before stone tools were invented. Like chimpanzees, bonobos, baboons, capuchins, and other primates, early Pliocene hominins may have killed and fragmented vertebrate prey with only their hands and jaws instead of tools. The extent to which our ancestors' hunting, scavenging, or other activities were communal and coordinated via symbolic communication has not been determined.

There is no valid way to estimate group size and composition because there is little evidence of movement patterns, shelters, and graves until the Late Paleolithic. Archaeological traces of human-made shelters occur rarely from 60 kya, then become more common, particularly in regions with notable seasons of inclement weather. The first appearances and development of symbolically based spirituality are also highly elusive because they left no morphological or unarguable archaeological trace until the innovation of writing and ritual paraphernalia. Although some Neanderthals buried their dead, there is little evidence of mortuary ceremony in their graves. Graves of Homo sapiens from 40 kya sometimes contain grave goods.

Russel Howard Tuttle: Professor of Anthropology, of Evolutionary Biology, of the Biological and Social Sciences, University of Chicago, Illinois. Author of Apes of the World: Their Social Behavior, Communication, Mentality and Ecology and others.

Source: "Human Evolution." Encyclopædia Britannica Ultimate Reference Suite, 2013.

And if there was a mechanical vocalization difference between present day humans and early ancestors like Cro-magnons or even just before pre-history, we might also want to consider that the vocalizations of their infant populations was different as well. Whereas there may have been a similarity in crying and cooing behavior, there may have been stark differences in the area of babbling. Hence, no consonant and vowel pairs may have been articulated... however crudely. Thus, if there was no oral-induced exchanges of a dichotomous design (such as vowels/consonants), is the present peace/war articulation that which illustrates a pre-vocal pattern of contrasts? In other words, if we say "peace/war", are we actually referring to a pre-vocal influence having originated in the deep distant past related to a non-vocal event and stimulus?

— End of page 44 —



Date of Origination: Tuesday, 28-March-2017... 05:49 AM
Date of initial posting: Tuesday, 11-April-2017... 1:57 PM Updated posting: Saturday, 31-March-2018... 12:29 PM