Wednesday, October 24, 2007

Delapan Tips Mengontrol Anak Menghadapi Media Kekerasan

  1. Siapkan set aturan main mana yang pantas dan mana yang tidak pantas ditonton dalam keluarga anak. Terapkan aturan main tersebut pada semua media: TV, kartun, video, film, video game, majalah dan komik
  2. Bantulah anak anda untuk memilih program acara media yang sesuai dengan aturan main yang disepakati dalam keluarga. Carilah program positif sekaligus mengurangi program yang negatif.
  3. Berhati-hatilah manakala anak menonton di luar rumah. Komunikasikan standar anda kepada para tetangga, kakek-nenek mereka, baby sitter atau orang lain yang mempunyai perhatian kepada anak anda. Mintalah kerjasama positif kepada mereka.
  4. Jadikan diri anda sebagai teladan yang baik dalam memilih tayangan program media.
  5. Menonton bersama anak secara periodik memang berguna, bantulah mereka memaknai tayangan yang sedang mereka tonton. Berbagilah pengalaman dengan anak-anak mengenai apa yang boleh dan yang tidak mereka lihat atau dengar dari tayangan media massa.
  6. Jika memungkinkan gunakanlah video rekaman pada tayangan yang memang harus dilihat sendiri oleh anak-anak anda.
  7. Dukunglah kegiatan atau aktivitas anak-anak selain menontong TV atau video atau video game.
  8. Bicarakanlah tentang peran dan fungsi media dengan orang tua lainnya, berbagilah tips dan tantanglah orang tua yang skeptis mengenai kritisisme media.


Monday, October 22, 2007

Children And Media Violence

Children And Media Violence

  • By the time a child is eighteen years old, he or she will witness on television (with average viewing time) 200,000 acts of violence including 40,000 murders (Huston, et al, 1992).
  • Children, ages 8 to 18, spend more time (44.5 hours per week- 61/2 hours daily) in front of computer, television, and game screens than any other activity in their lives except sleeping (Kaiser Family Foundation, 2005).
  • Since the 1950s, more than 1,000 studies have been done on the effects of violence in television and movies. The majority of these studies conclude that: children who watch significant amounts of television and movie violence are more likely to exhibit aggressive behavior, attitudes and values (Senate Committee on the Judiciary, 1999).
  • Media violence affects children's behavior states the American Medical Association, American Academy of Pediatrics, American Psychological Association, American Academy of Family Physicians, and American Academy of Child & Adolescent Psychiatry (Congressional Public Health Summit, 2000).
  • Children are affected at any age, but young children are most vulnerable to the effects of media violence (Bushman, 2001). Young children
    • are more easily impressionable.
    • have a harder time distinguishing between fantasy and reality.
    • cannot easily discern motives for violence.
    • learn by observing and imitating.
  • Young children who see media violence have a greater chance of exhibiting violent and aggressive behavior later in life, than children who have not seen violent media (Congressional Public Health Summit, 2000).
  • Violent video games can cause people to have more aggressive thoughts, feelings, and behaviors; and decrease empathetic, helpful behaviors with peers (Anderson, 2004; Gentile, 2003).
  • Children who watch more TV and play more video games are not only exposed to more media violence, but are more likely to act more aggressively with peers and tend to assume the worst in their interactions with peers (Buchanan, et al, 2002).
  • Violence (homicide, suicide, and trauma) is a leading cause of death for children, adolescents and young adults, more prevalent than disease, cancer or congenital disorders (American Academy of Pediatrics, 2001).

What’s Happening

Six prominent medical groups (American Academy of Pediatrics, American Academy of Child & Adolescent Psychiatry, American Psychological Association, American Medical Association, American Academy of Family Physicians and the American Psychiatric Association) warn of these effects of media violence on children:

  • Children will increase anti-social and aggressive behavior.
  • Children may become less sensitive to violence and those who suffer from violence.
  • Children may view the world as violent and mean, becoming more fearful of being a victim of violence.
  • Children will desire to see more violence in entertainment and real life.
  • Children will view violence as an acceptable way to settle conflicts.
    (Congressional Public Health
    Summit, 2000)

Many factors in the portrayal of media violence contribute to its affect on children and teens (Comstock, 1994, Huesmann, 2001):

  • What are the consequences for aggressive behavior? Is it rewarded or punished? Aggressive behavior on screen that lacks consequences, portrayed as justified, or is rewarded will have a greater effect on children.
  • When the violence is committed by an attractive or charismatic hero, with whom the child identifies, the effect of that violence will be greater.
  • When the child's attention is focused on the violence on the screen, causing the child to be engaged or aroused, the impact is greater.
  • If the child sees the violence in the show as being realistic, reflecting real life, the impact will be greater.

Tuesday, October 09, 2007

Pernyataan Menolak Praktik Hukuman Mati

Dengan ini dinyatakan bahwa saya sebagai pribadi:

1. menolak tegas praktik hukuman mati di Indonesia

2. mendesak untuk dilakukannya amandemen perundangan tentang vonis hukuman mati

3. mengganti hukuman mati dengan praktik hukuman lainnya yang lebih manusiawi

4. mendesak pemerintah untuk mengadakan serta memodifikasi penjara atau lembaga pemasyarakatan dengan level keamanan penuh dan terpencil
kepada terhukum vonis hukuman mati

Ideology and Signs in Advertising

The main aim of persuasive advertisements is to manipulate the consumer/receiver into taking a certain action. The advertisement reflects the time and spatial setting of its origin. It also reflects the social relationships prevalent within that culture, and forms values that establish a range of ideological references.

The term ideology was first used in 1796 to refer to a science of ideas (Nöth 1991: 377). Three versions of the concept exist: the value-neutral concept, the pejorative sense and the universalistic sense. The first concept refers to ideology as any system of norms, values, beliefs or Weltanschauungen directing the social and political attitudes of a group. The second type refers to a system of false ideas, representing the false consciousness of a social class. The third concept identifies ideology with the sphere of ideas in general (Nöth 1991: 377-8). This view coincides with the view of psychologists that ideology is the way that attitudes are organised into a coherent pattern (Fiske 1982: 144).

Bahktin (1990: 378) links ideology with semiotics when he writes:

Everything ideological […] is a sign; without signs, there is no ideology. The domain of ideology coincides with the domain of signs. Wherever a sign is present, ideology is present too. Everything ideological possesses semiotic value.

Advertisements become a vehicle for ideology by reflecting ideas, beliefs and opinions that are a reflection of the society within a specific culture. This idea is echoed by semioticians such as Fiske (1982: 145) who argue that ideology is determined by society, not by the individual’s set of attitudes. According to Maria Campos (1983: 978) "ideology attains material existence. It becomes concretised, it acts surreptitiously, it never presents itself as ‘being ideological’". The ideology is generated by the signs that represent the advertisement and its message.

In other words, the ideology-semiotic relationship is established when the ideology makes use of signs to convey its message; thus the ideology precedes the signs. "It becomes communicable as it is turned into a code" (Campos: 1983: 978). This code then relates the ideology by portraying certain deeds, habits or institutions.

On the one hand, advertisements make the consumer/receiver believe that they reflect reality, but in fact they only create a world which makes allusions to reality. The same can be said for ideology. Ideology becomes the category of illusions and false consciousness according to Fiske (1982: 145). As Campos (1983: 978) says, "Ideology takes certain elements as a starting point but changes them at the moment of expression." The falseness of the ideology is used by the ruling class to maintain dominance over the working class. This can clearly be seen in African countries, of which South Africa is no exception, and the advertising campaigns used.

Multinational companies propagate their products and the virtues and lifestyles of the capitalist, American Dream by reflecting this and many other values and ideologies in their advertising. Examples include motor cars (Ford), clothing (Levi) and perfume (Calvin Klein). The underlying message is that all that is American is good and should be followed.

A Discursive-Semiotic Approach to Cultural Aspects in Persuasive Advertisements

INTRODUCTION

The combination of discourse analysis and semiotic analysis brings together two disciplines that have not traditionally been used by translators to deal with the transference of cultural aspects in translation.

Much confusion still prevails amongst theorists regarding the exact definitions of discourse and semiotics. Discourse is often seen as only referring to the spoken word. Stubbs (1983: 9) differentiates between text and discourse: "one talks of written text versus spoken discourse".

Semiotics is often confused with semiosis. In both cases a distinct definition will be presented for these terms to avoid any confusion or ambiguity. These two disciplines will be discussed separately, starting with discourse.

Definitions of Discourse

Various views on the term "discourse" will be compared, as well as various views on discourse analysis. Often theorists use text and discourse interchangeably; others define discourse as spoken words only, and text as written words. In both instances context is seen as a separate function. A distinction is made in this dissertation between these terms and their function, and a working definition of the different terms will be formulated for use and application.

Traditionally, discourse has been treated as "a continuous stretch of (especially spoken) language larger than a sentence…a discourse is a behavioural unit which has a pre-theoretical status in linguistics…" (Crystal 1991: 106). According to this definition discourse is primarily seen as spoken language (a language act: parole).

Discourse covers a vast field and definitions abound. This can be illustrated by the opinions of various theorists. Yule and Brown (1987: 1) state that "the analysis of discourse, is necessarily, the analysis of language in use. As such, it cannot be restricted to the descriptions of linguistic forms independent of the purposes or functions which these forms are designed to serve in human affairs".

It becomes clear that the production of discourse is a social act and therefore written discourse is the representation of this social act. This social act implies that communication takes place. This feature will be further discussed under the communicative function of discourse.

Some theorists distinguish between text and discourse as two separate terms and concepts, an opinion that will be later refuted. Salkie (1995: ix) states that "text or a discourse is a stretch of language that may be longer than one sentence. Text and discourse analysis is about how sentences combine to form texts by means of cohesiveness and coherence".

Widdowson (1983: 9) also distinguishes "textual cohesion, recognizable in surface leixis, grammar and propositional development, from discourse coherence which operates between underlying speech acts".

Newmark’s (1988: 54) definition is similar to Salkie’s definition, in that he states:

The analysis of texts beyond and ‘above’ the sentence – the attempt to find linguistic regularities in discourse…its main concepts are cohesion – the features that bind sentences to each other grammatically and lexically – and coherence – which is the notional and logical unity of a text.

Two important aspects (standards of textuality), coherence and cohesion, are mentioned in the above definitions. Coherence refers to those elements that make a text hang together, and refers to textual and contextual aspects of discourse. A coherent text is "a text whose constituent parts (episodes, sentences) are meaningfully related so that the text as a whole ‘makes sense’, even though there may be relatively few markers…" (Fairclough 1992: 83).

Cohesiveness or cohesion refers to "how clauses are linked together into sentences, and how sentences are in turn linked together to form larger units in texts" (Fairclough 1992: 77). This can be achieved by repetition, conjunctive words, near-synonyms or vocabulary from a common semantic field. Cohesion deals with the textual aspect of discourse.

The relevance of these two aspects is that they are important in text production, and thus in discourse analysis. Should one or both of these features be absent, the text would not be able to function as a meaningful whole. This in turn would have an impact on the context of the discourse, and thus have many ramifications for the translator of the text who would have to make sense of disjointed elements in the advertisement to be translated.

A shortcoming in Salkie’s definition is that it does not account for texts that are shorter than a sentence and consist only of one or two words, or spoken language. In the case of advertisements, especially print advertisements, there is often little or no text and the emphasis is on the visual material supported by very little text. For instance, a print advertisement could consist of only visual material and one word, such as a brand name or an exclamation. This proves that discourse does not necessarily have to consist of lengthy sentences. The coherence would be brought about by the interaction between the word(s) and the visual material, but there would be very little or no cohesion due to the lack of text.

Newmark’s definition suggests that discourse is an all-inclusive term for the written and spoken language used in a social act. Stubbs (1983: 1) points out "that language and situation are inseparable". The situation forms the basis of the context. It follows thus that context and text are two inseparable aspects that work together to constitute the discourse. Thus a working definition for discourse as perceived in this dissertation can be formulated.

Working definition of discourse

Text refers to all linguistic aspects in written or spoken natural language, i.e. the words used to form the utterance or written text. It could be a word, a sentence, a paragraph, or a longer stretch of language, in other words any length of words used to create text. In semiotic terms language represents a sign system. In other words, language is a linguistic sign system creating meaning in a given context.

The information provided by the text must be related to the discourse as a whole; that is with the text as coherent collection of semantic relations, in other words "…the quality of perceived purpose, meaning and connection…" (Cook 1994: 25).

The text takes place within a given situation or context. Context consists of various factors, not all of which always appear at once in a given situation.

According to Cook (1992: 1) context includes:

substance - the physical material which carries or relays text;

music and pictures;

paralanguage - meaningful behaviour accompanying language, such as voice quality, gestures, facial expressions and touch (in speech) and choice of typeface and letter sizes (in writing);

situation - the properties and relations of objects and people in the vicinity of the text, as perceived by the participants;

co-text - text which precedes or follows that under analysis, and which participants judge to belong to the same discourse;

intertext - text which the participants perceive as belonging to other discourse, but which they associate with the text under consideration, and which affects their interpretation;

participants - they are described as senders, addressers, addressees and receivers; and

function - what the text is intended to do by the senders and addressers, or perceived to do by the receivers and addressees. (This element will be dealt with separately.)

For the purposes of this dissertation this definition of context suffices and can be used as such. In persuasive advertisements, usually more than one of these aspects works together to form the context in which text production takes place. In semiotic terms, the different aspects create or represent signs (context) that generate meaning to perform a persuasive function together with the linguistic signs (text).

Therefore, advertisement discourse is defined as text occurring within a specific context.

Discourse analysis

It is vital for the translator to keep in mind that text cannot exist without context and vice versa. The main assumption is that, in persuasive advertisements, the text (language) is subject and sensitive to the context. Context includes knowledge of elements existing outside the text (knowledge of the world) as well as how these elements contribute to create a certain frame of reference and/or a cultural identity.

The culture in which a certain advertisement is created forms part of the context. Schiffrin (1987: 4) confirms this view by saying that "… language always occur(s) in a context, but its patterns – of form and function, and at surface and underlying levels – are sensitive to features of that context". When translating a persuasive advertisement, the translator has to be sensitive to this because "language is potentially sensitive to all of the contexts in which it occurs, and, even more strongly, language reflects those contexts because it helps to constitute them" (Schiffrin 1987: 5).

Advertisements always rely on the relation between the text and its context; the one cannot survive without the other. The receiver senses this relationship and decodes the message accordingly. The context of the advertisement determines how the receivers will perceive the message. The context is embedded in a specific culture, whether it is a language-related culture or a sub-culture.

The task of discourse analysis is to identify the cultural aspects and determine their role in the persuasive advertisements in view of transferring them in the translation process.

Knowledge of discourse analysis is important for the translator to:

  • identify the text and context;
  • isolate and describe the inherent elements in the text and context;
  • determine how these elements interact in the discourse;
  • identify cultural aspects; and
  • determine how the above-mentioned points function in the communication process.

ADVERTISEMENTS AS DISCOURSE

Advertisement discourse challenges the translator more than any other discourse because of its very nature and the multitude of elements that constitute its existence. Cook (1992: 4) states that there are hundreds of discourse types "which merge into each other and defy exact definition". This is particularly relevant to the nature of advertisements: an advertisement could be several types at once. For instance, a persuasive advertisement could display characteristics of a joke, a song and cartoon at the same time. In an attempt to deal with the translation of cultural aspects in advertisements, the characteristics and the function of this communicative event have to be discussed. However, it would not be reasonable or justified to formulate one definitive meaning of what constitutes advertisements: the definition would be limiting - a contradiction in terms.

Characteristics

The various characteristics of advertising as identified by Cook (1992: 214) apply to the broad spectrum of advertisements, in whatever form. These characteristics cover the most important characteristics inherent to all forms of advertising. The translator can use these guidelines to determine whether a discourse is an advertisement if it displays one or more of these characteristics.

The features below are prototypical of advertisements rather than definitive. (They have been arranged in order of importance as viewed by the study. The characteristics from number 26 are the author’s additions.)

  1. They have the typical restless instability of a new discourse type.
  2. They seek to alter addressees’ behaviour. (Persuasive advertisements are prime examples.)
  3. They change constantly. (Advertisements for a specific product change intermittently.)
  4. They are a discourse on the periphery of attention. (Advertisements are not regarded as being "serious".)
  5. They are unsolicited by their receivers. (Advertisements appear in the media, e.g. on television.)
  6. They are parasitic: appropriating and existing through the voices of other discourse types. (In magazines, newspapers and on television and radio.)
  7. They merge the features of public and private discourse, and the voices of intimacy and authority, exploiting the features common to both. (Private conversation and public addresses can be used.)
  8. They use various substances for discourse (e.g. a perfume strip in a magazine).
  9. They are embedded in an accompanying discourse (e.g. in a newspaper).
  10. They provoke social, moral and aesthetic judgements, either positive or negative.
  11. They are often heard in many contradictory ways simultaneously.
  12. Advertisements provoke controversy (e.g. Bennetton advertisements).
  13. They are multi-modal and can use pictures, music and language, either singly or in combination (e.g. television commercial).
  14. They are multi-submodal, in their use of language and can use writing, speech and song (e.g. radio advertisement).
  15. They contain and foreground extensive and innovative use of paralanguage (e.g. body language in a television commercial).
  16. They foreground connotational, indeterminate and metaphorical meaning, thus creating fusion between disparate spheres (e.g. Mercedes-Benz with luxury).
  17. They make dense use of intra-modal and inter-modal parallelisms.
  18. They use a heteroglossic narrative.
  19. They make extensive use of intra- and inter-discoursal allusion.
  20. They are presented in short bursts (e.g. television commercial of 30 seconds).
  21. They follow a principle of reversal, causing them to change many features, as soon as they become established, to the opposite.
  22. They are identified by their position in an accompanying discourse.
  23. They use their space and time to give pleasure.
  24. They use code-play.
  25. They answer a need for display and repetitive language.
  26. Advertisements, as verbal art, are detrimentally constrained by the need to fulfil the wishes of their clients.
  27. They infiltrate new technology and media (e.g. on the Internet).
  28. They (unnecessarily) create need.
  29. They sell a lifestyle. (Fun-loving people smoke Peter Stuyvesant cigarettes and visit exotic holiday destinations.)
  30. They are a form of mass communication.

A persuasive advertisement could contain one, all or a combination of these characteristics. There are no set rules which determine that persuasive advertisements that use a specific medium should display certain characteristics. Advertisements are in a constant state of flux. Although the message of two products might be the same, different mediums could change the characteristics of the two advertisements.

Function

In the broadest sense advertisements either persuade or inform receivers in terms of their functionality. The main function of a persuasive advertisement is to persuade the receiver to take a specific action, in other words the receiver is directly manipulated to change or modify his/her (consumer) behaviour. Elements of information can also be present. The intended function can only take place if the discourse fulfils its communicative role.

Friday, August 10, 2007

Pernyataan Sikap: Menolak dan Mengutuk Pembakaran Buku !!

Saya sebagai Pribadi menyatakan bahwa:


1. Menolak proses atau cara berpikir bahwa pembakaran Buku

2. Mengutuk keras aktivitas negasi Intelektualitas dengan cara Membakar Buku

3. Mendorong Pemberdayaan dan Diskusi Ilmiah yang rasional pada tingkat Publik

4. Mendorong Selubung Kedok Pembodohan melalui aksi Pembakaran Buku

Tuesday, July 31, 2007

Tehnik Propaganda

Propaganda Techniques Assertion:

Assertion is commonly used in advertising and modern propaganda. An assertion is an enthusiastic or energetic statement presented as a fact, although it is not necessarily true. They often imply that the statement requires no explanation or back up, but that it should merely be accepted without question. Examples of assertion, although somewhat scarce in wartime propaganda, can be found often in modern advertising propaganda. Any time an advertiser states that their product is the best without providing evidence for this, they are using an assertion. The subject, ideally, should simply agree to the statement without searching for additional information or reasoning. Assertions, although usually simple to spot, are often dangerous forms of propaganda because they often include falsehoods or lies.

Bandwagon:

Bandwagon is one of the most common techniques in both wartime and peacetime and plays an important part in modern advertising. Bandwagon is also one of the seven main propaganda techniques identified by the Institute for Propaganda Analysis in 1938. Bandwagon is an appeal to the subject to follow the crowd, to join in because others are doing so as well. Bandwagon propaganda is, essentially, trying to convince the subject that one side is the winning side, because more people have joined it. The subject is meant to believe that since so many people have joined, that victory is inevitable and defeat impossible. Since the average person always wants to be on the winning side, he or she is compelled to join in. However, in modern propaganda, bandwagon has taken a new twist. The subject is to be convinced by the propaganda that since everyone else is doing it, they will be left out if they do not. This is, effectively, the opposite of the other type of bandwagon, but usually provokes the same results. Subjects of bandwagon are compelled to join in because everyone else is doing so as well. When confronted with bandwagon propaganda, we should weigh the pros and cons of joining in independently from the amount of people who have already joined, and, as with most types of propaganda, we should seek more information.

Card stacking:

Card stacking, or selective omission, is one of the seven techniques identified by the IPA, or Institute for Propaganda Analysis. It involves only presenting information that is positive to an idea or proposal and omitting information contrary to it. Card stacking is used in almost all forms of propaganda, and is extremely effective in convincing the public. Although the majority of information presented by the card stacking approach is true, it is dangerous because it omits important information. The best way to deal with card stacking is to get more information.

Glittering Generalities:

Glittering generalities was one of the seven main propaganda techniques identified by the Institute for Propaganda Analysis in 1938. It also occurs very often in politics and political propaganda. Glittering generalities are words that have different positive meaning for individual subjects, but are linked to highly valued concepts. When these words are used, they demand approval without thinking, simply because such an important concept is involved. For example, when a person is asked to do something in "defense of democracy" they are more likely to agree. The concept of democracy has a positive connotation to them because it is linked to a concept that they value. Words often used as glittering generalities are honor, glory, love of country, and especially in the United States, freedom. When coming across with glittering generalities, we should especially consider the merits of the idea itself when separated from specific words.

Lesser of Two Evils:

The "lesser of two evils" technique tries to convince us of an idea or proposal by presenting it as the least offensive option. This technique is often implemented during wartime to convince people of the need for sacrifices or to justify difficult decisions. This technique is often accompanied by adding blame on an enemy country or political group. One idea or proposal is often depicted as one of the only options or paths. When confronted with this technique, the subject should consider the value of any proposal independently of those it is being compared with.

Name Calling:

Name calling occurs often in politics and wartime scenarios, but very seldom in advertising. It is another of the seven main techniques designated by the Institute for Propaganda Analysis. It is the use of derogatory language or words that carry a negative connotation when describing an enemy. The propaganda attempts to arouse prejudice among the public by labeling the target something that the public dislikes. Often, name calling is employed using sarcasm and ridicule, and shows up often in political cartoons or writings. When examining name calling propaganda, we should attempt to separate our feelings about the name and our feelings about the actual idea or proposal.

Pinpointing the Enemy:

Pinpointing the enemy is used extremely often during wartime, and also in political campaigns and debates. This is an attempt to simplify a complex situation by presenting one specific group or person as the enemy. Although there may be other factors involved the subject is urged to simply view the situation in terms of clear-cut right and wrong. When coming in contact with this technique, the subject should attempt to consider all other factors tied into the situation. As with almost all propaganda techniques, the subject should attempt to find more information on the topic. An informed person is much less susceptible to this sort of propaganda.

Plain Folks:

The plain folks propaganda technique was another of the seven main techniques identified by the IPA, or Institute for Propaganda Analysis. The plain folks device is an attempt by the propagandist to convince the public that his views reflect those of the common person and that they are also working for the benefit of the common person. The propagandist will often attempt to use the accent of a specific audience as well as using specific idioms or jokes. Also, the propagandist, especially during speeches, may attempt to increase the illusion through imperfect pronunciation, stuttering, and a more limited vocabulary. Errors such as these help add to the impression of sincerity and spontaneity. This technique is usually most effective when used with glittering generalities, in an attempt to convince the public that the propagandist views about highly valued ideas are similar to their own and therefore more valid. When confronted by this type of propaganda, the subject should consider the proposals and ideas separately from the personality of the presenter.

Simplification (Stereotyping):

Simplification is extremely similar to pinpointing the enemy, in that it often reduces a complex situation to a clear-cut choice involving good and evil. This technique is often useful in swaying uneducated audiences. When faced with simplification, it is often useful to examine other factors and pieces of the proposal or idea, and, as with all other forms of propaganda, it is essential to get more information.

Testimonials:

Testimonials are another of the seven main forms of propaganda identified by the Institute for Propaganda Analysis. Testimonials are quotations or endorsements, in or out of context, which attempt to connect a famous or respectable person with a product or item. Testimonials are very closely connected to the transfer technique, in that an attempt is made to connect an agreeable person to another item. Testimonials are often used in advertising and political campaigns. When coming across testimonials, the subject should consider the merits of the item or proposal independently of the person of organization giving the testimonial.

Transfer:

Transfer is another of the seven main propaganda terms first used by the Institute for Propaganda Analysis in 1938. Transfer is often used in politics and during wartime. It is an attempt to make the subject view a certain item in the same way as they view another item, to link the two in the subjects mind. Although this technique is often used to transfer negative feelings for one object to another, it can also be used in positive ways. By linking an item to something the subject respects or enjoys, positive feelings can be generated for it. However, in politics, transfer is most often used to transfer blame or bad feelings from one politician to another of his friends or party members, or even to the party itself. When confronted with propaganda using the transfer technique, we should question the merits or problems of the proposal or idea independently of convictions about other objects or proposals.

Bibliography

The Science of Modern Propaganda. http://www.propaganda101.com/ Last Visited: August, 2001.

Lee, Alfred McLung; Lee, Elizabeth Bryan. Propaganda Analysis. http://carmen.artsci.washington.edu/ (subdirectory). Last Visited: August, 2001.

Dorje, Carl. Propaganda Techniques. http://serendipity.magnet.ch/more/propagan.html Last Visited: August, 2001.

Propaganda

From Wikipedia, the free encyclopedia

Explanation of Propaganda

An appeal to one's emotions is, perhaps, the more obvious propaganda method, but there are varied other more subtle and insidious forms. A common characteristic of propaganda is volume (in the sense of a large amount). Individually propaganda functions as self-deception. Culturally it works within religions, politics, and economic entities like those that both favour and oppose globalization. Commercially it works within the (mass) market in the free market societies.

Propaganda shares techniques with advertising and public relations. In fact, advertising and public relations can be thought of as propaganda that promotes a commercial product or shapes the perception of an organization, person or brand. A number of techniques which are based on research are used to generate propaganda. Many of these same techniques can be found under logical fallacies, since propagandists use arguments that, while sometimes convincing, are not necessarily valid. A few examples are: Flag-waving, Glittering generalities, Intentional vagueness, Oversimplification, Rationalization, Red herring, Slogans, Stereotyping, Testimonial, Unstated assumption, and bandwagon.

In order to solidify the meaning of propaganda as distinct from certain types of advertising and public relations, it is helpful to mention Sheryl Tuttle Ross’s Epistemic Merit Model of propaganda. For example, a political campaign aid that in some way alludes to local racial politics visually without making any overt statements in regard to those politics would still be considered epistemically defective and therefore might count as propaganda, even though nothing false or inflammatory was said. Since propaganda can sometimes be subtle and slippery, using the Epistemic Merit Model can aid in analysis and in personal opinion.

In the East, the term propaganda now overlaps with distinct terms like indoctrination (ideological views established by repetition rather than verification) and mass suggestion (broader strategic methods). In practice, the terms are often used synonymously. Historically, the most common use of the term propaganda started to be in the religious context of the Catholic Church and evolved to be more common in political contexts, in particular to refer to certain efforts sponsored by governments, political groups, but also often covert interests. In the early 20th century the term propaganda was also used by the founders of the nascent public relations industry to describe their activities; this usage died out around the time of World War II, as the industry started to avoid the word, given the pejorative connotation it had acquired.

Literally translated from the Latin gerundive as "things which must be disseminated," in some cultures the term is neutral or even positive, while in others the term has acquired a strong negative connotation. Its connotations can also vary over time. For example, in Portuguese and some Spanish language speaking countries, particularly in the Southern Cone, the word "propaganda" usually means the most common manipulation of information — "advertising". In English, "propaganda" was originally a neutral term used to describe the dissemination of information in favor of any given cause. During the 20th century, however, the term acquired a thoroughly negative meaning in western countries, of equalling the intentional dissemination of false, but perhaps "compelling", claims supporting or justifying nefarious political ideologies. This redefinition arose because both the Soviet Union and Germany's government under Hitler admitted explicitly to using propaganda favoring, respectively, communism and fascism, in all forms of public expression. As these ideologies were antipathetic to English-language and other western societies, the negative feelings toward them came to be projected into the word "propaganda" itself. Nowadays nobody admits doing propaganda but, on the other side, everybody accuses the opponent of using propaganda, whenever there is an opponent in question.

At the left, right, or mainstream, propaganda knows no borders; as is detailed by Roderick Hindery. Hindery further argues that debates about most social issues can be productively revisited in the context of asking "what is or is not propaganda?" Not to be overlooked is the link between propaganda, indoctrination, and terrorism/counterterrorism. Mere threats to destroy are often as socially disruptive as physical devastation itself.

U.S. propaganda poster, which warns against civilians sharing information on troop movements (National Archives)

Etymology

Brochure of the Peoples Temple, portraying cult leader Jim Jones as the loving father of the "Rainbow Family."

Swedish Anti-Euro propaganda for the referendum of 2003.

Soviet Propaganda Poster during the Great Patriotic War. The text reads "Red Army Soldier - SAVE US!"

"Propaganda" is a form of the classical Latin verb "propagare," which means "to propagate, to extend, to spread." In 1622, shortly after the start of the Thirty Years' War, Pope Gregory XV founded the Congregatio de Propaganda Fide ("Congregation for Spreading the Faith"), a committee of Cardinals with the duty of overseeing the propagation of Christianity by missionaries sent to non-Catholic countries. Therefore, the term itself originates with this Roman Catholic Sacred Congregation for the Propagation of the Faith (sacra congregatio christiano nomini propagando or, briefly, propaganda fide), the department of the pontifical administration charged with the spread of Catholicism and with the regulation of ecclesiastical affairs in non-Catholic countries (mission territory). The actual Latin stem propagand- conveys a sense of "that which ought to be spread". Originally the term was not intended to refer to misleading information. The modern sense dates from World War I, when it evolved to the field of politics, and was not originally pejorative.

Purpose of propaganda

The aim of propaganda is to influence people's opinions or behaviors actively, rather than merely to communicate the facts about something. For example, propaganda might be used to garner either support or disapproval of a certain position, rather than to simply present the position, or to try to convince people to buy something, rather than to simply let them know there is some thing on the market. What separates propaganda from "normal" communication is in ways by which the message attempts to shape opinion or behavior, which are often subtle and insidious among other characteristics. For example, propaganda is often presented in a way that attempts to deliberately evoke a strong emotion, especially by suggesting illogical (or non-intuitive) relationships between concepts or objects (for instance between a “good” car and an attractive woman or a sex symbol). An appeal to one's emotions is, perhaps, a more obvious, and the most common propaganda method than those utilized by some other more subtle and insidious forms. For instance, propaganda may be transmitted indirectly or implicitly, through an ostensibly fair and balanced debate or argument. This can be done to great effect in conjunction with a broadly targeted, broadcast news format. In such a setting, techniques like, "red herring", and other ploys (such as Ignoratio elenchi), are often used to divert the audience from a critical issue, while the intended message is suggested through indirect means. This sophisticated type of diversion utilizes the appearance of lively debate within, what is actually, a carefully focused spectrum, to generate and justify deliberately conceived assumptions. This technique avoids the distinctively biased appearance of one sided rhetoric, and works by presenting a contrived premise for an argument as if it were a universally accepted and obvious truth, so that the audience naturally assumes it to be correct. By maintaining the range of debate in such a way that it appears inclusive of differing points of view, so as to suggest fairness and balance, the suppositions suggested become accepted as fact. Here is such an example of a hypothetical situation in which the opposing viewpoints are supposedly represented: the hawk (see: hawkish) says, "we must stay the course", and the dove says, "The war is a disaster and a failure", to which the hawk responds, "In war things seldom go smoothly and we must not let setbacks affect our determination", the dove retorts, "setbacks are setbacks, but failures are failures." As one can see, the actual validity of the war is not discussed and is never in contention. One may naturally assume that the war was not fundamentally wrong, but just the result of miscalculation, and therefore, an error, instead of a crime. Thus, by maintaining the appearance of equitable discourse in such debates, and through continuous inculcation, such focused arguments succeed in compelling the audience to logically deduce that the presupposions of debate are unequivocal truisms of the given subject.

The method of propaganda is essential to the word's meaning as well. A message does not have to be untrue to qualify as propaganda.

In fact, the message in modern propaganda is often not blatantly untrue. But even if the message conveys only "true" information, it will generally contain partisan bias and fail to present a complete and balanced consideration of the issue. Another common characteristic of propaganda is volume (in the sense of a large amount). For example, a propagandist may seek to influence opinion by attempting to get a message heard in as many places as possible, and as often as possible. The intention of this approach is to a) reinforce an idea through repetition, and b) exclude or "drown out" any alternative ideas.

U.S. Propaganda from WWII, urging citizens to increase production. The heads that appear are those of Adolf Hitler and Hideki Tojo

In English, the word "propaganda" now carries strong negative (as well as political, mainly) connotations, although it has not always done so. It was formerly common for political organizations, as it had started to be for the advertising and public relations industry, to refer to their own material as propaganda. Because of the negative connotations the word has gained, nowadays nobody admits doing propaganda but, on the other side, everybody accuses the opponent of doing propaganda, whenever there is an opponent in question. Other languages, however, do not necessarily regard the term as derogatory and hence usage may lead to misunderstanding in communications with non-native English speakers. For example, in Portuguese and some Spanish language speaking countries, particularly in the Southern Cone, the word "propaganda" usually means "advertising" (the most common manipulation of information).

Famed public relations pioneer Edward L. Bernays in his classic studies eloquently describes propaganda as the purpose of communications. In Crystallizing Public Opinion, for example, he dismisses the semantic differentiations (“Education is valuable, commendable, enlightening, instructive. Propaganda is insidious, dishonest, underhanded, misleading.”) and instead concentrates on purposes. He writes (p. 212), “Each of these nouns carries with it social and moral implications. . . . The only difference between ‘propaganda’ and ‘education,’ really, is in the point of view. The advocacy of what we believe in is education. The advocacy of what we don’t believe in is propaganda.”

The reason propaganda exists and is so widespread is because it serves various social purposes, necessary ones, often popular yet potentially corrupting. Many institutions such as media, private corporations and government itself are literally propaganda-addicts, co-dependent on each other and the fueling influence of the propaganda system that they help create and maintain. Propagandists have an advantage through knowing what they want to promote and to whom, and although they often resort to various two-way forms of communication this is done to make sure their one-sided purposes are achieved.

Types of propaganda

Propaganda shares techniques with advertising and public relations. In fact, advertising and public relations can be thought of as propaganda that promotes a commercial product or shapes the perception of an organization, person or brand, though in post-WWII usage the word "propaganda" more typically refers to political or nationalist uses of these techniques or to the promotion of a set of ideas, since the term had gained a pejorative meaning, which commercial and government entities couldn’t accept. The refusal phenomenon was eventually to be seen in politics itself by the substitution of ‘political marketing’ and other designations for ‘political propaganda’.

Propaganda also has much in common with public information campaigns by governments, which are intended to encourage or discourage certain forms of behavior (such as wearing seat belts, not smoking, not littering and so forth). Again, the emphasis is more political in propaganda. Propaganda can take the form of leaflets, posters, TV and radio broadcasts and can also extend to any other medium.

In the case of the United States, there is also an important legal (imposed by law) distinction between advertising (a type of overt propaganda) and what the Government Accountability Office (GAO), an arm of the United States Congress, refers to as "covert propaganda."

Journalistic theory generally holds that news items should be objective, giving the reader an accurate background and analysis of the subject at hand. On the other hand, advertisements evolved from the traditional commercial advertisements to include also a new type in the form of paid articles or broadcasts disguised as news. These generally present an issue in a very subjective and often misleading light, primarily meant to persuade rather than inform. Normally they use only subtle propaganda techniques and not the more obvious ones used in traditional commercial advertisements. If the reader believes that a paid advertisement is in fact a news item, the message the advertiser is trying to communicate will be more easily "believed" or "internalized." Such advertisements are considered obvious examples of "covert" propaganda because they take on the appearance of objective information rather than the appearance of propaganda, which is misleading. Federal law specifically mandates that any advertisement appearing in the format of a news item must state that the item is in fact a paid advertisement. The Bush Administration has come under fire for allegedly producing and disseminating covert propaganda in the form of television programs, aired in the United States, which appeared to be legitimate news broadcasts and did not include any information signifying that the programs were not generated by a private-sector news source.[1]

A series of American propaganda posters during World War II appealed to servicemen's patriotism to protect themselves from venereal disease. The text at the bottom of the poster reads, "You can't beat the Axis if you get VD".

Propaganda, in a narrower use of the term, connotates deliberately false or misleading information that supports or furthers a political (but not only) cause or the interests of those with power. The propagandist seeks to change the way people understand an issue or situation for the purpose of changing their actions and expectations in ways that are desirable to the interest group. Propaganda, in this sense, serves as a corollary to censorship in which the same purpose is achieved, not by filling people's minds with approved information, but by preventing people from being confronted with opposing points of view. What sets propaganda apart from other forms of advocacy is the willingness of the propagandist to change people's understanding through deception and confusion rather than persuasion and understanding. The leaders of an organization know the information to be one sided or untrue, but this may not be true for the rank and file members who help to disseminate the propaganda.

More in line with the religious roots of the term, it is also used widely in the debates about new religious movements (NRMs), both by people who defend them and by people who oppose them. The latter pejoratively call these NRMs cults. Anti-cult activists and countercult activists accuse the leaders of what they consider cults of using propaganda extensively to recruit followers and keep them. Some social scientists, such as the late Jeffrey Hadden, and CESNUR affiliated scholars accuse ex-members of "cults" who became vocal critics and the anti-cult movement of making these unusual religious movements look bad without sufficient reasons.[2][3]

Propaganda is a mighty weapon in war. In this case its aim is usually to dehumanize and create hatred toward a supposed enemy, either internal or external. The technique is to create a false image in the mind. This can be done by using special words, special avoidance of words or by saying that the enemy is responsible for certain things he never did. Most propaganda wars require the home population to feel the enemy has inflicted an injustice, which may be fictitious or may be based on facts. The home population must also decide that the cause of their nation is just.

Propaganda is also one of the methods used in psychological warfare, which may also involve false flag operations.

The term propaganda may also refer to false information meant to reinforce the mindsets of people who already believe as the propagandist wishes. The assumption is that, if people believe something false, they will constantly be assailed by doubts. Since these doubts are unpleasant (see cognitive dissonance), people will be eager to have them extinguished, and are therefore receptive to the reassurances of those in power. For this reason propaganda is often addressed to people who are already sympathetic to the agenda. This process of reinforcement uses an individual's predisposition to self-select "agreeable" information sources as a mechanism for maintaining control.

US Office for War Information, propaganda message: working less helps our enemies

Propaganda can be classified according to the source and nature of the message. White propaganda generally comes from an openly identified source, and is characterized by gentler methods of persuasion, such as standard public relations techniques and one-sided presentation of an argument. Black propaganda is identified as being from one source, but is in fact from another. This is most commonly to disguise the true origins of the propaganda, be it from an enemy country or from an organization with a negative public image. Grey propaganda is propaganda without any identifiable source or author. In scale, these different types of propaganda can also be defined by the potential of true and correct information to compete with the propaganda. For example, opposition to white propaganda is often readily found and may slightly discredit the propaganda source. Opposition to grey propaganda, when revealed (often by an inside source), may create some level of public outcry. Opposition to black propaganda is often unavailable and may be dangerous to reveal, because public cognizance of black propaganda tactics and sources would undermine or backfire the very campaign the black propagandist supported.

Propaganda may be administered in very insidious ways. For instance, disparaging disinformation about the history of certain groups or foreign countries may be encouraged or tolerated in the educational system. Since few people actually double-check what they learn at school, such disinformation will be repeated by journalists as well as parents, thus reinforcing the idea that the disinformation item is really a "well-known fact," even though no one repeating the myth is able to point to an authoritative source. The disinformation is then recycled in the media and in the educational system, without the need for direct governmental intervention on the media.

Such permeating propaganda may be used for political goals: by giving citizens a false impression of the quality or policies of their country, they may be incited to reject certain proposals or certain remarks or ignore the experience of others.

See also: black propaganda, marketing, advertising

Techniques of propaganda transmission

United States Army 312th PSYOP Company passes out leaflets and broadcasts messages in Al Kut, Iraq on May 2, 2003.

Common media for transmitting propaganda messages include news reports, government reports, historical revision, junk science, books, leaflets, movies, radio, television, and posters. In the case of radio and television, propaganda can exist on news, current-affairs or talk-show segments, as advertising or public-service announce "spots" or as long-running advertorials. The magazine Tricontinental, issued by the Cuban OSPAAAL organization, folds propaganda posters and places one in each copy, allowing a very broad distribution of pro-Fidel Castro propaganda.

Ideally a propaganda campaign will follow a strategic transmission pattern to fully indoctrinate a group. This may begin with a simple transmission such as a leaflet dropped from a plane or an advertisement. Generally these messages will contain directions on how to obtain more information, via a web site, hotline, radio program, et cetera (as it is seen also for selling purposes among other goals). The strategy intends to initiate the individual from information recipient to information seeker through reinforcement, and then from information seeker to opinion leader through indoctrination. A successful propaganda campaign includes this cyclical meme-reproducing process.

Techniques of propaganda generation

A number of techniques which are based on social psychological research are used to generate propaganda. Many of these same techniques can be found under logical fallacies, since propagandists use arguments that, while sometimes convincing, are not necessarily valid.

An Italian poster from World War II using the image of Jesus to elicit support for the fascist cause from the largely Catholic population. The portrayal of an African American US Army soldier desecrating a church fosters racist sentiment.

Some time has been spent analyzing the means by which propaganda messages are transmitted. That work is important but it is clear that information dissemination strategies only become propaganda strategies when coupled with propagandistic messages. Identifying these messages is a necessary prerequisite to study the methods by which those messages are spread. That is why it is essential to have some knowledge of the following techniques for generating propaganda:

* Ad Hominem: A Latin phrase which has come to mean attacking your opponent

* Appeal to authority: Appeals to authority cite prominent figures to support a position idea, argument, or course of action.

* Appeal to fear: Appeals to fear seek to build support by instilling fear in the general population, for example, Joseph Goebbels exploited Theodore Kaufman's Germany Must Perish! to claim that the Allies sought the extermination of the German people.

* Appeal to Prejudice: Using loaded or emotive terms to attach value or moral goodness to believing the proposition. "A reasonable person would agree that our income tax is too low."

* Argumentum ad nauseam: Uses tireless repetition. An idea once repeated enough times, is taken as the truth. Works best when media sources are limited and controlled by the propagator.

* Bandwagon: Bandwagon and inevitable-victory appeals attempt to persuade the target audience to take the course of action that "everyone else is taking."

* Inevitable victory: invites those not already on the bandwagon to join those already on the road to certain victory. Those already or at least partially on the bandwagon are reassured that staying aboard is their best course of action.

* Join the crowd: This technique reinforces people's natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their best interest to join.

* Black-and-White fallacy: Presenting only two choices, with the product or idea being propagated as the better choice. (Eg. You are either with us or with the evil enemy)

* Common man: The "'plain folks'" or "common man" approach attempts to convince the audience that the propagandist's positions reflect the common sense of the people. It is designed to win the confidence of the audience by communicating in the common manner and style of the target audience. Propagandists use ordinary language and mannerisms (and clothe their message in face-to-face and audiovisual communications) in attempting to identify their point of view with that of the average person.

* Demonizing the “enemy”: Projecting a person or idea as the "enemy" through suggestion or false accusations.

World War I poster by Winsor McCay, urging Americans to buy Liberty Bonds

* Direct order: This technique hopes to simplify the decision making process. The propagandist uses images and words to tell the audience exactly what actions to take, eliminating any other possible choices. Authority figures can be used to give the order, overlapping it with the Appeal to authority technique, but not necessarily. The Uncle Sam "I want you" image is an example of this technique.

* Euphoria: The use of an event that generates euphoria or happiness in lieu of spreading more sadness, or using a good event to try to cover up another. Or creating a celebrateable event in the hopes of boosting morale. Euphoria can be used to take one's mind from a worse feeling. i.e. a holiday or parade.

* Falsifying information: The creation or deletion of information from public records, in the purpose of making a false record of an event or the actions of a person during a court session, or possibly a battle, etc. Pseudoscience is often used in this way.

* Flag-waving: An attempt to justify an action on the grounds that doing so will make one more patriotic, or in some way benefit a group, country, or idea. The feeling of patriotism which this technique attempts to inspire may diminish or entirely omit one's capability for rational examination of the matter in question.

* Glittering generalities: Glittering generalities are emotionally appealing words applied to a product or idea, but which present no concrete argument or analysis. A famous example is the campaign slogan "Ford has a better idea!"

* Intentional vagueness: Generalities are deliberately vague so that the audience may supply its own interpretations. The intention is to move the audience by use of undefined phrases, without analyzing their validity or attempting to determine their reasonableness or application. The intent is to cause people to draw their own interpretations rather than simply being presented with an explicit idea. In trying to "figure out" the propaganda, the audience foregoes judgment of the ideas presented. Their validity, reasonableness and application is not considered.

Saddam Hussein pictured as a decisive war leader in an Iraqi propaganda picture.

* Obtain disapproval or Reductio ad Hitlerum: This technique is used to persuade a target audience to disapprove of an action or idea by suggesting that the idea is popular with groups hated, feared, or held in contempt by the target audience. Thus if a group which supports a certain policy is led to believe that undesirable, subversive, or contemptible people support the same policy, then the members of the group may decide to change their original position.

* Oversimplification: Favorable generalities are used to provide simple answers to complex social, political, economic, or military problems.

* Quotes out of Context: Selective editing of quotes which can change meanings. Political "documentaries" often make use of this technique.

* Rationalization: Individuals or groups may use favorable generalities to rationalize questionable acts or beliefs. Vague and pleasant phrases are often used to justify such actions or beliefs.

* Red herring: Presenting data that is irrelevant, then claiming that it validates your argument.

* Scapegoating: Assigning blame to an individual or group that isn't really responsible, thus alleviating feelings of guilt from responsible parties and/or distracting attention from the need to fix the problem for which blame is being assigned.

* Slogans: A slogan is a brief, striking phrase that may include labeling and stereotyping. Although slogans may be enlisted to support reasoned ideas, in practice they tend to act only as emotional appeals. For example, "blood for oil" or "cut and run" are slogans used by those who view the USA's current situation in Iraq with disfavor. Similarly, the names of the military campaigns, such as "enduring freedom" or "just cause", may also be regarded to be slogans, devised to prevent free thought on the issues.

* Stereotyping or Name Calling or Labeling: This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable. For instance, reporting on a foreign country or social group may focus on the stereotypical traits that the reader expects, even though they are far from being representative of the whole country or group; such reporting often focuses on the anecdotal.

* Testimonial: Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited. The testimonial places the official sanction of a respected person or authority on a propaganda message. This is done in an effort to cause the target audience to identify itself with the authority or to accept the authority's opinions and beliefs as its own. See also, damaging quotation

Soldier loads a "leaflet bomb" during the Korean war.

* Transfer: Also known as Association, this is a technique of projecting positive or negative qualities (praise or blame) of a person, entity, object, or value (an individual, group, organization, nation, patriotism, etc.) to another to make the second more acceptable or to discredit it. It evokes an emotional response, which stimulates the target to identify with recognized authorities. Often highly visual, this technique often utilizes symbols (for example, the Swastika used in Nazi Germany, originally a symbol for health and prosperity) superimposed over other visual images. An example of common use of this technique in America is for the President to be filmed or photographed in front of the American flag.

* Unstated assumption: This technique is used when the propaganda concept that the propagandist intends to transmit would seem less credible if explicitly stated. The concept is instead repeatedly assumed or implied.

* Virtue words: These are words in the value system of the target audience which tend to produce a positive image when attached to a person or issue. Peace, happiness, security, wise leadership, freedom, etc. are virtue words. See ""Transfer"".

See also: doublespeak, meme, cult of personality, spin, demonization, factoid

The propaganda model

The propaganda model is a theory advanced by Edward S. Herman and Noam Chomsky that alleges systemic biases in the mass media and seeks to explain them in terms of structural economic causes.

First presented in their 1988 book Manufacturing Consent: the Political Economy of the Mass Media, the propaganda model views the private media as businesses selling a product — readers and audiences (rather than news) — to other businesses (advertisers). The theory postulates five general classes of "filters" that determine the type of news that is presented in news media. These five are:

1. Ownership of the medium

2. Medium's funding sources

3. Sourcing

4. Flak

5. Anti-communist ideology

The first three (ownership, funding, and sourcing) are generally regarded by the authors as being the most important.

Although the model was based mainly on the characterization of United States media, Chomsky and Herman believe the theory is equally applicable to any country that shares the basic economic structure and organizing principles which the model postulates as the cause of media biases. After the disintegration of the Soviet Union, Chomsky stated that the new filter replacing communism would be terrorism and Islam.[1]

History of propaganda

Propaganda has been a human activity as far back as reliable recorded evidence exists.

Ancient propaganda

The Behistun Inscription (c. 515 BCE) detailing the rise of Darius I to the Persian throne, can be seen as an early example of propaganda.

The Arthashastra written by Chanakya (c. 350 - 283 BCE), a professor of political science at Takshashila University and a prime minister of the Maurya Empire, discusses propaganda in detail, such as how to spread propaganda and how to apply it in warfare. His student Chandragupta Maurya (c. 340 - 293 BCE), founder of the Maurya Empire, employed these methods during his rise to power.[4]

The writings of Romans such as Livy (c. 59 BCE - 17 CE) are considered masterpieces of pro-Roman statist propaganda.

19th and 20th centuries

U.S. Propaganda from WWII, Depicting Hitler as foolish

Gabriel Tarde's Laws of Imitation (1890) and Gustave Le Bon's The Crowd: A Study of the Popular Mind (1897) were two of the first codifications of propaganda techniques, which influenced many writers afterward, including Sigmund Freud. Hitler's Mein Kampf is heavily influenced by Le Bon's theories. Journalist Walter Lippman, in Public Opinion (1922) also worked on the subject, as well as psychologist Edward Bernays, a nephew of Freud, early in the 20th century. During World War I, Lippman and Bernays were hired by then United States President, Woodrow Wilson, to participate in the Creel Commission, the mission of which was to sway popular opinion in favor of entering the war, on the side of the United Kingdom. The Creel Commission provided themes for speeches by "four-minute men" at public functions, and also encouraged censorship of the American press. The Commission was so unpopular that after the war, Congress closed it down without providing funding to organize and archive its papers.

The war propaganda campaign of Lippman and Bernays produced within six months such an intense anti-German hysteria as to permanently impress American business (and Adolf Hitler, among others) with the potential of large-scale propaganda to control public opinion. Bernays coined the terms "group mind" and "engineering consent", important concepts in practical propaganda work.

The current public relations industry is a direct outgrowth of Lippman's and Bernays' work and is still used extensively by the United States government. For the first half of the 20th century Bernays and Lippman themselves ran a very successful public relations firm.

World War II saw continued use of propaganda as a weapon of war, both by Hitler's propagandist Joseph Goebbels and the British Political Warfare Executive, as well as the United States Office of War Information.

In the early 2000s, the United States government developed and freely distributed a video game known as America's Army. The stated intention of the game is to encourage players to become interested in joining the U.S. Army. According to a poll by I for I Research, 30% of young people who had a positive view of the military said that they had developed that view by playing the game.[citation needed]

Russian revolution

Russian revolutionaries of the 19th and 20th centuries distinguished two different aspects covered by the English term propaganda. Their terminology included two terms: Russian: ‡„ËÚ‡ˆËfl (agitatsiya), or agitation, and Russian: ÔÓÔ‡„‡Ì‰‡, or propaganda, see agitprop (agitprop is not, however, limited to the Soviet Union, as it was considered, before the October Revolution, to be one of the fundamental activity of any Marxist activist; this importance of agit-prop in Marxist theory may also be observed today in Trotskyist circles, who insist on the importance of leaflets distribution).

Soviet propaganda meant dissemination of revolutionary ideas, teachings of Marxism, and theoretical and practical knowledge of Marxist economics, while agitation meant forming favorable public opinion and stirring up political unrest. These activities did not carry negative connotations (as they usually do in English) and were encouraged. Expanding dimensions of state propaganda, the Bolsheviks actively used transportation such as trains, aircraft and other means.

Josef Stalin's regime built the largest fixed-wing aircraft of the 1930s, Tupolev ANT-20, exclusively for this purpose. Named after the famous Soviet writer Maxim Gorky who had recently returned from fascist Italy, it was equipped with a powerful radio set called "Voice from the sky", printing and leaflet-dropping machinery, radiostations, photographic laboratory, film projector with sound for showing movies in flight, library, etc. The aircraft could be disassembled and transported by railroad if needed. The giant aircraft set a number of world records.

The GPU thunderbolt strikes the counter-revolutionary saboteur

"Long Live World October (revolution)!"

Bolshevik propaganda train. 1923

ANT-20 "Maxim Gorky" propaganda aircraft in the Moscow sky

Nazi Germany

Most propaganda in Germany was produced by the Ministry for Public Enlightenment and Propaganda (Propagandaministerium, or "Promi" (German abbreviation)). Joseph Goebbels was placed in charge of this ministry shortly after Hitler took power in 1933. All journalists, writers, and artists were required to register with one of the Ministry's subordinate chambers for the press, fine arts, music, theater, film, literature, or radio.

The Nazis believed in propaganda as a vital tool in achieving their goals. Adolf Hitler, Germany's Führer, was impressed by the power of Allied propaganda during World War I and believed that it had been a primary cause of the collapse of morale and revolts in the German home front and Navy in 1918 (see also: Dolchstoßlegende). Hitler would meet nearly every day with Goebbels to discuss the news and Goebbels would obtain Hitler's thoughts on the subject; Goebbels would then meet with senior Ministry officials and pass down the official Party line on world events. Broadcasters and journalists required prior approval before their works were disseminated.

Along with posters, the Nazis produced a number of films and books to spread their beliefs.

Poster depicting Allied "liberators" as monster.

"Mother and Child" poster for charity subscription.

"All 10-year-olds to us."

"The Eternal Jew" poster for a movie.

"Mothers Fight for your Children."

Invites Dutchmen to join the SS.

Promotes Eugenics.

Nazi poster portraying Adolf Hitler. Text: "Long Live Germany!"

Cold War propaganda

Soviet propaganda poster of Lenin from 1967

The United States and the Soviet Union both used propaganda extensively during the Cold War. Both sides used film, television, and radio programming to influence their own citizens, each other, and Third World nations. The United States Information Agency operated the Voice of America as an official government station. Radio Free Europe and Radio Liberty, which were in part supported by the Central Intelligence Agency, provided grey propaganda in news and entertainment programs to Eastern Europe and the Soviet Union respectively. The Soviet Union's official government station, Radio Moscow, broadcast white propaganda, while Radio Peace and Freedom broadcast grey propaganda. Both sides also broadcast black propaganda programs in periods of special crises. In 1948, the United Kingdom's Foreign Office created the IRD (Information Research Department) which took over from wartime and slightly post-war departments such as the Ministry of Information and dispensed propaganda via various media such as the BBC and publishing.[5][6]

Large image of Joseph Stalin looms over Soviets.

The ideological and border dispute between the Soviet Union and People's Republic of China resulted in a number of cross-border operations. One technique developed during this period was the "backwards transmission," in which the radio program was recorded and played backwards over the air. (This was done so that messages meant to be received by the other government could be heard, while the average listener could not understand the content of the program.)

When describing life in capitalist countries, in the USA in particular, propaganda tended to show it worse than it was, for example it was explained that most of the rich people in the West earned their large fortunes through "robbing" simple workers, by paying them the smallest salaries possible, reducing their social payments (such as ill-time payments, right for free medicine and many others) & causing troubles to professional unions in their work to protect workers' rights (Soviet critics of actions taken against professional unions or workers in the USA or other countries was based on that fact, that working class (workers & peasants) were treated as "ideologically close" and so sympathised, and along with blacks and women, fought for their rights). At some point the emphasis was placed on the difference between the richiest and poorest people, while it was stated in Soviet Union everyone is equal and free. Another claimed the major way of income of rich people was producing & selling weapons, which (as was said) made these weapon manufacturers interested in starting or supporting another war, as it was said about the war in Vietnam. For its criticism of the USA Soviet propaganda was using different facts of racism or neo-fascism, which both the Soviet Union and the USA were known for.

In the Americas, Cuba served as a major source and a target of propaganda from both black and white stations operated by the CIA and Cuban exile groups. Radio Habana Cuba, in turn, broadcast original programming, relayed Radio Moscow, and broadcast The Voice of Vietnam as well as alleged confessions from the crew of the USS Pueblo.

One of the most insightful authors of the Cold War era was George Orwell, whose novels Animal Farm and Nineteen Eighty-Four are virtual textbooks on the use of propaganda. Though not set in the Soviet Union, these books are about totalitarian regimes in which language is constantly corrupted for political purposes. These novels were, ironically, used for explicit propaganda. The CIA, for example, secretly commissioned an animated film adaptation of Animal Farm in the 1950s with small changes to the original story to suit its own needs.[2] Another source of irony is the fact that Orwell himself was a socialist and did not just have left-wing totalitarian regimes in mind when he wrote 1984.

Afghanistan

In the 2001 invasion of Afghanistan, psychological operations tactics were employed to demoralize the Taliban and to win the sympathies of the Afghan population. At least six EC-130E Commando Solo aircraft were used to jam local radio transmissions and transmit replacement propaganda messages.

Leaflets were also dropped throughout Afghanistan, offering rewards for Osama bin Laden and other individuals, portraying Americans as friends of Afghanistan and emphasizing various negative aspects of the Taliban. Another shows a picture of Mohammed Omar in a set of crosshairs with the words "We are watching".

Iraq

In 2003 the American administration led by George W. Bush selectively chose intelligence reports which indicated the Iraqi military possessed what the administration referred to as Weapons of Mass Destruction. These reports were later found to be incorrect, and timely objections by intelligence experts to the misinformation were eventually revealed. This faulty information, and its dissemination through the American media, was sufficient to engage America in its largest military operation in decades. The motives for this selective misinformation campaign, and the subsequent military action, have not yet been documented because this statement is also a piece of propaganda.

U.S.PSYOP pamphlet disseminated in Iraq. Text: "This is your future al-Zarqawi" and shows al-Qaeda terrorist al-Zarqawi caught in a rat trap.

During the 2003 invasion of Iraq, the Iraqi Information Minister Mohammed Saeed al-Sahaf repeatedly claimed Iraqi forces were decisively winning every battle. Even up to the overthrow of the Iraqi government at Baghdad, he maintained that the United States would soon be defeated, in contradiction with all other media. Due to this, he quickly became a cult figure in the West, and gained recognition on the website WeLoveTheIraqiInformationMinister.com[7] The Iraqis, misled by his propaganda, on the other hand, were shocked when instead Iraq was defeated.

In November 2005, various media outlets, including The Chicago Tribune and the Los Angeles Times, alleged that the United States military had manipulated news reported in Iraqi media in an effort to cast a favorable light on its actions while demoralizing the insurgency. Lt. Col. Barry Johnson, a military spokesman in Iraq, said the program is "an important part of countering misinformation in the news by insurgents", while a spokesman for Defense Secretary Donald H. Rumsfeld said the allegations of manipulation were troubling if true. The Department of Defense has confirmed the existence of the program. More recently, The New York Times (see external links below) published an article about how the Pentagon has started to use contractors with little experience in journalism or public relations to plant articles in the Iraqi press. These articles are usually written by US soldiers without attribution or are attributed to a non-existent organization called the "International Information Center." Planting propaganda stories in newspapers was done by both the Allies and Central Powers in the First World War and the Axis and Allies in the Second; this is the latest version of this technique.[8][9][10]

Children and propaganda

Of all the potential targets for propaganda children are the most vulnerable: in that they are the most unprepared for the critical reasoning and contextual comprehension required to determine whether a message is propaganda or not.

Vulnerabilities (Theoretical)

Children's vulnerability to propaganda is rooted in developmental psychology. The attention children give their environment during development, due to the process of developing their understanding of the world, will cause them to absorb propaganda indiscriminately. Also, children are highly imitative: studies by Albert Bandura, Dorothea Ross, and Sheila A. Ross in the 1960s indicated that children are susceptible to filmed representations of behavior. Therefore television is of particular interest in regard to children's vulnerability to propaganda.

Another vulnerability of children is the theoretical influence that their peers have over their behavior. According to Judith Rich Harris's group-socialization theory, children learn the majority of what they do not receive paternally, through genes, from their peer groups. The implication then is that if peer-groups can be indoctrinated through propaganda at a young age to hold certain beliefs, the group will self-regulate the indoctrination, since new members to the group will adapt their beliefs to fit the group's.

Method

To a degree, socialization, formal education, and standardized television programming can be seen as using propaganda for the purpose of indoctrination. Schools that utilize dogmatic, frozen world-views, often resort to propagandist curriculums that indoctrinate children. The use of propaganda in schools was highly prevalent during the 1930s and 1940s in Germany, as well as in Stalinist Russia.

In Nazi Germany, the education system was thoroughly co-opted to indoctrinate the German youth with anti-Semitic ideology. This was accomplished through the National Socialist Teachers’ Union, of which 97% of all German teachers were members in 1937. It encouraged the teaching of “racial theory.” Picture books for children such as Don’t Trust A Fox in A Green Meadow Or the Word of A Jew, The Poisonous Mushroom, and The Poodle-Pug-Dachshund-Pincher were widely circulated (over 100,000 copies of Don’t Trust A Fox… were circulated during the late 1930s) and contained depictions of Jews as devils, child molesters, and other morally charged figures. Slogans such as “Judas the Jew betrayed Jesus the German to the Jews” were recited in class.[11] The following is an example of a propagandistic math problem recommended by the National Socialist Essence of Education.

The Jews are aliens in Germany—in 1933 there were 66,06,000 inhabitants in the German Reich, of whom 499,682 were Jews. What is the per cent of aliens?[12]

 This blog migrated to https://www.mediologi.id. just click here