.

.
Library of Professor Richard A. Macksey in Baltimore

POSTS BY SUBJECT

Labels

Friday, April 22, 2011

911-Hunt the Rubble!

 http://www.acebaker.com/9-11/HTR/web-content/Pages/HTRHome.html

What became of the floor assemblies? If these two quarter mile high buildings really just "fell down" as we are supposed to believe, wouldn't we expect to find the concrete slabs, steel pans, and floor trusses stacked up like a 110 story sandwich? Fractured and compacted together, yes, but at least still in some recognizable existence? I would. The official story is everything fell straight down, remember?
Well, So far, I can't find anything, anywhere, that looks even a bit like a floor assembly, or any office contents anywhere. Please help me . . .


Arial Photo of Intact Towers

This arial photo was taken from the southwest looking northeast, at an altitude of perhaps 2500 feet. This angle is about 30˙ away from the angle in the GZ photo, which is from the south looking north.

This photo was chosen because it was taken from about the same altitude and about the same angle looking down as was the GZ photo.

In order to match the scales I first added just the foreground buildings . This makes the Banker's Trust building visible twice, as you will see in the Intact towers with GZ foreground.


Just Plain Big

Now I substitute in the picture of Ground Zero. Appreciate the size of these twin giants. Picture the 220 floor assemblies, acre-sized square donuts of steel-reinforced concrete, falling down on each other.

Even if the floor assemblies squeezed together into a block with no air, it would still be about 5 stories tall.


Ground Zero-

I doesn't look like buildings fell down here, it looks like they disappeared. The Banker's Trust building is the black one in the forefround. Each twin tower was 3 1/2 times the size of Banker's Trust.
What I do see is a smoldering flat area with thousands of short sticks of steel. I see what looks a lot like a depression or crater where the north tower stood. The footprint of the South Tower is partially obscured by the Banker's Trust building.
Looks like we are going to have to get a lot closer in to find any of those floor assemblies.


Round Holes
These are views straight down on ground zero, cropped from the hi-res NOAA photo. The building at the top right is what's left of WTC6, center right is WTC5. While we don't discover any floor assemblies or stacked up concrete, we do see something very odd.
Check out the sharp round holes. Notice the large one down through WTC5. Half a circle is cut out of the building, the other half is carved into the debris lying next to the building, 9 stories below.
See the three holes side by side near the bottom of the picture. It looks as though a giant drill press was used. There are quite a few of these oddities at ground zero.
Can falling debris cause this effect?
The second picture to the left is WTC6. This 8-story building appears to have been hollowed out down to ground level. Notice all the round, vertical holes. The entire hollowed out section has a scalloped edge, as if it were carved with a giant drill. Even down at ground level, int the dust and shredded steel, we can see round imprints.
While we have to look elsewhere to discover any floor assemblies, this is proof that the rubble hunt can be very interesting, even if we haven't found our holy grail.

FOUND!
Here are the floor assemblies from WTC2. According to calculations, the amount of dust produced by the "collapsing" twin towers was enough to account for the building contents, including the 200,000 tons of concrete.






IMPORTANT-Planting misinformation in the human mind

Planting misinformation in the human mind: A 30-year investigation of the malleability of memory

  1. Elizabeth F. Loftus
+ Author Affiliations
  1. Department of Psychology and Social Behavior, University of California–Irvine, Irvine, California 92697-7085, USA

Abstract

The misinformation effect refers to the impairment in memory for the past that arises after exposure to misleading information. The phenomenon has been investigated for at least 30 years, as investigators have addressed a number of issues. These include the conditions under which people are especially susceptible to the negative impact of misinformation, and conversely when are they resistant. Warnings about the potential for misinformation sometimes work to inhibit its damaging effects, but only under limited circumstances. The misinformation effect has been observed in a variety of human and nonhuman species. And some groups of individuals are more susceptible than others. At a more theoretical level, investigators have explored the fate of the original memory traces after exposure to misinformation appears to have made them inaccessible. This review of the field ends with a brief discussion of the newer work involving misinformation that has explored the processes by which people come to believe falsely that they experienced rich complex events that never, in fact, occurred.
In 2005 the journal Learning & Memory published the first experimental work using neuroimaging to reveal the underlying mechanisms of the “misinformation effect,” a phenomenon that had captured the interest of memory researchers for over a quarter century (Okado and Stark 2005). These new investigators used a variation of the standard three-stage procedure typical in studies of misinformation. Their subjects first saw several complex events, for example one involving a man stealing a girl's wallet. Next some of the subjects got misinformation about the event, such as the fact that the girl's arm was hurt in the process (rather than her neck). Finally the subjects were asked to remember what they saw in the original event. Many claimed that they saw the misinformation details in the original event. For example, they remembered seeing the arm being hurt, not the neck. Overall, the misinformation was remembered as being part of the original event about 47% of the time. So, expectedly, a robust impairment of memory was produced by exposure to misinformation— the misinformation effect. But the researchers' new work had a twist: They went on to show that the neural activity that occurred while the subjects processed the events and later the misinformation predicted whether a misinformation effect would occur.
In an essay that accompanied the Okado and Stark findings, I placed their results within the context of 30 years of research on behavioral aspects of the misinformation effect (Loftus 2005). Their work received much publicity, and boosted public interest in the misinformation effect as a scientific phenomenon. For example, WebMD (Hitti 2005) touted the new findings showing that brain scans can predict whether the memories would be accurate or would be infected with misinformation. And the Canadian press applauded the study as being the first to investigate how the brain encodes misinformation (Toronto Star 2005).
So what do we know about the misinformation effect after 30 years? The degree of distortion in memory observed in the Okado and Stark neuroimaging study has been found in hundreds of studies involving a wide variety of materials. People have recalled nonexistent objects such as broken glass. They have been misled into remembering a yield sign as a stop sign, hammers as screwdrivers, and even something large, like a barn, that was not part of the bucolic landscape by which an automobile happened to be driving. Details have been planted into memory for simulated events that were witnessed (e.g. a filmed accident), but also into memory for real-world events such as the planting of wounded animals (that were not seen) into memory for the scene of a tragic terrorist bombing that actually had occurred in Russia a few years earlier (Nourkova et al. 2004). The misinformation effect is the name given to the change (usually for the worse) in reporting that arises after receipt of misleading information. Over its now substantial history, many questions about the misinformation effect have been addressed, and findings bearing on a few key ones are summarized here.
  1. Under what conditions are people particularly susceptible to the negative impact of misinformation? (The When Question)
  2. Can people be warned about misinformation, and successfully resist its damaging influence?
  3. Are some types of people particularly susceptible? (The Who Question)
  4. When misinformation has been embraced by individuals, what happens to their original memory?
  5. What is the nature of misinformation memories?
  6. How far can you go with people in terms of the misinformation you can plant in memory?

The When Question

Long ago, researchers showed that certain experimental conditions are associated with greater susceptibility to misinformation. So, for example, people are particularly prone to having their memories be affected by misinformation when it is introduced after the passage of time has allowed the original event memory to fade (Loftus et al. 1978). One reason this may be true is that with the passage of time, the event memory is weakened, and thus, there is less likelihood that a discrepancy is noticed while the misinformation is being processed. In the extreme, with super-long time intervals between an event and subsequent misinformation, the event memory might be so weak that it is as if it had not been presented at all. No discrepancy between the misinformation and original memory would be detected, and the subject might readily embrace the misinformation. These ideas led to the proposal of a fundamental principle for determining when changes in recollection after misinformation would occur: the Discrepancy Detection principle (Tousignant et al. 1986). It essentially states that recollections are more likely to change if a person does not immediately detect discrepancies between misinformation and memory for the original event. Of course, it should be kept in mind that false memories can still occur even if a discrepancy is noticed. The rememberer sometimes thinks, “Gee, I thought I saw a stop sign, but the new information mentions a yield sign, I guess I must be wrong and it was a yield sign.” (Loftus and Hoffman 1989).
The other important time interval is the period between the misinformation and the test. One study asked subjects to say whether a key item was part of the event only, part of the misinformation, in both parts, or in neither. Misinformation effects occur when subjects say that the item is part of the event only, or that the item was in both parts. Overall, subjects were slightly more likely to say “both” (22%) than “event only” (17%). But the timing of the test affected these ratios. With a short interval between the misinformation and the test, subjects are less likely to claim that the misinformation item was in the event only (Higham 1998). This makes sense. If subjects have recently read the misinformation they might well remember doing so when tested and at the same time might also incorrectly believe that they also saw the misinformation detail during an original event.
Temporarily changing someone's state can increase misinformation effects. So for example, if people are led to believe that they have drunk alcohol, they are more susceptible (Assefi and Garry 2002), and when people are hypnotized, they are more susceptible (Scoboria et al. 2002). These temporary states may have the effect of disrupting the ability of subjects to detect discrepancies between the misinformation and what remains of their original memory.

Warnings

Long ago, researchers showed that warning people about the fact that they might in the future be exposed to misinformation sometimes helps them resist the misinformation. However, a warning given after the misinformation had been processed did not improve the ability to resist its damaging effects (Greene et al. 1982). The lack of effectiveness of post-misinformation warnings presumably occurred because the misinformation had already been incorporated into the memory and an altered memory now existed in the mind of the individual. The research on warnings fits well with the Discrepancy Detection principle. If people are warned prior to reading post-event information that the information might be misleading, they can better resist its influence, perhaps by increasing the likelihood that the person scrutinizes the post-event information for discrepancies.
More recent work suggests that warning people that they may have in the past been exposed to misinformation (post-misinformation warnings) may have some success, but only in limited circumstances. In one study, an immediate post-misinformation warning helped subjects resist the misinformation, but only when the misinformation was in a relatively low state of accessibility. With highly accessible misinformation, the immediate post-misinformation warnings didn't work at all. (The accessibility of misinformation can be enhanced by presenting it multiple times versus a single time). Moreover, it didn't seem to matter whether the warning was quite general or item-specific (Eakin et al. 2003). The general warning informed subjects that the narrative they had read referred to some objects and events from the slides in an inaccurate way. The specific warning explicitly mentioned the misleading details (e.g., they would be told the misinformation was about the tool). Eakin et al. explained these results with several hypotheses. They favored a suppression hypothesis, which states that when people get a warning, they suppress the misinformation and it has less ability to interfere with answering on the final test. Moreover they suggested that the entire context of the misinformation might be suppressed by the warning. Suppression might have more trouble working when misinformation is too accessible. Also, highly accessible misinformation might distract the subject from thinking to scrutinize the misinformation for discrepancies from some presumably overwhelmed original event memory.

The Who Question

Misinformation affects some people more than others. For one thing, age matters. In general young children are more susceptible to misinformation than are older children and adults (see Ceci and Bruck 1993). Moreover, the elderly are more susceptible than are younger adults (Karpel et al. 2001; Davis and Loftus 2005). These age effects may be telling us something about the role of cognitive resources, since we also know that misinformation effects are stronger when attentional resources are limited. In thinking about these age effects, it should probably be emphasized that suggestion-induced distortion in memory is a phenomenon that occurs with people of all ages, even if it is more pronounced with certain age groups.
In terms of personality variables, several have been shown to be associated with greater susceptibility to misinformation such as empathy, absorption, and self-monitoring. The more one has self-reported lapses in memory and attention, the more susceptible one is to misinformation effects. So, for example, Wright and Livingston-Raper (2002) showed that about 10% of the variance in susceptibility to misinformation is accounted for by dissociation scores that measure the frequency of such experiences as how often a person can't remember whether he did something or just thought about doing that thing (see Davis and Loftus 2005 for a review of these personality variables).
Interestingly, misinformation effects have also been obtained with some unusual subject samples, including three-month-old infants (Rovee-Collier et al. 1993), gorillas (Schwartz et al. 2004), and even with pigeons and rats (Harper and Garry 2000; M. Garry and D.N. Harper, in prep.). One challenging aspect of these studies is finding ways to determine that misinformation had taken hold in species that are unable to explicitly say so. Take pigeons, for example. They have an amazing ability to remember pictures that they were shown as long as two years earlier (Vaughan and Greene 1983, 1984). But their otherwise good memory can be disrupted by misinformation. In two different studies, Harper and Garry examined misinformation effects in pigeons by using an entirely visual paradigm (see also M. Garry and D.N. Harper, in prep.). First, the pigeons saw a light (let's say a red light). They had been trained over many many trials to peck the light to show that they had paid attention to it. After they pecked the light, it turned off. After a delay, the pigeons were exposed to post-event information, where they saw either the same colored light or a different colored light. They had to peck this light, too. Then came the test: The pigeons saw the original light and a novel colored light. If they pecked the originally correct color, they got food. If they pecked the novel color, they got no food. The pigeons were more accurate when the post-event experience did not mislead them. Moreover, like humans, pigeons are more susceptible to the misinformation if it occurs later in the original–final test interval than if it occurs early in that interval. M. Garry and D.N. Harper (in prep.) make the point that knowing that pigeons and humans respond the same way to misleading information provides more evidence that the misinformation effect is not just a simple matter of retrograde interference. Retrograde interference is a mere disruption in performance, not a biasing effect. That is, it typically makes memory worse, but does not pull for any particular wrong answer. But for pigeons, like humans, who use the misinformation differentially depending on when they are exposed to it, the misinformation appears to have a specific biasing effect too. The observation of a misinformation effect in nonverbal creatures also suggests that the misinformation effects are not a product of mere demand characteristics. That is, they are not produced by “people” who give a response just to please the experimenter, even when it is not the response they think they should give.

The fate of the original memory?

One of the most fundamental questions one can ask about memory is the question about the permanence of our long-term memories. If information makes it way into our long-term memories, does it stay there permanently even when we can't retrieve it on demand? Or do memory traces once stored become susceptible to decay or damage or alteration? In this context, we can pose the more specific question: When misinformation is accepted and incorporated into a person's recollection, what happens to their original memory? Does the misinformation impair the original memory, perhaps by altering the once-formed traces? Or does the misinformation cause retrieval impairment, possibly by making the original memory less accessible?
A lively debate developed in the 1980s when several investigators rejected the notion that misinformation causes any type of impairment of memory (McCloskey and Zaragoza 1985). Instead, they explicitly and boldly pressed the idea that misinformation had no effect on the original event memory. Misinformation, according to this view, merely influences the reports of subjects who never encoded (or for other reasons can't recall) the original event. Instead of guessing at what they saw, these subjects would be lured into producing the misinformation response. Alternatively, the investigators argued that misinformation effects could be arising because subjects remember both sources of information but select the misleading information because, after deliberation, they conclude it must be correct.
To support their position, McCloskey and Zaragoza (1985) devised a new type of test. Suppose the subjects saw a burglar pick up a hammer and received the misinformation that it was a screwdriver. The standard test would allow subjects to select between a hammer and a screwdriver. On the standard test, control subjects who had not received the misinformation would tend to select the hammer. Many subjects exposed to misinformation (called misled subjects) would, of course, select the screwdriver, producing the usual misinformation test. In the new test, called the “Modified Test,” the misinformation option is excluded as a response alternative. That is, the subjects have to choose between a hammer and a totally novel item, wrench. With the modified test, subjects were very good at selecting the original event item (hammer, in this example), leading McCloskey and Zaragoza to argue that it was not necessary to assume any memory impairment at all—neither impairment of traces nor impairment of access to traces. Yet later analyses of a collection of studies using the modified test showed that small misinformation effects were obtained even when these unusual types of tests were employed (Ayers and Reder 1998), and even when nonverbal species were the subjects of the experiments.
While space is too limited to present the myriad paradigms that were devised by investigators wishing to explore the fate of the original memory (e.g., Wagenaar and Boer 1987; Belli 1989; Tversky and Tuchin 1989), suffice it to say that the entire debate heightened appreciation for the different ways by which people come to report a misinformation item as their memory. Sometimes this occurs because they have no original memory (it was never stored or it has faded). Sometimes this occurs because of deliberation. And sometimes it appears as if the original event memories have been impaired in the process of contemplating misinformation. Moreover, the idea that you can plant an item into someone's memory (apart from whether you have impaired any previous traces) was downright interesting in its own right.

The nature of misinformation memories

Subjectively, what are misinformation memories like? One attempt to explore this issue compared the memories of a yield sign that had actually been seen in a simulated traffic accident, to the memories of other subjects who had not seen the sign but had it suggested to them (Schooler et al. 1986). The verbal descriptions of the “unreal” memories were longer, contained more verbal hedges (I think I saw...), more references to cognitive operations (After seeing the sign the answer I gave was more of an immediate impression...), and fewer sensory details. Thus statistically a group of real memories might be different from a group of unreal ones. Of course, many of the unreal memory descriptions contained verbal hedges and sensory detail, making it extremely difficult to take a single memory report and reliably classify it as real or unreal. (Much later, neurophysiological work would attempt to distinguish real from unreal memories, a point we return to later).
A different approach to the nature of misinformation memories came from the work of Zaragoza and Lane (1994) who asked this question: Do people confuse the misleading suggestions for their “real memories” of the witnessed event? They asked this question because of the real possibility that subjects could be reporting misinformation because they believed it was true, even if they had no specific memory of seeing it. After numerous experiments in which subject were asked very specific questions about their memory for the source of suggested items that they were embracing, the investigators concluded that misled subjects definitely do sometimes come to remember seeing things that were merely suggested to them. They referred to the phenomenon as the “source misattribution effect.” But they also noted that the size of the effect can vary, and emphasized that source misattributions are not inevitable after exposure to suggestive misinformation.

How much misinformation can you plant in one mind?: Rich false memories

It is one thing to change a stop sign into a yield sign, to make a person believe that a crime victim was hurt in the arm instead of the neck, or to add a detail to an otherwise intact memory. But it is quite another thing to plant an entire memory for an event that never happened. Researchers in the mid-1990s devised a number of techniques for planting whole events, or what have been called “rich false memories.” One study used scenarios made up by relatives of subjects, and planted false memories of being lost for an extended time in a shopping mall at age 6 and rescued by an elderly person (Loftus 1993; Loftus and Pickrell 1995). Other studies used similar methods to plant a false memory that as a child the subject had had an accident at a family wedding (Hyman Jr. et al. 1995), had been a victim of a vicious animal attack (Porter et al. 1999), or that he or she had nearly drowned and had to be rescued by a lifeguard (Heaps and Nash 2001).
Sometimes subjects will start with very little memory, but after several suggestive interviews filled with misinformation they will recall the false events in quite a bit of detail. In one study, a subject received the suggestion that he or she went to the hospital at age 4 and was diagnosed as having low blood sugar (Ost et al. 2005). At first the subject remembered very little: “... I can't remember anything about the hospital or the place. It was the X general hospital where my mum used to work? She used to work in the baby ward there... but I can't... no. I know if I was put under hypnosis or something I'd be able to remember it better, but I honestly can't remember.” Yet in the final interview in week 3, the subject developed a more detailed memory and even incorporated thoughts at the time into the recollection: “... I don't remember much about the hospital except I know it was a massive, huge place. I was 5 years old at the time and I was like `oh my God I don't really want to go into this place, you know it's awful'... but I had no choice. They did a blood test on me and found out that I had a low blood sugar...”
Taken together these studies show the power of this strong form of suggestion. It has led many subjects to believe or even remember in detail events that did not happen, that were completely manufactured with the help of family members, and that would have been traumatic had they actually happened.
Some investigators have called this strong form of suggestion the “familial informant false narrative procedure” (Lindsay et al. 2004); others find the term awfully cumbersome, and prefer to simply call the procedure the “lost-in-the-mall” technique, after the first study that used the procedure. Across many studies that have now utilized the “lost-in-the-mall” procedure, an average of ∼30% of subjects have gone on to produce either partial or complete false memory (Lindsay et al. 2004). Other techniques, such as those involving guided imagination (see Libby 2003 for an example), suggestive dream interpretation, or exposure to doctored photographs, have also led subjects to believe falsely that they experienced events in their distant and even in their recent past (for review, see Loftus 2003).
Figure 1.
Fake advertisements showing Bugs Bunny at a Disney resort, used to plant false beliefs in Braun et al. (2002) and Braun-LaTour et al. (2004).
A concern about the recent work showing the creation of very rich false beliefs and memories is that these might reflect true experiences that have been resurrected from memory by the suggestive misinformation. To counter that concern, some investigators have tried to plant implausible or impossible false memories. In several studies subjects were led to believe that they met Bugs Bunny at a Disney Resort after exposure to fake ads for Disney that featured Bugs Bunny. An example of an ad containing the false Bugs Bunny information is shown in Figure 1; subjects simply evaluate the ad on a variety of characteristics. In one study, the single fake ad led 16% of subjects to later claim that they had met him (Braun et al. 2002), which could not have occurred because Bugs Bunny is a Warner Brothers character and would not be seen at a Disney resort. Later studies showed even higher rates of false belief, and that the ads that contained a picture of Bugs produced more false memories than ads that contained only a verbal mention (Braun-LaTour et al. 2004.) While obviously less complex, these studies dovetail nicely with real-world examples in which individuals have come to develop false beliefs or memories for experiences that are implausible or impossible (e.g., alien abduction memories, as studied by McNally and colleagues 2004).

Concluding remarks

Misinformation can cause people to falsely believe that they saw details that were only suggested to them. Misinformation can even lead people to have very rich false memories. Once embraced, people can express these false memories with confidence and detail. There is a growing body of work using neuroimaging techniques to assist in locating parts of the brain that might be associated with true and false memories, and these reveal the similarities and differences in the neural signatures (e.g., Curran et al. 2001; Fabiani et al. 2000). Those with strong interests in neuroscience will find interesting the recent neuroimaging and electrophysiological studies suggesting that sensory activity is greater for true recognition than false recognition (Schacter and Slotnick 2004). These studies suggest, more explicitly, that the hippocampus and a few other cortical regions come into play when people claim to have seen things that they didn't see. But, keep in mind that for the most part these studies are done with relatively pallid sorts of true and false memories (e.g., large collections of words or simple pictures). With the Okado and Stark (2005) neuroimaging investigation of misinformation we are one step closer to developing some techniques that might enable us to use neural activity to tell whether a report about a complex event is probably based on a true experience or whether it is based on misinformation. We are still, however, a long way from a reliable assessment when all we have is a single memory report to judge.
In the real world, misinformation comes in many forms. When witnesses to an event talk with one another, when they are interrogated with leading questions or suggestive techniques, when they see media coverage about an event, misinformation can enter consciousness and can cause contamination of memory. These are not, of course, the only source of distortion in memory. As we retrieve and reconstruct memories, distortions can creep in without explicit external influence, and these can become pieces of misinformation. This might be a result of inference-based processes, or some automatic process, and can perhaps help us understand the distortions we see in the absence of explicit misinformation (e.g., Schmolck et al.'s [2000] distortions in recollections of the O.J. Simpson trial verdict).
An obvious question arises as to why we would have evolved to have a memory system that is so malleable in its absorption of misinformation. One observation is that the “updating” seen in the misinformation studies is the same kind of “updating” that allows for correction of incorrect memories. Correct information can supplement or distort previously stored error, and this, of course, is a good thing. Whatever the misinformation reveals about normal memory processes, one thing is clear: the practical implications are significant. The obvious relevance to legal disputes, and other real-world activities, makes it understandable why the public would want to understand more about the misinformation effect and what it tells us about our malleable memories.

Footnotes

  • Article published online ahead of print. Article and publication date are at http://www.learnmem.org/cgi/doi/10.1101/lm.94705.

References

IMPORTANT-Disinfo (911 & generally)

Disinfo

"The likelihood of one individual being right increases in direct proportion to the intensity to which others are trying to prove him wrong."

- - Harry Segall





Twenty-Five Ways To Suppress Truth:   The Rules of Disinformation Top
(Includes The 8 Traits of A Disinformationalist)

by H. Michael Sweeney

Built upon Thirteen Techniques for Truth  Suppression by David Martin, the following may be useful to the initiate  in the world of dealing with veiled and half-truth, lies, and suppression of truth when serious crimes are studied in public forums. This, sadly, includes every day news media, one of the worst offenders with respect to being a source of disinformation. Where the crime involves a conspiracy, or a conspiracy to cover up the crime, there will invariably be a disinformation campaign launched against  those seeking to uncover and expose the truth and/or the conspiracy. There are specific tactics which disinfo artists tend to apply, as revealed here. Also included with this material are seven common traits of the disinfo artist which may also prove useful in identifying players and motives.

 The more a particular party fits the traits and is guilty of following the rules, the more likely they are a professional disinfo artist with a vested motive. People can be bought, threatened, or blackmailed into providing disinformation, so even "good guys" can be suspect in many cases.

 A rational person participating as one interested in the truth will evaluate that chain of evidence and conclude either that the links are solid and conclusive, that  one or more links are weak and need further development before conclusion can be arrived at, or that one or more links can be broken, usually invalidating (but not necessarily so, if parallel links already exist or can be found, or if a particular link was merely supportive, but not in itself key to) the argument. The game is played by raising issues which either strengthen or weaken (preferably to the point  of breaking) these links. It is the job of a disinfo artist to interfere with these evaluations... to at least make people think the links are weak or broken when, in truth, they are not... or to propose alternative solutions leading away from the truth. Often, by simply impeding and slowing down the process through disinformation tactics, a level of victory  is assured because apathy increases with time and rhetoric.

 It would seem true in almost every instance, that if one cannot break the chain of evidence for a given solution, revelation of truth has won out. If the chain is broken either a new link must be forged, or a whole new chain developed, or the solution is invalid and a new one must be found... but truth still wins out. There is no shame in being the creator or supporter of a failed solution, chain, or link, if done with honesty in search of the truth. This is the rational approach. While it is understandable that a person can become emotionally involved with a particular side of a given issue, it is  really unimportant who wins, as long as truth wins. But the disinfo artist will seek to emotionalize and chastise any failure (real or false claims thereof), and will seek by means of intimidation to prevent discussion  in general.

It is the disinfo artist and those who may pull their strings (those who stand to suffer should the crime be solved) MUST seek to prevent  rational and complete examination of any chain ofevidence which would hang them. Since fact and truth seldom fall on their own, they must be overcome with lies and deceit. Those who are professional in the art of lies and deceit, such as the intelligence community and the professional criminal (often the same people or at least working together), tend to apply fairly well defined and observable tools in this process.However, the public at large is not well armed against such weapons, and is often easily ledastray by these time-proven tactics. Remarkably, not even media and law enforcement have NOT BEEN TRAINED to deal with these issues. For the most part, only the players themselves understand the rules of the game.

For such disinformationalists, the overall aim is to avoid discussing links in the chain of evidence which cannot be broken by truth, but at all times, to use clever deceptions or lies to make select links seem weaker than they are, create the illusion of a break, or better still, cause any who are considering the chain to be distracted in any number of ways, including the method of questioning the credentials of the presenter. Please understand that fact is fact, regardless of the source. Likewise, truth is truth, regardless of the source. This is why criminals are allowed to testify against other criminals. Where a motive to lie may truly exist, only actual evidence that the testimony itself  IS a lie renders it completely invalid. Were a known 'liar's' testimony to stand on its own without supporting fact, it might certainly be of questionable value, but if the testimony (argument) is based on verifiable or otherwise demonstrable facts, it matters not who does the presenting or what their motives are, or if they have lied in the past or even if motivated to lie in this instance -- the facts or links would and should stand or fall on their own merit and their part in the matter will merely be supportive.

Moreover, particularly with respects to public forums such as newspaper letters to the editor, and Internet chat and news groups, the disinfo type has a very important role. In these forums, the principle topics of discussion are generally attempts by individuals to cause other persons to become interested in their own particular position, idea, or solution -- very much in development at the time. People often use such mediums as a sounding board and in hopes of pollination to better form their ideas. Where such ideas are critical of government or powerful, vested groups (especially if their criminality is the topic), the disinfo artist has yet another role -- the role of nipping it in the bud. They also seek to stage the concept, the presenter, and any supporters as less than credible should any possible future confrontation in more public forums result due to their early successes. You can often spot the disinfo types at work here by the unique application of "higher standards" of discussion than necessarily warranted. They will demand that those presenting arguments or concepts back everything up with the same level of expertise as a professor, researcher, or investigative writer. Anything less renders anydiscussion meaningless and unworthy in their opinion, and anyone who disagrees is obviously stupid -- and they generally put it in exactly those terms.

So, as you read any such discussions, particularly so in Internet news groups (NG), decide for yourself when a rational argument is being applied and when disinformation, psyops (psychological warfare operations) or trickery is the tool. Accuse those guilty of the latter freely. They (both those deliberately seeking to lead you astray, and those who are simply foolish or misguided thinkers) generally run  for cover when thus illuminated, or -- put in other terms, they put up or shut up (a perfectly acceptable outcome either way, since truth is the goal.) Here are the twenty-five methods and seven traits, some of which don't apply directly to NG application. Each contains a simple example in the form of actual (some paraphrased for simplicity) from NG comments on commonly known historical events, and a proper response.[examples & response- http://www.proparanoid.com/truth.html]

Accusations should not be overused -- reserve for repeat offenders and those who use multiple tactics. Responses should avoid falling into emotional traps or informational sidetracks, unless it is feared that some observers will be easily dissuaded by the trickery. Consider quoting the complete rule rather than simply citing it, as others will not have reference. Offer to provide a complete copy of the rule set upon request  

Twenty-Five Rules of Disinformation Top
Note: The first rule and last five (or six, depending on situation) rules are generally not directly within the ability of the traditional disinfo artist to apply. These rules are generally used more directly by those at the leadership, key players, or planning level of the criminal conspiracy or conspiracy to cover up.

1. Hear no evil, see no evil, speak no evil.  Regardless of what you know, don't discuss it -- especially if you are a public figure, news anchor,  etc. If it's not reported, it didn't happen,  and you never have to deal with the issues.

2. Become incredulous and indignant.  Avoid discussing key issues and instead focus on side issues which can be used show the topic  as being critical of some otherwise sacrosanct group or theme. This is also known as the  'How dare you!' gambit.

3. Create rumor mongers.  Avoid discussing issues by describing all charges, regardless of venue or evidence, as mere rumors and wild accusations. Other derogatory terms mutually exclusive of truth may work as well. This method which works especially well with a silent press, because the only way the public  can learn of the facts are through such 'arguable rumors'. If you can associate the material with the Internet, use this fact to certify it a 'wild rumor' from a 'bunch of kids on the Internet' which can have no basis in fact.
[add: Use the derogatory terms 'space beams' and 'rabid no-planers', then associate these with the terms 'wild accusations' and 'ad hominem attacks'. (JW, 2007)]

4. Use a straw man. Find or create a seeming element of your opponent's   argument which you can easily knock down to make  yourself look good and the opponent to look bad. Either make up an issue you may safely imply exists based on your interpretation of the opponent/opponent arguments/situation, or select the weakest aspect of the weakest charges.  Amplify their significance and destroy them in a way which appears to debunk all the charges, real and fabricated alike, while actually avoiding discussion of the real issues.
[add example: 'But space beams don't explain the presence of sulfur', (JW, 2007)]

5. Sidetrack opponents with name calling and ridicule.  This is also known as the primary 'attack the messenger'  ploy, though other methods qualify as variants of that approach. Associate opponents with unpopular titles such as 'kooks', 'right-wing', 'liberal', 'left-wing', 'terrorists', 'conspiracy buffs',  'radicals', 'militia', 'racists', 'religious fanatics',  'sexual deviates', and so forth. This makes others  shrink from support out of fear of gaining the same label, and you avoid dealing with issues.
[add: Use names like 'rabid no-planers', 'space beams', 'space beamers'. (JW, 2007)]

6. Hit and Run. In any public forum, make a brief attack of your opponent or the opponent position and then scamper off before an answer can be fielded, or simply ignore any answer. This works extremely well in Internet  and letters-to-the-editor environments where a steady stream of new identities can be called upon without having to explain criticism, reasoning -- simply make an accusation or other  attack, never discussing issues, and never answering any subsequent response, for that would dignify the opponent's viewpoint.

7. Question motives. Twist or amplify any fact which could be taken to imply that the opponent operates out of a hidden personal  agenda or other bias. This avoids discussing issues and forces the accuser on the defensive.

8. Invoke authority. Claim for yourself or associate yourself with authority and present your argument with enough 'jargon' and 'minutia' to illustrate you are 'one who knows', and simply say it isn't so without discussing issues or demonstrating concretely why or citing sources.

9. Play Dumb. No matter what evidence or logical argument is offered, avoid discussing issues except with denials they have any credibility, make any sense, provide any proof, contain or make a point, have logic, or support a conclusion. Mix well for maximum effect.
[add example: "I haven't seen any evidence of pulverization on Judy Wood's web site." (JW, 2007)]

10. Associate opponent charges with old news. A derivative of the straw man -- usually, in any large-scale matter of high visibility, someone will make charges early on which can be or were already easily dealt with - a kind of investment for the future should the matter not be so easily contained.) Where it can be foreseen, have your own side raise a straw man issue and have it dealt with early on as part of the initial contingency plans. Subsequent charges, regardless of validity or new ground uncovered, can usually then be associated with the original charge and dismissed as simply being a rehash without need to address current issues -- so much the better where the opponent  is or was involved with the original source.

11. Establish and rely upon fall-back positions.  Using a minor matter or element of the facts, take the 'high road' and 'confess' with candor that some innocent mistake, in hindsight, was made -- but that opponents have seized on the opportunity to blow it all out of proportion and imply greater criminalities which, 'just isn't so.' Others can reinforce this on your behalf, later, and even publicly 'call for an end to the nonsense' because you have already 'done the right thing.' Done properly, this can garner sympathy and respect for 'coming clean' and 'owning up' to your mistakes without addressing more serious issues.

12. Enigmas have no solution.  Drawing upon the overall umbrella of events surrounding the crime and the multitude of players and events, paint the entire affair as too complex to solve. This causes those otherwise following the matter to begin to lose interest more quickly without having to address the actual issues.
[add example: 'Thermite is available on ebay and it is untracable, so I guess we'll never know who did it." (JW, 2007)]

13. Alice in Wonderland Logic. Avoid discussion of the issues by reasoning backwards or with an apparent deductive logic which forbears any actual material fact.

14. Demand complete solutions. Avoid the issues by requiring opponents to solve the crime at hand completely, a ploy which works best with issues qualifying for rule 10.
[add example: 'Exactly how much energy would be required to pulverize the WTC Towers?' The authors of the DEW paper are asked this on a regular basis as if there is a question as to whether or not the WTC was destroyed. But, those with other theories who ask this question feel no need to answer the same question themselves. (JW, 2007)]

15. Fit the facts to alternate conclusions.  This requires creative thinking unless the crime  was planned with contingency conclusions in place.
[add: Show a photo of a toasted car on FDR Drive and then emphasize how well "thermite fits the data". (JW, 2007)]

16. Vanish evidence and witnesses.  If it does not exist, it is not fact, and you won't have to address the issue.
[add: We are frequently reminded to pine for the evidence we don't have (the missing steel) instead of looking at the evidence we do have (photos). (JW, 2007)]

17. Change the subject. Usually in connection with one of the other ploys  listed here, find a way to side-track the discussion with abrasive or controversial comments in hopes of turning attention to a new, more manageable topic. This works especially well with companions who can  'argue' with you over the new topic and polarize the discussion arena in order to avoid discussing more key issues.

18. Emotionalize, Antagonize, and Goad Opponents. If you can't do anything else, chide and taunt your opponents and draw them into emotional responses which will tend to make them look foolish and overly motivated, and generally render their material somewhat less coherent. Not only will you avoid discussing the issues in the first instance, but even if their emotional response addresses the issue, you can further avoid the issues by then focusing on how 'sensitive they are to criticism.'

19. Ignore proof presented, demand impossible proofs. This is perhaps a variant of the 'play dumb' rule.  Regardless of what material may be presented by an opponent in public forums, claim the material irrelevant  and demand proof that is impossible for the opponent to come by (it may exist, but not be at his disposal, or it may be something which is known to be safely destroyed or withheld, such as a murder weapon.) In order to completely avoid discussing issues, it may be required that you to categorically deny and be critical of media or books as valid sources, deny that witnesses are acceptable, or even deny that statements made by government or other authorities have any meaning or relevance.
[add example: Thermite cannot explain the cylindrical holes in WTC6 and the toasted cars, so that data must be ignored. (JW, 2007)]

20. False evidence. Whenever possible, introduce new facts or clues designed and manufactured to conflict with opponent presentations -- as useful tools to neutralize sensitive issues or impede resolution. This works best when the crime was designed with contingencies for the purpose, and the facts cannot be easily separated from the fabrications.
[add example: 'We have new data (from mysterious and secret samples and test methods) that show strong evidence of possible ___'. (JW, 2007)]

21. Call a Grand Jury, Special Prosecutor, or other  empowered investigative body. Subvert the (process) to your benefit and effectively neutralize all sensitive issues without open discussion. Once convened, the evidence and testimony are required to be secret when properly handled. For instance, if you own the prosecuting attorney, it can insure a Grand Jury hears no useful evidence and that the evidence is sealed and unavailable to subsequent investigators. Once a favorable verdict is achieved, the matter can be considered officially closed. Usually, this technique is applied to find the guilty innocent, but it can also be used to obtain charges when seeking to frame a victim.

22. Manufacture a new truth. Create your own expert(s), group(s), author(s), leader(s) or influence existing ones willing to forge new ground via scientific, investigative, or social research or testimony which concludes favorably. In this way, if you must actually address issues, you can do so authoritatively.
[add example: 'If not able to take over 'Scholars for 9/11 Truth,' then start a new group, 'Scholars for 9/11 Truth and Justice' and establish a new "truth." (But, isn't truth its own defense?)]

23. Create bigger distractions. If the above does not seem to be working to distract from sensitive issues, or to prevent unwanted media coverage of unstoppable  events such as trials, create bigger news stories (or treat them as such) to distract the multitudes.
[add example: Why would a group of folks want to destroy the organization, Scholars for 9/11 Truth and drag it out for many months with multiple emails a day proposing endless negotiations with no intention of following through on any of them? (JW, 2007)]

24. Silence critics. If the above methods do not prevail, consider removing opponents from circulation by some definitive solution so that the need to address issues is removed entirely. This can be by their death, arrest and detention, blackmail or destruction of theircharacter by release of blackmail information, or merely by destroying them financially, emotionally, or severely damaging their health.
[add example: Murder the student of a prominent 9/11 researcher. (JW, 2007)]

25. Vanish. If you are a key holder of secrets or otherwise overly illuminated and you think the heat is getting too hot, to avoid  the issues, vacate the kitchen. .


Note: There are other ways to attack truth, but these listed are the most common, and others are likely derivatives of these. In the end, you can usually spot the professional disinfo players by one or more of seven (now 8) distinct traits:

Amendments
by Judy Wood

26. Isolate and intimidate. On a group email, the troublemaker is replied to individually in an intimidating tone. This includes removing certain "trouble makers" from a group emailing that is designed to promote propaganda that the troublemakers can easily disprove.


Eight Traits of the Disinformationalist 
by H. Michael Sweeney


1) Avoidance. They never actually discuss issues head-on or provide constructive input, generally avoiding citation of references or credentials. Rather, they merely imply this, that, and the other. Virtually everything about  their presentation implies their authority and  expert knowledge in the matter without any further justification for credibility.

2) Selectivity. They tend to pick and choose opponents carefully, either applying the hit-and-run approach against mere commentators supportive of opponents, or focusing heavier attacks on key opponents who are known to directly address issues. Should a comment at or become argumentative with any success, the focus will shift to include the commentator as well.
[add example: Hold "conferences" but do not invite key opponents. Decline all invitations to events where key opponents will be present. (JW, 2007)]

3) Coincidental. They tend to surface suddenly and somewhat coincidentally with a new controversial topic with no clear prior record of participation in general discussions in the particular public arena involved. They likewise tend to vanish once the topic is no longer of general concern. They were likely directed or elected to be there for a reason, and vanish with the reason.
[add example: Manage a Journal where content is carefully managed. Reject submissions by opponents and accept ad hominem hit pieces attacking opponents, regardless of how much they may undermine the credibility of the Journal. (JW, 2007)]

4) Teamwork. They tend to operate in self-congratulatory and complementary packs or teams. Of course, this can happen naturally in any public forum, but there will likely be an ongoing pattern of frequent exchanges of this sort where professionals are involved. Sometimes one of the players will infiltrate the opponent camp to become a source for straw man or other tactics designed to dilute opponent presentation strength.

5) Anti-conspiratorial. They almost always have disdain for 'conspiracy theorists' and, usually, for those who in any way believe JFK was not killed by LHO. Ask yourself why, if they hold such disdain for conspiracy theorists, do they focus on defending a single topic discussed in a NG focusing on conspiracies? One might think they would either be trying to make fools of everyone on every topic, or simply ignore the group they hold in such disdain.Or, one might more rightly conclude they have  an ulterior motive for their actions in going out of their way to focus as they do.
[add example: Omit including the "official government story" of 9/11 as a conspiracy theory. (JW, 2007)]

6) Artificial Emotions. An odd kind of 'artificial' emotionalism and an unusually thick skin -- an ability to persevere and persist even in the face of overwhelming criticism and unacceptance. This likely stems from intelligence community training that, no matter how condemning the evidence, deny everything, and never become emotionally involved or reactive. The net result for a disinfo artist is that emotions can seem artificial. Most people, if responding in anger, for instance, will express their animosity throughout their rebuttal. But disinfo types usually have trouble maintaining the 'image' and are hot and cold with respect to pretended emotions and their usually more calm or unemotional communications style. It's just a job, and they often seem unable to 'act their role in character' as well in a communications medium as they might be able in a real face-to-face  conversation/confrontation. You might have outright rage and indignation one moment, ho-hum the next, and more anger later -- an emotional yo-yo. With respect to being thick-skinned, no amount of criticism will deter them from doing their job, and they will generally continue their old disinfo patterns without any adjustments to criticisms of how obvious it is that they play that game -- where a more rational individual who truly cares what others think might seek to improve their communications style, substance, and so forth, or simply give up.

7) Inconsistent. There is also a tendency to make mistakes which betray their true self/motives. This may stem from not really knowing their topic, or it may be somewhat 'freudian', so to speak, in that perhaps they  really root for the side of truth deep within.

I have noted that often, they will simply cite contradictory information which neutralizes  itself and the author. For instance, one such  player claimed to be a Navy pilot, but blamed his poor communicating skills (spelling, grammar, incoherent style) on having only a grade-school education. I'm not aware of too many Navy pilots who don't have a college degree. Another claimed no knowledge of a particular topic/situation but later claimed first-hand knowledge of it.

8) BONUS TRAIT: Time Constant. Recently discovered, with respect to News Groups, is the response time factor. There are three ways this can be seen to work, especially when the government or other empowered player is involved in a cover up operation:
1) ANY NG posting by a targeted proponent for truth  can result in an IMMEDIATE response. The government and other empowered players can afford to pay people to sit there and watch for an opportunity to do some damage. SINCE DISINFO IN A NG ONLY WORKS IF THE READER SEES IT - FAST RESPONSE IS CALLED FOR, or the visitor may be swayed towards truth.
2) When dealing in more direct ways with a disinformationalist, such as email, DELAY IS CALLED FOR - there will usually be a minimum of a 48-72 hour delay. This allows a sit-down team discussion on response strategy for best effect, and even enough time to 'get permission' or instruction from a formal chain of command.
3) In the NG example 1) above, it will often ALSO be seen that bigger guns are drawn and fired after the same 48-72 hours delay - the team approach in play. This is especially true when the targeted truth seeker or their comments are considered more important with respect to potential to reveal truth. Thus, a serious truth sayer will be attacked twice for the same sin.


I close with the first paragraph of the introduction to my unpublished book, Fatal Rebirth:

Truth cannot live on a diet of secrets, withering within entangled lies. Freedom cannot live on a diet of lies, surrendering to the veil of oppression. The human spirit cannot live on a diet of oppression, becoming subservient in the end to the will of evil. God, as truth incarnate, will not long let stand a world devoted to such evil. Therefore, let us have the truth and freedom our spirits require... or let us die seeking these things, for without them, we shall surely and justly perish in an evil world.





************************************************************************
SYMPTOMS OF PATHOLOGICAL SKEPTICISM (c)1996 William J. Beaty 
************************************************************************

Many members of the mainstream scientific community react with extreme hostility when presented with certain claims. This can be seen in their emotional responses to current controversies such as UFO abductions, Cold Fusion, cryptozoology, psi, and numerous others. The scientists react not with pragmatism and a wish to get to the bottom of things, but instead with the same tactics religious groups use to suppress heretics: hostile emotional attacks, circular reasoning, dehumanizing of the 'enemy', extreme closed-mindedness, intellectually dishonest reasoning, underhanded debating tactics, negative gossip, and all manner of name-calling and character assassination.
Two can play at that game! Therefore, I call their behavior "Pathological Skepticism," a term I base upon skeptics' assertion that various unacceptable ideas are "Pathological Science." Below is a list of the symptoms of pathological skepticism I have encountered, and examples of the irrational reasoning they tend to produce.
(Note: all the quotes are artificial examples)

  1. Belief that theories determine phenomena, rather than the reverse. "The phenomenon you have observed is impossible, crazy stuff. We know of no mechanism which could explain your results, so we have grave suspicions about the accuracy your report. There is no room for your results in modern theory, so they simply cannot exist. You are obviously the victim of errors, hoaxers, or self-delusion. We need not publish your paper, and any attempts at replicating your results would be a waste of time. Your requests for funding are misguided, and should be turned down."


  2. Erecting barriers against new ideas by constantly altering the requirements for acceptance. (A practice called "moving the goalposts.") "I'll believe it when 'X' happens" (but when it does, this immediately is changed to: "I'll believe it when 'Y' happens.")
    Example:
    "I won't believe it until major laboratories publish papers in this field. They have? That means nothing! Major labs have been wrong before. I'll believe it when stores sell products which use the effect. They do? That means nothing, after all, stores sell magic healing pendants and Ouija boards. I'll believe it when a Nobel Prize winning researcher gets behind that work. One has? Well that means nothing! That person is probably old and dotty like Dr. Pauling and his vitamin-C..." etc.


  3. Belief that fundamental concepts in science rarely change, coupled with a "herd following" behavior where the individual changes his/her opinions when colleagues all do, all the while remaining blind to the fact that any opinions had ever changed. "The study of (space flight, endosymbiosis, drillcore bacteria, child abuse, cold fusion, etc.) has always been a legitimate pursuit. If scientists ever ridiculed the reported evidence or tried to stop such research, it certainly was not a majority of scientists. It must have been just a few misguided souls, and must have happened in the distant past."


  4. Belief that science is guided by consensus beliefs and majority rule, rather than by evidence. Indulging in behavior which reinforces the negative effects of consensus beliefs while minimizing the impact of any evidence which contradicts those beliefs. "I don't care how good your evidence is, I won't believe it until the majority of scientists also find it acceptable. Your evidence cannot be right, because it would mean that hundreds of textbooks and thousands of learned experts are wrong.


  5. Adopting a prejudiced stance against a theory or an observed phenomena without first investigating the details, then using this as justification for refusing to investigate the details. "Your ideas are obviously garbage. What, try to replicate your evidence? I wouldn't soil my hands. And besides, it would be a terrible waste of time and money, since there's no question about the outcome."


  6. Maintaining an unshakable stance of hostile, intolerant skepticism, and when anyone complains of this, accusing them of paranoid delusion. Remaining blind to scientists' widespread practice of intellectual suppression of unorthodox findings, and to the practice of "expulsion of heretics" through secret, back-room accusations of deviance or insanity. "You say that no one will listen to your ideas, and now the funding for your other projects is cut off for no reason? And colleagues are secretly passing around a petition demanding that you be removed? If you're thinking along THOSE lines, then you obviously are delusional and should be seeking professional help."


  7. Ignoring the lessons of history, and therefore opening the way for repeating them again and again. "Scientists of old ridiculed the germ theory, airplanes, space flight, meteors, etc. They were certain that science of the time had everything figured out, and that major new discoveries were no longer possible. Isn't it good that we researchers of today are much more wise, and such things can no longer happen!"


  8. *Denial* of the lessons of history. An inability to admit that science has made serious mistakes in the past. Maintaining a belief that good ideas and discoveries are never accidentally suppressed by closed-mindedness, then revising history to fit this belief. "Throughout history, the *majority* of scientists never ridiculed flying machines, spacecraft, television, continental drift, reports of ball lightning, meteors, sonoluminescence, etc. These discoveries are not examples of so-called 'paradigm shifts', they are obvious examples of the slow, steady, forward progress made by science!"


  9. Using circular arguments to avoid accepting evidence which supports unusual discoveries, or to prevent publication of this evidence. "I do not have to inspect the evidence because I know it's wrong. I know it's wrong because I've never seen any positive evidence."
    "We will not publish your paper, since these results have not been replicated by any other researchers.We will not publish your paper, since it is merely a replication of work which was done earlier, by other researchers."


  10. Accusing opponents of delusion, lying, or even financial fraud, where no evidence for fraud exists other than the supposed impossibility of evidence being presented. "Don't trust researchers who study parapsychology. They constantly cheat and lie in order to support their strange worldviews. Very few of them have been caught at it, but it's not necessary to do so, since any fool can see that the positive evidence for psi can only be created by people who are either disturbed or dishonest."


  11. Unwarranted confidence that the unknown is in the far distance, not staring us in the face. "Your evidence cannot be real because it's not possible that thousands of researchers could have overlooked it for all these years. If your discovery was real, the scientists who work in that field would already know about it."


  12. Belief that certain fields of science are complete, that scientific revolutions never happen, and that any further progress must occur only in brushing up the details. "Physics is a mature field. Future progress can only lie in increasing the energies of particle accelerators, and in refining the precision of well-known measurements. Your discovery cannot be true, since it would mean we'd have to throw out all our hard-won knowledge about physics."


  13. Excusing the ridicule, trivialization, and the scorn which is directed at 'maverick' ideas and at anomalous evidence. Insisting that sneering and derisive emotional attacks constitute a desirable and properly scientific natural selection force. "It is right that new discoveries be made to overcome large barriers. That way only the good ideas will become accepted. If some important discoveries are suppressed in this process, well, that's just the price we have to pay to defend science against the fast-growing hoards of crackpots who threaten to destroy it."


  14. Justifying any refusal to inspect evidence by claiming a "slippery slope." Using the necessary judicious allocation of time and funding as a weapon to prevent investigation of unusual, novel, or threatening ideas. "If we take your unlikely discovery seriously, all scientists everywhere will have to accept every other crackpot idea too, and then we'll waste all of our time checking out crackpot claims."


  15. A blindness to phenomena which do not fit the current belief system, coupled with a denial that beliefs affect perceptions. "Thomas Kuhn's 'paradigm shifts' and sociology's 'cognitive dissonance' obviously do not apply to average, rational scientists. Scientists are objective, so they are not prone to the psychological failings which plague normal humans. Scientists always welcome any data which indicates a need to revise their current knowledge. Their "beliefs" don't affect their perceptions, scientists don't have "beliefs", science is not a religion!


  16. A belief that all scientific progress is made by small, safe, obvious steps, that widely-accepted theories are never overturned, and that no new discoveries come from anomalies observed. "All your observations are obviously mistakes. They couldn't possibly be real, because if they were real, it would mean that major parts of current science are wrong, and we would have to rewrite large portions of we know about physics. This never occurs. Science proceeds by building on earlier works, never by tearing them down. Therefore it is right that we reject evidence which contradicts contemporary theory, and recommend that funding of such research not be continued."


  17. Hiding any evidence of personal past ridicule of ideas which are later proved valid. Profound narcissism; an extreme need to always be right, a fear of having personal errors revealed, and a habit of silently covering up past mistakes. " X is obviously ridiculous, and its supporters are crack-pots who are giving us a bad name and should be silenced."
    But if X is proved true, the assertion suddenly becomes:
    "Since 'X' is obviously true, it follows that..."


  18. Belief in the lofty status of modern science but with consequent blindness to, and denial of, its faults. A tendency to view shameful events in the history of modern science as being beneficial, and a lack of any desire to fix contemporary problems. "It was right that Dr. Wegner's career was wrecked; that he was treated as a crackpot, ridiculed, and died in shame. His evidence for continental drift convinced no one. And besides, he did not propose a mechanism to explain the phenomena."


  19. A belief that Business and the Press have no tendency towards close- mindedness and suppression of novelty, and that their actions are never are guided by the publicly-expressed judgement of scientists. "If the Wright Brothers' claims were true, we would be reading about it in all the papers, and flying-machine companies would be springing up left and right. Neither of these is occurring, therefor the Wright's claims are obviously a lie and a hoax.


  20. Refusing to be swayed when other researchers find evidence supporting unconventional phenomena or theories. If other reputable people change sides and accept the unorthodox view, this is seen as evidence of their gullibility or insanity, not as evidence that perhaps the unconventional view is correct. "I'll believe it when someone like Dr. P believes it."
    But when Dr. P changes sides, this becomes:
    "Dr. P did some great work in his early years, but then he destroyed his career by getting involved with that irrational crackpot stuff."


  21. Elevating skepticism to a lofty position, yet indulging in hypocrisy and opening the way to pathological thinking by refusing to ever cast a critical, SKEPTICAL eye upon the irrational behavior of scoffers. "Criticizing skeptics is never beneficial. It even represents a danger to science. One should never criticize science, it just gives ammunition to the enemy; it aids the irrational, anti-science hoards who would destroy our fragile edifice."


  22. Belief that modern scientists as a group lack faults, and therefore clinging to any slim justifications in order to ignore the arguments of those who hope to eliminate the flaws in Science. "I think we can safely ignore Thomas Kuhn's STRUCTURES OF SCIENTIFIC REVOLUTIONS. Despite his physics training we can see that Kuhn was an outsider to science; he obviously doesn't have a good grasp on real science. Outsiders never can see things in the proper positive light, it takes a working scientist to see the real situation. Also, he stressed his central themes way too much, so I think we can ignore him as simply being a sensationalist. And besides, if he's digging up dirt regarding science, then he must have a hidden agenda. I bet we'll find that he's a Christian or something, probably a creationist."


  23. Blindness to the widespread existence of the above symptoms. Belief that scientists are inherently objective, and rarely fall victim to these faults. Excusing the frequent appearance of these symptoms as being isolated instances which do not comprise an accumulation of evidence for the common practice of Pathological Skepticism.

    "This 'Pathological Skepticism' does not exist. Kooks and crackpots deserve the hostile mistreatment we give them, but anyone who does similar things to skeptics is terribly misguided. Those who criticize skeptics are a danger to Science itself, and we must stop them."
See also:
Zen and the art of debunkery, Dan Drasin
http://amasci.com/pathskep.html
Seven Warning Signs of Bogus Skepticism
http://mathpost.la.asu.edu/~boerner/seven%20warning%20signs.html




Zen and the art of debunkery
by Dan Drasin

http://amasci.com/pathskep.html

ZEN. . . AND THE ART OF DEBUNKERY

(C) 1993 by Daniel Drasin. All rights reserved. May not be reproduced without permission. May be posted electronically provided that it is transmitted unaltered, in its entirety, and without charge. File begins and ends with ####, and totals 27,251 bytes. Also available as a handy, attractive booklet for $6.50 each, postpaid. Send checks to Daniel Drasin, P.O. Box 1772, Boulder, CO 80306. Allow 2-3 weeks for shipment, as I am often out of town. Order a dozen as gifts for your skeptical friends!. --------------------------------------------------------- So you've had a close encounter with a UFO or its occupants. Or a serious interest in the subject of extramundane life. Or a passion for following clues that seem to point toward the existence of a greater reality. Mention any of these things to most working scientists and be prepared for anything from patronizing skepticism to merciless ridicule. After all, science is supposed to be a purely hardnosed enterprise with little patience for "expanded" notions of reality. Right? Wrong. Like all systems of truth seeking, science, properly conducted, has a profoundly expansive, spiritual impulse at its core. This "Zen" in the heart of science is revealed when the practitioner sets aside arbitrary beliefs and cultural preconceptions, and approaches the nature of things with "beginner's mind." When this is done, reality can speak freshly and freely, and can be heard more clearly. Appropriate testing and objective validation can--indeed, *must*-- come later. Seeing with humility, curiosity and fresh eyes was once the main point of science. But today it is often a different story. As the scientific enterprise has been bent toward exploitation, institutionalization, hyperspecialization and new orthodoxy, it has increasingly preoccupied itself with disconnected facts in a spiritual, psychological, social and ecological vacuum. Virtually gone from the scene is the philosopherscientist, to whom meaning and context were once the very fabric of a multi-level universe. Today's mainstream science tends, instead, to deny or disregard entire domains of reality, and satisfies itself with reducing all of life and consciousness to a dead physics. As we approach the end of the millennium, science seems in many ways to be treading the weary path of the religions it presumed to replace. Where free, dispassionate inquiry once reigned, emotions now run high in the defense of a fundamentalized "scientific truth." As anomalies mount up beneath a sea of denial, defenders of the Faith and the Kingdom cling with increasing self- righteousness to the hull of a sinking paradigm. Faced with provocative evidence of things undreamt of in their materialist philosophy, many otherwise mature scientists revert to a kind of skeptical infantilism characterized by blind faith in the absoluteness of the familiar. Small wonder that, after more than half a century, the UFO remains shrouded in superstition, ignorance, denial, disinformation, taboo . . . and debunkery. What is "debunkery?" As intended here, it is the attempt to *debunk* (invalidate) new information and insight by substituting scient*istic* propaganda for scient*ific* method. To throw this kind of pseudoscientific behavior into bold--if somewhat comic--relief, I have assembled below a useful "how-to" guide for aspiring debunkers, with a special section devoted to debunking the UFO--perhaps the most aggressively debunked subject in the whole of modern history. As will be obvious to the reader, I have carried a few of these debunking strategies over the threshold of absurdity for the sake of making a point. As for the rest, their inherently fallacious reasoning, twisted logic and sheer goofiness will sound frustratingly familar to those who have dared explore beneath the ocean of denial and attempted in good faith to report back about what they found there. So without further ado . . . == HOW TO DEBUNK JUST ABOUT ANYTHING == *PART 1: GENERAL DEBUNKERY* Top <>Before commencing to debunk, prepare your equipment. Equipment needed: one armchair. <> Put on the right face. Cultivate a condescending air that suggests that your personal opinions are backed by the full faith and credit of God. Employ vague, subjective, dismissive terms such as "ridiculous" or "trivial" in a manner that suggests they have the full force of scientific authority. <> Portray science not as an open-ended process of discovery but as a holy war against unruly hordes of quackery-worshipping infidels. Since in war the ends justify the means, you may fudge, stretch or violate scientific method, or even omit it entirely, in the name of defending scientific method. <> Keep your arguments as abstract and theoretical as possible. This will "send the message" that accepted theory overrides any actual evidence that might challenge it--and that therefore no such evidence is worth examining. <> Reinforce the popular misconception that certain subjects are inherently unscientific. In other words, deliberately confuse the *process* of science with the *content* of science. (Someone may, of course, object that science must be neutral to subject matter and that only the investigative *process* can be scientifically responsible or irresponsible. If that happens, dismiss such objections using a method employed successfully by generations of politicians: simply reassure everyone that "there is no contradiction here.") <> Arrange to have your message echoed by persons of authority. The degree to which you can stretch the truth is directly proportional to the prestige of your mouthpiece. <> Always refer to unorthodox statements as "claims," which are "touted," and to your own assertions as "facts," which are "stated." <> Avoid examining the actual evidence. This allows you to say with impunity, "I have seen absolutely no evidence to support such ridiculous claims!" (Note that this technique has withstood the test of time, and dates back at least to the age of Galileo. By simply refusing to look through his telescope, the ecclesiastical authorities bought the Church over three centuries' worth of denial free and clear!) <> If examining the evidence becomes unavoidable, report back that "there is nothing new here!" If confronted by a watertight body of evidence that has survived the most rigorous tests, simply dismiss it as being "too pat." <> Equate the necessary skeptical component of science with *all* of science. Emphasize the narrow, stringent, rigorous and critical elements of science to the exclusion of intuition, inspiration, exploration and integration. If anyone objects, accuse them of viewing science in exclusively fuzzy, subjective or metaphysical terms. <> Insist that the progress of science depends on explaining the unknown in terms of the known. In other words, science equals reductionism. You can apply the reductionist approach in any situation by discarding more and more and more evidence until what little is left can finally be explained entirely in terms of established knowledge. <> Downplay the fact that free inquiry, legitimate disagreement and respectful debate are a normal part of science. <> At every opportunity reinforce the notion that what is familiar is necessarily rational. The unfamiliar is therefore irrational, and consequently inadmissible as evidence. <> State categorically that the unconventional arises exclusively from the "will to believe" and may be dismissed as, at best, an honest misinterpretation of the conventional. <> Maintain that in investigations of unconventional phenomena, a single flaw invalidates the whole. In conventional contexts, however, you may sagely remind the world that, "after all, situations are complex and human beings are imperfect." <> "Occam's Razor," or the "principle of parsimony," says the correct explanation of a mystery will usually involve the simplest fundamental principles. Insist, therefore, that the most familiar explanation is by definition the simplest! Imply strongly that Occam's Razor is not merely a philosophical rule of thumb but an immutable law. <> Discourage any study of history that may reveal today's dogma as yesterday's heresy. Likewise, avoid discussing the many historical, philosophical and spiritual parallels between science and democracy. <> Since the public tends to be unclear about the distinction between evidence and proof, do your best to help maintain this murkiness. If absolute proof is lacking, state categorically that there is no evidence. <> If sufficient evidence has been presented to warrant further investigation of an unusual phenomenon, argue that "evidence alone proves nothing!" Ignore the fact that preliminary evidence is not supposed to prove *anything*. <> In any case, imply that proof precedes evidence. This will eliminate the possibility of initiating any meaningful process of investigation--particularly if no criteria of proof have yet been established for the phenomenon in question. <> Insist that criteria of proof cannot possibly be established for phenomena that do not exist! <> Although science is not supposed to tolerate vague or double standards, always insist that unconventional phenomena must be judged by a separate, yet ill-defined, set of scientific rules. Do this by declaring that "extraordinary claims demand extraordinary evidence"--but take care never to define where the "ordinary" ends and the "extraordinary" begins. This will allow you to manufacture an infinitely receding evidential horizon, i.e., to define "extraordinary" evidence as that which lies just out of reach at any point in time. <> Practice debunkery-by-association. Lump together all phenomena popularly deemed paranormal and suggest that their proponents and researchers speak with a single voice. In this way you can indiscriminately drag material across disciplinary lines or from one case to another to support your views as needed. For example, if a claim having some superficial similarity to the one at hand has been (or is popularly assumed to have been) exposed as fraudulent, cite it as if it were an appropriate example. Then put on a gloating smile, lean back in your armchair and just say "I rest my case." <> Use the word "imagination" as an epithet that applies only to seeing what's *not* there, and not to denying what *is* there. <> If a significant number of people agree that they have observed something that violates the consensus reality, simply ascribe it to "mass hallucination." Avoid addressing the possibility that the consensus reality, which is routinely observed by millions, might itself constitute a mass hallucination. <> Ridicule, ridicule, ridicule. It is far and away the single most chillingly effective weapon in the war against discovery and innovation. Ridicule has the unique power to make people of virtually any persuasion go completely unconscious in a twinkling. It fails to sway only those few who are of sufficiently independent mind not to buy into the kind of emotional consensus that ridicule provides. <> By appropriate innuendo and example, imply that ridicule constitutes an essential feature of scientific method that can raise the level of objectivity, integrity and dispassionateness with which any investigation is conducted. <> Imply that investigators of the unorthodox are zealots. Suggest that in order to investigate the existence of something one must first believe in it absolutely. Then demand that all such "true believers" know all the answers to their most puzzling questions in complete detail ahead of time. Convince people of your own sincerity by reassuring them that you yourself would "love to believe in these fantastic phenomena." Carefully sidestep the fact that science is not about believing or disbelieving, but about finding out. <> Use "smoke and mirrors," i.e., obfuscation and illusion. Never forget that a slippery mixture of fact, opinion, innuendo, out- of-context information and outright lies will fool most of the people most of the time. As little as one part fact to ten parts B.S. will usually do the trick. (Some veteran debunkers use homeopathic dilutions of fact with remarkable success!) Cultivate the art of slipping back and forth between fact and fiction so undetectably that the flimsiest foundation of truth will always appear to firmly support your entire edifice of opinion. <> Employ "TCP": Technically Correct Pseudo-refutation. Example: if someone remarks that all great truths began as blasphemies, respond immediately that not all blasphemies have become great truths. Because your response was technically correct, no one will notice that it did not really refute the original remark. <> Trivialize the case by trivializing the entire field in question. Characterize the study of orthodox phenomena as deep and timeconsuming, while deeming that of unorthodox phenomena so insubstantial as to demand nothing more than a scan of the tabloids. If pressed on this, simply say "but there's nothing there to study!" Characterize any serious investigator of the unorthodox as a "buff" or "freak," or as "self-styled"-the media's favorite code-word for "bogus." <> Remember that most people do not have sufficient time or expertise for careful discrimination, and tend to accept or reject the whole of an unfamiliar situation. So discredit the whole story by attempting to discredit *part* of the story. Here's how: a) take one element of a case completely out of context; b) find something prosaic that hypothetically could explain it; c) declare that therefore that one element has been explained; d) call a press conference and announce to the world that the entire case has been explained! <> Engage the services of a professional stage magician who can mimic the phenomenon in question; for example, ESP, psychokinesis or levitation. This will convince the public that the original claimants or witnesses to such phenomena must themselves have been (or been fooled by) talented stage magicians who hoaxed the original phenomenon in precisely the same way. <> Find a prosaic phenomenon that resembles, no matter how superficially, the claimed phenomenon. Then suggest that the existence of the commonplace look-alike somehow forbids the existence of the genuine article. For example, imply that since people often see "faces" in rocks and clouds, the enigmatic Face on Mars must be a similar illusion and therefore cannot possibly be artificial. <> When an unexplained phenomenon demonstrates evidence of intelligence (as in the case of the mysterious crop circles) focus exclusively on the mechanism that might have been wielded by the intelligence rather than the intelligence that might have wielded the mechanism. The more attention you devote to the mechanism, the more easily you can distract people from considering the possibility of nonphysical or nonterrestrial intelligence. <> Accuse investigators of unusual phenomena of believing in "invisible forces and extrasensory realities." If they should point out that the physical sciences have *always* dealt with invisible forces and extrasensory realities (gravity? electromagnetism? . . . ) respond with a condescending chuckle that this is "a naive interpretation of the facts." <> Insist that western science is completely objective, and is based on no untestable assumptions, covert beliefs or ideological interests. If an unfamiliar or inexplicable phenomenon happens to be considred true and/or useful by a nonwestern or other traditional society, you may therefore dismiss it out of hand as "ignorant misconception," "medieval superstition" or "fairy lore." <> Label any poorly-understood phenomenon "occult," "paranormal," "metaphysical," "mystical" or "supernatural." This will get most mainstream scientists off the case immediately on purely emotional grounds. If you're lucky, this may delay any responsible investigation of such phenomena by decades or even centuries! <> Ask questions that appear to contain generally-assumed knowledge that supports your views; for example, "why do no police officers, military pilots, air traffic controllers or psychiatrists report UFOs?" (If someone points out that they do, insist that those who do must be mentally unstable.) <> Ask unanswerable questions based on arbitrary criteria of proof. For example, "if this claim were true, why haven't we seen it on TV?" or "in this or that scientific journal?" Never forget the mother of all such questions: "If UFOs are extraterrestrial, why haven't they landed on the White House lawn?" <> Remember that you can easily appear to refute anyone's claims by building "straw men" to demolish. One way to do this is to misquote them while preserving that convincing grain of truth; for example, by acting as if they have intended the extreme of any position they've taken. Another effective strategy with a long history of success is simply to misreplicate their experiments--or to avoid replicating them at all on grounds that to do so would be ridiculous or fruitless. To make the whole process even easier, respond not to their actual claims but to their claims as reported by the media, or as propagated in popular myth. <> Insist that such-and-such unorthodox claim is not scientifically testable because no self-respecting grantmaking organization would fund such ridiculous tests. <> Be selective. For example, if an unorthodox healing method has failed to reverse a case of terminal illness you may deem it worthlesswhile taking care to avoid mentioning any of the shortcomings of conventional medicine. <> Hold claimants responsible for the production values and editorial policies of any media or press that reports their claim. If an unusual or inexplicable event is reported in a sensationalized manner, hold this as proof that the event itself must have been without substance or worth. <> When a witness or claimant states something in a manner that is scientifically imperfect, treat this as if it were not scientific at all. If the claimant is not a credentialed scientist, argue that his or her perceptions cannot possibly be objective. <> If you're unable to attack the facts of the case, attack the participants--or the journalists who reported the case. Ad-hominem arguments, or personality attacks, are among the most powerful ways of swaying the public and avoiding the issue. For example, if investigators of the unorthodox have profited financially from activities connected with their research, accuse them of "profiting financially from activities connected with their research!" If their research, publishing, speaking tours and so forth, constitute their normal line of work or sole means of support, hold that fact as "conclusive proof that income is being realized from such activities!" If they have labored to achieve public recognition for their work, you may safely characterize them as "publicity seekers." <> Fabricate supportive expertise as needed by quoting the opinions of those in fields popularly assumed to include the necessary knowledge. Astronomers, for example, may be trotted out as experts on the UFO question, although course credits in ufology have never been a prerequisite for a degree in astronomy. <> Fabricate confessions. If a phenomenon stubbornly refuses to go away, set up a couple of colorful old geezers to claim they hoaxed it. The press and the public will always tend to view confessions as sincerely motivated, and will promptly abandon their critical faculties. After all, nobody wants to appear to lack compassion for self-confessed sinners. <> Fabricate sources of disinformation. Claim that you've "found the person who started the rumor that such a phenomenon exists!" <> Fabricate entire research projects. Declare that "these claims have been thoroughly discredited by the top experts in the field!" Do this whether or not such experts have ever actually studied the claims, or, for that matter, even exist. *PART 2: DEBUNKING THE UFO* Top <> Point out that an "unidentified" flying object is just that, and cannot be automatically assumed to be extraterrestrial. Do this whether or not anyone involved *has* assumed it to be extraterrestrial. <> Equate nature's laws with our current understanding of nature's laws. Then label all concepts such as antigravity or interdimensional mobility as mere flights of fancy "because obviously they would violate nature's laws." Then if a UFO is reported to have hovered silently, made right-angle turns at supersonic speeds or appeared and disappeared instantly, you may summarily dismiss the report. <> Declare that there is no proof that life can exist in outer space. Since most people still behave as if the Earth were the center of the universe, you may safely ignore the fact that Earth, which is already in outer space, has abundant life. <> Point out that the government-sponsored SETI program assumes in advance that extraterrestrial intelligence can only exist light-years away from Earth. Equate this a-priori assumption with conclusive proof; then insist that this invalidates all terrestrial reports of ET contact. <> When someone produces purported physical evidence of alien technology, point out that no analysis can prove that its origin was extraterrestrial; after all, it might be the product of some perfectly ordinary, ultra-secret underground government lab. The only exception would be evidence obtained from a landing on the White House lawn-the sole circumstance universally agreed upon by generations of skeptics as conclusively certifying extraterrestrial origin! <> If photographs or other visual media depicting a UFO have been presented, argue that since images can now be digitally manipulated they prove nothing. Assert this regardless of the vintage of the material or the circumstances of its acquisition. Insist that the better the quality of a UFO photo, the greater the likelihood of fraud. Photos that have passed every known test may therefore be held to be the most perfectly fraudulent of all! <> If you can't otherwise destroy the credibility of a UFO photo, plant a small model of the alleged craft near the photographer's home where it can be conveniently discovered and whisked off to the local media. The model need not resemble the original too closely; as long as the press says it's a dead ringer nobody will question the implication of fraud. <> Argue that all reports of humanoid extraterrestrials must be bogus because the evolution of the humanoid form on Earth is the result of an infinite number of accidents in a genetically isolated environment. Avoid addressing the logical proposition that if interstellar visitations have occurred, Earth cannot be considered genetically isolated in the first place. <> Argue that extraterrestrials would or wouldn't, should or shouldn't, can or can't behave in certain ways because such behavior would or wouldn't be logical. Base your notions of logic on how terrestrials would or wouldn't behave. Since terrestrials behave in all kinds of ways you can theorize whatever kind of behavior suits your arguments. <> Stereotype contact claims according to simplistic scenarios already well established in the collective imagination. If a reported ET contact appears to have had no negative consequences, sarcastically accuse the claimant of believing devoutly that "benevolent ETs have come to magically save us from destroying ourselves!" If someone claims to have been traumatized by an alien contact, brush it aside as "a classic case of hysteria." If contactees stress the essential humanness and limitations of certain ETs they claim to have met, ask "why haven't these omnipotent beings offered to solve all our problems for us?" <> Ask why alleged contactees and abductees haven't received alien infections. Reject as "preposterous" all medical evidence suggesting that such may in fact have occurred. Categorize as "pure science-fiction" the notion that alien understandings of immunology might be in advance of our own, or that sufficiently alien microorganisms might be limited in their ability to interact with our biological systems. Above all, dismiss anything that might result in an actual investigation of the matter. <> Travel to China. Upon your return, report that "nobody there told me they had seen any UFOs." Insist that this proves that no UFOs are reported outside countries whose populations are overexposed to science fiction. <> Where hypnotic regression has yielded consistent contactee testimony in widespread and completely independent cases, argue that hypnosis is probably unreliable, and is always worthless in the hands of non-credentialed practitioners. Be sure to add that the subjects must have been steeped in the UFO literature, and that, whatever their credentials, the hypnotists involved must have been asking leading questions. <> If someone claims to have been emotionally impacted by a contact experience, point out that strong emotions can alter perceptions. Therefore, the claimant's recollections must be entirely untrustworthy. <> Maintain that there cannot possibly be a government UFO coverupÉ but that it exists for legitimate reasons of national security! <> Accuse conspiracy theorists of being conspiracy theorists and of believing in conspiracies! Insist that only *accidentalist* theories can possibly account for repeated, organized patterns of suppression, denial and disinformational activity. <> Argue that since theoretically there can be no press censorship in the United States, there is no press censorship in the United States. <> In the event of a worst-case scenario--for example, one in which the UFO is suddenly acknowledged as a global mystery of millennial proportions--just remember that the public has a short memory. Simply say dismissively, "Well, everyone knows this is a monumentally significant issue. As a matter of fact, my colleagues and I have been remarking on it for years!" * * * Daniel Drasin is a media producer, writer, musician and award- winning cinematographer with a passionate interest in the field of New Science. He lives in Boulder, Colorado and chases flying saucers in his spare time. * * *