Posts Tagged ‘WMD’s’

CNN | Mar 22, 2010

By Tom Watkins

About 1 million children in the United States and about 30 million worldwide have gotten Rotarix vaccine, the FDA says.

(CNN) — Federal health authorities recommended Monday that doctors suspend using Rotarix, one of two vaccines licensed in the United States against rotavirus, saying the vaccine is contaminated with material from a pig virus.

“There is no evidence at this time that this material poses a safety risk,” Food and Drug Administration Commissioner Dr. Margaret Hamburg told reporters in a conference call.

Rotarix, made by GlaxoSmithKline, was approved by the FDA in 2008. The contaminant material is DNA from porcine circovirus 1, a virus from pigs that is not known to cause disease in humans or animals, Hamburg said.

About 1 million children in the United States and about 30 million worldwide have gotten Rotarix vaccine, she said.

Raw Story | Apr 2nd, 2010

In what is being hailed as a major victory for workers in the biotech and nanotech fields, a former scientist with pharmaceutical firm Pfizer has been awarded $1.37 million for being fired after raising the alarm over researchers being infected with a genetically engineered “AIDS-like” virus.

Becky McClain, a molecular biologist from Deep River, Connecticut, filed a lawsuit against Pfizer in 2007, claiming she had been wrongly terminated for complaining about faulty safety equipment that allowed a “dangerous lentivirus” to infect her and some of her colleagues.

The Hartford Courant describes the virus as “similar to the one that can lead to acquired immune deficiency syndrome, or AIDS.” Health experts testified that the virus has affected the way McClain’s body processes potassium, which they say causes McClain to suffer complete paralysis as often as a dozen times per month, the Courant reports.

McClain’s lawsuit (PDF) asserted that Pfizer had interfered with her right to free speech, and that she should have been protected from retaliation by whistleblower legislation.

Government Computer News:

Technology has always been essential to military strength, but breakthroughs developed within the military often are not limited to weapons. This special report introduces some of the Pentagon’s most advanced information technology projects, in the context of their relation to commercial products and battlefield necessities.

[IMGCAP(1)]The Defense Advanced Research Projects Agency has fostered technologies ranging from the Internet to artificial intelligence research. Nowadays, the scientists it supports are pushing IT ever closer to achieving the processing power and cognitive awareness of living beings. At the same time DARPA is applying technology to the pressing threats imposed by current conflicts, the agency is sponsoring more than a dozen innovative projects, including a bid to perfect cheap, extremely accurate and nonradioactive atomic clocks for use in battlefield systems.

Advances in the mathematical algorithms for cryptography and the processing muscle behind them soon will transform the platforms that handle cascades of classified data, for example. National Security Agency officials characterize their work as a process of continuous ploy and counterploy in the rarefied realms of logic and computing.

The Grand Challenge of bringing practical, remotely piloted or autonomous land vehicles into use also is advancing via the competitive work of several teams. And in its approach to supercomputing, the Defense Department could be changing the way high-performance systems are measured, developed and purchased.

Mutating threats shape DARPA’s research in a wide range of new technologies

In a conflict where the biggest threats to soldiers often are low-tech, homemade explosives, it might not be obvious why troops need a more precise atomic clock to support their efforts. But the Defense Advance Research Projects Agency is working to deliver such precision, along with 13 other future icons that span a range of science and technology, from networking to air vehicles, biology and lasers, DARPA Director Tony Tether said.

The Chip Scale Atomic Clocks (CSACs), for instance, would perform key control functions throughout Pentagon networks and also could help warfighters detect an enemy’s presence.

All the Future Icon projects involve the application of computing resources to solve present and future defense missions, and some directly attack the problems of improving information technology performance for existing systems and futuristic computer architectures.

And they are the types of projects whose impact often extends beyond their original scope, affecting the development of technologies used elsewhere in government and commercially.

“They are tremendously difficult technical challenges that will be hard to solve without fundamentally new approaches — ones which may require bringing multiple disciplines to bear and perhaps even result in entirely new disciplines,” Tether said in testimony submitted recently to the House Armed Services Subcommittee on Terrorism, Unconventional Threats and Capabilities.

One of the most ambitious of the futuristic computer design projects is a five-year project to build a system modeled on the human brain, which would reflect and incorporate human assessments of the roles and intentions of people (see sidebar).

Shape shifters
The research agency is also probing highly advanced IT challenges such as the Programmable Matter project, which aims to develop software that would allow physical objects to change their size, shape, color and other attributes to fulfill changing functions within, say, a military communications system.

CSACs would tackle more immediate concerns in defense networks and in helping soldiers detect enemy vehicles and facilities, according to a leading scientist at the National Institute of Standards and Technology who is researching the technology with DARPA support.

DARPA’s research is honing computer-based methods of detecting purposely hidden or naturally elusive enemy targets underground or on the high seas.

The CSAC project has been driven by the increasing need to reliably assure continual synchronization of systems linked via the Global Information Grid, said Thomas O’Brian, chief of the Time and Frequency Division at NIST’s laboratory in Boulder, Colo. The lab receives DARPA funding to support the development of chip-scale atomic clocks.

The tiny clocks could be deployed in hundreds of systems that military organizations at all levels rely on, including not only radios but also radars, sensors and location units that use the Global Positioning System, O’Brian said in an interview. The atomic clocks promise to make GPS systems more reliable while using little power, along with providing other helpful features, such as low weight and small size, he continued.

The CSACs “are significantly more accurate than the quartz crystal units ,which have been the standard” for such timekeeping, O’Brian said. The new generation of small clocks relies on the vibration frequency of elements such as cesium and rubidium to maintain their steady timekeeping and does not involve radioactive materials.

The tiny clocks can operate for as long as two days or more using the power available in a AA battery, O’Brian said.

“Another aspect of these devices is that they can serve as magnetometers,” he added. As such, the CSACs could sense the presence of metallic objects, such as mines or tanks. “You could scatter them across a wide area so when a Jeep or tank drives over, they might detect it,” O’Brian said. “Or they could detect the presence of ventilating fans in [al Qaeda caves] in Tora Bora [Afghanistan].”

CSACs already have proved themselves in demonstrations using GPS devices, and the technology showed that it could help navigation units function when satellite signals aren’t available, O’Brian said.

Some of the main tasks remaining before the CSACs reach routine use include:

  • Developing efficient, low-cost mass-production methods.
  • Improving the small clocks’ resistance to field conditions such as vibration, temperature and pressure variations and shock.
  • Reducing power consumption.

O’Brian expressed confidence that researchers could soon achieve those improvements.

The research agency’s push in the fields of “detection, precision identification, tracking and destruction of elusive targets” has spawned several research projects. One group of them aims to improve methods for finding and investigating caves, and another centers on tracking seaborne vessels.

The cave research has gained momentum partly from the response of adversary countries’ forces to the success of the Pentagon’s spy satellite technology. Countries such as Iran and North Korea reportedly have built extensive underground facilities to conceal some of their nuclear-weapon production facilities from orbiting sensors.

The underground research spurred by such strategic threats also has led DARPA to study how better cave technology can aid tactical operations, such as by helping soldiers discover enemy troops and weapons lurking in small caves and by helping detect cross-border smuggling tunnels.

The Counter-Underground Facilities program aims at developing sensors, software and related technology to:

  • Pinpoint the power, water airflow and exhaust vents from cave installations.
  • Evaluate the condition of underground facilities before and after attacks.
  • Monitor activities within cave structures during attacks.

According to DARPA procurement documents, the Pentagon’s cave program began by developing methods to learn about those conditions and other features of caves via Measurement and Signature Intelligence (Masint) technology.

Masint methods involve the use of extremely sophisticated and highly classified technology that can integrate information gathered by various types of sensors, including acoustic, seismic, electromagnetic, chemical, multispectral and gravity-sensing devices.

DARPA’s underground facility research project also involves investigation of the effluents coming from vents connected to cave complexes. Effluents for Vent Hunting research can involve the computerized evaluation of smoke to distinguish, for example, between decoy cooking fires and real cooking fires in an area where hostile forces may be roaming.

On the high seas, the Predictive Analysis for Naval Deployment Activities (PANDA) project is refining its existing technology to track the location and patterns of more than 100,000 vessels and to detect when ships and boats deviate from normally expected behavior.

Suspicious behavior
As such, the PANDA research is similar to other systems that use exception detection to pinpoint unusual behavior by people in airports or train stations. Developers of those counterterrorism systems have carved out the task of teaching systems what types of events to watch for among the countless mundane activities observed via video cameras in the transportation hubs.

Like the PANDA system, the exception-detection software for airports flags unusual events — such as an errant freighter in one case or an unattended satchel in the other — and brings them to the attention of human analysts.

At the edges of computer science, DARPA is approaching the problem of attracting and cultivating talent to the field of computer science partly by asking promising students to choose projects that strike them as interesting and attractive.

“One of the ideas the students liked is Programmable Matter,” Tether told the congressional subcommittee members. “It is an important idea that is of significant relevance to DOD. The challenge is to build a solid object out of intelligent parts that could be programmed so that it can transform itself into other physical objects in three dimensions. It would do this by changing its color, shape or other characteristics.”

The programmable matter project could, for instance, lead to the invention of a malleable antenna that could change its shape depending on the radio or radar to which it is connected, Tether said.

“The computer science challenges are to identify the algorithms that would allow each element of the object to do its job as the object changes, while staying well coordinated with the other elements and functioning as an ensemble,” he added.

DARPA throws down the challenge on cognitive computing

The Defense Advanced Research Projects Agency’s research in the field of cognitive computing could progress to the point of a Grand Challenge that would pit alternate methods of building brainlike systems against one another.

The agency’s Biologically-Inspired Cognitive Architecture program is pushing artificial intelligence in the direction of building software that mimics human brain functions.

BICA relies on recent advances in cognitive psychology and the science of the human brain’s biological structure to build software that comes much closer to human abilities than previous AI. The research agency’s Information Processing Technology Office is leading the BICA research process by funding research teams based mainly at universities.

AI traces its roots back to designs such as expert systems and neural networks, familiar since the 1980s, which held out the promise of transforming information technology by adopting human learning and thinking methods. Those classic AI approaches proved to be useful in some commercial and government systems but were less effective than conventional IT architectures for most uses.

BICA’s leaders note that AI progress has been slow and steady in recent decades. “However, we have fallen short of creating systems with genuine artificial intelligence — ones that can learn from experience and adapt to changing conditions in the way that humans can,” according to DARPA. “We are able to engineer specialized software solutions for almost any well-defined problem, but our systems still lack the general, flexible learning abilities of human cognition.”

The BICA program has completed its first phase, which commissioned eight research teams to combine recent findings in brain biology and psychology to help build blueprints for functioning computers that could learn and understand like people. In the second phase of the five-year BICA program, which is now under way, the military research agency is seeking proposals for vendor teams to develop and test models of human cognition, or thinking, based on the architectures built in the program’s first year.

DARPA has not yet announced plans for a grand challenge competition to pit the resulting AI-like systems against one another. But vendor documents submitted in response to BICA’s first phase refer to an anticipated challenge stage of the program.

The University of Maryland at College Park provided one of the computer architectures for the first phase of the BICA program, basing some of its research on methods of designing a mobile system that could learn the various skills DARPA seeks in a cognitive system. “We are ultimately interested in [designing] an agent that captures many of the abilities of a child, and thus do not focus on a large initial knowledge base,” the University of Maryland computer scientists wrote.

“We keep the environment and input/ output to the system relatively simple so that we can focus on the primary issue of integrating those components and not the important but low-level details that will eventually need to be addressed,” according to their blueprint.

The 14 Future Icon technology areas, as described in testimony by Defense Advanced Research Projects Agency Director Tony Tether before a House committee:

Networks: Self-forming, robust, self-defending networks at the strategic and tactical level are the key to network-centric warfare.

Chip-Scale Atomic Clock: Miniaturizing an atomic clock to fit on a chip to provide very accurate time as required, for instance, in assured network communications.

Global War on Terrorism: Technologies to identify and defeat terrorist activities such as the manufacture and deployment of improvised explosive devices and other asymmetric activities.

Air Vehicles: Manned and unmanned air vehicles that quickly arrive at their mission station and can remain there for very long periods.

Space: The U.S. military’s ability to use space is one of its major strategic advantages, and DARPA is working to ensure the United States maintains that advantage.

High-Productivity Computing Systems: DARPA is working to maintain the U.S. global lead in supercomputing, which is fundamental to a variety of military operations, from weather forecasting to cryptography to the design of new weapons.

Real-Time Accurate Language Translation: Real-time machine language translation of text and speech with near-expert human translation accuracy.

Biological Warfare Defense: Technologies to dramatically accelerate the development and production of vaccines and other medical therapeutics from 12 years to only 12 weeks.

Prosthetics: Developing prosthetics that can be controlled and perceived by the brain, just as with a natural limb.

Quantum Information Science: Exploiting quantum phenomena in the fields of computing, cryptography and communications, with the promise of opening new frontiers in each area.

Newton’s Laws for Biology: DARPA’s Fundamental Laws of Biology program is working to bring deeper mathematical understanding and accompanying predictive ability to the field of biology, with the goal of discovering fundamental laws of biology that extend across all size scales.

Low-Cost Titanium: A completely revolutionary technology for extracting titanium from ore and fabricating it promises to dramatically reduce the cost for military-grade titanium alloy, making it practical for many more applications.

Alternative Energy: Technologies to help reduce the military’s reliance on petroleum.

High-Energy Liquid Laser Area Defense System: Novel, compact, high-power lasers making practical small-size and low-weight speed-of-light weapons for tactical mobile air and ground vehicles.

NSA pushes for adoption of elliptic-curve encryption, whose greater security and shorter key lengths will help secure small, mobile devices

The cryptographic security standards used in public-key infrastructures, RSA and Diffie-Hellman, were introduced in the 1970s. And although they haven’t been cracked, their time could be running out.

That’s one reason the National Security Agency wants to move to elliptic-curve cryptography (ECC) for cybersecurity by 2010, the year the National Institute of Standards and Technology plans to recommend all government agencies move to ECC, said Dickie George, technology director at NSA’s information assurance directorate.

Another reason is that current standards would have to continually extend their key lengths to ensure security, which increases processing time and could make it difficult to secure small devices. ECC can provide greater security with shorter keys, experts say.

The switch to ECC will be neither quick nor painless. It will require mass replacement of hardware and software to be compatible with ECC and new NSA cybersecurity standards.

In fact, the 2010 goal might not be realistic for NSA, where more than a million different pieces of equipment will need to be moved to ECC, George said. NSA’s move could potentially take as long as 10 years to complete, given the project’s complexity and scope. The agency has not set a specific deadline for completing its Cryptographic Modernization initiative, started in 2001 and recognizes that cybersecurity will always be a moving target, he said. The move to ECC is part of the initiative.

ECC, a complex mathematical algorithm used to secure data in transit, will replace RSA and Diffie-Hellman because it can provide much greater security at a smaller key size. ECC takes less computational time and can be used to secure information on smaller machines, including cell phones, smart cards and wireless devices.

The specifics of the changeover were announced in 2005 with NSA’s release of its Suite B Cryptography standards. Suite B falls under NSA’s Cryptographic Modernization initiative and details ECC usage for public keys and digital signatures. The announcement, the first related to cryptographic standards in 30 years, was a watershed event, said Bill Lattin, chief technology officer at Certicom, a pioneer in ECC.

NSA has licensed approximately 25 of Certicom’s ECC patents for use by the government and vendors that develop defense products.

The move to ECC represents a new way of doing business for the NSA. The Cryptographic Modernization initiative “is not just replacing the old with the new. We are upgrading the entire way we do communications,” George said.

Interoperability is the core of the new communications program and the reason for the modernization initiative. NSA plans to work closely with other governments, U.S. departments and agencies, first responders, and the commercial sector, George said. To do so, the agency needs public-key algorithms to securely transmit information among all parties, he said.

“If you go back 30 years, things weren’t nearly as interoperable as they are now. In today’s world, everything is being networked. We have to allow interoperability. And the cryptography has to match [among devices] because if it doesn’t, it is not going to be interoperable,” George said.

These interoperability goals will most likely extend across federal, state and local governments in addition to law enforcement agencies nationwide.

Although RSA and Diffie-Hellman are both public-key algorithms, experts say they don’t scale well for the future. To make RSA and Diffie-Hellman keys, which now can go to 1,024 bits, secure for the next 10 to 20 years, organizations would have to expand to key lengths of at least 2,048 bits, said Stephen Kent, chief scientist at BBN Technologies. Eventually, key sizes would need to expand to 4,096 bits. “That’s enormous keys. To do the math operations underlying the keys takes longer and is more computationally intensive,” Kent said.

Thus, NSA’s decision to move to ECC, which appears to be the only option. Experts agree that there is no new technology comparable to ECC. Although there are a number of protocols, there are only two basic technology approaches, George said: integers, used by RSA and Diffie-Hellman, and ECC, he said.

“ECC is the only impressive thing out there,” Kent said. “People don’t get excited every time a new thing comes along. We wait several years and let people try to crack it first. ECC definitely passed the test in this regard.”

NIST, which develops government- wide cybersecurity standards, also sees a need to move to ECC, although its recommendations are less stringent than NSA’s, whose ECC guidelines are a subset of NIST’s.

“I’m pretty sure [RSA and Diffie-Hellman] will be broken within a decade or so,” said Bill Burr, manager of NIST’s security technology group. “We are trying to end the use for most purposes of RSA and Diffie-Hellman with 1,000-bit keys by the end of 2010. And if you are real conservative, we are late.”.

“NSA has been fairly aggressive to standardize on ECC,” Burr said. We are slower, partly because we think it will naturally happen anyhow.”

John Pescatore, vice president and analyst at Gartner, does not see a need for the average user to switch to ECC unless it is to take advantage of its smaller size, such as securing cell phones and smart cards. With NSA, those technologies might include “things that a soldier carries around…and [has] strict limits on power consumption,” Pescatore said.

Burr expects ECC to become a universal standard by 2020, when most ECC patents owned by Certicom expire. “If it’s not a big problem today, it may be hard for the CIO to motivate people to transition to ECC,” said Kent.

DARPA’s Grand Challenge moves downtown, where teams will test their vehicles against city traffic

The Defense Advanced Research Projects Agency’s competition for autonomous vehicles has seen great leaps forward in its first two incarnations. This year, the ride could get rather bumpy, as the Grand Challenge moves from the expanses of the desert to the mean streets of the city.

The competition, called the Urban Challenge for 2007, is no mere sporting event. DARPA’s goal is to use the challenge to help develop technologies for self-guiding military vehicles that could reduce the deadly toll of vehicular-related battlefield casualties among U.S. military personnel.

Approximately half the U.S. soldiers killed to date in Iraq have died in enemy attacks on vehicles, whether by live enemy fire or by improvised explosive devices or, to a lesser extent, in vehicular accidents.

Based on results from the two previous Grand Challenges and a preliminary look at the entrants in DARPA’s Urban Challenge contest now under way, “we think that over time we will be able to build vehicles that will be able to drive as well as humans in certain situations,” said Norman Whitaker, program manager for DARPA’s Urban Challenge.

In May, DARPA trimmed the roster of teams competing in the Urban Challenge from 89 to 53 and will further narrow the field to 30 semifinalists this week based on scores issued during site visits DARPA officials have been conducting since May. The agency also will name this week the location of the competition’s Qualification Event scheduled for Oct. 26 to 31 and the location for the final contest Nov. 3.

To date, DARPA has said only that both events would take place in the western United States, although its placement in a simulated urban combat zone has become the theme of this year’s contest and considerably upped the ante for the level of vehicle proficiency that will be required to successfully complete the contest’s 60-mile course in six hours.

The complexities of a city environment and the introduction this year of other moving vehicles along the course increases exponentially the sophistication of the sensing, data processing and guidance technologies required, Whitaker said.

DARPA’s goal in its successive challenges is to raise the bar each time, he said, although the addition of moving traffic represents the biggest obstacle ever added to the contest.

The first Grand Challenge in 2004 ran over a 142-mile course in the desert, but the competition looked more like the Keystone Cops than Knight Rider — no vehicle made it past the eight-mile mark. Still, DARPA officials said they saw promise, which came to fruition in 2005, when four vehicles covered a 132-mile desert course. With those results, the decision was made to take the Grand Challenge downtown.

With an urban setting and traffic, vehicles “have to make decisions fast, so we’ve speeded up the timeframe” in which vehicles must receive sensor data, process it and respond, all without human intervention, Whitaker said. “As usual, we’ve taken it to the nth degree and said we want full autonomy. By [asking for an extreme], we get a lot of the middle ground covered.”

The placement of this year’s contest in a dynamic setting creates challenges unheard of in previous challenges and requires technological advancements that will bring self-guided vehicles to a near reality, participants say.

“This year we have moving objectives and that dynamic interaction is new and very difficult,” said Gary Schmiedel, vice president of the advanced product engineering group at Oshkosh Truck, one of the corporate entrants in this year’s Urban Challenge and one of the teams that successfully completed the 132-mile course in 2005. “This brings us much closer to a real-world application of the technology and means that we have to build a truck that’s as versatile as you or I would be.”

At the level of sophistication that will be required in this year’s contest, “this is really a software competition, not a hardware competition,” said David Stavens, a doctoral candidate at Stanford University who’s working on Stanford’s entry in the Urban Challenge and was a co-creator of Stanley, the modified Volkswagen Touareg sport utility vehicle that won DARPA’s 2005 Grand Challenge for Stanford University.

The Stanford team, consequently, is spending much of its time this year working on probabilistic algorithms and machine learning capabilities and is tackling the problem with help from the Stanford Artificial Intelligence Laboratory, Stavens said. Probabilistic algorthms will help this year’s Stanford entry, Junior, a Volkswagen Passat station wagon, deal with uncertainties along the course, while machine learning will enable the team to program the car with human-like driving skills.

“By driving other roads, you can gain enough knowledge that the robot will be able to handle the Urban Challenge course just fine,” Stavens said. “This is a very rich subset of the skills that you and I would use when we jump in our own cars and go driving, but this type of technology can save our soldiers’ lives in the battlefield and save lives in the civilian world.”

After this year’s challenge, DARPA will evaluate whether the contests have advanced the technology enough to make commercial production of autonomous vehicles for the military feasible and economically practical, Whitaker said. After an experiment along the lines of the challenges, “there’s an intermediate phase before [the military] goes out and starts buying systems. It could also be that we’ll need to see more work on the commercial side,” he said.

Teams build on technologies from past challenges

As the agency that created the Internet and nurtured it through its early years, the Defense Advanced Research Projects Agency has a long history of transferring its technical innovations from military to civilian use. The Grand Challenge will likely prove to be another example.

Although the challenge’s primary goal is developing driverless military vehicles, DARPA has organized the competitions with the expectation that technologies created for them will be applied in the private sector, too.

Many of the corporate Grand Challenge participants, in fact, look at it as an opportunity to test and perfect — in demanding military conditions — technologies they will later adapt for industrial or civilian use.

Velodyne Acoustics, a maker of high-fidelity stereo and home theater equipment, entered the 2005 Grand Challenge and invented laser-based sensors for its vehicle that it has now sold to participants in the 2007 Urban Challenge.

The company also is marketing its invention to prospects in several industries, said Michael Dunbar, Velodyne’s business development manager.

David Hall, the company’s founder, chief executive officer and chief engineer, along with his brother, Bruce, Velodyne’s president, entered a vehicle in the 2005 Grand Challenge as Team DAD (for Digital Audio Drive). While working on the project, they identified shortcomings with the laser-based light, distancing and ranging (Lidar) scanners used alone or in combination with cameras as the eyes in the guidance systems of autonomous vehicles, Dunbar said. Lidar systems available on the market at the time scanned for objects only along a single, fixed line of sight.

In response to those limitations, David Hall, an avid inventor, created his own Lidar scanner consisting of an assembly of 64 lasers spinning at 300 to 900 rotations per second capable of detecting objects anywhere in a 360-degree horizontal field. The Velodyne Lidar assembly produces 1 million data points per second, compared to the 5,000 data points a second of earlier systems.

Velodyne doesn’t have a vehicle in this year’s Urban Challenge but has sold its HDL-64 Lidar scanner to 10 Challenge participants that have included it on their vehicles, either alone or in conjunction with optical sensors, Dunbar said. “Some of the teams can use our sensor and eliminate other types of sensors so [the sensor data] is much easier for them to manipulate,” he said.

By setting its own benchmarks for supercomputing systems, DOD gets better performance — and might change how HPC systems are procured

Twice a year, work being done by the world’s fastest supercomputers comes to a screeching halt so the systems can run a benchmark called Linpack to determine how fast they are, at least in relation to one another. Linpack — which measures how many trillions of floating-point operations per second the machine is capable of executing — is the benchmark used to rank the fastest supercomputers in the world, in the twice-annual Top 500 List.

As an exercise in flexing muscle, Linpack is about as useful as any other benchmark. But as a tool for judging supercomputing systems in a procurement process, it is limited at best. The Defense Department, through its High Performance Computing Modernization Program, is shaking up the supercomputing world by applying a more disciplined approach to purchasing big iron.

Instead of using a generic benchmark to compare models, the program issues a set of metrics that carefully codifies its own workload. Program leaders then ask vendors to respond with the best — yet most cost-effective — systems they can provide to execute such a workload.

“We don’t specify how big the machine is,” said Cray Henry, head of the program. “We will run a sample problem of a fixed size, and call the result our target time. We then put a bid on the street and say we want you to build a machine that will run this twice as fast.” It is up to the vendor to figure out how that machine should achieve those results.

Sounds simple, but in the field of supercomputers, this common-sense approach is rather radical.

“It’s a well-oiled process,” agreed Alison Ryan, vice president of business development at SGI. She said that for vendors, “this kind of procurement is actually difficult. It takes a lot of nontrivial work. It’s easier to do a procurement based on Linpack.” But in the end, the work is worthwhile for both DOD and the vendor, because “it’s actually getting the right equipment for your users.”

“They’ve done a great job on the program in institutionalizing the [request for proposal] process,” said Peter Ungaro, chief executive officer at supercomputer company Cray.

DOD created HPCMP in 1994 as a way to pool resources for supercomputing power. Instead of having each of the services buy supercomputers for its own big jobs, the services could collectively buy an array of machines that could handle a wider variety of tasks, including large tasks.

On the rise
Today, the program has an annual budget of about $250 million, including $50 million for procuring two new supercomputers. Eight HPCMP shared-resource centers, which house the systems, tackle about 600 projects submitted by 4,600 users from the military services, academia and industry.

As of December 2006, the program had control of machines that could do a total of 315.5 teraflops, and that number grows by a quarter each year, as the oldest machines are replaced or augmented by newer technologies.

And over the years, the program has developed a painstakingly thorough process of specifying what kind of systems it needs.

What about HPCMP is so different? It defines its users’ workload, rather than use a set of generic performance goals.

Henry said that most of the workloads on the program’s systems can fall into one of about 10 categories, such as computational fluid dynamics, structural mechanics, chemistry and materials science, climate modeling and simulation, and electromagnetics. Each job has a unique performance characteristic and can be best run on a unique combination of processors, memory, interconnects and software. “This is better because it gauges true workload,” Ryan said.

To quantify these types of jobs, HPCMP came up with a computer program called the linear optimizer, which calculates the overall system performance for handling each of these jobs. It weights each job by how often it is executed. It also factors in the price of each system and existing systems that can already execute those tasks.

Once numbers have been generated for each proposed system, the program takes usability into consideration. Henry admitted that is hard to quantify, but it includes factors such as what sorts of third-party software is available for the platform and what sorts of compilers, debuggers and other development tools are available.

Once these performance and usability numbers are calculated, they are weighted against the past performance of the vendors. From there, the answer of which system may be the right one may be obvious — or it may come down to a narrow choice between a handful of systems.

“It’s not often they need the same type of system year after year,” Ungaro said.

Bottom line
Although DOD generally is well- represented on the twice-annual list of the world’s fastest computers — it had 11 in the June 2007 Top 100 ranking, for instance — the true beneficiaries are the researchers who can use the machines. The biggest benefit? “Time to solution,” Henry said.

DOD might need to know the performance characteristics of an airplane fuselage. Using a very accurate simulation saves money and time from testing actual fuselages.

“Typically, the kind of equations we’re trying to solve require from dozens to thousands of differential calculations,” Henry said. And each equation “can require a tremendous number of iterations.”

Imagine executing a single problem a million or even tens of millions of times at once, with each execution involving thousands of calculations. That’s the size of the job these systems usually handle.

DOD has many problems to test against. Programs track toxic releases of gas spread across an environment. They help develop better algorithms for tracking targets on the ground from moving radars. They speed development of missiles. In one example, supercomputing shortened the development time of the Hellfire missile to just 13 months, allowing it to be deployed in Iraq two years earlier than otherwise would have been possible.

By providing the fastest computing power available, the program in its modest way can assure the Defense Department stays ahead of the enemy.

By IgnoranceIsntBliss

A mere acronym of 2 letters (DU) is all it takes to completely destroy Al Gore and his ‘projected’ divine destiny to save the earth from man-made environmental doomsday. But this lesson in hypocrisy doesn’t end there.

I hadn’t actually thought of the lynchpin point until I recently finished Al Gore’s Assault on Reason book. It should have been titled The Assault on Bush as he joked about it not being, in a video interview I seen somewhere online, because there’s hardly a page in it that doesn’t mention GWB and his minions. I counted 105 of 273 pages that included attacks on just the issue of Bush’s Iraq / Foreign-Policy for example.

One could hardly articulate such a sophisticated blackballing, however, Al Gore actually did while bending over backwards to not step into the “conspiracy theorist” world. And his trouncing of the Bush Junta is well deserved and I do recommend all people read this book for that reason, as well as not only his democracy / American history / Media coverage but perhaps most importantly for a lesson in observing left/right political bias.

It’s important that Gore didn’t go full blown conspiracy theorist as that would obviously implicate him with things like 9/11 and Depleted Uranium.

Dennis Kucinich is one of the few with the cajonas to address the DU issue.

In page after page Al attacks Bush’s very essence, and in particular Bush’s environmental policies and the Iraqi Invasion / Occupation. Al slams Dubya on Iraq, while patting himself on the back for the Balkans sectarian conflict that they engineered. The most notable common thread between these 2 conflicts and these 2 men is the Depleted Uranium issue. Others include the engineering conflicts, sectarian conflict scenarios, the use of Al Qaeda as an instrument of proxywars & related tactical subversiveness, and even oil (pipelines).

It turns out that DU has a 4.5 BILLION Year halflife. The “Clinton-Gore Administration”, as Al calls it at every chance in his book, used DU munitions in the conflict that started and lasted over virtually the entire 8 year reign of the Clinton-Gore Administration. To add insult to injury, Gore campaigned in 2000 on (imperialist) nation building, but then blasts Bush for his hegemonic Iraqi power grab because the only difference is that Americans also die this time around in an engineered sectarian conflict.

As a side-note, perhaps Al Gore inherited his stance on the use of Uraium was a weapon from his father.

In the late, 1950s Al Gore’s father, the senator from Tennessee, proposed dousing the demilitarized zone in Korea with uranium as a cheap failsafe against an attack from the North Koreans.

After the Gulf War, Pentagon war planners were so delighted with the performance of their radioactive weapons that ordered a new arsenal and under Bill Clinton’s orders fired them at Serb positions in Bosnia, Kosovo and Serbia. More than a 100 of the DU bombs have been used in the Balkans over the last six years.

This is what Depleted Uranium does to people:

In all fairness, GWB has become history’s master of the use of these weapons.

Now we have Shrub using the same stuff that is absolutely sure to contaminate the environment until what could be considered the end of time (astro-science models predict that the Sun and life on Earth will expire well before one billion years before the DU munitions reach their half-life), but oddly enough Al Gore somehow managed to forget to mention this little tidbit in his all out partisan Bush crucifixion hit-piece book, which focuses on both the Iraq imperial power move and the environment.

But the hypocrisy doesn’t end there. “The Assault on Reason” (emphasis his) includes an entire chapter on “The Politics of Fear”, which impressively includes the sort of neuro-psychological descriptions of mind matters many would expect from my own writings here at this blog. Now I present to you clear and obvious terrormongering pieces of video propaganda:

Example 1: Look at the language: “IT WILL SHAKE YOU TO YOUR CORE”, BY FAR, THE MOST TERRIFYING FILM YOU WILL EVER SEE”, “Think of the impact of a couple hundred thousand refugee’s, and then imagine a hundred million”, “NOTHING IS SCARIER”.

Note the use of New Orleans Katrina footage, and the fact that the vast majority of damage was due to flooding that was entirely mankinds fault.

And then note how at 2:17 of the trailer there is a 3 frame burst of a nuclear bomb explosion, which is entirely out of context of the presentation. For fairness, he does have a worthy context in the actual film where they use this same –arguably even then scaremonger– clip, but in the context of the preview -that potentially millions watched- it doesn’t fit.

Anyone with even a self-prescribed education in socio-psychological propaganda can quickly tell, from Al Gore’s book and online video interviews addressing it, that Gore is clearly trained in socio-psycholgical propaganda techniques, and because of this fact there is no excuse for the out-of-context terrormongering highlighted in that single clip, which continues on into the follow-up TV commercial:

Example 2: Added terrormonger language: “Grabs you like a thriller with an ending that will haunt your dreams”, “You will be captivated… then riveted… then scared out of your wits”. That cemmercial also includes about 1-2 frames of the same nuclear bomb mushroom cloud used in the An Inconvenient Truth trailer-terror piece.

To “emphasize this point”, about Gore’s hypocrisy, it’s best I point out that there are many pages in his book with entire paragraphs about the Bush Junta‘s selective cherry-picking use of facts and scientific information in regards to most particularly the Iraq War effort and environmental issues.

Gore often claims that “the debate is over” about “Global Warmming” , but it turns out that the debate is in fact not over (see here for some debate). But to directly address his self-described (on the back of the DVD package) “persuasive” propaganda-piece, his central argument is the  650,000 year ice-core ..

It’s just too bad that it’s well established that temperature always lead to the “complex relationships” that Gore presented as if the CO2 drove the temperature:

Ice cores show CO2 increases lag behind temperature

This isn’t even merely a matter of him misreading some graph’s. He went to the extent of calling the ice core specialist as his special friend, and made it look cozy. How could Gore have not understood which came first? How would you not look to see which happens first, before making such strong statements. To waive his 650,000-year-resolution graph around -off of a scissor lift- like some sort of gospel is Hollywood trickery at ‘best’.

This scenario presents us with 2 options, that I can think of.
1) He deliberately engaged in disinfo.
2)He’s too incompetent to hold and so effectively propagate such a staunch belief.

In either case, this is but one of the many examples of Gore using sketchy science during his terrormonger campaign. For a complete lesson in Gore trickery have a look at A Skeptics Guide to An Inconvenient Truth.

Moving on, despite the reality where the D.U. weapons fact destroy’s Gore’s claim to trying to save the environment, this lesson in environmental hypocrisy doesn’t end there. For starters in the AIT film and book Al asks “Are you ready to change your life?”. It’s bad enough that YOU changing your life is the primary solution Al gives us in his film presentation, but then there’s the fact that he doesn’t practice what he preaches.

While he may buy “carbon offsets” each time he travels, it doesn’t change the fact that he flies more in one year than most humans will in their entire lives. He doesn’t just fly to in relations to his global warming activism either, but in that regards he claims in his film that he’s done over 1000 talks since 2001.

Sean Hannity’s excellent piece on Al Gore and jet travel.
(Although I loathe that man too.)

But perhaps his travels can be somehow ignored in light of what he tells us we should do, but unfortunately we can’t rationaize his home habits as being part of the effort to save the earth.

According to Schweizer, the Gores own three homes: a 10,000-square-foot home in Nashville, Tennessee; a 4,000-square-foot home in Arlington, Virginia (across the Potomac River from Washington, DC); and a third home of undisclosed size in Carthage, Tennessee. Neutral Source has verified the Gores own a 2.1 acre property at 312 Lynnwood Blvd. in the Belle Meade section of Nashville, Tennessee (Parcel ID 11611005600). Its assessed value in January 2005 was $3 million, but we have not been able to validate Schweizer’s claims about its size. So we performed a search and found 15 single-family homes on the market in the Gores’ Nashville neighborhood with asking prices of $3 million or more. Of the 298 neighborhood properties on the market, only three are listed at $3 million or more. According to MLS data, these houses are 9.727, 7,340, and 9,878 square feet respectively. So we can confirm that it’s quite plausible that the Gores’ Nashville home is, as Schweizer claims, 10,000 square feet.

So what does that mean?

Still, a rough approximation of the Gores’ residential CO2 emissions can be obtained by assuming that CarbonCounter’s “average” really means “median.” This yields 26 tons x 9.2 = 239 tons CO2 per year. CarbonCounter will “offset” the Gores’ CO2 emissions for a contribution of $10 per ton, and certainly they can afford the $2,390 contribution that CarbonCounter says will buy residential “carbon neutrality.” But actually making a large reduction in CO2 emissions from 17,000 square feet of residential living space would be both very challenging technically, and much more expensive. The Gores are easily capable of being CO2 Pragmatists with respect to residential carbon neutrality, but they cannot be CO2 Puritans without completely abandoning their lifestyle.

In plain English:

Gore’s home uses more than 20 times the national average

So here we have Al Gore telling us that we need change our lives, meanwhile he’s using 20+ times the amount of energy as something like 90+% of the rest of the world population.

While we’re on the energy topic perhaps it’s best to point out that Gore has a record of relations with oil ‘interests’.

Al Gore: The Other Oil CandidateCorpwatch, August 29th, 2000
For thousands of years, the Kitanemuk Indians made their home in the Elk Hills of central California. Come February 2001, the last of the 100 burial grounds, holy places and other archaeological sites of the Kitanemuks will be obliterated by the oil drilling of Occidental Petroleum Company. Oxy’s plans will “destroy forever the evidence that we once existed on this land,” according to Dee Dominguez, a Kitanemuk whose great grandfather was a signatory to the 1851 treaty that surrendered the Elk Hills.

Occidental’s planned drilling of the Elk Hills doesn’t only threaten the memory of the Kitanemuk. Environmentalists say a rare species of fox, lizard and the kangaroo rat would also be threatened by Oxy’s plans. A lawsuit has been filed under the Endangered Species Act. But none of that has given pause to Occidental or the politician who helped engineer the sale of the drilling rights to the federally-owned Elk Hills. That politician is Al Gore.

So in light of that, along with the Clinton-Gore engineered Bosnia-Kosovo proxywar for oil-pipelines, it seems obvious that Gore is a tad more oil friendly than one would expect. It should also be pointed out that Gore was one of only 10 Democrats to break ranks and support the Persian Gulf War, which is interesting considering the history of that conflict. That’s all too bad because US Foreign Policy and wars in general are probably the biggest contributers to the environmental footprint of us human beings.

Ron Paul: the CIA / Foreign Policy contributing to Global Warming; at time 2:25.
Al Gore supports nation building:

Q. Bush made nation-building a point of difference with you [in the Oct. 3 debate].
A. I think that phrase taps into a legitimate concern about how far we should go and how long we should be involved. But it’s not a new mission. The Marshall Plan was about nation-building. And the generation that won World War II, having seen the catastrophe of the interwar period in the 20’s and 30’s, wisely decided that nation-building was a preferable alternative to World War III. And it was a stunning success.

As proven under Clinton-Gore and Bush-Cheney “nation building” also means war, and that brings us to the next big Bush-Gore parallel of hypocrisy. It turns out that both Bush and Gore brag about their Christian faiths, yet somehow each is engaged in the same goal which is a de facto ‘god on earth’ artificial intelligence system.

While Bush is driving this initiative through with excessive militarism, Al Gore on the other hand is driving the same goal with his friends at NASA and Google. Gore is arguably the main man behind this effort, being right in the center of the eye of this whirlwind down there at the main Googleplex HQ, which is situated right next door to the NASA Ames Research Center, in Silicon Valley, where NASA and Google in 2005 entered a partnership in the development of cognitive artificial intelligence.

Click for the full story.

It’s hard to imagine why a professed ‘man of the faith’ would take part in such an operation, but then again maybe his words on the Larry King Show -in reference to “global warming”- may shed some light on this contradiction: “It’s really a spiritual issue.”

In closing, RAGE said it best:

The UK Ministry of Defence released their future global / military forecast document recently. It highlights key areas such as Artificial Intelligence, Transhumanism, climate change, globalization, and so on. It’s more or less a doomsday scenario narrative, much of which being marked as “probable”. This is of little surprise considering my usual reporting, combined with others such as the forecast that humans have a 50/50 chance of surviving the 21st Century.

GuardianUK covered it and mentioned many of the key topics. Then covered it with the expected “new world order” spin. Not surprisingly, neither source even touched the A.I. subject, nor did they provide the link to the official document, which would have allowed ‘casual’ readers to easily click through and potentially find those sections themselves.

All findings within Strategic Trends are presented with an indication of confidence”

Having established trend-based outcomes of varying probability, Strategic Trends articulates a number of specific Risks associated with each dimension to highlight the way some of the more adverse consequences could manifest themselves and affect Defence business.”

The Strategic Trends approach starts by identifying the major trends in each of these dimensions and analyses ways in which these trends are likely to develop and interact during the next 30 years, in order to establish a range of Probable Outcomes. Nothing in the future is guaranteed, of course, and Strategic Trends varies the strength of its assessments to highlight sets of Alternative Outcomes that, while less probable, are nonetheless highly plausible”

More Excerpts:

Erosion of Civil Liberties. Technology will enable pervasive surveillance in response to terrorism, rising transnational crime and the growing capability of disparate groups or individuals to inflict catastrophic damage or disruption. Coupled with intrusive, highly responsive and accessible data-bases, the emergence of a so-called ‘surveillance society’ will increasingly challenge assumptions about privacy, with corresponding impacts on civil liberties and human rights. These capabilities will be deployed by the private as well as the public sector.”

Confronted with few direct threats and declining populations, most affluent societies will attempt to minimize their Defence burden by investing in conflict prevention and, for as long as it is in their interest to do so, participating in alliances, forming communities of interest and contracting out security. The US will be the exception, making by far the greatest commitment to Defence throughout the period, consistent with its economic power and technological advantage.”

Deliberate Collateral Casualties. Both state and non-state actors may target commercial and industrial installations to inflict mass casualties, in breach of international law, as an intended primary or secondary effect. The potential impact may be reinforced by increasing industrialization in developing countries, a possible resurgence in nuclear power plant construction, and the progressive concentration of populations in urban areas.”

Globalization / Revolution:

“By 2010, most people (above 50%) will be living in urban rather than rural environments. Poor housing, weak infrastructure and social deprivation will combine with low municipal capacity to create a range of new instability risks in areas of rapid urbanization, especially in those urban settlements that contain a high proportion of unplanned and shanty development.”

During the next 30 years, every aspect of human life will change at an unprecedented rate, throwing up new features, challenges and opportunities. Three areas of change, or Ring Road issues, will touch the lives of everyone on the planet and will underpin these processes: climate change, globalization and global inequality (see panels below).”

While material conditions for most people are likely to improve over the next 30 years, the gap between rich and poor will probably increase and absolute poverty will remain a global challenge. Despite their rapid growth, significant per capita disparities will exist in countries such as China and India and smaller, but traditionally more affluent Western economies. In some regions – notably areas of Sub-Saharan Africa – a fall in poverty may be reversed. Differentials in material well-being will be more explicit through globalization and increased access to more readily and cheaply available telecommunications. Disparities in wealth and advantage will therefore become more obvious, with their associated grievances and resentments, even among the growing numbers of people who are likely to be materially more prosperous than their parents and grandparents. Absolute poverty and comparative disadvantage will fuel perceptions of injustice among those whose expectations are not met, increasing tension and instability, both within and between societies and resulting in expressions of violence such as disorder, criminality, terrorism and insurgency. They may also lead to the resurgence of not only anti-capitalist ideologies, possibly linked to religious, anarchist or nihilist movements, but also to populism and the revival of Marxism.”

Alternatively, a less even process of globalization may lead to lower-density settlement patterns, with people straddling rural and urban-based livelihoods, resulting in extensive browning of the countryside.”

Competition for resources of all kinds will intensify.”

Economic growth and increased consumption will result in greater demand and competition for essential resources. Demand for energy is likely to grow by more than half again by 2035 and fossil fuels will have to meet more than 80% of this increase.5 Major reserves are in politically unstable regions and primary consumer nations are likely to be increasingly reluctant to trust security of supply to market forces and the integrity of the international system.”

“Globalization will result in critical interdependencies that will link members of a globalized society that includes a small super-rich elite and a substantial underclass of slum and subsistence dwellers, who will make up 20% of the world population in 2020.

Declining youth populations in Western societies could become increasingly dissatisfied with their economically burdensome ‘baby-boomer’ elders, among whom much of societies’ wealth would be concentrated. Resentful at a generation whose values appear to be out of step with tightening resource constraints, the young might seek a return to an order provided by more conservative values and structures. This could lead to a civic renaissance, with strict penalties for those failing to fulfil their social obligations. It might also open the way to policies which permit euthanasia as a means to reduce the burden of care for the elderly.”

The middle classes could become a revolutionary class, taking the role envisaged for the proletariat by Marx. The globalization of labour markets and reducing levels of national welfare provision and employment could reduce peoples’ attachment to particular states. The growing gap between themselves and a small number of highly visible super-rich individuals might fuel disillusion with meritocracy, while the growing urban under-classes are likely to pose an increasing threat to social order and stability, as the burden of acquired debt and the failure of pension provision begins to bite. Faced by these twin challenges, the world’s middle-classes might unite, using access to knowledge, resources and skills to shape transnational processes in their own class interest.”

A growing Hispanic population in the US might lead to increasing social tensions, possibly resulting in an aggressive separatist movement. Unlike the Black Power militants of the 1960s, this movement might focus on geographically-based self-determination as its aim, threatening secession by Hispanic-majority states. Confronted by this threat, the US might become increasingly introspective, withdrawing from all non-essential overseas commitments. In the wider world, other states and non-state actors could take advantage of the US withdrawal or break-up, using violence to pursue objectives that, otherwise, might have provoked a US military response.

Economic globalization and indiscriminate migration may lead to levels of international integration that effectively bring interstate warfare to an end; however, it will also result in communities of interest at every level of society that transcend national boundaries and could resort to the use of violence. Operating within a globalized system, states might not be willing or able to regulate these groups’ activities, concentrating on containing the risk and diverting their activities elsewhere according to their interests. In addition, rivalries between interest groups that cannot gain economic and information leverage might increasingly resort to violence and coercion, evolving loose arrangements and networks similar to those currently used by criminal organizations.

In a globalized environment, military technologies will be developed at an accelerating pace, some of which might have the potential to render established capabilities obsolete. For example, a cheap, simple-to-make and easy-to-use weapon might be invented that is effective against a wide range of targets and against which established countermeasures are ineffective.”

The US position as the world’s most indebted nation makes it vulnerable to stock market collapse, currency runs and economic crisis, as well as global currency manipulation. The most likely cause of crisis would be energy market instability or volatility leading to a loss of market confidence. Also, failure to continue to support or service its debt in these circumstances would put US creditors and commodity suppliers at risk, possibly causing a global economic downturn.”

Key natural resources, especially oil, gas and minerals of strategic value, will continue to be sourced from unstable areas and unreliable regions. Maintaining access and containing instability risks in these areas is therefore likely to increase in importance, alongside wider developmental and stabilization roles. Where oil and gas sources are located in areas of doubtful security, military intervention may be used to protect the integrity of sites and to secure investments.”

The middle class will be more vulnerable to economic and social volatility. This may trigger a rise in political engagement and may encourage a resort to either communitarian solutions or extremist politics. While the immediate risk may exist at the national level, exposure to globalized economic forces may cause a reaction to globalization and ultimately fuel tension and difficulties at international levels.”

Social transformation arising from globalization, demographic imbalances and economic shifts will result in wide-ranging, often intense, instability risks, whose impacts will be transmitted beyond their immediate point of origin. These features will demand sensitive warning, strong governance and responsive containment arrangements. In an unstable economic environment or in the event of social crisis, an increase in militancy and activism, possibly based on a declining middle-class, is likely to fuel extremist politics in some societies, possibly characterized by resurgent nationalism and authoritarianism.”

Going Underground. All likely future opponents will have recognized the advantages of going underground if they wish to avoid the surveillance, targeting and penetrative capabilities of sophisticated military forces, particularly those deploying air platforms and systems. In future, states will seek to site most of their major nodes and the majority of their decisive fighting power underground or among civilian infrastructure that it is illegal or unethical to target. Similarly, irregular opponents will base themselves in underground networks, both for offence and defence, especially in complex urban spaces.”

In a fast-changing area, it is difficult and foolish, outside the realms of science fiction, to forecast in any depth technological breakthroughs or their likely applications. Many of the interrelated effects of globalization, including market-manipulation by existing stakeholders, the unpredictability of consumer demand and complex routes to market, will make predictions for the future even less certain. Many issues, including control regimes, will have to be addressed as they arise, although it might be anticipated that some issues will become highly charged.”

Artificial Intelligence / Transhumanism:

Increasing pervasiveness and exploitation of technology at all levels of warfare will increase the distance between ‘the point of the spear’ and the point of interaction for most personnel. Such reliance on technology and unmanned, remote options is likely to lead to increasing vulnerability to a resurgence in traditional, mass warfighting and irregular activity. Ethical questions regarding the accountability for automated actions are also likely to increase.”

Cognitive Science – Routes to the direct application of advances in cognitive science are less clear than nanotechnology or biotechnology; however, indications are that interdisciplinary advances involving cognitive science are likely to enable us more effectively to map cognitive processes. Soft Artificial Intelligence is already well established with self diagnosing and self reconfiguring networks in use and self repairing networks likely in the next 10 years. Mapping of human brain functions and the replication of genuine intelligence is possible before 2035″

Advances in social science, behavioural science and mathematical modelling will combine, leading to more informed decision making. Advanced processing and computational power will permit a new level of pattern recognition (Combinatronics) enabling the decoding of previously unrecognised or undecipherable systems and allowing the modelling of a range of biological to social, political and economic processes. As a result, simulation and representatives will have a significant and widespread impact on the future and will become an increasingly powerful tool to aid policy and decision makers.21 It will also blur the line between illusion and reality.”

AI. The progressive introduction of ‘soft’ AI and further simplification of the Human Computer Interface (HCI) is likely to change the emphasis in training from technical aspects of system operation to the application of judgement in the employment of systems and the conduct of operations. This will stimulate a cultural change with significant effects on the requirements for manpower, command structures and training.”

The application of advanced genetics could challenge current assumptions about human nature and existence. Initially employed for medical purposes, breakthroughs in these areas could be put to ethically questionable uses, such as the super-enhancement of human attributes, including physical strength and sensory perception. Extreme variation in attributes could arise between individuals, or where enhancement becomes a matter of fashion, between societies, creating additional reasons for conflict.”

Developments in genetics might allow treatment of the symptoms of ageing and this would result in greatly increased life expectancy for those who could afford it. The divide between those that could afford to ‘buy longevity’ and those that could not, could aggravate perceived global inequality. Dictatorial or despotic rulers could potentially also ‘buy longevity’, prolonging their regimes and international security risks.”

Human Nature of War Challenged by Technology. Increasing pervasiveness and exploitation of technology at all levels of warfare will increase the distance between ‘the point of the spear’ and the point of interaction for most personnel. Such reliance on technology and unmanned, remote options is likely to lead to increasing vulnerability to a resurgence in traditional, mass warfighting and irregular activity. Ethical questions regarding the accountability for automated actions are also likely to increase.”

A more permissive R&D environment could accelerate the decline of ethical constraints and restraints. The speed of technological and cultural change could overwhelm society’s ability to absorb the ethical implications and to develop and apply national and international regulatory and legal controls. Such a regulatory vacuum would be reinforcing as states and commercial organizations race to develop and exploit economic, political and military advantage. The nearest approximation to an ethical framework could become a form of secular utilitarianism, in an otherwise amoral scientific culture.”

The Role of Artificial Intelligence. The simulation of cognitive processes using Artificial Intelligence (AI) is likely to be employed to manage knowledge and support decision-making, with applications across government and commercial sectors. Reliance on AI will create new vulnerabilities that are likely be exploited by criminals, terrorists or other opponents.”

Unmanned Technologies. Advances in autonomous systems, which promise to reduce substantially the physical risks to humans and mitigate some of their weaknesses, will allow the wider exploration and exploitation of extreme or hazardous environments such as deep sea, underground, contaminated areas and outer space. Furthermore, these technologies will allow increased Defence exploitation in all environments with a correspondingly reduced risk to military personnel and an expanded range of capabilities. AI and the effective replication of human judgement processes, when combined with autonomous systems, particularly robotics, are likely to enable the application of lethal force without human intervention, raising consequential legal and ethical issues.”

By 2035, an implantable information chip could be developed and wired directly to the user’s brain. Information and entertainment choices would be accessible through cognition and might include synthetic sensory perception beamed direct to the user’s senses. Wider related ICT developments might include the invention of synthetic telepathy, including mind-to-mind or telepathic dialogue. This type of development would have obvious military and security, as well as control, legal and ethical, implications.”

While it will be difficult to predict particular breakthroughs, trend analysis indicates that the most substantial technological developments will be in: ICT, biotechnology, energy, cognitive science, smart materials and sensor/network technology. Advanced nanotechnology will underpin many breakthroughs, (See text box). Developments in these areas are likely to be evolutionary, but where disciplines interact, such as in the combination of Cognitive Science and ICT to produce advanced decision-support tools, developments are likely to be revolutionary, resulting in the greatest opportunities for novel or decisive application. Most technological breakthroughs will be positive, however, many advances will also present potential threats, either through perverse applications, such as the use of genetic engineering to produce designer bio-weapons or unstable substances, or through the unanticipated consequences of experimental technological innovation.

Greater connectivity and accessibility to information through the proliferation of ICT will stimulate intensifying international debate on ethics, regulation and law, and will cause religious, ethical and moral concerns and disputes. The pace and diffusion of R&D and the operation of commercial imperatives will make global regulation difficult and will increase the opportunities for unethical or irresponsible actors to evade control. In addition, the effectiveness of regulation is likely to vary by culture, region or country, with an uneven application of, and access to, innovation. However, these issues are likely to be highly politicized and the issues are likely, on past evidence, to cause localized disorder and possibly organized violence.”

Scientific breakthroughs are likely to have the potential to improve the quality of life for many, for example in the safe genetic modification of crops or through stem cell research. However, a combination of market pricing or ethically based regulation may obstruct access by those who might wish or need to benefit most, thereby reinforcing inequality and a sense of grievance.”

“Conversely, it is possible that innovation will take place even more rapidly than is anticipated. Breakthroughs such as the early development of quantum computing will add significant impetus to the pace of technological change and information processing. Specific advances may also have significant geopolitical impacts. For example, a breakthrough in energy technology will alter the global significance of the Middle East, reducing Western strategic dependence on an unstable and volatile area.”

By the end of the period it is likely that the majority of the global population will find it difficult to ‘turn the outside world off’. ICT40 is likely to be so pervasive that people are permanently connected to a network or two-way data stream with inherent challenges to civil liberties; being disconnected could be considered suspicious.”

Technology / Weapons:

Innovation is likely to continue at an unprecedented rate and there is likely to be a multiplicity of sources of innovation and production. Making predictions about how novel and emerging technologies would be exploited and applied will be difficult and imprecise. The rate of change, tempo and unpredictability of innovation and development will challenge decision-makers who will have to anticipate and respond to direct and indirect outcomes. Notwithstanding this, trends indicate that the most rapid technological advances are likely in: ICT, energy, biotechnology, cognitive science, sensors and networks and smart materials. Nanotechnology is likely to be an important enabler for other developments, for example in electronics, sensors and commodity manufacture. Whilst technology will benefit many people, its application and integration will continue to be unequal, reinforcing differences in understanding, advantage and opportunity between the haves and have-nots.”

Technology and Fighting Power. Successful exploitation of new technology, such as Directed Energy Weapons will depend on the users’ understanding of both the advantages and the limitations to its application across physical, conceptual and moral components of fighting power. Those who fail to do so are likely to risk defeat by those who achieve a better component mix, by those who target components to which technological advantage does not apply, or by those who employ technologies such as Electromagnetic Pulse (EMP) to neutralize a more sophisticated adversary’s capability. Small incremental changes in technology are also likely to lead to disproportionally large increases in warfighting capability and effectiveness. This is likely to lead to the reduction of transitional concept-to-capability timescales and increase the scope for technology leakage and more discriminating use of Off-The-Shelf (OTS) applications, especially in the areas of nano- and bio- technology.”

Perverse Application of Technology. The development of technologies that have hitherto been considered benign may be subverted for hostile use. For example, biotechnology and genetic engineering may be combined to create ‘designer’ bio-weapons to target crops, livestock, or particular ethnic groups.”

Given current multi-lateral agreements and technical factors, the effective weaponization of space is unlikely before 2020. However, nations will seek to inhibit the use of space by opponents through a combination of electromagnetic manipulation, hard-kill from ground-based sensor and weapon systems, the targeting of supporting ground-based infrastructure and a range of improvised measures. At its most extreme, the weaponization of space may eventually include the development of space-based strike weapons capable of attacking ground-based and other space targets; for example solid metal projectiles travelling at orbital velocities, so-called ‘rods from the gods’. However, this will remain extremely unlikely without the prospect of sustained and extreme deterioration in international relationships and will be technically difficult to achieve before 2020.”

Innovation, research and development will originate from more international and diffuse sources and will proliferate widely, making regulation and control of novel technologies more challenging. The exploitation of these may have catastrophic results, especially those associated with nanotechnology, biotechnology and weapon systems. These may be unintended, for example ‘runaway’ nanotechnology or biotechnology, or intended, such as the development and use of directed energy or electromagnetic-pulse weapons.”

Access to technology that enables the production and distribution of Chemical, Biological, Radiological and Nuclear (CBRN) weapons is likely to increase. A critical indicator of risk is contained in the examples of North Korea and Iran – both in obtaining or seeking nuclear weapons and in exploiting their putative possession for political and economic advantage. In future, much proliferation and threat will be manifest in the ungoverned space between legality and realpolitik, together with the distinct possibility of the acquisition of CBRN material by non-state and rogue elements.”

In the use of violence and the threat of force, military and civil distinctions will become blurred and weapons and technologies will be more widely available to potential combatants and individuals. The greatest risks of large-scale conflict will be in areas of economic vulnerability, poor governance, environmental and demographic stress and enduring inequality and hardship, especially where there has been a history of recurring conflict (See Figure 2). Most conflicts will be societal, involving civil war, intercommunal violence, insurgency, pervasive criminality and widespread disorder. However, in areas subject to significant demographic and wealth imbalances, there will be a risk of large scale cross-border migration and exogenous shock. Finally, a trend towards societal conflict will be reflected in the continuing prevalence of civilian casualties, as it takes place in increasingly urbanized situations and human networks.”

Arms Rivalry. Increasing strategic and possibly inter-bloc competition is likely as a result of the emergence of major new powers. This may stimulate intensive arms races, for example between China and the US, or between regional rivals such as India and Pakistan, reducing resources available for peaceful economic development. The increase in arms spending would probably extend beyond immediate rivals to include their neighbours and partners, thus intensifying regional tensions and increasing the chances of conflict.”

At the most serious level, space systems could be destroyed or disabled by a burst of solar energy or a natural fluctuation. Similarly, satellites and space platforms could be destroyed or damaged in a deliberate hostile attack, or by being struck by space-debris, causing a cascade of collateral damage to other space-based platforms. The damage could be amplified if an element of the chain explodes and emits an electromagnetic pulse. The consequences might include catastrophic failures of critical space-enabled utilities, triggering widespread mass-transport accidents, multiple military and public service system failures and the collapse of international financial systems.”

Electromagnetic Pulse (EMP) capabilities will probably become operational during the period out to 2035. It could be used to destroy all ICT devices over selected areas, while limiting wider physical and human damage. While military and other high-value networks may be hardened against this threat, most networks and communities on which societies depend, will not. The employment of an EMP weapon against a ‘World-City’ (for example, an international business-service hub) would have significant impact beyond the country against which it was targeted. It might even reduce political and business confidence in globalized economic processes to the point that concern about national economic resilience reverses internationally integrative trends, leading to a world increasingly characterized by protection, control and isolationism.”

The political purpose most commonly envisaged for nuclear weapons has been to deter nuclear attack by, or to offset the conventional superiority of, a potential adversary. Future concerns will centre on the potential acquisition of nuclear weapons by terrorists and other irregular entities, for coercive purposes or to inflict massive casualties. In addition, existing assumptions about the employment of nuclear weapons may be challenged in still more radical ways, including the exploration of neutron possibilities. The ability to inflict organic destruction, while leaving infrastructure intact, might make it a weapon of choice for extreme ethnic cleansing in an increasingly populated world. Alternatively, it might be considered as a basis for a new era of deterrence both in outfacing irresponsible nuclear powers and in opposing demographically strong nations.”

Doomsday Scenario
Many of the concerns over the development of new technologies lie in their safety, including the potential for disastrous outcomes, planned and unplanned. For example, it is argued that nanotechnology could have detrimental impacts on the environment, genetic modification could spiral out of control and that AI could be superior to that of humans, but without the restraining effect of human social conditioning. Various doomsday scenarios arising in relation to these and other areas of development present the possibility of catastrophic impacts, ultimately including the end of the world, or at least of humanity.”

NOTE: All emphasis and formatting theirs!

by Sherwood Ross
Global Research, February 5, 2007

Although no foreign power has threatened a bioterror attack against America, since 9/11 the Bush administration has allocated a stunning $43-billion to “defend” against one. Critics are now saying, however, Bush’s newest “biodefense” initiative is both offensive and illegal.

The latest development, according to the Associated Press, is that the U.S. Army is replacing its Military Institute of Infectious Diseases at Fort Detrick, Md., “with a new laboratory that would be a component of a biodefense campus operated by several agencies.” The Army told AP the laboratory is intended to continue research that is only meant for defense against biological threats.

But University of Illinois international law professor Francis Boyle charged the Fort Detrick work will include “acquiring, growing, modifying, storing, packaging and dispersing classical, emerging and genetically engineered pathogens.” Those activities, as well as planned study of the properties of pathogens when weaponized, “are unmistakable hallmarks of an offensive weapons program.”

Boyle made his comments to Fort Detrick as part of its environmental impact assessment of the new facility. Boyle pointed out in his letter that he authored the 1989 U.S. law enacted by Congress that criminalized BWC violations.

The Fort Detrick expansion is but one phase of a multi-billion biotech buildup going forward in 11 agencies sparked by the unsolved, Oct., 2001, anthrax attacks on Congress that claimed five lives and sickened 17.

The attacks, and ensuing panic, led to passage of the BioShield Act of 2004. There is strong evidence, though, the attacks were not perpetrated by any foreign government or terrorist band but originated at Fort Detrick, the huge, supposedly super-safe biotechnology research center. Despite an intensive FBI investigation, no one has been charged with a crime.

Referring to the work undertaken at Fort Detrick, Mark Wheelis, Senior Lecturer in the Section of Microbiology of the University of California, Davis, told the Global Security Newswire(GNS) as far back as June 30, 2004, “This is absolutely without any question what one would do to develop an offensive biological weapons capability.”

“We’re going to develop new pathogens for various purposes. We’re going to develop new ways of packaging them, new ways of disseminating them. We’re going to harden them to environmental degradation. We’ll be prepared to go offensive at the drop of a hat if we so desire,” he told GNS.

Alan Pearson, director of the chemical and bioweapons control program at the Center for Arms Control and Nonproliferation Studies in Washington, told the Baltimore Sun government scientists must tread carefully lest they wind up “in essence creating new threats that we’re going to have to defend ourselves against.”

Richard Novick, a New York University microbiology professor has stated, “I cannot envision any imaginable justification for changing the antigenicity of anthrax as a defensive measure.” (That is, to create a new strain for which there is no known vaccine.)
Milton Leitenberg, a University of Maryland arms control advocate, told The Washington Post last July 30th, “If we saw others doing this kind of research (Fort Detrick), we would view it as an infringement of the bioweapons treaty. You can’t go around the world yelling about Iranian and North Korean programs, about which we know very little, when we’ve got all this going on.”

One alarming example of such Federally-funded research reported in the October, 2003, issue of “New Scientist,” is the creation of “an extremely deadly form of mousepox, a relative of the smallpox virus, through genetic engineering.”

The publication warned such research “brings closer the prospect of pox viruses that cause only mild infections in humans being turned into diseases lethal even to people who have been vaccinated.”

Edward Hammond, director of The Sunshine Project of Austin, Tex., a non-profit working for transparency in biological research, said the recreation of the deadly 1918 “Spanish flu” germ that killed an estimated 40-million world-wide, means “the possibility of man-made disaster, either accidental or deliberate, has risen for the entire world.”

Richard H. Ebright, a Rutgers University chemist who tracks arms control issues, told The Baltimore Sun the government’s tenfold expansion of Biosafety Level-4 laboratories, such as those at Fort Detrick, raises the risk of accidents or the diversion of dangerous organisms. “If a worker in one of these facilities removes a single viral particle or a single cell, which cannot be detected or prevented, that single particle or cell can form the basis of an outbreak,” he said.

The current expansion at Fort Detrick flows from a paper penned by President Bush. His Homeland Security Presidential Directive, HSPD-10, written April 28, 2004, states, “Among our many initiatives we are continuing to develop more forward-looking analyses, to include Red Teaming efforts, to understand new scientific trends that may be exploited by our adversaries to develop biological weapons and to help position intelligence collectors ahead of the problem.”

Boyle said the Bush paper is “a smoking gun” fired at the BWC. “Red Teaming means that we actually have people out there on a Red Team plotting, planning, scheming and conspiring how to use biowarfare.”

Boyle traces advocacy for aggressive biowarfare back to the neo-conservative Project for a New American Century(PNAC), whose members, including Paul Wolfowitz, later influenced President Geoge Bush’s military and foreign policy. Before assuming his current post as World Bank head, Wolfowitz served Bush as deputy secretary of defense.

Before the anthrax attacks on Congress, PNAC advocated “advanced forms of biological warfare that can ‘target’ specific genotypes may transform biological warfare from the realm of terror to a politically useful tool,” Boyle wrote in “Biowarfare and Terrorism,” (Clarity Press).

Biological warfare inolves the use of living organisms for military purposes. Such weapons can be viral, bacterial, and fungal, among other forms, and can be spread over a large geographic terrain by wind, water, insect, animal, or human transmission, according to Jeremy Rifkin, author of “The Biotech Century”(Penguin).

Rifkin has written “it is widely acknowledged that it is virtually impossible to distinguish between defensive and offensive research in the field.” And Jackie Cabasso, of Western States Legal Foundation of Oakland, Calif., noted, “With biological weapons, the line between offense and defense is exceedingly difficult to draw. In the end, secrecy is the greatest enemy of safety.”
She added, “The U.S. is now massively expanding its biodefense program, mostly in secretive facilities. Other countries are going to be suspicious. This bodes badly for the future of biological weapons control.”

Critics following the biowarfare trail at Fort Detrick, are wondering if President Bush — who scrapped the nuclear proliferation treaty and then had the Pentagon design new nuclear weapons — isn’t also ignoring the BWC in order to create new germ warfare pathogens.

(Sherwood Ross is an American reporter and columnist. He worked for the Chicago Daily News and has written for wire services and national magazines. Reach him at

Brad Blog:

Guest Blogged by Brad Jacobson of MediaBloodhound

While cable news dutifully devotes nonstop coverage to the latest random criminal cases — kidnappings, shootouts, murderous love triangles, car chases — it’s telling when a supposed break in one of the biggest manhunts in FBI history, for a terrorist who murdered and poisoned multiple American citizens with anthrax, takes a back seat to nearly every other story. That is, if it’s mentioned at all.

Even as details, leaks, and a burgeoning list of questions bubbled to the surface last week, demanding serious scrutiny, the big three broadcast networks were equally blasé. Some nights skipping mention of the unfolding story altogether, as did last Tuesday’s editions of CBS Evening News and ABC World News (though both that evening reported the eminently newsworthy story of a thrill-seeking English couple who married while being strapped outside separate airplanes). On the same night, Brian Williams afforded 39 precious seconds to the anthrax investigation on NBC Nightly News.

In covering one of the most historic criminal investigations in our nation’s history, the worst bioterrorism attack on U.S. soil, the overall tenor and quality of network reporting (as well as much of the work in mainstream print media) has been nothing short of disgraceful. What America saw, instead, was a dearth of circumspection and a paucity of competent investigative work that mirrors the most feckless moments of the last eight years…

This coverage, delivered in an Orwellian bubble world where our brazenly criminal administration still earns the benefit of the doubt, is all the more indefensible when you factor in the reality this is a Bush administration investigation, one which had already dragged on for almost seven years, during which time the government was forced to cough up nearly $6 million to settle with a previously wrongly accused man whose reputation and personal life it had destroyed.

As the story unraveled, coverage almost invariably not only failed to address questions that would be obvious to fictional adolescent sleuths Nancy Drew and the Hardy Boys but also showcased a breathless zeal to help the Department of Justice prosecute Ivins through unfiltered and uncorroborated leaks — from accusations of “therapist” Jean Duley (Ivins was a homicidal killer who threatened her life and planned to kill all of his colleagues in a final “blaze of glory”), a woman known to have a fairly lengthy police record (news that failed to reach national mainstream outlets until the day the FBI/DOJ publicly aired their case, before disappearing again; plus, to my knowledge, Duley’s police record has yet to receive network airtime), whose depth of experience appeared at least suspect (she was still attending Hood College as of last year and, while various media reports called her a “psychiatrist,” “psychologist,” or “social worker,” it turns out Duley is actually an “addictions counselor“) whose affidavit, including the misspelling “theripist” and manic, haphazard penmanship, appears as if it were written by either a second grader or an unstable adult (investigative journalist Larisa Alexandrovna has more on Duley); to a leak last Monday courtesy of the Associated Press — quickly largely debunked by an update of the same article and then further dispelled by a New York Times piece Tuesday — which claimed, around the time of the anthrax attacks, Ivins had been visiting and harassing members of a Princeton University sorority located near one of the mailboxes used to send the envelopes; to another leak portraying him as both a porn-obsessed sicko because he received adult videos to a P.O. box and a raging alcoholic who, nonetheless, managed to retain his security clearance to work with some of the most lethal substances on the planet.

While ABC World News ignored the case on Tuesday’s August 5 broadcast, its previous night’s coverage proved no report might be preferable to a poor one. A segment called “A Closer Look” (video of this segment online included the headline “Closing the Anthrax Case”) focused on the break in the anthrax investigation. It’s a piece of journalism that might be described as anti-investigative work. As the online headline suggested — with exception to a one-sentence quote from New Jersey Representative Rush Holt (“After seven years of blind alleys and false accusations, we have to ask, well, has the FBI once again let their zeal replace evidence”) — this “closer look” was nothing more than a stenographic replay of the FBI’s storyline, including those damning quotes from Ms. Duley, a present wrapped in a bow to the FBI, the Department of Justice and the Bush administration. But a grave disservice to journalism, victims of the anthrax attacks, the American people, and, quite possibly, the Ivins family. There was nothing remotely closer about this look.

Then there’s those 39 seconds NBC Nightly News dedicated to the Ivins case the following evening. Another example of a report imparting more heat than light, complete with an exclusive leak to NBC News from the Justice Department, seamlessly delivered by Brian Williams:

BRIAN WILLIAMS: Federal officials are telling our justice correspondent, Pete Williams, they will reveal a possible motive tomorrow as to why they believe Dr. Bruce Ivins, the former Ft. Detrick bioweapons expert, sent the anthrax letters, including the one here to NBC. They say he felt badly stung by the criticism that the anthrax vaccine he helped develop for the armed forces back in the first Gulf War could’ve contributed to what’s now know as Gulf War Syndrome. He may have sent the deadly letters, they believe, to generate renewed interest in anthrax as a threat which would cause demand for an approved vaccine, one that he later, by the way, worked on.

Neither Brian Williams nor his justice correspondent posed any questions regarding this fresh allegation. Failing to demand evidence supporting this new leak or to question its legitimacy before passing it on to millions of viewers and the rest of the media, the dynamic Williams duo acted not as responsible journalists who either considered or cared that government officials might be using them — something any competent and ethical journalist must be on guard against in such situations — but as willing mouthpieces, blithely abdicating their role as members of the Fourth Estate, no more circumspect than White House spokespeople.

Even New York Times journalist Scott Shane, one of the more reliable reporters covering this case, had an odd appearance when he visited PBS’ NewsHour on Mondays August 4 broadcast.·(Yet it was arguably as much or more the fault of NewsHour senior correspondent Margaret Warner.) Earlier in the day, Shane published a Times article with the headline “Anthrax Evidence Is Said to Be Circumstantial” (later edited online to “Anthrax Evidence Called Mostly Circumstantial“), in which he reported in the opening paragraph “a person who has been briefed on the investigation said on Sunday” that “evidence amassed by F.B.I. investigators against Dr. Bruce E. Ivins….was largely circumstantial.” But somehow in a lengthy discussion with Shane, neither he nor Warner raised this highly relevant point, each with ample opportunity to do so.

While possible, it seems unlikely on the same day Shane writes a major article around this finding — the case being brought against Ivins will be predominantly circumstantial — that it would later, on the very same day, completely slip his mind. What’s more, as regular newscast segments go, Warner conducted a pretty extensive interview. So even if, for the sake of argument, Warner failed to do her homework prior to the interview and missed Shane’s article (more believable), one would still expect Shane to point out the case’s top-heavy circumstantial nature, if not immediately, then at some time during the discussion. Did NewsHour censor Shane? Did they agree beforehand not to mention that, by Sunday August 3, the case against Ivins was already believed — by a very credible source close to the investigation — to be built upon “largely” or “mostly” circumstantial evidence? It’s certainly a curious omission, one that, intentionally or not, helped to buy the government more time to leak negative information about Ivins before playing its hand on Wednesday.

As it turned out, when the Justice Department held its big press conference two days later, it confirmed Shane’s Monday scoop had been correct. If anything, the report’s characterization of the evidence seeming “mostly” or “largely” circumstantial turned out to be generous. The case against Ivins appears, thus far, completely circumstantial: they couldn’t tie him directly to the anthrax envelopes, prove he made the trip to Princeton around the time the envelopes were mailed, detect the type of anthrax mailed on his body or in his home or car, present any eyewitness accounts putting Ivins in his lab on those nights in late September and early October, or confirm many other colleagues hadn’t used the same flask that federal prosecutors call “effectively the murder weapon.”

Following this far from airtight presentation, journalism professor and author Ted Gup wrote in the Washington Post:

Such evidence, even when seemingly overwhelming and conclusive, is the very sort of circumstantial argument that pegged Richard Jewell as the Atlanta bomber, that linked Oregon attorney Brandon Mayfield to the Madrid bombings, that fingered Los Alamos scientist Wen Ho Lee as a spy, and that cast biodefense expert Steven Hatfill as the original anthrax suspect. In each of those investigations, the news media were largely complicit, conveying incriminating details of the government’s case as if they were the gospel.And yet, in each of those cases, the government was wrong — shaking public confidence even as it eroded individual civil liberties, produced groundless prosecutions and diverted precious time and resources in pursuit of bogus cases.
In June, the government agreed to a settlement with Hatfill valued at $5.8 million. Neither it nor the press, which was only too eager to link arms with the Justice Department in carrying the stories that stripped Hatfill of everything he had, has offered an apology or conceded wrongdoing.

Against this background, who could be blamed for imagining that an innocent Ivins was hounded to his death? Can we discount the accounts that suggest the government repeatedly harassed Ivins’s family, offering his son a reward and sports car if he would turn his father in?

Gup went on to say:

To their credit, in reporting the Ivins’s case, the media now appear somewhat chastened and more inquisitive than inquisitorial. It may well be that, absent a trial, it will fall to reporters to aggressively test the solidity of the case against Ivins. Perhaps they can restore a measure of credibility to their profession and to the government.

Hopefully he was not holding his breath.

If you turned on CNN and MSNBC the day after Wednesday’s FBI/DOJ presentation, you would’ve found no mention of the Ivins case. Paris Hilton’s scantily clad political spoof? Yes. A child kidnapping ring? You bet. Bret Favre’s trade to the NY Jets? Touchdown. Questions about a case involving the worst bioterrorism attack in U.S. history? Nothing.

On the same Thursday afternoon, a look at their websites found the Ivins case only made MSNBC’s “Other Top Stories,” coming in fourth behind — you guessed it — Bret Favre’s trade to the NY Jets. Of CNN’s 18 top stories, the Ivins case was absent — of course, Favre’s trade is there, as is “Did Caylee’s mom pose as mystery sitter?”; “Owners cuddle, dress pets…then fry them”; “Paris did ad in 4 takes — from memory!”; “McCain, Obama agree on ‘Dark Knight'”; and “Lawyer: Morgan Freeman, wife divorcing.”

And while ABC, CBS and NBC national nightly newscasts covered the DOJ’s case against Ivins on Wednesday, they hardly appeared “chastened” or felt compelled to “restore a measure of credibility to their profession.”

In the CBS Evening News report, introduced with a graphic of a Justice Department file opened to an illustrated report titled “Anthrax Case CLOSED,” anchor Katie Couric and justice correspondent Bob Orr repeated the pattern of laying out the government’s case with little or no questioning of the quality of evidence provided.

Orr framed his segment, saying, “Newly released FBI evidence makes a strong circumstantial case that bioweapons researcher Ivins was a delusional sociopath who had the opportunity, motive and means to be the 2001 anthrax killer.” Interspersed with U.S. Attorney Jeffrey Taylor’s comments from the press conference, Orr’s performance is closer to a co-prosecutor on the DOJ’s behalf than as a journalist assessing the strengths or weaknesses of the evidence, including the flask on which the alleged matching anthrax spores were found: “The most damning evidence,” asserted Orr, “a flask of anthrax spores recovered in 2004 from Ivins’ personal workspace at Ft. Detrick, the Army weapons lab where he worked.”

Yet he failed to mention the gaping hole in this “most damning evidence”: it was already known by then that many of Ivins’ colleagues also had access to the same flask. Moreover, on the day of the FBI/DOJ’s press conference, Paul Kemp, Ivins’ attorney, told the media that the number of people with access to it was far greater than previously reported — not 10 or 20 or 30 people but hundreds. The government soon admitted, by its own count, that more than 100 people could’ve used the flask.

Orr similarly treated other weak strands of the DOJ’s circumstantial evidence, including the alleged “striking” likeness between the threatening letter sent with the anthrax envelopes and the email Ivins wrote to a friend. Orr called Ivins’ email “chilling.” But Ivins’ words aren’t chilling. Nearly everyone in the Bush administration and in the GOP-led Congress, as well as many in the media, often made similar post-9/11 comments. Rather, it’s what Ivins believed Osama Bin Laden might do (“…Bin Laden terrorists for sure have anthrax and sarin gas…” based on what Bin Laden had said (“…he [Bin Laden] just decreed death to all Jews and all Americans”) that might instill fear. Without further proof, it’s a specious piece of semantic contortion and misappropriation that crumbles under scrutiny.

Moreover, Orr omitted the obvious: Where’s the handwriting analysis? And if one was performed, why aren’t the results being presented to us?

After Orr’s de facto co-prosecution, he ended his report with what should’ve been his lede:

ORR: While the FBI believes it’s now solved the case, the evidence does not directly connect Ivins to the anthrax letters and does not directly tie him to the New Jersey postbox where they were sent out. But with the suspect now dead, the government will never have to prove that case in court.

Which is exactly why Ted Gup noted in his WaPo op-ed, “It may well be that, absent a trial, it will fall to reporters to aggressively test the solidity of the case against Ivins.” Imagine how Professor Gup would grade Orr, Couric and CBS for this report.

NBC Nightly News justice correspondent Pete Williams’ framed his report somewhat more responsibly, noting upfront, “But this is a circumstantial case with no absolute proof that he did it.” Yet he prefaced this comment with an FBI assertion that, according to the evidence presented, is false on its face: “Amy, the FBI says it can trace the anthrax used in the attacks directly to Dr. Ivins and it says he repeatedly tried to mislead investigators.” Whether or not he misled investigators (unproven as well in the evidence proffered), again, the flask sitting in Ivins’ workspace in a shared lab three years later, to which so many colleagues had access — including former employees, like Philip Zack, who were no longer employed at Ft. Detrick when they frequented the lab and worked on unsanctioned, unknown projects — does not “directly” link Ivins to the anthrax used in the attacks. Like CBS’ Orr, Justice Correspondent Williams then proceeded to state the other main points of the Justice Department’s case without question.

Inclusion of a statement from Ivins’ lawyer was the only substantive difference in this report: “Tonight, a lawyer for Dr. Ivins says the FBI never found anthrax in his house or in his car or anything else directly linking him to the mailings.” But Pete Williams, presumably an expert in covering federal criminal cases, offered no educated assessments of his own on the government’s evidence. As with Orr, Williams did little more than parrot the FBI/DOJ presentation, in a segment edited in such a way that only added coherence and credibility to the government’s case. Similar to Orr as well (and Couric’s “Anthrax Case CLOSED” opening graphic) he also punctuated his report with an air of futility and premature closure: “And without a trial, we’ll never hear what Dr. Ivins would’ve said in his own defense.”

Essentially identical to Orr’s and Williams’ reports was justice correspondent Pierre Thomas’ segment on the FBI/DOJ’s presentation for ABC World News, another reiteration of the evidence edited in a such a way as to lend more heft and seamlessness to the government’s case while omitting obvious disconnects and holes.

To its credit, however (if we were grading on effort and not execution), World News then followed this segment with another titled “Anthrax Investigation Debunked,” in which Gibson spoke with legal correspondent Jan Crawford-Greenberg:

CHARLES GIBSON: Well, with Ivins’ death, this case will actually never go into a court of law. But would all that evidence have stood up in court? Our legal correspondent, Jan Crawford Greenburg, is joining us from Washington. And Jan, I know you’ve seen the evidence. I want to read you part of a statement that came from lawyers today. They said what the FBI presented with that evidence was all heaps of innuendo, contorted to create the illusion of guilt. How conclusive was it?JAN CRAWFORD-GREENBERG: Well, Charlie, certainly, there was enough evidence to get an indictment from a grand jury, as Pierre just reported. You know, we saw that he had control over that [sic] anthrax spores, had been linked to a flask in his lab through all of that scientific – that new scientific testing. That we saw his behavior growing increasingly erratic. And of course, he even tried to mislead investigators to say another researcher had control over that anthrax. But this was not an open or shut case by any means. Defense lawyers would have had a lot work with. For example, there was no DNA, actual DNA, linking Ivins to the anthrax on those letters, his own DNA on those letters. You know and then even when you look at the scientific evidence in that flask, the anthrax spores that were in that flask, Charlie, a lot of researchers in that lab also had access to it.

Yet, once again, there’s no direct evidence Ivins “had control” over those specific anthrax spores or that he solely “had been linked” to that flask in his lab. Quite the opposite. In fact, Crawford-Greenberg went on to contradict the strength of this evidence and her own act of inflating its worth by subsequently noting “even when you look at the scientific evidence in that flask, the anthrax spores that were in that flask, Charlie, a lot of researchers in that lab also had access to it.”

Gibson then posed a question that can’t be asked too much, but his legal correspondent’s response could’ve come straight from the FBI or DOJ:

CHARLES GIBSON: So it might have been a dicey case for the FBI and for prosecutors in court. But whether or not he could have been convicted, this was obviously a rather quirky fellow. What was he doing dealing with deadly toxins?JAN CRAWFORD-GREENBERG: Well, Charlie, this was someone who had worked in this lab nearly 30 years. He was highly respected, highly regarded by his colleagues. It was only in the later years that his behavior became more erratic. Now we saw some congressmen today calling for more screenings of scientists who handle these dangerous drugs, but there’s no indication that that would have picked up any of his erratic behavior at all.

With so much of the government’s circumstantial evidence resting on Ivins’ alleged ever-deteriorating mental state, purportedly going back at least as far as July 2000 and maybe even to his undergraduate college days, it’s hard to believe his colleagues and supervisors (not to mention to his friends and family) would’ve remained so oblivious or unconcerned about such a chronic basket case, specifically one whose job entails handling substances that could potentially unlock a genocidal Pandora’s Box. Moreover, according to the case against him, “his behavior became more erratic” seven years before they revoked his security clearance.

Maybe World News deserves some credit for actually attempting to give this evidence “a closer look” this time. Or maybe it intended to only appear as if it were doing so. Regardless, Crawford-Greenberg’s responses did more to muddle the government’s evidence against Ivins than it did to present viewers with a clear and candid legal assessment.

Compare Gibson and Crawford-Greenberg’s discussion to MSNBC’s Countdown segment aired on the same night, in which investigative journalist Gerald Posner, speaking with host Keith Olbermann, exposed many aspects of the government’s case without mincing words or glossing over its discrepancies and disconnects.

OLBERMANN: The flask of anthrax with identical spores, ostensibly, their strongest piece of evidence. What do you make of this?POSNER: That’s what they make it sound like, but it’s not. Let me tell you, the late public hears this, they think that’s the evidence. Those are the spores that got people sick, sent out from the envelopes, not true. That was liquid anthrax in that flask.

Even if the FBI can tie it to that flask, they can’t explain how it was then made into this extremely sophisticated type of weapon with small milligramage with electric charges to it, with polyglass on top of the coating, all to go deep inside the lungs, to spray into the air. This was weaponized, military anthrax. They cannot explain how it went from that glass flask in a liquid form into the form that was sent out in the envelopes. That they don’t have the evidence on.

OLBERMANN: What, if anything they presented today, is the strongest evidence? What do they got going for them?

POSNER: Well, they threw out this machine, what they called the lyopholizer, they say that can make wet anthrax into dry anthrax, but I talked to six different microbiologists today and people involved formerly in weapons programs in the United States and in Russia, who say that the machine that the FBI talks about can’t do that. [What a novel journalistic technique — speaking with other experts to confirm the credibility of the government’s case.]

The strongest evidence they have going for them is also their Achilles’ heel and that’s his psychological profile. That fact that he’s very unstable, that he was someone who was an alcoholic, that he might wanted to have the vaccine continue to go along, but that’s also the fact that he could have been set up as a cutout, a patsy, or used by a group of people who wanted the anthrax out there.

They also knew about his weak psychological profile. How was he employed with the most secret biological warfare lab in the United States with this type of background that we now hear about that they should have known about from day one? The Defense Department should hang its head in shame.

OLBERMANN: Right. Thirty-five years of murderous intent and nobody knew about it, and they let him in to the germ warfare lab. As to motive, they mentioned it but almost as if it were in passing. Is that a weak part of the case? Do they offer anything that made any sense?

POSNER: Boy, I’ll tell you, I thought it was a weak part of the case. I listened to the press conference today and then sort of at the end as though they thought they had to throw something out, they said, “Oh, by the way, let’s give you the reasons to why we think he sent out and went on this homicidal rage.”

And the motive they said was, “Well, he helped develop a vaccine for anthrax, he probably wanted to continue to see that developed so that by killing people, by having come up with some unknown way of this high military grade anthrax. We would keep the vaccine program going.”

That was pretty weak, and, you know, I thought they just literally were fishing. They don’t have a good motive, unfortunately, for them and their prosecution. But as you said in the lead into this, they don’t need to because the primary suspect, the only suspect, is dead. They’re going to close this case.

OLBERMANN: But the declaration that he is the only, it’s not just a question of proving a dead man did this or was part of this, but the insistence is he did by himself, the lone, mad scientist thing. Did they get anywhere near confirming that?

POSNER: No. As a matter of fact, Keith, that’s my major problem with this. You know, if you look at it and you say, “He‘s involved, he‘s got a role in it, he‘s done something.” That, the evidence, I’m waiting to see that and they may nail that down. But I spoke to enough experts in the last few days who have convinced me, who know how this process works, that these spores that were sent out, were not the work of one lone scientist and that, I believe, is the case.

Nevertheless, this story disappeared from network news studios by the following morning. No mention on TV Thursday on CNN or MSNBC, nor on NBC, CBS or ABC’s national nightly newscasts. Nor did it warrant any further network coverage Friday, Saturday or Sunday.

Dr. Bruce Ivins is dead. He may have been the anthrax killer and acted alone. He may have acted with others. (Based on the known evidence, both of these two scenarios seem less likely with each passing day.) He may have just been a convenient fall guy. (As Gerry Andrews, microbiologist and former longtime colleague of Ivins, wrote in a New York Times editorial yesterday: “After the anthrax attack, Dr. Ivins himself worked directly with the evidence. The F.B.I. asked Dr. Ivins to help them with the forensics in the case by analyzing the contents of suspicious letters. And he did so for years, until the authorities began to suspect that the anthrax spores used in the mailings might have originated from his lab. [Awfully convenient, no?] Dr. Ivins, for instance, was asked to analyze the anthrax envelope that was sent to Mr. Daschle’s office on Oct. 9, 2001. When his team analyzed the powder, they found it to be a startlingly refined weapons-grade anthrax spore preparation, the likes of which had never been seen before by personnel at Fort Detrick.”) The person or persons who murdered and poisoned Americans with those anthrax letters may even have framed him. The FBI may have also driven Ivins to take his own life after relentlessly hounding him and his family for a crime he never committed.

But the FBI and DOJ wanted this case closed. Now. And in one of the most important criminal investigations in our nation’s history, for the deadliest bioterrorism attack on U.S. soil — which our government, with help from Brian Ross and ABC News’ curiously sourced false reporting, initially used to build support for invading Iraq — the networks (Olbermann’s Countdown coverage notwithstanding) have thus far refused to substantively question this historically corrupt government’s circumstantial case against a dead man who will never have his day in court.

By the way, have you heard that John Edwards cheated on his wife?

Cross-posted at Media Bloodhound…


Brad Jacobson, a Brooklyn-based freelance writer, media critic, independent journalist and satirist, is the founding editor and writer of MediaBloodhound.


Previously Related at The BRAD BLOG: