Posts Tagged ‘WMD’s’

CNN | Mar 22, 2010

By Tom Watkins

About 1 million children in the United States and about 30 million worldwide have gotten Rotarix vaccine, the FDA says.

(CNN) — Federal health authorities recommended Monday that doctors suspend using Rotarix, one of two vaccines licensed in the United States against rotavirus, saying the vaccine is contaminated with material from a pig virus.

“There is no evidence at this time that this material poses a safety risk,” Food and Drug Administration Commissioner Dr. Margaret Hamburg told reporters in a conference call.

Rotarix, made by GlaxoSmithKline, was approved by the FDA in 2008. The contaminant material is DNA from porcine circovirus 1, a virus from pigs that is not known to cause disease in humans or animals, Hamburg said.

About 1 million children in the United States and about 30 million worldwide have gotten Rotarix vaccine, she said.

Raw Story | Apr 2nd, 2010

In what is being hailed as a major victory for workers in the biotech and nanotech fields, a former scientist with pharmaceutical firm Pfizer has been awarded $1.37 million for being fired after raising the alarm over researchers being infected with a genetically engineered “AIDS-like” virus.

Becky McClain, a molecular biologist from Deep River, Connecticut, filed a lawsuit against Pfizer in 2007, claiming she had been wrongly terminated for complaining about faulty safety equipment that allowed a “dangerous lentivirus” to infect her and some of her colleagues.

The Hartford Courant describes the virus as “similar to the one that can lead to acquired immune deficiency syndrome, or AIDS.” Health experts testified that the virus has affected the way McClain’s body processes potassium, which they say causes McClain to suffer complete paralysis as often as a dozen times per month, the Courant reports.

McClain’s lawsuit (PDF) asserted that Pfizer had interfered with her right to free speech, and that she should have been protected from retaliation by whistleblower legislation.

Government Computer News:

Technology has always been essential to military strength, but breakthroughs developed within the military often are not limited to weapons. This special report introduces some of the Pentagon’s most advanced information technology projects, in the context of their relation to commercial products and battlefield necessities.

[IMGCAP(1)]The Defense Advanced Research Projects Agency has fostered technologies ranging from the Internet to artificial intelligence research. Nowadays, the scientists it supports are pushing IT ever closer to achieving the processing power and cognitive awareness of living beings. At the same time DARPA is applying technology to the pressing threats imposed by current conflicts, the agency is sponsoring more than a dozen innovative projects, including a bid to perfect cheap, extremely accurate and nonradioactive atomic clocks for use in battlefield systems.

Advances in the mathematical algorithms for cryptography and the processing muscle behind them soon will transform the platforms that handle cascades of classified data, for example. National Security Agency officials characterize their work as a process of continuous ploy and counterploy in the rarefied realms of logic and computing.

The Grand Challenge of bringing practical, remotely piloted or autonomous land vehicles into use also is advancing via the competitive work of several teams. And in its approach to supercomputing, the Defense Department could be changing the way high-performance systems are measured, developed and purchased.

Mutating threats shape DARPA’s research in a wide range of new technologies

In a conflict where the biggest threats to soldiers often are low-tech, homemade explosives, it might not be obvious why troops need a more precise atomic clock to support their efforts. But the Defense Advance Research Projects Agency is working to deliver such precision, along with 13 other future icons that span a range of science and technology, from networking to air vehicles, biology and lasers, DARPA Director Tony Tether said.

The Chip Scale Atomic Clocks (CSACs), for instance, would perform key control functions throughout Pentagon networks and also could help warfighters detect an enemy’s presence.

All the Future Icon projects involve the application of computing resources to solve present and future defense missions, and some directly attack the problems of improving information technology performance for existing systems and futuristic computer architectures.

And they are the types of projects whose impact often extends beyond their original scope, affecting the development of technologies used elsewhere in government and commercially.

“They are tremendously difficult technical challenges that will be hard to solve without fundamentally new approaches — ones which may require bringing multiple disciplines to bear and perhaps even result in entirely new disciplines,” Tether said in testimony submitted recently to the House Armed Services Subcommittee on Terrorism, Unconventional Threats and Capabilities.

One of the most ambitious of the futuristic computer design projects is a five-year project to build a system modeled on the human brain, which would reflect and incorporate human assessments of the roles and intentions of people (see sidebar).

Shape shifters
The research agency is also probing highly advanced IT challenges such as the Programmable Matter project, which aims to develop software that would allow physical objects to change their size, shape, color and other attributes to fulfill changing functions within, say, a military communications system.

CSACs would tackle more immediate concerns in defense networks and in helping soldiers detect enemy vehicles and facilities, according to a leading scientist at the National Institute of Standards and Technology who is researching the technology with DARPA support.

DARPA’s research is honing computer-based methods of detecting purposely hidden or naturally elusive enemy targets underground or on the high seas.

The CSAC project has been driven by the increasing need to reliably assure continual synchronization of systems linked via the Global Information Grid, said Thomas O’Brian, chief of the Time and Frequency Division at NIST’s laboratory in Boulder, Colo. The lab receives DARPA funding to support the development of chip-scale atomic clocks.

The tiny clocks could be deployed in hundreds of systems that military organizations at all levels rely on, including not only radios but also radars, sensors and location units that use the Global Positioning System, O’Brian said in an interview. The atomic clocks promise to make GPS systems more reliable while using little power, along with providing other helpful features, such as low weight and small size, he continued.

The CSACs “are significantly more accurate than the quartz crystal units ,which have been the standard” for such timekeeping, O’Brian said. The new generation of small clocks relies on the vibration frequency of elements such as cesium and rubidium to maintain their steady timekeeping and does not involve radioactive materials.

The tiny clocks can operate for as long as two days or more using the power available in a AA battery, O’Brian said.

“Another aspect of these devices is that they can serve as magnetometers,” he added. As such, the CSACs could sense the presence of metallic objects, such as mines or tanks. “You could scatter them across a wide area so when a Jeep or tank drives over, they might detect it,” O’Brian said. “Or they could detect the presence of ventilating fans in [al Qaeda caves] in Tora Bora [Afghanistan].”

CSACs already have proved themselves in demonstrations using GPS devices, and the technology showed that it could help navigation units function when satellite signals aren’t available, O’Brian said.

Some of the main tasks remaining before the CSACs reach routine use include:

  • Developing efficient, low-cost mass-production methods.
  • Improving the small clocks’ resistance to field conditions such as vibration, temperature and pressure variations and shock.
  • Reducing power consumption.

O’Brian expressed confidence that researchers could soon achieve those improvements.

The research agency’s push in the fields of “detection, precision identification, tracking and destruction of elusive targets” has spawned several research projects. One group of them aims to improve methods for finding and investigating caves, and another centers on tracking seaborne vessels.

The cave research has gained momentum partly from the response of adversary countries’ forces to the success of the Pentagon’s spy satellite technology. Countries such as Iran and North Korea reportedly have built extensive underground facilities to conceal some of their nuclear-weapon production facilities from orbiting sensors.

The underground research spurred by such strategic threats also has led DARPA to study how better cave technology can aid tactical operations, such as by helping soldiers discover enemy troops and weapons lurking in small caves and by helping detect cross-border smuggling tunnels.

The Counter-Underground Facilities program aims at developing sensors, software and related technology to:

  • Pinpoint the power, water airflow and exhaust vents from cave installations.
  • Evaluate the condition of underground facilities before and after attacks.
  • Monitor activities within cave structures during attacks.

According to DARPA procurement documents, the Pentagon’s cave program began by developing methods to learn about those conditions and other features of caves via Measurement and Signature Intelligence (Masint) technology.

Masint methods involve the use of extremely sophisticated and highly classified technology that can integrate information gathered by various types of sensors, including acoustic, seismic, electromagnetic, chemical, multispectral and gravity-sensing devices.

DARPA’s underground facility research project also involves investigation of the effluents coming from vents connected to cave complexes. Effluents for Vent Hunting research can involve the computerized evaluation of smoke to distinguish, for example, between decoy cooking fires and real cooking fires in an area where hostile forces may be roaming.

On the high seas, the Predictive Analysis for Naval Deployment Activities (PANDA) project is refining its existing technology to track the location and patterns of more than 100,000 vessels and to detect when ships and boats deviate from normally expected behavior.

Suspicious behavior
As such, the PANDA research is similar to other systems that use exception detection to pinpoint unusual behavior by people in airports or train stations. Developers of those counterterrorism systems have carved out the task of teaching systems what types of events to watch for among the countless mundane activities observed via video cameras in the transportation hubs.

Like the PANDA system, the exception-detection software for airports flags unusual events — such as an errant freighter in one case or an unattended satchel in the other — and brings them to the attention of human analysts.

At the edges of computer science, DARPA is approaching the problem of attracting and cultivating talent to the field of computer science partly by asking promising students to choose projects that strike them as interesting and attractive.

“One of the ideas the students liked is Programmable Matter,” Tether told the congressional subcommittee members. “It is an important idea that is of significant relevance to DOD. The challenge is to build a solid object out of intelligent parts that could be programmed so that it can transform itself into other physical objects in three dimensions. It would do this by changing its color, shape or other characteristics.”

The programmable matter project could, for instance, lead to the invention of a malleable antenna that could change its shape depending on the radio or radar to which it is connected, Tether said.

“The computer science challenges are to identify the algorithms that would allow each element of the object to do its job as the object changes, while staying well coordinated with the other elements and functioning as an ensemble,” he added.

DARPA throws down the challenge on cognitive computing

The Defense Advanced Research Projects Agency’s research in the field of cognitive computing could progress to the point of a Grand Challenge that would pit alternate methods of building brainlike systems against one another.

The agency’s Biologically-Inspired Cognitive Architecture program is pushing artificial intelligence in the direction of building software that mimics human brain functions.

BICA relies on recent advances in cognitive psychology and the science of the human brain’s biological structure to build software that comes much closer to human abilities than previous AI. The research agency’s Information Processing Technology Office is leading the BICA research process by funding research teams based mainly at universities.

AI traces its roots back to designs such as expert systems and neural networks, familiar since the 1980s, which held out the promise of transforming information technology by adopting human learning and thinking methods. Those classic AI approaches proved to be useful in some commercial and government systems but were less effective than conventional IT architectures for most uses.

BICA’s leaders note that AI progress has been slow and steady in recent decades. “However, we have fallen short of creating systems with genuine artificial intelligence — ones that can learn from experience and adapt to changing conditions in the way that humans can,” according to DARPA. “We are able to engineer specialized software solutions for almost any well-defined problem, but our systems still lack the general, flexible learning abilities of human cognition.”

The BICA program has completed its first phase, which commissioned eight research teams to combine recent findings in brain biology and psychology to help build blueprints for functioning computers that could learn and understand like people. In the second phase of the five-year BICA program, which is now under way, the military research agency is seeking proposals for vendor teams to develop and test models of human cognition, or thinking, based on the architectures built in the program’s first year.

DARPA has not yet announced plans for a grand challenge competition to pit the resulting AI-like systems against one another. But vendor documents submitted in response to BICA’s first phase refer to an anticipated challenge stage of the program.

The University of Maryland at College Park provided one of the computer architectures for the first phase of the BICA program, basing some of its research on methods of designing a mobile system that could learn the various skills DARPA seeks in a cognitive system. “We are ultimately interested in [designing] an agent that captures many of the abilities of a child, and thus do not focus on a large initial knowledge base,” the University of Maryland computer scientists wrote.

“We keep the environment and input/ output to the system relatively simple so that we can focus on the primary issue of integrating those components and not the important but low-level details that will eventually need to be addressed,” according to their blueprint.

The 14 Future Icon technology areas, as described in testimony by Defense Advanced Research Projects Agency Director Tony Tether before a House committee:

Networks: Self-forming, robust, self-defending networks at the strategic and tactical level are the key to network-centric warfare.

Chip-Scale Atomic Clock: Miniaturizing an atomic clock to fit on a chip to provide very accurate time as required, for instance, in assured network communications.

Global War on Terrorism: Technologies to identify and defeat terrorist activities such as the manufacture and deployment of improvised explosive devices and other asymmetric activities.

Air Vehicles: Manned and unmanned air vehicles that quickly arrive at their mission station and can remain there for very long periods.

Space: The U.S. military’s ability to use space is one of its major strategic advantages, and DARPA is working to ensure the United States maintains that advantage.

High-Productivity Computing Systems: DARPA is working to maintain the U.S. global lead in supercomputing, which is fundamental to a variety of military operations, from weather forecasting to cryptography to the design of new weapons.

Real-Time Accurate Language Translation: Real-time machine language translation of text and speech with near-expert human translation accuracy.

Biological Warfare Defense: Technologies to dramatically accelerate the development and production of vaccines and other medical therapeutics from 12 years to only 12 weeks.

Prosthetics: Developing prosthetics that can be controlled and perceived by the brain, just as with a natural limb.

Quantum Information Science: Exploiting quantum phenomena in the fields of computing, cryptography and communications, with the promise of opening new frontiers in each area.

Newton’s Laws for Biology: DARPA’s Fundamental Laws of Biology program is working to bring deeper mathematical understanding and accompanying predictive ability to the field of biology, with the goal of discovering fundamental laws of biology that extend across all size scales.

Low-Cost Titanium: A completely revolutionary technology for extracting titanium from ore and fabricating it promises to dramatically reduce the cost for military-grade titanium alloy, making it practical for many more applications.

Alternative Energy: Technologies to help reduce the military’s reliance on petroleum.

High-Energy Liquid Laser Area Defense System: Novel, compact, high-power lasers making practical small-size and low-weight speed-of-light weapons for tactical mobile air and ground vehicles.

NSA pushes for adoption of elliptic-curve encryption, whose greater security and shorter key lengths will help secure small, mobile devices

The cryptographic security standards used in public-key infrastructures, RSA and Diffie-Hellman, were introduced in the 1970s. And although they haven’t been cracked, their time could be running out.

That’s one reason the National Security Agency wants to move to elliptic-curve cryptography (ECC) for cybersecurity by 2010, the year the National Institute of Standards and Technology plans to recommend all government agencies move to ECC, said Dickie George, technology director at NSA’s information assurance directorate.

Another reason is that current standards would have to continually extend their key lengths to ensure security, which increases processing time and could make it difficult to secure small devices. ECC can provide greater security with shorter keys, experts say.

The switch to ECC will be neither quick nor painless. It will require mass replacement of hardware and software to be compatible with ECC and new NSA cybersecurity standards.

In fact, the 2010 goal might not be realistic for NSA, where more than a million different pieces of equipment will need to be moved to ECC, George said. NSA’s move could potentially take as long as 10 years to complete, given the project’s complexity and scope. The agency has not set a specific deadline for completing its Cryptographic Modernization initiative, started in 2001 and recognizes that cybersecurity will always be a moving target, he said. The move to ECC is part of the initiative.

ECC, a complex mathematical algorithm used to secure data in transit, will replace RSA and Diffie-Hellman because it can provide much greater security at a smaller key size. ECC takes less computational time and can be used to secure information on smaller machines, including cell phones, smart cards and wireless devices.

The specifics of the changeover were announced in 2005 with NSA’s release of its Suite B Cryptography standards. Suite B falls under NSA’s Cryptographic Modernization initiative and details ECC usage for public keys and digital signatures. The announcement, the first related to cryptographic standards in 30 years, was a watershed event, said Bill Lattin, chief technology officer at Certicom, a pioneer in ECC.

NSA has licensed approximately 25 of Certicom’s ECC patents for use by the government and vendors that develop defense products.

The move to ECC represents a new way of doing business for the NSA. The Cryptographic Modernization initiative “is not just replacing the old with the new. We are upgrading the entire way we do communications,” George said.

Interoperability is the core of the new communications program and the reason for the modernization initiative. NSA plans to work closely with other governments, U.S. departments and agencies, first responders, and the commercial sector, George said. To do so, the agency needs public-key algorithms to securely transmit information among all parties, he said.

“If you go back 30 years, things weren’t nearly as interoperable as they are now. In today’s world, everything is being networked. We have to allow interoperability. And the cryptography has to match [among devices] because if it doesn’t, it is not going to be interoperable,” George said.

These interoperability goals will most likely extend across federal, state and local governments in addition to law enforcement agencies nationwide.

Although RSA and Diffie-Hellman are both public-key algorithms, experts say they don’t scale well for the future. To make RSA and Diffie-Hellman keys, which now can go to 1,024 bits, secure for the next 10 to 20 years, organizations would have to expand to key lengths of at least 2,048 bits, said Stephen Kent, chief scientist at BBN Technologies. Eventually, key sizes would need to expand to 4,096 bits. “That’s enormous keys. To do the math operations underlying the keys takes longer and is more computationally intensive,” Kent said.

Thus, NSA’s decision to move to ECC, which appears to be the only option. Experts agree that there is no new technology comparable to ECC. Although there are a number of protocols, there are only two basic technology approaches, George said: integers, used by RSA and Diffie-Hellman, and ECC, he said.

“ECC is the only impressive thing out there,” Kent said. “People don’t get excited every time a new thing comes along. We wait several years and let people try to crack it first. ECC definitely passed the test in this regard.”

NIST, which develops government- wide cybersecurity standards, also sees a need to move to ECC, although its recommendations are less stringent than NSA’s, whose ECC guidelines are a subset of NIST’s.

“I’m pretty sure [RSA and Diffie-Hellman] will be broken within a decade or so,” said Bill Burr, manager of NIST’s security technology group. “We are trying to end the use for most purposes of RSA and Diffie-Hellman with 1,000-bit keys by the end of 2010. And if you are real conservative, we are late.”.

“NSA has been fairly aggressive to standardize on ECC,” Burr said. We are slower, partly because we think it will naturally happen anyhow.”

John Pescatore, vice president and analyst at Gartner, does not see a need for the average user to switch to ECC unless it is to take advantage of its smaller size, such as securing cell phones and smart cards. With NSA, those technologies might include “things that a soldier carries around…and [has] strict limits on power consumption,” Pescatore said.

Burr expects ECC to become a universal standard by 2020, when most ECC patents owned by Certicom expire. “If it’s not a big problem today, it may be hard for the CIO to motivate people to transition to ECC,” said Kent.

DARPA’s Grand Challenge moves downtown, where teams will test their vehicles against city traffic

The Defense Advanced Research Projects Agency’s competition for autonomous vehicles has seen great leaps forward in its first two incarnations. This year, the ride could get rather bumpy, as the Grand Challenge moves from the expanses of the desert to the mean streets of the city.

The competition, called the Urban Challenge for 2007, is no mere sporting event. DARPA’s goal is to use the challenge to help develop technologies for self-guiding military vehicles that could reduce the deadly toll of vehicular-related battlefield casualties among U.S. military personnel.

Approximately half the U.S. soldiers killed to date in Iraq have died in enemy attacks on vehicles, whether by live enemy fire or by improvised explosive devices or, to a lesser extent, in vehicular accidents.

Based on results from the two previous Grand Challenges and a preliminary look at the entrants in DARPA’s Urban Challenge contest now under way, “we think that over time we will be able to build vehicles that will be able to drive as well as humans in certain situations,” said Norman Whitaker, program manager for DARPA’s Urban Challenge.

In May, DARPA trimmed the roster of teams competing in the Urban Challenge from 89 to 53 and will further narrow the field to 30 semifinalists this week based on scores issued during site visits DARPA officials have been conducting since May. The agency also will name this week the location of the competition’s Qualification Event scheduled for Oct. 26 to 31 and the location for the final contest Nov. 3.

To date, DARPA has said only that both events would take place in the western United States, although its placement in a simulated urban combat zone has become the theme of this year’s contest and considerably upped the ante for the level of vehicle proficiency that will be required to successfully complete the contest’s 60-mile course in six hours.

The complexities of a city environment and the introduction this year of other moving vehicles along the course increases exponentially the sophistication of the sensing, data processing and guidance technologies required, Whitaker said.

DARPA’s goal in its successive challenges is to raise the bar each time, he said, although the addition of moving traffic represents the biggest obstacle ever added to the contest.

The first Grand Challenge in 2004 ran over a 142-mile course in the desert, but the competition looked more like the Keystone Cops than Knight Rider — no vehicle made it past the eight-mile mark. Still, DARPA officials said they saw promise, which came to fruition in 2005, when four vehicles covered a 132-mile desert course. With those results, the decision was made to take the Grand Challenge downtown.

With an urban setting and traffic, vehicles “have to make decisions fast, so we’ve speeded up the timeframe” in which vehicles must receive sensor data, process it and respond, all without human intervention, Whitaker said. “As usual, we’ve taken it to the nth degree and said we want full autonomy. By [asking for an extreme], we get a lot of the middle ground covered.”

The placement of this year’s contest in a dynamic setting creates challenges unheard of in previous challenges and requires technological advancements that will bring self-guided vehicles to a near reality, participants say.

“This year we have moving objectives and that dynamic interaction is new and very difficult,” said Gary Schmiedel, vice president of the advanced product engineering group at Oshkosh Truck, one of the corporate entrants in this year’s Urban Challenge and one of the teams that successfully completed the 132-mile course in 2005. “This brings us much closer to a real-world application of the technology and means that we have to build a truck that’s as versatile as you or I would be.”

At the level of sophistication that will be required in this year’s contest, “this is really a software competition, not a hardware competition,” said David Stavens, a doctoral candidate at Stanford University who’s working on Stanford’s entry in the Urban Challenge and was a co-creator of Stanley, the modified Volkswagen Touareg sport utility vehicle that won DARPA’s 2005 Grand Challenge for Stanford University.

The Stanford team, consequently, is spending much of its time this year working on probabilistic algorithms and machine learning capabilities and is tackling the problem with help from the Stanford Artificial Intelligence Laboratory, Stavens said. Probabilistic algorthms will help this year’s Stanford entry, Junior, a Volkswagen Passat station wagon, deal with uncertainties along the course, while machine learning will enable the team to program the car with human-like driving skills.

“By driving other roads, you can gain enough knowledge that the robot will be able to handle the Urban Challenge course just fine,” Stavens said. “This is a very rich subset of the skills that you and I would use when we jump in our own cars and go driving, but this type of technology can save our soldiers’ lives in the battlefield and save lives in the civilian world.”

After this year’s challenge, DARPA will evaluate whether the contests have advanced the technology enough to make commercial production of autonomous vehicles for the military feasible and economically practical, Whitaker said. After an experiment along the lines of the challenges, “there’s an intermediate phase before [the military] goes out and starts buying systems. It could also be that we’ll need to see more work on the commercial side,” he said.

Teams build on technologies from past challenges

As the agency that created the Internet and nurtured it through its early years, the Defense Advanced Research Projects Agency has a long history of transferring its technical innovations from military to civilian use. The Grand Challenge will likely prove to be another example.

Although the challenge’s primary goal is developing driverless military vehicles, DARPA has organized the competitions with the expectation that technologies created for them will be applied in the private sector, too.

Many of the corporate Grand Challenge participants, in fact, look at it as an opportunity to test and perfect — in demanding military conditions — technologies they will later adapt for industrial or civilian use.

Velodyne Acoustics, a maker of high-fidelity stereo and home theater equipment, entered the 2005 Grand Challenge and invented laser-based sensors for its vehicle that it has now sold to participants in the 2007 Urban Challenge.

The company also is marketing its invention to prospects in several industries, said Michael Dunbar, Velodyne’s business development manager.

David Hall, the company’s founder, chief executive officer and chief engineer, along with his brother, Bruce, Velodyne’s president, entered a vehicle in the 2005 Grand Challenge as Team DAD (for Digital Audio Drive). While working on the project, they identified shortcomings with the laser-based light, distancing and ranging (Lidar) scanners used alone or in combination with cameras as the eyes in the guidance systems of autonomous vehicles, Dunbar said. Lidar systems available on the market at the time scanned for objects only along a single, fixed line of sight.

In response to those limitations, David Hall, an avid inventor, created his own Lidar scanner consisting of an assembly of 64 lasers spinning at 300 to 900 rotations per second capable of detecting objects anywhere in a 360-degree horizontal field. The Velodyne Lidar assembly produces 1 million data points per second, compared to the 5,000 data points a second of earlier systems.

Velodyne doesn’t have a vehicle in this year’s Urban Challenge but has sold its HDL-64 Lidar scanner to 10 Challenge participants that have included it on their vehicles, either alone or in conjunction with optical sensors, Dunbar said. “Some of the teams can use our sensor and eliminate other types of sensors so [the sensor data] is much easier for them to manipulate,” he said.

By setting its own benchmarks for supercomputing systems, DOD gets better performance — and might change how HPC systems are procured

Twice a year, work being done by the world’s fastest supercomputers comes to a screeching halt so the systems can run a benchmark called Linpack to determine how fast they are, at least in relation to one another. Linpack — which measures how many trillions of floating-point operations per second the machine is capable of executing — is the benchmark used to rank the fastest supercomputers in the world, in the twice-annual Top 500 List.

As an exercise in flexing muscle, Linpack is about as useful as any other benchmark. But as a tool for judging supercomputing systems in a procurement process, it is limited at best. The Defense Department, through its High Performance Computing Modernization Program, is shaking up the supercomputing world by applying a more disciplined approach to purchasing big iron.

Instead of using a generic benchmark to compare models, the program issues a set of metrics that carefully codifies its own workload. Program leaders then ask vendors to respond with the best — yet most cost-effective — systems they can provide to execute such a workload.

“We don’t specify how big the machine is,” said Cray Henry, head of the program. “We will run a sample problem of a fixed size, and call the result our target time. We then put a bid on the street and say we want you to build a machine that will run this twice as fast.” It is up to the vendor to figure out how that machine should achieve those results.

Sounds simple, but in the field of supercomputers, this common-sense approach is rather radical.

“It’s a well-oiled process,” agreed Alison Ryan, vice president of business development at SGI. She said that for vendors, “this kind of procurement is actually difficult. It takes a lot of nontrivial work. It’s easier to do a procurement based on Linpack.” But in the end, the work is worthwhile for both DOD and the vendor, because “it’s actually getting the right equipment for your users.”

“They’ve done a great job on the program in institutionalizing the [request for proposal] process,” said Peter Ungaro, chief executive officer at supercomputer company Cray.

DOD created HPCMP in 1994 as a way to pool resources for supercomputing power. Instead of having each of the services buy supercomputers for its own big jobs, the services could collectively buy an array of machines that could handle a wider variety of tasks, including large tasks.

On the rise
Today, the program has an annual budget of about $250 million, including $50 million for procuring two new supercomputers. Eight HPCMP shared-resource centers, which house the systems, tackle about 600 projects submitted by 4,600 users from the military services, academia and industry.

As of December 2006, the program had control of machines that could do a total of 315.5 teraflops, and that number grows by a quarter each year, as the oldest machines are replaced or augmented by newer technologies.

And over the years, the program has developed a painstakingly thorough process of specifying what kind of systems it needs.

What about HPCMP is so different? It defines its users’ workload, rather than use a set of generic performance goals.

Henry said that most of the workloads on the program’s systems can fall into one of about 10 categories, such as computational fluid dynamics, structural mechanics, chemistry and materials science, climate modeling and simulation, and electromagnetics. Each job has a unique performance characteristic and can be best run on a unique combination of processors, memory, interconnects and software. “This is better because it gauges true workload,” Ryan said.

To quantify these types of jobs, HPCMP came up with a computer program called the linear optimizer, which calculates the overall system performance for handling each of these jobs. It weights each job by how often it is executed. It also factors in the price of each system and existing systems that can already execute those tasks.

Once numbers have been generated for each proposed system, the program takes usability into consideration. Henry admitted that is hard to quantify, but it includes factors such as what sorts of third-party software is available for the platform and what sorts of compilers, debuggers and other development tools are available.

Once these performance and usability numbers are calculated, they are weighted against the past performance of the vendors. From there, the answer of which system may be the right one may be obvious — or it may come down to a narrow choice between a handful of systems.

“It’s not often they need the same type of system year after year,” Ungaro said.

Bottom line
Although DOD generally is well- represented on the twice-annual list of the world’s fastest computers — it had 11 in the June 2007 Top 100 ranking, for instance — the true beneficiaries are the researchers who can use the machines. The biggest benefit? “Time to solution,” Henry said.

DOD might need to know the performance characteristics of an airplane fuselage. Using a very accurate simulation saves money and time from testing actual fuselages.

“Typically, the kind of equations we’re trying to solve require from dozens to thousands of differential calculations,” Henry said. And each equation “can require a tremendous number of iterations.”

Imagine executing a single problem a million or even tens of millions of times at once, with each execution involving thousands of calculations. That’s the size of the job these systems usually handle.

DOD has many problems to test against. Programs track toxic releases of gas spread across an environment. They help develop better algorithms for tracking targets on the ground from moving radars. They speed development of missiles. In one example, supercomputing shortened the development time of the Hellfire missile to just 13 months, allowing it to be deployed in Iraq two years earlier than otherwise would have been possible.

By providing the fastest computing power available, the program in its modest way can assure the Defense Department stays ahead of the enemy.

By IgnoranceIsntBliss

A mere acronym of 2 letters (DU) is all it takes to completely destroy Al Gore and his ‘projected’ divine destiny to save the earth from man-made environmental doomsday. But this lesson in hypocrisy doesn’t end there.

I hadn’t actually thought of the lynchpin point until I recently finished Al Gore’s Assault on Reason book. It should have been titled The Assault on Bush as he joked about it not being, in a video interview I seen somewhere online, because there’s hardly a page in it that doesn’t mention GWB and his minions. I counted 105 of 273 pages that included attacks on just the issue of Bush’s Iraq / Foreign-Policy for example.

One could hardly articulate such a sophisticated blackballing, however, Al Gore actually did while bending over backwards to not step into the “conspiracy theorist” world. And his trouncing of the Bush Junta is well deserved and I do recommend all people read this book for that reason, as well as not only his democracy / American history / Media coverage but perhaps most importantly for a lesson in observing left/right political bias.

It’s important that Gore didn’t go full blown conspiracy theorist as that would obviously implicate him with things like 9/11 and Depleted Uranium.


Dennis Kucinich is one of the few with the cajonas to address the DU issue.

In page after page Al attacks Bush’s very essence, and in particular Bush’s environmental policies and the Iraqi Invasion / Occupation. Al slams Dubya on Iraq, while patting himself on the back for the Balkans sectarian conflict that they engineered. The most notable common thread between these 2 conflicts and these 2 men is the Depleted Uranium issue. Others include the engineering conflicts, sectarian conflict scenarios, the use of Al Qaeda as an instrument of proxywars & related tactical subversiveness, and even oil (pipelines).

It turns out that DU has a 4.5 BILLION Year halflife. The “Clinton-Gore Administration”, as Al calls it at every chance in his book, used DU munitions in the conflict that started and lasted over virtually the entire 8 year reign of the Clinton-Gore Administration. To add insult to injury, Gore campaigned in 2000 on (imperialist) nation building, but then blasts Bush for his hegemonic Iraqi power grab because the only difference is that Americans also die this time around in an engineered sectarian conflict.

As a side-note, perhaps Al Gore inherited his stance on the use of Uraium was a weapon from his father.

In the late, 1950s Al Gore’s father, the senator from Tennessee, proposed dousing the demilitarized zone in Korea with uranium as a cheap failsafe against an attack from the North Koreans.

After the Gulf War, Pentagon war planners were so delighted with the performance of their radioactive weapons that ordered a new arsenal and under Bill Clinton’s orders fired them at Serb positions in Bosnia, Kosovo and Serbia. More than a 100 of the DU bombs have been used in the Balkans over the last six years.
http://www.counterpunch.org/du.html

This is what Depleted Uranium does to people:

In all fairness, GWB has become history’s master of the use of these weapons.

Now we have Shrub using the same stuff that is absolutely sure to contaminate the environment until what could be considered the end of time (astro-science models predict that the Sun and life on Earth will expire well before one billion years before the DU munitions reach their half-life), but oddly enough Al Gore somehow managed to forget to mention this little tidbit in his all out partisan Bush crucifixion hit-piece book, which focuses on both the Iraq imperial power move and the environment.

But the hypocrisy doesn’t end there. “The Assault on Reason” (emphasis his) includes an entire chapter on “The Politics of Fear”, which impressively includes the sort of neuro-psychological descriptions of mind matters many would expect from my own writings here at this blog. Now I present to you clear and obvious terrormongering pieces of video propaganda:

Example 1: Look at the language: “IT WILL SHAKE YOU TO YOUR CORE”, BY FAR, THE MOST TERRIFYING FILM YOU WILL EVER SEE”, “Think of the impact of a couple hundred thousand refugee’s, and then imagine a hundred million”, “NOTHING IS SCARIER”.

Note the use of New Orleans Katrina footage, and the fact that the vast majority of damage was due to flooding that was entirely mankinds fault.

And then note how at 2:17 of the trailer there is a 3 frame burst of a nuclear bomb explosion, which is entirely out of context of the presentation. For fairness, he does have a worthy context in the actual film where they use this same –arguably even then scaremonger– clip, but in the context of the preview -that potentially millions watched- it doesn’t fit.

Anyone with even a self-prescribed education in socio-psychological propaganda can quickly tell, from Al Gore’s book and online video interviews addressing it, that Gore is clearly trained in socio-psycholgical propaganda techniques, and because of this fact there is no excuse for the out-of-context terrormongering highlighted in that single clip, which continues on into the follow-up TV commercial:

Example 2: Added terrormonger language: “Grabs you like a thriller with an ending that will haunt your dreams”, “You will be captivated… then riveted… then scared out of your wits”. That cemmercial also includes about 1-2 frames of the same nuclear bomb mushroom cloud used in the An Inconvenient Truth trailer-terror piece.

To “emphasize this point”, about Gore’s hypocrisy, it’s best I point out that there are many pages in his book with entire paragraphs about the Bush Junta‘s selective cherry-picking use of facts and scientific information in regards to most particularly the Iraq War effort and environmental issues.

Gore often claims that “the debate is over” about “Global Warmming” , but it turns out that the debate is in fact not over (see here for some debate). But to directly address his self-described (on the back of the DVD package) “persuasive” propaganda-piece, his central argument is the  650,000 year ice-core ..

It’s just too bad that it’s well established that temperature always lead to the “complex relationships” that Gore presented as if the CO2 drove the temperature:

Ice cores show CO2 increases lag behind temperature

This isn’t even merely a matter of him misreading some graph’s. He went to the extent of calling the ice core specialist as his special friend, and made it look cozy. How could Gore have not understood which came first? How would you not look to see which happens first, before making such strong statements. To waive his 650,000-year-resolution graph around -off of a scissor lift- like some sort of gospel is Hollywood trickery at ‘best’.

This scenario presents us with 2 options, that I can think of.
1) He deliberately engaged in disinfo.
2)He’s too incompetent to hold and so effectively propagate such a staunch belief.

In either case, this is but one of the many examples of Gore using sketchy science during his terrormonger campaign. For a complete lesson in Gore trickery have a look at A Skeptics Guide to An Inconvenient Truth.

Moving on, despite the reality where the D.U. weapons fact destroy’s Gore’s claim to trying to save the environment, this lesson in environmental hypocrisy doesn’t end there. For starters in the AIT film and book Al asks “Are you ready to change your life?”. It’s bad enough that YOU changing your life is the primary solution Al gives us in his film presentation, but then there’s the fact that he doesn’t practice what he preaches.

While he may buy “carbon offsets” each time he travels, it doesn’t change the fact that he flies more in one year than most humans will in their entire lives. He doesn’t just fly to in relations to his global warming activism either, but in that regards he claims in his film that he’s done over 1000 talks since 2001.


Sean Hannity’s excellent piece on Al Gore and jet travel.
(Although I loathe that man too.)

But perhaps his travels can be somehow ignored in light of what he tells us we should do, but unfortunately we can’t rationaize his home habits as being part of the effort to save the earth.

According to Schweizer, the Gores own three homes: a 10,000-square-foot home in Nashville, Tennessee; a 4,000-square-foot home in Arlington, Virginia (across the Potomac River from Washington, DC); and a third home of undisclosed size in Carthage, Tennessee. Neutral Source has verified the Gores own a 2.1 acre property at 312 Lynnwood Blvd. in the Belle Meade section of Nashville, Tennessee (Parcel ID 11611005600). Its assessed value in January 2005 was $3 million, but we have not been able to validate Schweizer’s claims about its size. So we performed a search and found 15 single-family homes on the market in the Gores’ Nashville neighborhood with asking prices of $3 million or more. Of the 298 neighborhood properties on the market, only three are listed at $3 million or more. According to MLS data, these houses are 9.727, 7,340, and 9,878 square feet respectively. So we can confirm that it’s quite plausible that the Gores’ Nashville home is, as Schweizer claims, 10,000 square feet.

So what does that mean?

Still, a rough approximation of the Gores’ residential CO2 emissions can be obtained by assuming that CarbonCounter’s “average” really means “median.” This yields 26 tons x 9.2 = 239 tons CO2 per year. CarbonCounter will “offset” the Gores’ CO2 emissions for a contribution of $10 per ton, and certainly they can afford the $2,390 contribution that CarbonCounter says will buy residential “carbon neutrality.” But actually making a large reduction in CO2 emissions from 17,000 square feet of residential living space would be both very challenging technically, and much more expensive. The Gores are easily capable of being CO2 Pragmatists with respect to residential carbon neutrality, but they cannot be CO2 Puritans without completely abandoning their lifestyle.
http://neutralsource.org/content/blog/detail/598/

In plain English:

Gore’s home uses more than 20 times the national average

So here we have Al Gore telling us that we need change our lives, meanwhile he’s using 20+ times the amount of energy as something like 90+% of the rest of the world population.

While we’re on the energy topic perhaps it’s best to point out that Gore has a record of relations with oil ‘interests’.

Al Gore: The Other Oil CandidateCorpwatch, August 29th, 2000
For thousands of years, the Kitanemuk Indians made their home in the Elk Hills of central California. Come February 2001, the last of the 100 burial grounds, holy places and other archaeological sites of the Kitanemuks will be obliterated by the oil drilling of Occidental Petroleum Company. Oxy’s plans will “destroy forever the evidence that we once existed on this land,” according to Dee Dominguez, a Kitanemuk whose great grandfather was a signatory to the 1851 treaty that surrendered the Elk Hills.

Occidental’s planned drilling of the Elk Hills doesn’t only threaten the memory of the Kitanemuk. Environmentalists say a rare species of fox, lizard and the kangaroo rat would also be threatened by Oxy’s plans. A lawsuit has been filed under the Endangered Species Act. But none of that has given pause to Occidental or the politician who helped engineer the sale of the drilling rights to the federally-owned Elk Hills. That politician is Al Gore.

So in light of that, along with the Clinton-Gore engineered Bosnia-Kosovo proxywar for oil-pipelines, it seems obvious that Gore is a tad more oil friendly than one would expect. It should also be pointed out that Gore was one of only 10 Democrats to break ranks and support the Persian Gulf War, which is interesting considering the history of that conflict. That’s all too bad because US Foreign Policy and wars in general are probably the biggest contributers to the environmental footprint of us human beings.


Ron Paul: the CIA / Foreign Policy contributing to Global Warming; at time 2:25.
Al Gore supports nation building:

Q. Bush made nation-building a point of difference with you [in the Oct. 3 debate].
A. I think that phrase taps into a legitimate concern about how far we should go and how long we should be involved. But it’s not a new mission. The Marshall Plan was about nation-building. And the generation that won World War II, having seen the catastrophe of the interwar period in the 20’s and 30’s, wisely decided that nation-building was a preferable alternative to World War III. And it was a stunning success.
source

As proven under Clinton-Gore and Bush-Cheney “nation building” also means war, and that brings us to the next big Bush-Gore parallel of hypocrisy. It turns out that both Bush and Gore brag about their Christian faiths, yet somehow each is engaged in the same goal which is a de facto ‘god on earth’ artificial intelligence system.

While Bush is driving this initiative through with excessive militarism, Al Gore on the other hand is driving the same goal with his friends at NASA and Google. Gore is arguably the main man behind this effort, being right in the center of the eye of this whirlwind down there at the main Googleplex HQ, which is situated right next door to the NASA Ames Research Center, in Silicon Valley, where NASA and Google in 2005 entered a partnership in the development of cognitive artificial intelligence.


Click for the full story.

It’s hard to imagine why a professed ‘man of the faith’ would take part in such an operation, but then again maybe his words on the Larry King Show -in reference to “global warming”- may shed some light on this contradiction: “It’s really a spiritual issue.”

In closing, RAGE said it best:

The UK Ministry of Defence released their future global / military forecast document recently. It highlights key areas such as Artificial Intelligence, Transhumanism, climate change, globalization, and so on. It’s more or less a doomsday scenario narrative, much of which being marked as “probable”. This is of little surprise considering my usual reporting, combined with others such as the forecast that humans have a 50/50 chance of surviving the 21st Century.

GuardianUK covered it and mentioned many of the key topics. Then Prisonplanet.com covered it with the expected “new world order” spin. Not surprisingly, neither source even touched the A.I. subject, nor did they provide the link to the official document, which would have allowed ‘casual’ readers to easily click through and potentially find those sections themselves.

All findings within Strategic Trends are presented with an indication of confidence”

Having established trend-based outcomes of varying probability, Strategic Trends articulates a number of specific Risks associated with each dimension to highlight the way some of the more adverse consequences could manifest themselves and affect Defence business.”

The Strategic Trends approach starts by identifying the major trends in each of these dimensions and analyses ways in which these trends are likely to develop and interact during the next 30 years, in order to establish a range of Probable Outcomes. Nothing in the future is guaranteed, of course, and Strategic Trends varies the strength of its assessments to highlight sets of Alternative Outcomes that, while less probable, are nonetheless highly plausible”

More Excerpts:

Erosion of Civil Liberties. Technology will enable pervasive surveillance in response to terrorism, rising transnational crime and the growing capability of disparate groups or individuals to inflict catastrophic damage or disruption. Coupled with intrusive, highly responsive and accessible data-bases, the emergence of a so-called ‘surveillance society’ will increasingly challenge assumptions about privacy, with corresponding impacts on civil liberties and human rights. These capabilities will be deployed by the private as well as the public sector.”

Confronted with few direct threats and declining populations, most affluent societies will attempt to minimize their Defence burden by investing in conflict prevention and, for as long as it is in their interest to do so, participating in alliances, forming communities of interest and contracting out security. The US will be the exception, making by far the greatest commitment to Defence throughout the period, consistent with its economic power and technological advantage.”

Deliberate Collateral Casualties. Both state and non-state actors may target commercial and industrial installations to inflict mass casualties, in breach of international law, as an intended primary or secondary effect. The potential impact may be reinforced by increasing industrialization in developing countries, a possible resurgence in nuclear power plant construction, and the progressive concentration of populations in urban areas.”

Globalization / Revolution:

“By 2010, most people (above 50%) will be living in urban rather than rural environments. Poor housing, weak infrastructure and social deprivation will combine with low municipal capacity to create a range of new instability risks in areas of rapid urbanization, especially in those urban settlements that contain a high proportion of unplanned and shanty development.”

During the next 30 years, every aspect of human life will change at an unprecedented rate, throwing up new features, challenges and opportunities. Three areas of change, or Ring Road issues, will touch the lives of everyone on the planet and will underpin these processes: climate change, globalization and global inequality (see panels below).”

While material conditions for most people are likely to improve over the next 30 years, the gap between rich and poor will probably increase and absolute poverty will remain a global challenge. Despite their rapid growth, significant per capita disparities will exist in countries such as China and India and smaller, but traditionally more affluent Western economies. In some regions – notably areas of Sub-Saharan Africa – a fall in poverty may be reversed. Differentials in material well-being will be more explicit through globalization and increased access to more readily and cheaply available telecommunications. Disparities in wealth and advantage will therefore become more obvious, with their associated grievances and resentments, even among the growing numbers of people who are likely to be materially more prosperous than their parents and grandparents. Absolute poverty and comparative disadvantage will fuel perceptions of injustice among those whose expectations are not met, increasing tension and instability, both within and between societies and resulting in expressions of violence such as disorder, criminality, terrorism and insurgency. They may also lead to the resurgence of not only anti-capitalist ideologies, possibly linked to religious, anarchist or nihilist movements, but also to populism and the revival of Marxism.”

Alternatively, a less even process of globalization may lead to lower-density settlement patterns, with people straddling rural and urban-based livelihoods, resulting in extensive browning of the countryside.”

Competition for resources of all kinds will intensify.”

Economic growth and increased consumption will result in greater demand and competition for essential resources. Demand for energy is likely to grow by more than half again by 2035 and fossil fuels will have to meet more than 80% of this increase.5 Major reserves are in politically unstable regions and primary consumer nations are likely to be increasingly reluctant to trust security of supply to market forces and the integrity of the international system.”

“Globalization will result in critical interdependencies that will link members of a globalized society that includes a small super-rich elite and a substantial underclass of slum and subsistence dwellers, who will make up 20% of the world population in 2020.

Declining youth populations in Western societies could become increasingly dissatisfied with their economically burdensome ‘baby-boomer’ elders, among whom much of societies’ wealth would be concentrated. Resentful at a generation whose values appear to be out of step with tightening resource constraints, the young might seek a return to an order provided by more conservative values and structures. This could lead to a civic renaissance, with strict penalties for those failing to fulfil their social obligations. It might also open the way to policies which permit euthanasia as a means to reduce the burden of care for the elderly.”

The middle classes could become a revolutionary class, taking the role envisaged for the proletariat by Marx. The globalization of labour markets and reducing levels of national welfare provision and employment could reduce peoples’ attachment to particular states. The growing gap between themselves and a small number of highly visible super-rich individuals might fuel disillusion with meritocracy, while the growing urban under-classes are likely to pose an increasing threat to social order and stability, as the burden of acquired debt and the failure of pension provision begins to bite. Faced by these twin challenges, the world’s middle-classes might unite, using access to knowledge, resources and skills to shape transnational processes in their own class interest.”

A growing Hispanic population in the US might lead to increasing social tensions, possibly resulting in an aggressive separatist movement. Unlike the Black Power militants of the 1960s, this movement might focus on geographically-based self-determination as its aim, threatening secession by Hispanic-majority states. Confronted by this threat, the US might become increasingly introspective, withdrawing from all non-essential overseas commitments. In the wider world, other states and non-state actors could take advantage of the US withdrawal or break-up, using violence to pursue objectives that, otherwise, might have provoked a US military response.

Economic globalization and indiscriminate migration may lead to levels of international integration that effectively bring interstate warfare to an end; however, it will also result in communities of interest at every level of society that transcend national boundaries and could resort to the use of violence. Operating within a globalized system, states might not be willing or able to regulate these groups’ activities, concentrating on containing the risk and diverting their activities elsewhere according to their interests. In addition, rivalries between interest groups that cannot gain economic and information leverage might increasingly resort to violence and coercion, evolving loose arrangements and networks similar to those currently used by criminal organizations.

In a globalized environment, military technologies will be developed at an accelerating pace, some of which might have the potential to render established capabilities obsolete. For example, a cheap, simple-to-make and easy-to-use weapon might be invented that is effective against a wide range of targets and against which established countermeasures are ineffective.”

The US position as the world’s most indebted nation makes it vulnerable to stock market collapse, currency runs and economic crisis, as well as global currency manipulation. The most likely cause of crisis would be energy market instability or volatility leading to a loss of market confidence. Also, failure to continue to support or service its debt in these circumstances would put US creditors and commodity suppliers at risk, possibly causing a global economic downturn.”

Key natural resources, especially oil, gas and minerals of strategic value, will continue to be sourced from unstable areas and unreliable regions. Maintaining access and containing instability risks in these areas is therefore likely to increase in importance, alongside wider developmental and stabilization roles. Where oil and gas sources are located in areas of doubtful security, military intervention may be used to protect the integrity of sites and to secure investments.”

The middle class will be more vulnerable to economic and social volatility. This may trigger a rise in political engagement and may encourage a resort to either communitarian solutions or extremist politics. While the immediate risk may exist at the national level, exposure to globalized economic forces may cause a reaction to globalization and ultimately fuel tension and difficulties at international levels.”

Social transformation arising from globalization, demographic imbalances and economic shifts will result in wide-ranging, often intense, instability risks, whose impacts will be transmitted beyond their immediate point of origin. These features will demand sensitive warning, strong governance and responsive containment arrangements. In an unstable economic environment or in the event of social crisis, an increase in militancy and activism, possibly based on a declining middle-class, is likely to fuel extremist politics in some societies, possibly characterized by resurgent nationalism and authoritarianism.”

Going Underground. All likely future opponents will have recognized the advantages of going underground if they wish to avoid the surveillance, targeting and penetrative capabilities of sophisticated military forces, particularly those deploying air platforms and systems. In future, states will seek to site most of their major nodes and the majority of their decisive fighting power underground or among civilian infrastructure that it is illegal or unethical to target. Similarly, irregular opponents will base themselves in underground networks, both for offence and defence, especially in complex urban spaces.”

In a fast-changing area, it is difficult and foolish, outside the realms of science fiction, to forecast in any depth technological breakthroughs or their likely applications. Many of the interrelated effects of globalization, including market-manipulation by existing stakeholders, the unpredictability of consumer demand and complex routes to market, will make predictions for the future even less certain. Many issues, including control regimes, will have to be addressed as they arise, although it might be anticipated that some issues will become highly charged.”

Artificial Intelligence / Transhumanism:

Increasing pervasiveness and exploitation of technology at all levels of warfare will increase the distance between ‘the point of the spear’ and the point of interaction for most personnel. Such reliance on technology and unmanned, remote options is likely to lead to increasing vulnerability to a resurgence in traditional, mass warfighting and irregular activity. Ethical questions regarding the accountability for automated actions are also likely to increase.”

Cognitive Science – Routes to the direct application of advances in cognitive science are less clear than nanotechnology or biotechnology; however, indications are that interdisciplinary advances involving cognitive science are likely to enable us more effectively to map cognitive processes. Soft Artificial Intelligence is already well established with self diagnosing and self reconfiguring networks in use and self repairing networks likely in the next 10 years. Mapping of human brain functions and the replication of genuine intelligence is possible before 2035″

Advances in social science, behavioural science and mathematical modelling will combine, leading to more informed decision making. Advanced processing and computational power will permit a new level of pattern recognition (Combinatronics) enabling the decoding of previously unrecognised or undecipherable systems and allowing the modelling of a range of biological to social, political and economic processes. As a result, simulation and representatives will have a significant and widespread impact on the future and will become an increasingly powerful tool to aid policy and decision makers.21 It will also blur the line between illusion and reality.”

AI. The progressive introduction of ‘soft’ AI and further simplification of the Human Computer Interface (HCI) is likely to change the emphasis in training from technical aspects of system operation to the application of judgement in the employment of systems and the conduct of operations. This will stimulate a cultural change with significant effects on the requirements for manpower, command structures and training.”

The application of advanced genetics could challenge current assumptions about human nature and existence. Initially employed for medical purposes, breakthroughs in these areas could be put to ethically questionable uses, such as the super-enhancement of human attributes, including physical strength and sensory perception. Extreme variation in attributes could arise between individuals, or where enhancement becomes a matter of fashion, between societies, creating additional reasons for conflict.”

Developments in genetics might allow treatment of the symptoms of ageing and this would result in greatly increased life expectancy for those who could afford it. The divide between those that could afford to ‘buy longevity’ and those that could not, could aggravate perceived global inequality. Dictatorial or despotic rulers could potentially also ‘buy longevity’, prolonging their regimes and international security risks.”

Human Nature of War Challenged by Technology. Increasing pervasiveness and exploitation of technology at all levels of warfare will increase the distance between ‘the point of the spear’ and the point of interaction for most personnel. Such reliance on technology and unmanned, remote options is likely to lead to increasing vulnerability to a resurgence in traditional, mass warfighting and irregular activity. Ethical questions regarding the accountability for automated actions are also likely to increase.”

A more permissive R&D environment could accelerate the decline of ethical constraints and restraints. The speed of technological and cultural change could overwhelm society’s ability to absorb the ethical implications and to develop and apply national and international regulatory and legal controls. Such a regulatory vacuum would be reinforcing as states and commercial organizations race to develop and exploit economic, political and military advantage. The nearest approximation to an ethical framework could become a form of secular utilitarianism, in an otherwise amoral scientific culture.”

The Role of Artificial Intelligence. The simulation of cognitive processes using Artificial Intelligence (AI) is likely to be employed to manage knowledge and support decision-making, with applications across government and commercial sectors. Reliance on AI will create new vulnerabilities that are likely be exploited by criminals, terrorists or other opponents.”

Unmanned Technologies. Advances in autonomous systems, which promise to reduce substantially the physical risks to humans and mitigate some of their weaknesses, will allow the wider exploration and exploitation of extreme or hazardous environments such as deep sea, underground, contaminated areas and outer space. Furthermore, these technologies will allow increased Defence exploitation in all environments with a correspondingly reduced risk to military personnel and an expanded range of capabilities. AI and the effective replication of human judgement processes, when combined with autonomous systems, particularly robotics, are likely to enable the application of lethal force without human intervention, raising consequential legal and ethical issues.”

By 2035, an implantable information chip could be developed and wired directly to the user’s brain. Information and entertainment choices would be accessible through cognition and might include synthetic sensory perception beamed direct to the user’s senses. Wider related ICT developments might include the invention of synthetic telepathy, including mind-to-mind or telepathic dialogue. This type of development would have obvious military and security, as well as control, legal and ethical, implications.”

While it will be difficult to predict particular breakthroughs, trend analysis indicates that the most substantial technological developments will be in: ICT, biotechnology, energy, cognitive science, smart materials and sensor/network technology. Advanced nanotechnology will underpin many breakthroughs, (See text box). Developments in these areas are likely to be evolutionary, but where disciplines interact, such as in the combination of Cognitive Science and ICT to produce advanced decision-support tools, developments are likely to be revolutionary, resulting in the greatest opportunities for novel or decisive application. Most technological breakthroughs will be positive, however, many advances will also present potential threats, either through perverse applications, such as the use of genetic engineering to produce designer bio-weapons or unstable substances, or through the unanticipated consequences of experimental technological innovation.

Greater connectivity and accessibility to information through the proliferation of ICT will stimulate intensifying international debate on ethics, regulation and law, and will cause religious, ethical and moral concerns and disputes. The pace and diffusion of R&D and the operation of commercial imperatives will make global regulation difficult and will increase the opportunities for unethical or irresponsible actors to evade control. In addition, the effectiveness of regulation is likely to vary by culture, region or country, with an uneven application of, and access to, innovation. However, these issues are likely to be highly politicized and the issues are likely, on past evidence, to cause localized disorder and possibly organized violence.”

Scientific breakthroughs are likely to have the potential to improve the quality of life for many, for example in the safe genetic modification of crops or through stem cell research. However, a combination of market pricing or ethically based regulation may obstruct access by those who might wish or need to benefit most, thereby reinforcing inequality and a sense of grievance.”

“Conversely, it is possible that innovation will take place even more rapidly than is anticipated. Breakthroughs such as the early development of quantum computing will add significant impetus to the pace of technological change and information processing. Specific advances may also have significant geopolitical impacts. For example, a breakthrough in energy technology will alter the global significance of the Middle East, reducing Western strategic dependence on an unstable and volatile area.”

By the end of the period it is likely that the majority of the global population will find it difficult to ‘turn the outside world off’. ICT40 is likely to be so pervasive that people are permanently connected to a network or two-way data stream with inherent challenges to civil liberties; being disconnected could be considered suspicious.”

Technology / Weapons:

Innovation is likely to continue at an unprecedented rate and there is likely to be a multiplicity of sources of innovation and production. Making predictions about how novel and emerging technologies would be exploited and applied will be difficult and imprecise. The rate of change, tempo and unpredictability of innovation and development will challenge decision-makers who will have to anticipate and respond to direct and indirect outcomes. Notwithstanding this, trends indicate that the most rapid technological advances are likely in: ICT, energy, biotechnology, cognitive science, sensors and networks and smart materials. Nanotechnology is likely to be an important enabler for other developments, for example in electronics, sensors and commodity manufacture. Whilst technology will benefit many people, its application and integration will continue to be unequal, reinforcing differences in understanding, advantage and opportunity between the haves and have-nots.”

Technology and Fighting Power. Successful exploitation of new technology, such as Directed Energy Weapons will depend on the users’ understanding of both the advantages and the limitations to its application across physical, conceptual and moral components of fighting power. Those who fail to do so are likely to risk defeat by those who achieve a better component mix, by those who target components to which technological advantage does not apply, or by those who employ technologies such as Electromagnetic Pulse (EMP) to neutralize a more sophisticated adversary’s capability. Small incremental changes in technology are also likely to lead to disproportionally large increases in warfighting capability and effectiveness. This is likely to lead to the reduction of transitional concept-to-capability timescales and increase the scope for technology leakage and more discriminating use of Off-The-Shelf (OTS) applications, especially in the areas of nano- and bio- technology.”

Perverse Application of Technology. The development of technologies that have hitherto been considered benign may be subverted for hostile use. For example, biotechnology and genetic engineering may be combined to create ‘designer’ bio-weapons to target crops, livestock, or particular ethnic groups.”

Given current multi-lateral agreements and technical factors, the effective weaponization of space is unlikely before 2020. However, nations will seek to inhibit the use of space by opponents through a combination of electromagnetic manipulation, hard-kill from ground-based sensor and weapon systems, the targeting of supporting ground-based infrastructure and a range of improvised measures. At its most extreme, the weaponization of space may eventually include the development of space-based strike weapons capable of attacking ground-based and other space targets; for example solid metal projectiles travelling at orbital velocities, so-called ‘rods from the gods’. However, this will remain extremely unlikely without the prospect of sustained and extreme deterioration in international relationships and will be technically difficult to achieve before 2020.”

Innovation, research and development will originate from more international and diffuse sources and will proliferate widely, making regulation and control of novel technologies more challenging. The exploitation of these may have catastrophic results, especially those associated with nanotechnology, biotechnology and weapon systems. These may be unintended, for example ‘runaway’ nanotechnology or biotechnology, or intended, such as the development and use of directed energy or electromagnetic-pulse weapons.”

Access to technology that enables the production and distribution of Chemical, Biological, Radiological and Nuclear (CBRN) weapons is likely to increase. A critical indicator of risk is contained in the examples of North Korea and Iran – both in obtaining or seeking nuclear weapons and in exploiting their putative possession for political and economic advantage. In future, much proliferation and threat will be manifest in the ungoverned space between legality and realpolitik, together with the distinct possibility of the acquisition of CBRN material by non-state and rogue elements.”

In the use of violence and the threat of force, military and civil distinctions will become blurred and weapons and technologies will be more widely available to potential combatants and individuals. The greatest risks of large-scale conflict will be in areas of economic vulnerability, poor governance, environmental and demographic stress and enduring inequality and hardship, especially where there has been a history of recurring conflict (See Figure 2). Most conflicts will be societal, involving civil war, intercommunal violence, insurgency, pervasive criminality and widespread disorder. However, in areas subject to significant demographic and wealth imbalances, there will be a risk of large scale cross-border migration and exogenous shock. Finally, a trend towards societal conflict will be reflected in the continuing prevalence of civilian casualties, as it takes place in increasingly urbanized situations and human networks.”

Arms Rivalry. Increasing strategic and possibly inter-bloc competition is likely as a result of the emergence of major new powers. This may stimulate intensive arms races, for example between China and the US, or between regional rivals such as India and Pakistan, reducing resources available for peaceful economic development. The increase in arms spending would probably extend beyond immediate rivals to include their neighbours and partners, thus intensifying regional tensions and increasing the chances of conflict.”

At the most serious level, space systems could be destroyed or disabled by a burst of solar energy or a natural fluctuation. Similarly, satellites and space platforms could be destroyed or damaged in a deliberate hostile attack, or by being struck by space-debris, causing a cascade of collateral damage to other space-based platforms. The damage could be amplified if an element of the chain explodes and emits an electromagnetic pulse. The consequences might include catastrophic failures of critical space-enabled utilities, triggering widespread mass-transport accidents, multiple military and public service system failures and the collapse of international financial systems.”

Electromagnetic Pulse (EMP) capabilities will probably become operational during the period out to 2035. It could be used to destroy all ICT devices over selected areas, while limiting wider physical and human damage. While military and other high-value networks may be hardened against this threat, most networks and communities on which societies depend, will not. The employment of an EMP weapon against a ‘World-City’ (for example, an international business-service hub) would have significant impact beyond the country against which it was targeted. It might even reduce political and business confidence in globalized economic processes to the point that concern about national economic resilience reverses internationally integrative trends, leading to a world increasingly characterized by protection, control and isolationism.”

The political purpose most commonly envisaged for nuclear weapons has been to deter nuclear attack by, or to offset the conventional superiority of, a potential adversary. Future concerns will centre on the potential acquisition of nuclear weapons by terrorists and other irregular entities, for coercive purposes or to inflict massive casualties. In addition, existing assumptions about the employment of nuclear weapons may be challenged in still more radical ways, including the exploration of neutron possibilities. The ability to inflict organic destruction, while leaving infrastructure intact, might make it a weapon of choice for extreme ethnic cleansing in an increasingly populated world. Alternatively, it might be considered as a basis for a new era of deterrence both in outfacing irresponsible nuclear powers and in opposing demographically strong nations.”

Doomsday Scenario
Many of the concerns over the development of new technologies lie in their safety, including the potential for disastrous outcomes, planned and unplanned. For example, it is argued that nanotechnology could have detrimental impacts on the environment, genetic modification could spiral out of control and that AI could be superior to that of humans, but without the restraining effect of human social conditioning. Various doomsday scenarios arising in relation to these and other areas of development present the possibility of catastrophic impacts, ultimately including the end of the world, or at least of humanity.”

NOTE: All emphasis and formatting theirs!