Here’s How We Could Brighten Clouds to Cool the Earth - IEEE Spectrum

2022-05-28 18:30:47 By : Ms. Yaofeng Jiangsu

IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy.

"Ship tracks" over the ocean reveal a new strategy to fight climate change

An effervescent nozzle sprays tiny droplets of saltwater inside the team's testing tent.

As we confront the enormous challenge of climate change, we should take inspiration from even the most unlikely sources. Take, for example, the tens of thousands of fossil-fueled ships that chug across the ocean, spewing plumes of pollutants that contribute to acid rain, ozone depletion, respiratory ailments, and global warming.

The particles produced by these ship emissions can also create brighter clouds, which in turn can produce a cooling effect via processes that occur naturally in our atmosphere. What if we could achieve this cooling effect without simultaneously releasing the greenhouse gases and toxic pollutants that ships emit? That's the question the Marine Cloud Brightening (MCB) Project intends to answer.

Scientists have known for decades that the particulate emissions from ships can have a dramatic effect on low-lying stratocumulus clouds above the ocean. In satellite images, parts of the Earth's oceans are streaked with bright white strips of clouds that correspond to shipping lanes. These artificially brightened clouds are a result of the tiny particles produced by the ships, and they reflect more sunlight back to space than unperturbed clouds do, and much more than the dark blue ocean underneath. Since these "ship tracks" block some of the sun's energy from reaching Earth's surface, they prevent some of the warming that would otherwise occur.

The formation of ship tracks is governed by the same basic principles behind all cloud formation. Clouds naturally appear when the relative humidity exceeds 100 percent, initiating condensation in the atmosphere. Individual cloud droplets form around microscopic particles called cloud condensation nuclei (CCN). Generally speaking, an increase in CCN increases the number of cloud droplets while reducing their size. Through a phenomenon known as the Twomey effect, this high concentration of droplets boosts the clouds' reflectivity (also called albedo). Sources of CCN include aerosols like dust, pollen, soot, and even bacteria, along with man-made pollution from factories and ships. Over remote parts of the ocean, most CCN are of natural origin and include sea salt from crashing ocean waves.

Satellite imagery shows "ship tracks" over the ocean: bright clouds that form because of particles spewed out by ships.Jeff Schmaltz/MODIS Rapid Response Team/GSFC/NASA

The aim of the MCB Project is to consider whether deliberately adding more sea salt CCN to low marine clouds would cool the planet. The CCN would be generated by spraying seawater from ships. We expect that the sprayed seawater would instantly dry in the air and form tiny particles of salt, which would rise to the cloud layer via convection and act as seeds for cloud droplets. These generated particles would be much smaller than the particles from crashing waves, so there would be only a small relative increase in sea salt mass in the atmosphere. The goal would be to produce clouds that are slightly brighter (by 5 to 10 percent) and possibly longer lasting than typical clouds, resulting in more sunlight being reflected back to space.

"Solar climate intervention" is the umbrella term for projects such as ours that involve reflecting sunlight to reduce global warming and its most dangerous impacts. Other proposals include sprinkling reflective silicate beads over polar ice sheets and injecting materials with reflective properties, such as sulfates or calcium carbonate, into the stratosphere. None of the approaches in this young field are well understood, and they all carry potentially large unknown risks.

Solar climate intervention is not a replacement for reducing greenhouse gas emissions, which is imperative. But such reductions won't address warming from existing greenhouse gases that are already in the atmosphere. As the effects of climate change intensify and tipping points are reached, we may need options to prevent the most catastrophic consequences to ecosystems and human life. And we'll need a clear understanding of both the efficacy and risks of solar climate intervention technologies so people can make informed decisions about whether to implement them.

Our team, based at the University of Washington, the Palo Alto Research Center (PARC), and the Pacific Northwest National Laboratory, comprises experts in climate modeling, aerosol-cloud interactions, fluid dynamics, and spray systems. We see several key advantages to marine cloud brightening over other proposed forms of solar climate intervention. Using seawater to generate the particles gives us a free, abundant source of environmentally benign material, most of which would be returned to the ocean through deposition. Also, MCB could be done from sea level and wouldn't rely on aircraft, so costs and associated emissions would be relatively low.

The effects of particles on clouds are temporary and localized, so experiments on MCB could be carried out over small areas and brief time periods (maybe spraying for a few hours per day over several weeks or months) without seriously perturbing the environment or global climate. These small studies would still yield significant information on the impacts of brightening. What's more, we can quickly halt the use of MCB, with very rapid cessation of its effects.

Solar climate intervention is the umbrella term for projects that involve reflecting sunlight to reduce global warming and its most dangerous impacts.

Our project encompasses three critical areas of research. First, we need to find out if we can reliably and predictably increase reflectivity. To this end, we'll need to quantify how the addition of generated sea salt particles changes the number of droplets in these clouds, and study how clouds behave when they have more droplets. Depending on atmospheric conditions, MCB could affect things like cloud droplet evaporation rate, the likelihood of precipitation, and cloud lifetime. Quantifying such effects will require both simulations and field experiments.

Second, we need more modeling to understand how MCB would affect weather and climate both locally and globally. It will be crucial to study any negative unintended consequences using accurate simulations before anyone considers implementation. Our team is initially focusing on modeling how clouds respond to additional CCN. At some point we'll have to check our work with small-scale field studies, which will in turn improve the regional and global simulations we'll run to understand the potential impacts of MCB under different climate change scenarios.

The third critical area of research is the development of a spray system that can produce the size and concentration of particles needed for the first small-scale field experiments. We'll explain below how we're tackling that challenge.

One of the first steps in our project was to identify the clouds most amenable to brightening. Through modeling and observational studies, we determined that the best target is stratocumulus clouds, which are low altitude (around 1 to 2 km) and shallow; we're particularly interested in "clean" stratocumulus, which have low numbers of CCN. The increase in cloud albedo with the addition of CCN is generally strong in these clouds, whereas in deeper and more highly convective clouds other processes determine their brightness. Clouds over the ocean tend to be clean stratocumulus clouds, which is fortunate, because brightening clouds over dark surfaces, such as the ocean, will yield the highest albedo change. They're also conveniently close to the liquid we want to spray.

In the phenomenon called the Twomey effect, clouds with higher concentrations of small particles have a higher albedo, meaning they're more reflective. Such clouds might be less likely to produce rain, and the retained cloud water would keep albedo high. On the other hand, if dry air from above the cloud mixes in (entrainment), the cloud may produce rain and have a lower albedo. The full impact of MCB will be the combination of the Twomey effect and these cloud adjustments. Rob Wood

Based on our cloud type, we can estimate the number of particles to generate to see a measurable change in albedo. Our calculation involves the typical aerosol concentrations in clean marine stratocumulus clouds and the increase in CCN concentration needed to optimize the cloud brightening effect, which we estimate at 300 to 400 per cubic centimeter. We also take into account the dynamics of this part of the atmosphere, called the marine boundary layer, considering both the layer's depth and the roughly three-day lifespan of particles within it. Given all those factors, we estimate that a single spray system would need to continuously deliver approximately 3x10 15 particles per second to a cloud layer that covers about 2,000 square kilometers. Since it's likely that not every particle will reach the clouds, we should aim for an order or two greater.

We can also determine the ideal particle size based on initial cloud modeling studies and efficiency considerations. These studies indicate that the spray system needs to generate seawater droplets that will dry to salt crystals of just 30–100 nanometers in diameter. Any smaller than that and the particles will not act as CCN. Particles larger than a couple hundred nanometers are still effective, but their larger mass means that energy is wasted in creating them. And particles that are significantly larger than several hundred nanometers can have a negative effect, since they can trigger rainfall that results in cloud loss.

We need a clear understanding of both the efficacy and risks of solar climate intervention technologies so people can make informed decisions about whether to implement them.

Creating dry salt crystals of the optimal size requires spraying seawater droplets of 120–400 nm in diameter, which is surprisingly difficult to do in an energy-efficient way. Conventional spray nozzles, where water is forced through a narrow orifice, produce mists with diameters from tens of micrometers to several millimeters. To decrease the droplet size by a factor of ten, the pressure through the nozzle must increase more than 2,000 times. Other atomizers, like the ultrasonic nebulizers found in home humidifiers, similarly cannot produce small enough droplets without extremely high frequencies and power requirements.

Solving this problem required both out-of-the-box thinking and expertise in the production of small particles. That's where Armand Neukermans came in.

Kate Murphy leads the engineering effort for the MCB project at PARC, the Xerox research lab in Silicon Valley. Christopher Michel

Armand Neukermans brought his expertise in ink jet printers to bear on the quest to make nozzles that could efficiently and reliably spray tiny droplets of seawater. Christopher Michel

After a distinguished career at HP and Xerox focused on production of toner particles and ink jet printers, in 2009 Neukermans was approached by several eminent climate scientists, who asked him to turn his expertise toward making seawater droplets. He quickly assembled a cadre of volunteers—mostly retired engineers and scientists. and over the next decade, these self-designated "Old Salts" tackled the challenge. They worked in a borrowed Silicon Valley laboratory, using equipment scrounged from their garages or purchased out of their own pockets. They explored several ways of producing the desired particle size distributions with various tradeoffs between particle size, energy efficiency, technical complexity, reliability, and cost. In 2019 they moved into a lab space at PARC, where they have access to equipment, materials, facilities, and more scientists with expertise in aerosols, fluid dynamics, microfabrication, and electronics.

The three most promising techniques identified by the team were effervescent spray nozzles, spraying salt water under supercritical conditions, and electrospraying to form Taylor cones (which we'll explain later). The first option was deemed the easiest to scale up quickly, so the team moved forward with it. In an effervescent nozzle, pressurized air and salt water are pumped into a single channel, where the air flows through the center and the water swirls around the sides. When the mixture exits the nozzle, it produces droplets with sizes ranging from tens of nanometers to a few micrometers, with the overwhelming number of particles in our desired size range. Effervescent nozzles are used in a range of applications, including engines, gas turbines, and spray coatings.

The key to this technology lies in the compressibility of air. As a gas flows through a constricted space, its velocity increases as the ratio of the upstream to downstream pressures increases. This relationship holds until the gas velocity reaches the speed of sound. As the compressed air leaves the nozzle at sonic speeds and enters the environment, which is at much lower pressure, the air undergoes a rapid radial expansion that explodes the surrounding ring of water into tiny droplets.

Coauthor Gary Cooper and intern Jessica Medrado test the effervescent nozzle inside the tent. Kate Murphy

Neukermans and company found that the effervescent nozzle works well enough for small-scale testing, but the efficiency—the energy required per correctly sized droplet—still needs to be improved. The two biggest sources of waste in our system are the large amounts of compressed air needed and the large fraction of droplets that are too big. Our latest efforts have focused on redesigning the flow paths in the nozzle to require smaller volumes of air. We're also working to filter out the large droplets that could trigger rainfall. And to improve the distribution of droplet size, we're considering ways to add charge to the droplets; the repulsion between charged droplets would inhibit coalescence, decreasing the number of oversized droplets.

Though we're making progress with the effervescent nozzle, it never hurts to have a backup plan. And so we're also exploring electrospray technology, which could yield a spray in which almost 100 percent of the droplets are within the desired size range. In this technique, seawater is fed through an emitter—a narrow orifice or capillary—while an extractor creates a large electric field. If the electrical force is of similar magnitude to the surface tension of the water, the liquid deforms into a cone, typically referred to as a Taylor cone. Over some threshold voltage, the cone tip emits a jet that quickly breaks up into highly charged droplets. The droplets divide until they reach their Rayleigh limit, the point where charge repulsion balances the surface tension. Fortuitously, surface seawater's typical conductivity (4 Siemens per meter) and surface tension (73 millinewtons per meter) yield droplets in our desired size range. The final droplet size can even be tuned via the electric field down to tens of nanometers, with a tighter size distribution than we get from mechanical nozzles.

This diagram (not to scale) depicts the electrospray system, which uses an electric field to create cones of water that break up into tiny droplets. Kate Murphy

Electrospray is relatively simple to demonstrate with a single emitter-extractor pair, but one emitter only produces 10 7–109 droplets per second, whereas we need 1016–1017 per second. Producing that amount requires an array of up to 100,000 by 100,000 capillaries. Building such an array is no small feat. We're relying on techniques more commonly associated with cloud computing than actual clouds. Using the same lithography, etch, and deposition techniques used to make integrated circuits, we can fabricate large arrays of tiny capillaries with aligned extractors and precisely placed electrodes.

Images taken by a scanning electron microscope show the capillary emitters used in the electrospray system. Kate Murphy

Testing our technologies presents yet another set of challenges. Ideally, we would like to know the initial size distribution of the saltwater droplets. In practice, that's nearly impossible to measure. Most of our droplets are smaller than the wavelength of light, precluding non-contact measurements based on light scattering. Instead, we must measure particle sizes downstream, after the plume has evolved. Our primary tool, called a scanning electrical mobility spectrometer, measures the mobility of charged dry particles in an electrical field to determine their diameter. But that method is sensitive to factors like the room's size and air currents and whether the particles collide with objects in the room.

To address these problems, we built a sealed 425 cubic meter tent, equipped with dehumidifiers, fans, filters, and an array of connected sensors. Working in the tent allows us to spray for longer periods of time and with multiple nozzles, without the particle concentration or humidity becoming higher than what we would see in the field. We can also study how the spray plumes from multiple nozzles interact and evolve over time. What's more, we can more precisely mimic conditions over the ocean and tune parameters such as air speed and humidity.

Part of the team inside the test tent; from left, "Old Salts" Lee Galbraith and Gary Cooper, Kate Murphy of PARC, and intern Jessica Medrado. Kate Murphy

We'll eventually outgrow the tent and have to move to a large indoor space to continue our testing. The next step will be outdoor testing to study plume behavior in real conditions, though not at a high enough rate that we would measurably perturb the clouds. We'd like to measure particle size and concentrations far downstream of our sprayer, from hundreds of meters to several kilometers, to determine if the particles lift or sink and how far they spread. Such experiments will help us optimize our technology, answering such questions as whether we need to add heat to our system to encourage the particles to rise to the cloud layer.

The data obtained in these preliminary tests will also inform our models. And if the results of the model studies are promising, we can proceed to field experiments in which clouds are brightened sufficiently to study key processes. As discussed above, such experiments would be performed over a small and short time so that any effects on climate wouldn't be significant. These experiments would provide a critical check of our simulations, and therefore of our ability to accurately predict the impacts of MCB.

It's still unclear whether MCB could help society avoid the worst impacts of climate change, or whether it's too risky, or not effective enough to be useful. At this point, we don't know enough to advocate for its implementation, and we're definitely not suggesting it as an alternative to reducing emissions. The intent of our research is to provide policymakers and society with the data needed to assess MCB as one approach to slow warming, providing information on both its potential and risks. To this end, we've submitted our experimental plans for review by the U.S. National Oceanic and Atmospheric Administration and for open publication as part of a U.S. National Academy of Sciences study of research in the field of solar climate intervention. We hope that we can shed light on the feasibility of MCB as a tool to make the planet safer.

Kate Murphy is a senior member of the research staff at PARC, a Xerox company, working at the intersection of mechanical engineering and materials science. She currently leads the engineering efforts for the Marine Cloud Brightening project.

Gary Cooper received his PhD from the University of Chicago in 1973. After a career in the pharmaceutical industry, he retired in 2009 and joined a volunteer group of Silicon Valley physicists and engineers working on the task of generating aerosols needed for the Marine Cloud Brightening project. The volunteer team now works out of the Palo Alto Research Center (PARC).

Sarah Doherty is a senior research scientist at the Cooperative Institute for Climate, Ocean, and Ecosystem Studies at the University of Washington, and is the program manager for the Marine Cloud Brightening project. Her research focuses on aerosols in the atmosphere and how they interact with sunlight and clouds to affect climate change.

Robert Wood is a professor of atmospheric sciences at the University of Washington. Wood's research focuses on understanding the processes controlling clouds in the Earth's atmosphere and the roles that clouds play in determining climate variability and change. He is the primary investigator on the Marine Cloud Brightening project.

Another website link to the project, https://mcbproject.org/ .

An interesting article, as is the cautionary note from Mr. Holeman. We have been modifying the climate since the beginning of the industrial revolution. It is quite possible, in my personal opinion that it may not be sufficient to establish agreements between nations to taper the insult to the planet. Active remediation will be necessary, again my personal opinion. In this case as in medicine, first do no harm. Test, tweak and scale.

I calculate that your proposed system would inject seawater into the atmosphere at a rate of about 2.3 liters per minute at 10^17 droplets per second having a diameter of 400 nm.

One liter of seawater contains a billion virus particles.

https://en.wikipedia.org/wiki/Marine_viruses

Your proposed system would therefore inject about 23 billion virus particles per minute high into the atmosphere.

While most of these marine viruses infect marine plankton others are a threat to fish. Consider, for example, rhabdoviruses "which are distinct from, but related to rabies virus. At least nine types of rhabdovirus cause economically important diseases in species including salmon, pike, perch, sea bass, carp and cod." [ibid]

Marine mammals are also at risk of marine viral infection. "In 1988 and 2002, thousands of harbour seals were killed in Europe by phocine distemper virus.[67] Many other viruses, including caliciviruses, herpesviruses, adenoviruses and parvoviruses, circulate in marine mammal populations.[68]" [ibid]

Besides 10 billion virus particles a liter of seawater also contains a billion bacteria.

https://ocean.si.edu/ocean-life/microbes/marine-microbes

Virtually every pathogen (virus, bacteria, prion and fungus) that gets flushed down to the sea would also be catapulted into the atmosphere by your scheme.

You assume the rapid evaporation of seawater that would leave dry particles of sea salt. However, your system of high shear is the same sort of mechanism used to disrupt cell membranes and to create encapsulated nanoparticles for drug delivery systems.

https://www.sciencedirect.com/topics/biochemistry-genetics-and-molecular-biology/cell-disruption

Encapsulation is, of course, a means to stabilize and protect biological materials from dessication, oxidation and so forth.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2865805/

Am I alone in thinking that your scheme could make the planet less, not more, safe?

Your weekly selection of awesome robot videos

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

Chengxu Zhou from the University of Leeds writes, “we have recently done a demo with one operator teleoperating two legged manipulator for a bottle opening task.”

If you like this and are in the market for a new open source quadruped controller, CMU’s got that going on, too.

A bolt-on 360 camera kit for your drone that costs $430.

I think I may be too old to have any idea what’s going on here.

I’m not the biggest fan of the way the Stop Killer Robots folks go about trying to make their point, but they have a new documentary out, so here you go.

Error-riddled astronomical tables inspired the first computer—and the first vaporware

Allison Marsh is a professor at the University of South Carolina and codirector of the university's Ann Johnson Institute for Science, Technology & Society. She combines her interests in engineering, history, and museum objects to write the Past Forward column, which tells the story of technology through historical artifacts.

During Charles Babbage’s lifetime, this 2,000-part clockwork was as near to completion as his Difference Engine ever got.

It was an idea born of frustration, or at least that’s how Charles Babbage would later recall the events of the summer of 1821. That fateful summer, Babbage and his friend and fellow mathematician John Herschel were in England editing astronomical tables. Both men were founding members of the Royal Astronomical Society, but editing astronomical tables is a tedious task, and they were frustrated by all of the errors they found. Exasperated, Babbage exclaimed, “I wish to God these calculations had been executed by steam.” To which Herschel replied, “It is quite possible.“

Babbage and Herschel were living in the midst of what we now call the Industrial Revolution, and steam-powered machinery was already upending all types of business. Why not astronomy too?

Babbage set to work on the concept for a Difference Engine, a machine that would use a clockwork mechanism to solve polynomial equations. He soon had a small working model (now known as Difference Engine 0), and on 14 June 1822, he presented a one-page “Note respecting the Application of Machinery to the Calculation of Astronomical Tables” to the Royal Astronomical Society. His note doesn’t go into much detail—it’s only one page, after all—but Babbage claimed to have “repeatedly constructed tables of squares and triangles of numbers” as well as of the very specific formula x2 + x + 41. He ends his note with much optimism: “From the experiments I have already made, I feel great confidence in the complete success of the plans I have proposed.” That is, he wanted to build a full-scale Difference Engine.

Perhaps Babbage should have tempered his enthusiasm. His magnificent Difference Engine proved far more difficult to build than his note suggested.

It wasn’t for lack of trying, or lack of funds. For Babbage managed to do something else that was almost as unimaginable: He convinced the British government to fund his plan. The government saw the value in a machine that could calculate the many numerical tables used for navigation, construction, finance, and engineering, thereby reducing human labor (and error). With an initial investment of £1,700 in 1823 (about US $230,000 today), Babbage got to work.

The 19th-century mathematician Charles Babbage’s visionary contributions to computing were rediscovered in the 20th century.The Picture Art Collection/Alamy

Babbage based his machine on the mathematical method of finite differences, which allows you to solve polynomial equations in a series of iterative steps that compare the differences in the resulting values. This method had the advantage of requiring simple addition only, which was easier to implement using gear wheels than one based on multiplication and division would have been. (The Computer History Museum has an excellent description of how the Difference Engine works.) Although Babbage had once dreamed of a machine powered by steam, his actual design called for a human to turn a crank to advance each iteration of calculations.

Difference Engine No. 1 was divided into two main parts: the calculator and the printing mechanism. Although Babbage considered using different numbering systems (binary, hexadecimal, and so on), he decided to stick with the familiarity of the base-10 decimal system. His design in 1830 had a capacity of 16 digits and six orders of difference. Each number value was represented by its own wheel/cam combination. The wheels represented only whole numbers; the machine was designed to jam if a result came out between whole numbers.

As the calculator cranked out the results, the printing mechanism did two things: It printed a table while simultaneously making a stereotype mold (imprinting the results in a soft material such as wax or plaster of paris). The mold could be used to make printing plates, and because it was made at the same time as the calculations, there would be no errors introduced by humans copying the results.

Difference Engine No. 1 contained more than 25,000 distinct parts, split roughly equally between the calculator and the printer. The concepts of interchangeable parts and standardization were still in their infancy. Babbage thus needed a skilled craftsman to manufacture the many pieces. Marc Isambard Brunel, part of the father-and-son team of engineers who had constructed the first tunnel under the Thames, recommended Joseph Clement. Clement was an award-winning machinist and draftsman whose work was valued for its precision.

Babbage and Clement were both brilliant at their respective professions, but they often locked horns. Clement knew his worth and demanded to be paid accordingly. Babbage grew concerned about costs and started checking on Clement’s work, which eroded trust. The two did produce a portion of the machine [shown at top] that was approximately one-seventh of the complete engine and featured about 2,000 moving parts. Babbage demonstrated the working model in the weekly soirees he held at his home in London.

The machine impressed many of the intellectual society set, including a teenage Ada Byron, who understood the mathematical implications of the machine. Byron was not allowed to attend university due to her sex, but her mother supported her academic interests. Babbage suggested several tutors in mathematics, and the two remained correspondents over their lifetimes. In 1835, Ada married William King. Three years later, when he became the first Earl of Lovelace, Ada became Countess of Lovelace. (More about Ada Lovelace shortly.)

Despite the successful chatter in society circles about Babbage’s Difference Engine, trouble was brewing—cost overruns, political opposition to the project, and Babbage and Clement’s personality differences, which were causing extreme delays. Eventually, the relationship between Babbage and Clement reached a breaking point. After yet another fight over finances, Clement abruptly quit in 1832.

Ada Lovelace championed Charles Babbage’s work by, among other things, writing the first computer algorithm for his unbuilt Analytical Engine.Interim Archives/Getty Images

Despite these setbacks, Babbage had already started developing a more ambitious machine: the Analytical Engine. Whereas the Difference Engine was designed to solve polynomials, this new machine was intended to be a general-purpose computer. It was composed of several smaller devices: one to list the instruction set (on punch cards popularized by the Jacquard loom); one (called the mill) to process the instructions; one (which Babbage called the store but we would consider the memory) to store the intermediary results; and one to print out the results.

In 1840 Babbage gave a series of lectures in Turin on his Analytical Engine, to much acclaim. Italian mathematician Luigi Federico Menabrea published a description of the engine in French in 1842, “Notions sur la machine analytique.” This is where Lady Lovelace returns to the story.

Lovelace translated Menabrea’s description into English, discreetly making a few corrections. The English scientist Charles Wheatstone, a friend of both Lovelace and Babbage, suggested that Lovelace augment the translation with explanations of the Analytical Engine to help advance Babbage’s cause. The resulting “Notes,” published in 1843 in Richard Taylor’s Scientific Memoirs, was three times the length of Menabrea’s original essay and contained what many historians consider the first algorithm or computer program. It is quite an accomplishment to write a program for an unbuilt computer whose design was still in flux. Filmmakers John Fuegi and Jo Francis captured Ada Lovelace’s contributions to computing in their 2003 documentary Ada Byron Lovelace: To Dream Tomorrow. They also wrote a companion article published in the IEEE Annals of the History of Computing, entitled “Lovelace & Babbage and the Creation of the 1843 ‘Notes’.”

Although Lovelace’s translation and “Notes” were hailed by leading scientists of the day, they did not win Babbage any additional funding. Prime Minister Robert Peel had never been a fan of Babbage’s; as a member of Parliament back in 1823, he had been a skeptic of Babbage’s early design. Now that Peel was in a position of power, he secretly solicited condemnations of the Difference Engine. In a stormy meeting on 11 November 1842, the two men argued past each other. In January 1843, Babbage was informed that Parliament was sending the finished portion of Difference Engine No. 1 to the King’s College Museum. Two months later, Parliament voted to withdraw support for the project. By then, the government had spent £17,500 (about US $3 million today) and waited 20 years and still didn’t have a working machine. You could see why Peel thought it was a waste.

But Babbage, perhaps reinvigorated by his work on the Analytical Engine, decided to return to the Difference Engine in 1846. Difference Engine No. 2 required only 8,000 parts and had a much more elegant and efficient design. He estimated it would weigh 5 tons and measure 11 feet long and 7 feet high. He worked for another two years on the machine and left 20 detailed drawings, which were donated to the Science Museum after he died in 1871.

In 1985, a team at the Science Museum in London set out to build the streamlined Difference Engine No. 2 based on Babbage’s drawings. The 8,000-part machine was finally completed in 2002.Science Museum Group

Although Difference Engine No. 2, like all the other engines, was never completed during Babbage’s lifetime, a team at the Science Museum in London set out to build one. Beginning in 1985, under the leadership of Curator of Computing Doron Swade, the team created new drawings adapted to modern manufacturing techniques. In the process, they sought to answer a lingering question: Was 19th-century precision a limiting factor in Babbage’s design? The answer is no. The team concluded that if Babbage had been able to secure enough funding and if he had had a better relationship with his machinist, the Difference Engine would have been a success.

That said, some of the same headaches that plagued Babbage also affected the modern team. Despite leaving behind fairly detailed designs, Babbage left no introductory notes or explanations of how the pieces worked together. Much of the groundbreaking work interpreting the designs was done by Australian computer scientist and historian Allan G. Bromley, beginning in 1979. Even so, the plans had dimension inconsistencies, errors, and entire parts omitted (such as the driving mechanism for the inking), as described by Swade in a 2005 article for the IEEE Annals of the History of Computing.

The team had wanted to complete the Difference Engine by 1991, in time for the bicentenary of Babbage’s birth. They did finish the calculating section by then. But the printing and stereotyping section—the part that would have alleviated all of Babbage’s frustrations in editing those astronomical tables—took another nine years. The finished product is on display at the Science Museum.

A duplicate engine was built with funding from former Microsoft chief technology officer Nathan Myhrvold. The Computer History Museum displayed that machine from 2008 to 2016, and it now resides in the lobby of Myhrvold’s Intellectual Ventures in Bellevue, Wash.

The title of the textbook for the very first computer science class I ever took was The Analytical Engine. It opened with a historical introduction about Babbage, his machines, and his legacy. Babbage never saw his machines built, and after his death, the ideas passed into obscurity for a time. Over the course of the 20th century, though, his genius became more clear. His work foreshadowed many features of modern computing, including programming, iteration, looping, and conditional branching. These days, the Analytical Engine is often considered an invention 100 years ahead of its time. It would be anachronistic and ahistorical to apply today’s computer terminology to Babbage’s machines, but he was clearly one of the founding visionaries of modern computing.

Part of a continuing series looking at photographs of historical artifacts that embrace the boundless potential of technology.

An abridged version of this article appears in the June 2022 print issue as “The Clockwork Computer."

Download this free whitepaper to learn how battery energy storage up to 1500 VDC can deliver power efficiencies and cost reductions

The explosive growth of the battery energy storage industry has created a need for higher DC voltages in utility-scale applications.

Download this free whitepaper and learn how you can achieve a smooth transfer of power, efficiencies and cost reductions with battery energy storage system components up to1500 VDC.