Whatever Happened to the Global Warming “Pause?”

Picture It is only September, but, absent a massive volcanic eruption or asteroid impact, 2015 will be, by far, the hottest year on the instrumental record. The culprit is a massive El Niño that is compounding the warming effects of rising greenhouse gas emissions. This year’s scorching heat will mean that the three hottest years on record will have occurred within the same five-year stretch: in 2010, 2014, and 2015.

In June, scientists with the National Oceanographic and Atmospheric Administration (NOAA) published a major article that measures the rate of global warming in light of this recent heat. The article uses a suite of new sources on land and across the oceans to provide a comprehensive temperature reconstruction from 1880 to the present. Many of these sources provide meteorological data of unprecedented accuracy. The new temperature reconstruction includes especially substantial corrections to previous global sea surface temperature statistics from the years 1880 to 1940. It also makes minor adjustments to temperature reconstructions of the last twenty years. In fact, using their new data, the NOAA scientists conclude that global temperatures have increased steady since 1950. This is a major finding that contradicts the popular idea of a “pause” or “hiatus” in recent warming.

According to climatologist Gavin Schmidt, director of the Goddard Institute for Space Studies, “the fact that such small changes to the analysis make the difference between a hiatus or not . . . underlines how fragile a concept [the pause] was in the first place.” So how did that pause become such an influential idea? Given the forces behind record-setting temperatures this year, it is not surprising that the answer must begin with another major El Niño, eighteen years ago.  Picture El Ninos compared. Source: NASA JPL, Bay Area News Group From 1997 to 1998, that record-breaking El Niño slowly uncoiled across the tropical Pacific. In an El Niño, trade winds that normally blow from east to west slow down, lowering the layer of water that divides the upper from the lower ocean. That, in turn, reduces the efficiency of cool, deep-water upwelling. Warmer sea surface temperatures in the equatorial Pacific extend far to the east, and reshape atmospheric circulation around the world. In 1998, global temperatures rose sharply. Temperature records are usually set by a hundredth of a degree Celsius, but temperatures in 1998 exceeded the previous high-water mark by a tenth of a degree. It was the meteorological equivalent of Usain Bolt’s performance at the 2008 Olympics.  Picture The mechanism by which El Nino affects the world’s climate. “Atmospheric bridge” by Giorgiogp2 – Own work. Licensed under CC BY-SA 3.0 via Commons. https://goo.gl/EMTxYU Looking back in 2001, the Intergovernmental Panel on Climate Change (IPCC) concluded that the 1998 spike in Earth’s temperature was an “extreme event,” even in the context of human-caused climate change. Inevitably, global temperatures cooled in the immediate wake of the great El Niño, although they remained much higher than the twentieth-century average. Still, to some, the warnings of climate scientists had suddenly become incompatible with the recent instrumental record. The idea of the “pause” was born.

Global warming sceptics quickly sought to popularize the concept. Naomi Oreskes and Eric Conway recently chronicled how some activist scientists have, for decades, derailed the public discourse on science that appears to challenge the unfettered workings of the free market. They tried to undermine, for example, science that links cigarette smoke to lung cancer, and chlorofluorocarbons to the Ozone Hole. They lost those battles, but found greater success in their fight against climate science. At every step, they have deployed the weapon of doubt. If the science that supports regulation can be made to sound uncertain, it can hardly serve as a foundation for government policy.

Already in 2006, palaeontologist Robert Carter, for instance, insisted that temperatures had remained steady for eight years, since 1998. In his telling, the science behind global warming appeared anything but settled. Carter, a well-known denier of anthropogenic warming, co-authored a notorious paper in 2009, which argued that El Niño events account for most of the changes in global temperature over the past fifty years. Three years later, Carter admitted that he received a monthly salary from the Heartland Institute, an influential organization committed to denying global warming. Regardless, Carter made an error that is also committed by better-intentioned sceptics of anthropogenic climate change. He identified a climatic trend by cherry picking a past date – an abnormally warm year – that allowed him to argue for recent cooling. 
Even many scientists who agree with the scholarly consensus on global warming believed that the steady rise in Earth’s temperature had slowed – but not stopped – since 1998. As reported on this website, most of their research tied this supposed slowdown to one of two sets of variables. Some scientists argued that entirely natural causes had temporarily reduced the amount of solar radiation reaching the Earth. For example, as I reported on this website, one influential article blamed a recent spate of small volcanic eruptions. Other scientists more persuasively concluded that different parts of the climate system appeared to absorb more heat than had been thought possible. The oceans were widely cited as the destination for warmth that might otherwise have reached our atmosphere.

Nevertheless, the so-called “pause” in global warming barely registered in the public consciousness until early 2013. In January of that year, an article by lead author James Hansen – perhaps the most responsible and vocal voice in global warming scholarship – began with alarming news. In 2012, it announced, temperatures the world over had been 0.56 degrees Celsius warmer than the 1951-1980 base average. This was in spite of a La Niña that had chilled the waters of the equatorial Pacific, and therefore had the opposite effect on global temperatures of an El Niño.

Unfortunately, Hansen’s report contained a misleading and ultimately disastrous headline. Under “Global Warming Standstill,” the report noted that the five-year running mean of global temperature “has been flat for the past decade,” despite a steady (if slowing) increase in greenhouse gas emissions. The report speculated about possible reasons, but concluded that the most likely culprit was probably El Niño conditions early in the period, followed by La Niña episodes in more recent years. In years less troubled by these conditions, temperatures continued to rise at the rate they had in the previous three decades.  
Picture Google searches for “Global Warming Pause,” as of September 30th, 2013. Source: Google Trends/Maggie Severns. In March, an influential piece in The Economist misinterpreted this part of Hansen’s article. It began by declaring that, “over the past 15 years, air temperatures at the Earth’s surface have been flat while greenhouse-gas emissions have continued to soar.” The article did not deny the existence of warming, and concluded that the recent pause was “hardly reassuring.” Nevertheless, it suggested that Earth might be less sensitive to our carbon emissions than scientists had previously believed. The Hansen article, remember, had reached entirely different conclusions. 

The Economist article provided an opening for sceptics of climate change to push the narrative of the unexplained pause. Meanwhile, as chronicled by Chris Mooney, the IPCC prepared to release its fifth assessment report. In August, a leaked draft was released to the press. It contained descriptions of the pause that, when read in isolation, appeared to confirm the Economist article. According to one of the report’s lead authors, Dennis Hartmann of the University of Washington, public pressure convinced the scientists that they “had to say something” about the pause. They were hesitant, because the fifteen years between 1998 and 2013 were too short to reliably reflect a climatic trend. Yet before they had finalized their interpretation of the pause, it reached journalists who were only too eager to sacrifice scientific accuracy for sensationalistic headlines.  

Not all proponents of the pause were scientifically illiterate, or associated with global warming denial. For example, in September 2013, Berkeley physicist and converted climate sceptic Richard Muller announced that the “global warming crowd has a problem,” because, “despite a steady escalation of greenhouse gas emissions into the atmosphere, the planet’s average surface temperature has remained pretty much the same for the last 15 years.” Muller argued that he had more or less predicated this pause back in 2004. It was in a critique of the so-called “hockey stick” graph that famously illustrated global warming (and has been recently vindicated). In 2004, Muller warned that, were the graph taken seriously, many would see inevitable, natural variations in temperature as undermining the case for anthropogenic climate change. In 2013, it seemed that Muller might be right. 

Then, in late September, the IPCC finally released its completed fifth assessment report. As I described on this website, the report acknowledged a slight slowdown in the rate of global warming from 1998 to 2012. However, it weakened the previous draft’s description of a “pause” or “hiatus.” It confirmed that every decade in the past thirty years had been successively warmer than any previously recorded decade. As I wrote in 2013, it also explained that “the presence of short-term fluctuations in climate does not throw into doubt the existence of long-term climatic trends.” As one might expect, many sceptics met the final report with derision. 
Picture Previous NOAA temperature reconstructions in red (for land surface temperatures) and blue (for sea surface temperatures). Corrections in the new article are in black. Fast forward to the present, and 2015 is set to be another meteorological equivalent of Usain Bolt’s Olympic run. As in 1998, it looks like the previous record temperature will be broken by a tenth of a degree Celsius. Perhaps the narrative of the pause is now on its deathbed. Yet the idea of a recent slowdown in warming persists in very informed circles. In an elegant article published in the journal Science, climatologist Kevin Trenberth responds to the NOAA paper by taking issue with any warming trend that starts in 1950. After all, this was at the beginning of a so-called “big hiatus” in warming that chilled the world between 1943 and 1975. If sceptics should not cherry pick the start date of their temperature series by choosing an abnormally warm year, climatologists should not do the same by selecting a very cold year.

Interestingly, the big hiatus, which is well supported by robust science, was probably also caused by human activity. Aerosols released into the atmosphere by human pollution probably cooled the world, until clean air acts were passed in the 1970s. In the past, anthropogenic climate change could warm or cool the globe, and involve many kinds of human activity. 

Overall, Trenberth supports the major findings of the NOAA scientists. He argues that there may have been a slight slowdown – not a pause – in the rate of global warming, driven at least partially by natural variation in, for example, the Pacific Decadal Oscillation (which includes El Niño and La El Niña). Yet he agrees that temperatures are rising quickly once again, and that the slowdown has probably ended.

Earlier this year, record-breaking wildfires spread across North America, while heat waves killed thousands in Europe and India. Sea levels, it seems, are warming and rising much faster than even the IPCC had predicated. The costs of climate change inaction have become increasingly clear. Carbon emissions have stagnated or declined in many developed countries, yet it is far from enough. Without drastic action, the weather of 2015 is only a precursor of things to come. 

~Dagomar Degroot

Sources: 

Hansen, J., M. Sato, and R. Ruedy. “Global Temperature Update Through 2015.” 15 January, 2013. Available at: http://www.columbia.edu/~jeh1/mailings/2013/20130115_Temperature2012.pdf

Karl, Thomas R. et al. “Possible artifacts of data biases in the recent global surface warming hiatus.” Science 348: 6242 (2015): 1469-1472.

Trenberth, Kevin E. “Has there been a hiatus?.” Science 349: 6249 (2015): 691-692.

Trenberth, Kevin E. “The definition of El Nino.” Bulletin of the American Meteorological Society 78: 12 (1997): 2771-2777.

— Dagomar Degroot - HistoricalClimatology.com


Post-Doc Opporturnity: UC Institute for the Study of the Ecological Effects of Climate Impacts (ISEECI)

The newly formed UC Institute for the Study of the Ecological Effects of Climate Impacts (ISEECI) has posted three postdoctoral fellowship announcements on its website: http://nrs.ucop.edu/research/iseeci/index.htm Two of these–on biotic communities … Continue reading →

— Climate History Network


“Tomorrow: Sunny”: The Rise and Fall of Solar Heating in 1970s Canada

By Henry (Hank) Trim

Solar energy seems poised to become a major player in the world of energy. Years of investment have brought down the price of photovoltaics and innovative financing methods have generated unprecedented growth in the industry. According to the Canadian Solar Industries Association solar electric is the fastest growing source of energy in the world.[1]The future of solar is bright! No pun intended.

This is not the first time solar technology has seemed poised for success. In the late 1970s solar heating appeared ready to sweep across Canada. In fact, the federal government launched a multibillion dollar commercialization program and Alistair Gillespie, then the Minister of Energy, Mines, and Resources, promised that a solar industry would provide jobs for thousands of Canadians.[2]

World Oil Prices since 1861. The orange line is adjusted for inflation.  Tom The Hand - Own work. Licensed under CC BY-SA 3.0 via Commons https://commons.wikimedia.org/wiki/File:Oil_Prices_Since_1861.svg#/media/File:Oil_Prices_Since_1861.svg

World Oil Prices since 1861. The orange line is adjusted for inflation.
Tom The Hand – Own work. Licensed under CC BY-SA 3.0 via Commons https://commons.wikimedia.org/wiki/File:Oil_Prices_Since_1861.svg#/media/File:Oil_Prices_Since_1861.svg

In a series of posts over the next four months, I will explore the meteoric rise of solar heating in the 1970s and its fall in the 1980s. Solar heating’s story highlights the complexity of energy markets and energy policy and it illustrates how energy experts thought about the future and made critical decisions the basis of their best guesses in the 1970s. Energy analysts in government and industry still struggle with the challenges of uncertainty and contentious politics as they attempt to chart and control our energy future. Today’s photovoltaics still rely on these experts’ projections of the future.

The oil embargo of 1973 inaugurated today’s complex energy market. The Organization of Petroleum Exporting Countries (OPEC), led by Saudi Arabia, embargoed western countries in retaliation for their support of Israel in the 1973 Yom Kippur war. Oil prices spiked as reductions in supply and speculation drove prices to double the then quadruple. Oil, the foundation of the world’s energy supply, no longer seemed safe.

At the same time as the oil shocks disrupted markets, economists, engineers, and scientists began to use their growing abilities to model and analyze the world’s complexity. Searching for a way to understand energy, government and industry analysts applied these models to energy questions. In fact, discussions of energy quickly became debates over supply and demand projections, depletion rates, and forecasts of potential discoveries sprinkled with references to billions of barrels and megawatts of electricity.

Huge scale, long lead times, and global volatility made these forecasts both a necessity and a headache for companies and governments. Having some idea about what to expect, where to invest your time and money, and what new technology or energy source might reshape the energy future became a full time occupation for legions of industry and government analysts in the 1970s. Accurate predictions or not, the demands of the industry and the payoffs of backing the right idea has keep them scanning the horizon ever since. Canada’s solar boom in the 1970s was just one product of the ongoing search for energy futures.

Far from an energy superpower, Canada had not yet developed the oil sands. Nor had it begun to extract oil from the Grand Banks or fully explore the arctic for oil and gas. Nonetheless, the Canadian government saw great potential in the oil industry in the early 1970s. The government’s Energy Plan for Canada forecast rapid growth as new reserves were discovered and the oil sands finally came on line. In 1973, just before the embargo, analysts stated these developments would provide Canadians with jobs and tax dollars for decades.[3]

An EMR Energy Model. This model of Canada’s energy system appeared as part of an extensive list describing the complex feedback between energy and Canadian society and emphasizing the need for carefully designed policy.     Energy, Mines, and Resources, An Energy Policy for Canada – Phase 1 (Ottawa: Supply and Services, 1973)

An EMR Energy Model. This model of Canada’s energy system appeared as part of an extensive list describing the complex feedback between energy and Canadian society and emphasizing the need for carefully designed policy.
Energy, Mines, and Resources, An Energy Policy for Canada – Phase 1 (Ottawa: Supply and Services, 1973)

Three years later the government admitted the oil shock had thrown its “energy-policy planning and other social and economic” goals into disarray. There were two reasons for this disarray. First, in the 1970s the Canadian oil market functioned as two separate and heavily managed markets. West of the Ottawa River, Canada relied on Alberta oil, which before the OPEC embargo was more expensive then imported oil. East of the River, Canada relied on imports. The 1973 shock destroyed this system. In 1974 a new oil program attempted to strike a balance between western producers, who wanted to sell their oil at rising world prices, and consumers (particularly in central Canada’s industrial heartland) who wanted to keep low fuel costs. To allow prices to rise, subsidies were applied to imported oil. When oil prices continued to rise, it destroyed the federal government’s balance sheet. This created a strong incentive to find a substitute to oil.

The second cause of disarray further complicated matters. In 1974 energy experts brought further bad news. They decided their previous sunny forecasts of Canadian oil and gas potential had been far too optimistic. Experts now forecast that the country would run short of conventional oil by the mid 1980s unless the industry made substantial breakthroughs in extraction. This meant that Canadians and their government had to start preparing for a world of energy scarcity and rising prices. Although these forecasts turned out to be wrong, they shaped policy throughout the 1970s.

Canada’s energy future became a political issue in 1975. Pierre Elliott Trudeau’s Cabinet decided that it needed to respond to the criticism that “Canada [has] no national energy policy” and no idea how to respond to the energy crisis. Cabinet demanded that the Department of Energy, Mines, and Resources “guide “aggressive government policies directed at increasing supplies, reducing demands, and …encouraging inter-energy substitutions” according to Cabinet memos.[4] The federal government quickly launched new programs and the Department published a new Energy Strategy for Canada in 1976 that made decreasing energy use and replacing oil with other forms of energy a central goal. Scientists, entrepreneurs, and environmentalists saw this growing desperation as an opportunity. Perhaps, they thought, renewable energy’s time had finally come.

Hank Trim is a SSHRC postdoctoral fellow at the University of California, Santa Barbara. This is the first in a four part monthly series on the rise and fall of solar energy in the 1970s and 1980s

 

[1] “Roadmap 2020: Powering Canada’s Future with Solar Electricity,” Canadian Solar Industries Association, last modified August 20, 2015, http://www.cansia.ca/sites/default/files/cansia_roadmap_2020_final.pdf.

[2] Timothy Pritchard, “Federal Plan to Encourage use of Solar and Waste Energy,” Globe and Mail, July 5, 1978.

[3] Energy, Mines, and Resources, An Energy Policy for Canada – Phase 1 (Ottawa: Supply and Services, 1973).

[4] Cabinet Memo, “A National Energy Strategy,” 12/23/75, R1526, Vol. 255, File Energy Policy, 1975-1976, Gillispie Fonds, Library and Archives Canada.

— ActiveHistory.ca


Job Announcement: Graduate Editor/Communications Associate

Deadline: 30 September

— Rachel Carson Centre LMU - News and Events


Hurricane Katrina, 10 Years Later

Ten years ago today, New Orleans hit bottom: most of the city was flooded, systems and safety nets had snapped, and citizens lacked food, water, and security.  The city has since come back, but unevenly: tourist spots are hopping and there’s new investment, but social and racial inequalities have deepened.
Immediately after the event, I wrote a piece about Hurricane Katrina for Alternatives Journal.  I stressed a few points: that this disaster had been predicted well in advance, and owed much of its severity to earlier decisions and environmental transformations.  I also predicted that if powerful institutional and economic interests remained unreformed, New Orleans would experience, at best, only a partial and unequal recovery.  And so it has been.

— Stephen Bocking - Environment, History, Science


Beavering away in Västerbotten

Yesterday evening I went on my second beaver safari. This time I was near home–only 36 km away in Vännäs on the Vindel river. We had great luck and saw several beavers right away.

Beaver on beaver safari in Vännäs, Västerbotten country, Sweden. Photo by D Jørgensen

Beaver on beaver safari in Vännäs, Västerbotten country, Sweden. Photo by D Jørgensen

Beaver were first brought back to the county of Västerbotten very early in the reintroduction process. In 1924, the second beaver reintroduction in Sweden took place in Västerbotten on the Tärnaån further inland. But no more reintroductions happened in the area until after World War II. In the 1950s and 60s beavers were set out intentionally and more animals migrated in from the neighbouring Jämtland reintroductions.

According to an article from 1984 in the journal Från hav till fjäll, an inventory in 1961-62 counted 39 animals in Västerbotten county. By 1969, the number had grown in 63, and by 1976 it had jumped to an estimated 500. By 1983, the estimate was 5600 to 7000 animals and it’s gone up significantly since then. The beavers have been beavering away in Västerbotten.

Beaver skins on the benches on beaver safari, Vännäs, Sweden. Photo by D Jørgensen.

Beaver skins on the benches on beaver safari, Vännäs, Sweden. Photo by D Jørgensen.

Like my previous research object safari experiences, this was also a sensory tour. We started out around a fire to have a cup of newly open-fire cooked coffee while sitting on wooden benches draped with beaver skins. The skins were soft and warm. And our guide, Stefan Lindgren of By the River, explained that it is the soft underfur which has thousands of follicles per square cm that keeps the beavers fur waterproof (and soft). Stefan said that he had acquired them from a retired beaver trapper, who had originally kept them in order to make a beaver coat for his wife, but she refused to have it! So he bought the skins and now uses them on tours to allow the guests to get closer to beavers.

It was also a physical tour — we were in a rubber boat and each guest had to do some paddling along the ride. Luckily the wind was blowing upstream, so we had it pretty easy with the wind’s help. There was a stillness out on the Vindel river. You have to be quiet to not scare off the beavers, so it was just the sound of the paddles, the wind in the trees, a fish splash here and there, and the slap of a beaver tail when one dove out of sight.

Our guide Stefan with the freshly gnawed off branches dropped by the beaver. Photo by D. Jørgensen.

Our guide Stefan with the freshly gnawed off branches dropped by the beaver. Photo by D. Jørgensen.

We got to see the entrance to the beaver’s den with branches piled up as protection and the beaver trails from the water into the woodlands. At one point, we saw a beaver dragging some freshly cut willow branches through the water. Unfortunately, when the animal saw us, he/she dropped the newly acquired prize and swam away. Stefan then guided the boat over to the branches and picked them up for us to see the beaver’s handiwork. We each got a piece of beaver gnawed branch to take home.

Although beavers make significant changes to their landscape, they are in many ways invisible. Few people have ever seen a beaver, even if they live in an area well-inhabited by the critters. Like most wildlife, they do a good job of hiding themselves. This of course makes wildlife tourism like beaver safaris challenging. In this case, there are typically one or two beaver families in this particular area which is a protected little island near one side of the river downstream from a rapids. It is perfect beaver spot, so Stefan knows that most of the time, beavers will be there, but nothing is guaranteed. So I feel privileged to have been able to see beavers at work in Västerbotten.

jQuery(document).ready(function($) {
$(‘#wp_mep_1’).mediaelementplayer({
m:1

,features: [‘playpause’,’current’,’progress’,’duration’,’volume’,’tracks’,’fullscreen’]

});
});

— Dolly Jørgensen, The Return of Native Nordic Fauna


Peter Hobbins awarded the 2016 Merewether Scholarship

University of Sydney historian Dr Peter Hobbins has been awarded the 2016 Merewether Scholarship by the State Library of New South Wales for his project ‘Curios and curiosity: James Bray and the sunset of amateur science in colonial Sydney’. Garnering a $12,000 stipend, the scholarship was inaugurated in 2008 to facilitate research and public engagement […]

— Australian and NZ Environmental History Network


The “scholarly blog”: PressForward’s many paths to success, and how to measure them

 By Kimberly Coulter

It has been five months since Ant Spider Bee relaunched its site with the WordPress web aggregation and publication plugin PressForward. Thanks to a generous grant from the Alfred P. Sloan Foundation, we have been able to pilot this tool as a partner of the Roy Rosenzweig Center for History and New Media. PressForward helps us review a collection of relevant RSS feeds, nominate the posts we deem of greatest interest to our readers, and repost excerpts. By doing the aggregating work of the “ant,” it helps us be the “bee”…allowing more resources for digestion and cross-pollination.

“Digestion” and “cross-pollination” are two main functions of scholarly blogs, electronic publications that may build community and curate, contextualize, or comment on issues in a field of study. Discussing how a tool like PressForward is relevant for scholarly communication, and in particular blogs that engage an academic community with relevant news along with original “gray literature” (meaning not peer-reviewed), was a focus of the PressForward Institute for Scholarly Communication at George Mason University from 13-14 August. We enjoyed meeting RRCHNM staff and learned how to better manage our feeds and share them with others. It was great to meet the other three original pilot partners [PLoS, MicroBEnet, CitizenScienceToday (Zooniverse)] and other participants (including Woods Hole Marine Biological Laboratory and the Association of College and Research Libraries’ dh+lib).

In these five months since adding PressForward we’ve seen our traffic increase 138%. Still, we ask: what does a successful scholarly blog look like for us, what role does PressForward play, and how can we measure this success?

Models

As PressForward project co-director Stephanie Westcott pointed out, attracting researchers’ online engagement means different things to different communities. PressForward aims to do this by semi-automating information-gathering, and in making it easy for community members (“editors-at-large”) to participate in information curation. PressForward’s “test blog” is Digital Humanities Now,  to which visitors turn to for job postings, calls for proposals, news, and conversations relevant to digital humanities. The editors-at-large (particularly graduate students) view the work as career-relevant service. RRCHNM’s Lisa Rhody reported that the “editor’s choice” featured post has even become a quality stamp; a selected author sometimes includes the republication on his or her cv.

"Editor's Choice" feature in Digital Humanities Now.

“Editor’s Choice” feature in PressForward’s test publication, Digital Humanities Now.

While an aggregated feed may be presented as a stand-alone offering, as in Digital Humanities Now or Jon Christensen and Ursula Heise’s attractive blog Environmental Humanities Now, which generally feature “editor’s choice” posts selected from the aggregated feed, we also discussed other models. These alternative models generally fell into three categories: using the feed to 1) augment an original content publication with related postings that together may frame problems and offer resources; 2) emphasize community curation; and 3) contextualize a large body of scholarly literature.

At the workshop, pilot partners and others described diverse goals for using the aggregated feed. Using a dynamic feed to augment a regular original publication is what we’re trying to do with Ant Spider Bee: ultimately creating a collection of short essays reflecting on the role of the digital in environmental humanities. We hope our feed not only serves its scholarly community but also draw it to the site. Ideally feed content should infect its original content as well, making it more current and engaged. Community building is a tacit goal of most scholarly blogs, but as a focus, it has potential for great impact. By casting a broad and inclusive net, presenting cool new projects from the community as news, and analyzing them–ideally in visual ways–it’s possible to use PressForward to explicitly cultivate and strengthen a scholarly community. Popular also are aggregated opportunities and job postings, as are emphasized in Digital Humanities Now and dh+lib. Rosalind Reid, executive director of the Council for the Advancement of Science Writing and project director of New Horizons in Science, spoke about plans for a curated Compendium of Best Science Writing. In cases with large scholarly corpuses with a high number of users, it’s possible to use aggregated feeds to pull in materials that contextualize peer-reviewed papers. The Public Library of Science (PLoS) has ambitious plans to create a blog for each thematic journal collection, to which PressForward feeds would aggregate related news and discussions.

Scholarly blog editors need expert understanding of the field, but also understanding of its experts’ information-seeking and communication behaviors. In general their role serves a “meta” research function: contextualization, curation, and programming. In contrast to peer review’s stamp of experts’ quality assurance, content curation may mean an expert’s assemblage of objects to narrate or interpret a story. While scholarly blogs generally do not perform rigorous peer review, they may influence a field’s direction by defining its problems, programs, and roles (see Michel Callon on “Domestication of the Scallops”). Aggregation is a convenience, made possible by technology, that enables curators to cast a wider information net and filter it more finely. Curation itself cannot be automated, and it can serve a powerful function.

In addition to taking a big picture and influencing the framing of issues, scholarly bloggers are often invested in a project’s extra-disciplinary, educational, or public value (open access, etc.). Yet there is usually a mismatch between this work and the traditional metrics. How can we measure the impact of such work?

Metrics

Different stakeholders measure the value of scholarly work with different metrics. It may be important that the work is heavily cited, gains funding, is adopted by policymakers, reaches the public, picked up by mainstream journalism, informs public debate, builds community, provides visibility for the field, or even sells books, magazines, or museum tickets. Thus it is essential to be clear about the goal before choosing what to measure. For example, Google Analytics offers tools more oriented to measuring stickiness (keeping users on a page) and sales, while, as Rhody points out, a successful feed aggregator should have a high “bounce rate,” showing the site’s success in directing users to relevant external content.

“Alternative” metrics measure high quality, high impact digital and other nontraditional outputs that may not be captured by traditional metrics. Tools like surveys might help capture “soft” outcomes like resulting collaborations, incorporation of recommendations into policy, public engagement, journalistic coverage, media appearances, speaking engagements, invitations to editorial roles, invitations to contribute to an edited product, growth of visibility of a community, visibility for the field, educating the public. In a metrics panel discussion on the workshop’s second day, Robin Champeaux of Oregon Health and Sciences University introduced some alternative metrics resources but also pointed out that those committed to work with nontraditional value may need to do their own curation work to document its success with alternative metrics; a great deal of contextual knowledge is necessary to quantify alternative outcomes.  and quantifying alternative outcomes requires a lot of contextual knowledge.

For measuring impact in the humanities, Web-sourced data can only grow in importance, as Stacy Konkiel writes in her post “We’re overdue on altmetrics for the digital humanities.” The Altmetric Explorer is one tool that aggregates webbased attention data, including an API to big policy indexes to track ideas’ adoption into policy. It lets users monitor, search and measure conversations about publications, such as blogs, news, Wikipedia, or Twitter, offering an alternative to traditional journal-to-journal referencing via citations. It does, however, charge 3000GBP annual license for a team of five. Another tool that helps scientists share the diverse impacts of their work is Impactstory, which measures things like saves and code forks, and how frequently work is saved, viewed, and discussed. Users can contribute data from tools like Figshare, which makes research outputs sharable and citable; Publons, which hosts and aggregates open peer reviews; or LinkedIn’s presentation sharing service SlideShare.

"The Altmetric Explorer demonstrates that articles from the Journal of American History are consistently referred to in blogs (yellow), on Twitter (light blue), in mainstream news sources (red) and on Wikipedia pages (dark grey)." Fran Davies, Not just science articles – tracking other disciplines and other research outputs" August 6, 2015. http://www.altmetric.com/blog/

“The Altmetric Explorer demonstrates that articles from the Journal of American History are consistently referred to in blogs (yellow), on Twitter (light blue), in mainstream news sources (red) and on Wikipedia pages (dark grey).” Fran Davies, Not just science articles – tracking other disciplines and other research outputs” August 6, 2015. http://www.altmetric.com/blog/

Alternative metrics can help to differentiate a candidate in a competitive field; they may lead to, or indicate strong networks that lead to success in hard metrics like funding. Citation takes long times to build. The web can be complementary, and provide immediate impact.

While it’s important to document successes, and to lobby universities to recognize alternative media scholarship for tenure and promotion, that is not the only avenue for thinking about “what counts.” As tenure-track positions—even tenure itself—become less and less common, early career scholars are well-advised to broaden their avenues. As one workshop participant concluded, even if it’s not about tenure and promotion, we will always be serving different masters. Building a community, connections, visibility, and exchange of ideas keep us adaptable and focused on our missions.

We came away from the workshop with a long list of ideas for Ant Spider Bee. To name a few: we’ll be pruning our RSS feeds and welcome your recommendations of further feeds to follow; deploying a PressForward outbound OPML to make it easy for our users to subscribe to our same feeds; investigating alternative metrics tools; considering an ISSN for our featured posts; and starting in October, I’ll be teaching Digital Environmental Humanities at the University of Munich and asking my students to contribute to Ant Spider Bee‘s PressForward feed curation as “editors-at-large.” We hope Ant Spider Bee’s feeds are relevant for its audience, and that our posts influence the thoughtful framing and use of digital tools in our fields. We would be delighted if they made a small contribution to the advancement of digital and nontraditional scholarly projects and professional recognition for their authors. Whether or not we succeed in measuring this, we find the effort worthwhile.

Kimberly Coulter directs the Environment & Society Portal in Munich. The Portal is the digital publication platform and archive of the Rachel Carson Center for Environment and Society, a nonprofit joint initiative of the LMU (University of Munich) and the Deutsches Museum.

— Ant, Spider, Bee: Exploring Digital Environmental Humanities


Call for Proposals: The Anthropocene in Museums: Reflections, Projections and Imaginings

Deadline: 2 October 2015

— Rachel Carson Centre LMU - News and Events


Job: Research Associate at RCC

The Rachel Carson Center for Environment and Society is seeking a Research Associate to join the small team working on the Environment & Society Portal. This is a part-time (19h/week) position suitable for Ph.D. candidates. The position starts 15 November 2015 or as soon as possible thereafter.

The deadline for applications is 22 September 2015. Full details are available in the job specifications below or on the website.

— European Society for Environmental History