Why Study Energy?

Part I of an energy and research discussion for my parents

Abundant energy has shaped the modern world. It has enabled wonderful innovations such as rapid and affordable travel, vaccines produced on an industrial scale, fertilizer for our crops, an elevated standard of living for billions of people, and the Information Age just to name a few. Fossil fuel makes up the majority of the abundant, easily accessible energy we have consumed to get here. While our standards of living have been elevated, the aggressive burning of fossil fuels has positioned us on a path for severe climate change [1].

Current technologies exist that can significantly reduce our global energy use while delivering what energy is still needed via clean, carbon free sources. And, yes, this can be done while bringing power to the one billion people currently living in energy poverty.

A Brief History of Energy Use

Energy use has skyrocketed over the past two centuries. Over this same period, the composition of fuels and power sources we use changed significantly. Prior to the 1850s, wood was the main fuel source. For the first half of the 1900s, coal dominated. But coal was quickly outpaced by petroleum with the rise of the automobile. The 1970s saw the introduction of natural gas and nuclear power on a large scale.

From EIA “History of energy consumption in the United States, 1775-2009”

At the start of the 1970s, petroleum was set to continue its exponential climb. Instead, the global energy market was struck by the oil crisis of 1973. The U.S. Federal Government enacted sweeping programs to beat down energy use while keeping the economy humming along. This was the introduction of energy efficiency as a staple of the U.S. energy strategy [2].

Two versions of energy intensity: the cyan “Energy Intensity Index” as developed by the U.S. Office for Energy Efficiency and Renewable Energy and the green ratio of energy use to GPD “E/GDP” – Reference [3]

It is difficult to disentangle the effects of a growing population, an expanding economy, and an economy transitioning away from heavy industry in a single chart. The above chart shows us that the “Energy Intensity” or energy used to create economic value has been decreasing in the U.S. for many years. However, in the 1970s, after enacting aggressive efficiency policies, Energy Intensity fell faster than before.

Another way of viewing energy use is considering the amount of energy used per capita. Historically, the the total energy use per person in the U.S. increased every year until the 1970s. Since then, use per person has been steady or slightly declining.

It is worth noting that the U.S. has outsourced a significant portion of its heavy industry as it transitions towards a service based economy. Regardless, energy use per capita and energy intensity are both helpful indicators of the efficacy of coordinated energy efficiency policy at the national level.

A re-invigoration of coordinated energy efficiency policies would help further decrease Energy Intensity and reduce the energy which we need to supply with carbon free sources.

Carbon Free Energy

Nuclear energy and large scale hydroelectric power have been staples of the U.S. electric system for many decades (see first figure). Solar and wind power are relatively new to the U.S. energy portfolio. These four technologies, plus biofuels, make up two different categories of carbon free energy.

  • Predictable power sources who’s power output can be increased or decreased as needed (nuclear, hydro, biofuels)
  • Intermittent sources who’s output is controlled by weather, not customer needs (solar and wind).

The above chart from the EIA shows carbon free energy production by source and is part of their annual energy review. Solar and wind energy have begun a rapid rise since the turn of the millennium.

The rapid increase in solar and wind energy is on a collision course with the way electric utilities traditionally operate their grid. Intermittent solar and wind challenge operators to deliver continuous, reliable power despite their fluctuations. Batteries and other storage technologies are being researched, developed, and continuously improved to help smooth out these difficulties.

Currently, in places with lots of installed solar power, electricity is stored in batteries when it is sunny and discharged back into the grid when large clouds pass over, reducing solar panel output, or during the night. Many new wind power installations also include batteries to help smooth out fluctuations.

To enable a large scale energy transition away from carbon intense sources towards carbon free sources, we need to figure out the right mix of intermittent renewable energy, other clean sources, and storage technologies to create a reliable grid. This is the central focus of my current research working with Ken Calderia at Carnegie Science.

  • [1] IPCC Working Group 3: Fifth Assessment Report “Summary for Policy Makers” https://science2017.globalchange.gov/downloads/CSSR_Ch1_Our_Globally_Changing_Climate.pdf
  • [2] U.S. Office of Energy Efficiency and Renewable Energy, “Energy Intensity Indicators”, Accessed 14 October 2019, https://www.energy.gov/eere/analysis/energy-intensity-indicators

NREL OpenMod Workshop

Last week the National Renewable Energy Laboratory (NREL), the United States’ epicenter of solar and wind research, hosted the first openmod workshop in North America. The workshop was a congregation of academics, scientists, researchers, and open software and data enthusiasts gathering to discuss the state of open source energy modeling.

We discussed a broad range of models. On one end, there were extremely detailed models. Models that are excellent for understanding how the electric grid will respond to changing weather patterns that alter renewable energy availability in the coming hours or days. Models like ours at Carnegie Science, which focus on 50 year to century scale energy transitions, were the other end.

Beyond interest in models, some groups focused on model inputs and making data available. The Catalyst Cooperative is gathering data from disparate sources into an open communal databases for everyone to use. This effort is part of their Public Utility Data Liberation (PUDL) project.

In line with the communal data theme, I presented a 7 minute lightening round talk on the electricity demand project Dave Farnham (@farnham_h2o) and I have been working on. This is a project focused on making publicly available electricity demand data usable by everyone and was the subject of a previous post.

Altering data can be contentious. And, it should be if there is no well defined method to identify data to be replaced and deciding how to replace it. Because of this, I initially thought there would be some opposition to our work. After all, a 7 minute talk is not enough time to allay everyone’s fears.

There was support for our approach once the workshop participants saw the magnitude of the anomalous deviations we target. One participant was the exception who needed much more detail than what was possible in my 7 minute talk. We invested the additional time and effort discussing the details. And, it paid off. In the end, this participant expressed his support for our method.

In the coming weeks I hope to post a link to a recording of the talk, which is not currently available. For now, please see the linked slides if you are interested.

Electricity Demand

Creating electricity to power our industries, schools, hospitals, and modern lifestyles consumes 40% of all primary energy in the U.S. At Carnegie Science, we are studying what paths the electricity system could take to become net zero in carbon emissions in the future.

It would be incredible to have a clean 100% renewable wind and solar based electricity system. However, there are real challenges in meeting energy demand at all hours because the sun does not always shine and the wind does not always blow. These hurdles can be overcome with smart choices in energy storage and by wise planning based on studying the variability of wind and solar resources.

At Carnegie Science, we have built a computer model of a simplified energy system to study net zero emissions systems. Any energy system our model designs must be able to supply electricity to meet the desired consumption of the U.S. for every hour of every day in the future. To begin to understand what is required, we use historical hourly electricity demand as one of the model inputs.

One of my colleagues, David Farnham (@farnham_h2o), and I are working on preparing these historical electricity demand data for our model. The U.S. Energy Information Administration (EIA) graciously collects hourly information from the utilities across the U.S. and publishes that data for analysis and use by the public.

However, we are all at the mercy of the reporting practices of each utility. If utilities report outrageous numbers, the EIA publishes outrageous numbers. And, when these numbers are used in an energy model, they can lead to wild results.

David and I have been developing algorithms to identify these anomalous values. After identifying anomalies, we replace them with a best estimate of what the true value probably was. A great example of some strange values can be seen in the below graphic, which shows the hourly electricity demand for the PacifiCorp West service territory over 10 December days in 2016.

Electricity demand during 10 December days in 2016 in the PacifiCorp West service territory of the U.S. Data pulled from EIA database Sept. 3, 2019.

Even without any background knowledge of what electricity demand should look like, the problem region jumps out immediately. The demand increases by a factor of 7 for 24 hours compared to the surrounding data. There is also a sudden one hour drop in demand which we also flag as anomalous. Our brain is phenomenal at pattern recognition and at identifying regions which do not conform with their surroundings.

Imagine designing an energy system which had to provide electricity for those 24 anomalous hours. You would build a system 7 time larger than what is needed for the rest of the year. Utility rate payers would be up in arms.

We could visually check all 56 reporting regions in the U.S. for all four years of hourly data: 56 regions * 4 years * 8760 hours per year = 1,962,240 data points! Instead, we devise algorithms to scan the data for us.

A good algorithm is reusable. We are putting in extra effort now to design the best algorithms possible for the task with an aim of reusability. In 6 months, when there is a new 6 month chunk of data, we will simply run our code to clean it up and share the results with colleagues. David and I plan to publish our techniques and make the clean data available for everyone.

In two weeks, I am going to be sharing our techniques at an upcoming Open Energy Modeling workshop at the National Renewable Energy Laboratory. I hope that the intense effort we put into this work leads to a data product that other research teams can also use for their modeling.