The Policy Analyst’s Version of Guy Kawaski’s 10/20/30 Rule—and ideas from Tufte and Cohen

Guy Kawasaki’s now classic article “The 10/20/30 Rule of PowerPoint” provides three simple and powerful rules: “a PowerPoint presentation should have ten slides, last no more than 20 minutes, and contain no font small than thirty point” — hence 10/20/30.

These rules are helpful for almost anyone who must create a slide presentation — but for policy analysts and researchers, I’m going to argue for two tweaks: #1 your deck should be 12/20/24 and #2 the topics you cover should answer a different set of questions.

Let’s start with the number of slides. Why do I argue for 12? Because this allows you to answer the key questions that almost any kind of policy analysis study needs to address. These are:

  1. What is the issue?
  2. Why does the issue matter? To whom and under what circumstances?
  3. What’s the background on this issue?
  4. Where are the “problems” and/or gaps that are most key to consider and why?
  5. What is your research question?
  6. What is your analysis strategy and why did you choose it?
  7. What data will you analyze and why did you choose these data?
  8. What are your findings?
  9. What are your policy recommendations and how do they follow from your findings?
  10. What are the limits of your study?
  11. What is the next step required to reduce uncertainty or begin implementing?
  12. In summary, what is the research question, key findings, and recommendations—and how do I reach you?

A title slide is usually a good idea and, in some cases, I also include a references slide, or even have additional supporting data and figures at the end—but with these caveats, a good analyst should be able to cover all the bases in 12 core slides.

And, in fact, it’s possible to combine sequential pairs of almost any set of the first 11 slides above into a single slide, which can reduce your overall slide count as needed. The exact number of slides should be driven by the content, but a rule is a heuristic to make things easy — so aim for no more than 12.

Kawasaki recommends that you present your slides in 20 minutes—which leaves the rest of a typical hour meeting for questions and discussion. There’s good science backing this general point: We are limited in our ability to understand more than a handful of concepts, and it’s difficult for the average adult to focus for more than about 20 minutes.

And so, even though the context is quite different, 20 minutes is a good goal to aim for. If you need to create a longer presentation, try to break it up into 10- to 20-minute segments and leave time for questions and discussion in between.

The last part of the 10/20/30 rule is that text on slides should be no smaller than 30-point font—but, Guy allows, if that’s too “dogmatic” you can take “the age of the oldest person in your audience and divide it by two” to find the optimal font size.

Since the resolution of the average slide presentation system has increased quite a bit since Kawasaki’s essay was written in 2005—and since analysts sometimes must be a bit wordier, I’m going to argue for making your text no smaller than 24-point—although I usually start at 28-point for bullets and 36-point for slide titles.

Here, too, the exact size doesn’t matter—especially because fonts vary widely in how dense they are. But Kawasaki’s observation resonates with my experience in policy organizations and at conferences: Presenters often use small font because they either “don’t know their material” and need to read the text, or they think that more text is better.

Both are terrible reasons to write a lot of text—but analysts face additional challenges. First, subject matter experts (SME) are prone to “data dump” syndrome; and secondly analysts are often not great writers.

A single blog post can’t turn you into a top-notch scribe (although resources abound online), but aim for, what James Salter called, the lightening rod of “brevity, clarity and wit…” And note that wit doesn’t necessarily mean humor, but also means “keen perception” and the ability to make connections between ideas.

To avoid data dump syndrome, you should always ask “what does this audience need to know to understand my argument?” In a typical project, you may run many tests and generate a lot of data, but when you make your case, prune these branches—after all, you can always link to other documents or provide additional slides after the conclusion.

Are slides the answer? What’s the question?

Kawasaki’s advice is sage, but slideware poses deeper problems as well. Edward Tufte’s fantastic essay on the The Cognitive Style of PowerPoint argues, most fundamentally, that sometimes complex thoughts and ideas are resistant to bullet points. Since a typical slide can’t contain more than about 40 words, presenters often feel the need to create slide after slide to express a concept. This linear and segmented approach, what Tuftes calls “one damn slide after another,” harms our ability to make connections across topics and understand context.

Did an endless array of nested bullets and eye-glazing text cause the Columbia Space Shuttle disaster? Whatever the case, when you have to present a deck, take Tufte’s advice to heart: For each idea you want to communicate, ask yourself “Is there a map, diagram, photo, graphic or figure that will provide more insight or understanding than a set of bullets?” If the answer is yes, then create a simple visualization that avoids “chartjunk”—and instead of projecting your text, tell it to your audience.

Since Kawasaki and Tufte are in agreement that you can’t read your slides, how should you keep track of your notes? Most major packages provide a note area, but I find that once I have written and revised my notes, then I don’t need to look at them. However it’s always a good idea to write your notes down as bullet points and keep a copy close at hand.

Let me close this essay by adding on some advice from Steve Cohen, Executive Director of Columbia University’s Earth Institute: For important meetings, always print one copy of your deck for each attendee—and print a copy for yourself with your notes.

If the meeting size is large, or you want to conserve precious resources, then print at least two copies—one with your notes and one with just the slides. This backup copy can be use to show visuals in a small meeting or it can be run through a copy machine at the last minute. I would add that you should always also have your deck available as a PDF along with the native file in case of software issues. And even though cloud systems are ubiquitous, have a backup on your email or, ideally, a thumb drive as well. Even in our hyper-connected age, technical glitches crop up all the time.

For many topics and in many settings, presenting slides is a fact of life. Policy analysts and social science researchers should follow my 12/20/24 rule—but keep Tufte in mind and always ask how you can visualize text. And don’t forget Dr. Cohen’s advice: When the computer crashes and the projector can’t connect, paper can save the day.

Advertisements

Trump and Perry Place Clean Energy in Their Crosshairs

The Trump administration has largely failed at accomplishing any of the major goals that Mr. Trump campaigned on—his travel ban has been blocked, the treaty with Iran remains in effect, and the GOP efforts to “repeal and replace Obamacare” are in utter shambles.

The one area where Trump and his GOP colleagues have been effective: Gutting programs that protect the environment, and repealing rules that keep our skies clear and our waters safe.

When I looked at the Trump Administrations inchoate efforts late last year, I was deeply troubled to see the rogue’s gallery that he had put in charge of shaping climate policy and running the EPA—and they haven’t disappointed those who favor profits over the environment.

The one bright spot I saw was that federal subsidies for clean energy seemed safe—but now it looks like that will change. The New York Times reports today that the Department of Energy has begun a study to examine the impact of wind energy subsidies on the grid—and if the early moves of the Trump administration and Secretary Perry are any indication, advocates of clean power should start girding for a fight.

Particularly disturbing is the person in charge of the study: Travis Fisher, who wrote that clean energy policies are a greater threat than terrorism. Even Republican Senators, such as Chuck Grassley, are concerned by the seemingly slap-dash nature of these “research” efforts.

Why fight for clean energy subsidies like the PTC and ITC? Simple: Because fossil fuels continue to gobble up a staggering amount of subsidies, even as the cost us billions in healthcare and environmental damage.

Estimates by the International Monetary Fund find that fossil fuels continue to take over $5.3 trillion a year globally in tax payer money—despite all the damage they do to our lungs and environment. Although renewables also do receive subsidies (for the moment) the most rigorous independent study finds that worldwide, fossil fuels still receive more than four-times the support of renewables.

Don’t be fooled by the obfuscation: Wind, solar and so-called “intermittent” sources of renewable energy, when properly balanced and integrated, have not harmed grid reliability—they have improved it.

Grid reliability improves as wind power grows

And although wind power is approaching cost-parity with natural gas in some places, the industry needs a few more years of assistance to be fully competitive. The broader point is that clean power plants improve public health and the health of the American environment.

Some researchers who have modeled different scenarios point out that the combination of a carbon tax with renewable energy subsidies would be the most effective way to make producers of dirty energy pay their fair share—and allow clean sources of power to be competitive with their pollution spewing cousins.

So if any conservatives are in favor of a 1-to-1 switch—let’s work with them and make the change. However, if the only goal of the current administration is to favor the dirty power producers of year’s past, then we should get ready to fight back by making a clear public case: Investing in clean energy creates high-paying American jobs, reduces asthma and protects public health, lower power costs for consumers, and protects our environment.

What Does Trump Mean for Renewables and Climate Policy?

As the United States prepares for the inauguration of Donald Trump, those who work on renewable energy and climate change policy are fearful—and with good reason.

The good news is that solar, wind and renewable energy (RE) will continue to grow in the near term—the bad news, for those of us working to foster a clean energy transition, is that President-elect Trump is an avowed climate-change denier who has assembled a radically anti-environmental cabinet. The incoming administration looks ready to savage the Paris Accord, slash federal clean-tech research, and destroy decades of bipartisan environmental progress.

But before we go dark, let’s look at some good news: wind and solar have become the fastest growing sources of new power in the US. In the best locations, electricity from utility-scale wind is approaching cost-parity with natural gas and coal. And some energy analysts expect that solar and wind will become cheaper than even coal-fired power plants in the next decade.

bloomberg_power

More good news on the electricity side: The challenges of integrating intermittent renewable generators into the grid are fading in the face of a more flexible power system that uses demand management, diverse resources linked by transmission, fast-ramping natural gas generators, and, increasingly, grid-scale batteries and storage.

State Renewable Standards and Federal Tax Credits Should Survive

At the state level, we see evidence that intelligent deployment and integration of utility-scale and distributed resources can blow through old ceilings: California now receives 27% of state power demand from renewable sources and is on track to reach 33% by 2020 and 50% by 2030. Smaller, distributed energy resources (DERs), usually located close to the customer, are also proving to be a boon to clean energy.

Another piece of good news is that the two primary ways we provides incentives to encourage RE in the US—state-level renewable portfolio standards (RPS) and federal investment tax credits—seem secure (for the moment).

Looking at RPSs, which are state-level requirements that utilities generate or procure power from a wide range of renewable sources, 29 US states have put these programs into place—and an additional eight states have weaker RE goals.

Sure, certain states, such as Kansas, have turned RPS into voluntary goals, but despite a major Koch-funded effort to go after renewable standards, state requirements for renewable power have proven difficult to reverse—even in Republican-dominated states. And not only are these laws looking secure, but in many large states (e.g., California, and New York) the standards have been increased.

dsire_rps_2016

At the federal level the two major programs that support clean power, the production tax credit (PTC) for wind, and the investment tax credit (ITC) for RE equipment more generally, were renewed last year by the GOP-led Congress. Because these tax credits enjoy bipartisan support the new administration seems unlikely to repeal them—but of course this could change.

Ok, that’s the good news; now, let’s turn to the scary stuff.

Big Threats to Renewable Research and Environmental Protection Loom

The first fact we need to acknowledge is that despite a quick meeting with Al Gore, Trump has nominated a rogue’s gallery of climate-deniers, fossil-fuel advocates and anti-environmentalists for key roles in his incoming administration. The most disturbing choices include Scott Pruitt as head of the EPA and Rick Perry as Secretary of Energy—but other key roles are also set to be filled by anti-environmentalists.

At the international level, Trump’s stated intention to withdraw from the Paris Accord could greatly complicate global efforts to reduce carbon emissions—but with China, India, the EU and 122 other nations having ratified the agreement, the US will face stiff international pressure (and perhaps even tariffs) if it withdraws from the agreement.

Looking out a few years, the most dangerous threat posed by the incoming administration is also the least visible: the destruction of US federal research on the scientific underpinnings of climate change, and the evisceration of support for clean-tech research.

The pre-inauguration moves by the Trump Administration to identify federal employees involved with climate change are chilling and point towards a regime that would rather hide under obfuscation than understand the science—but at least scientists are rallying now to protect important datasets.

It will take some time, but, if their goal is to attack federal clean-tech research, the Trump Administration will be able to savage support for critical programs at the National Labs, the Department of Energy (DOE), the National Oceanographic and Atmospheric Agency (NOAA), the Department of Defense (DOD) and, perhaps, even NASA. If this happens, it will not only harm the scientific research that we need to understand Earth’s systems, it will disrupt development of advanced technologies that private companies are not yet able to field.

It’s possible that other governments will step up their funding for GCC research. It’s also possible that private research efforts, such as that recently announced by the Gates Foundation, can take up some of the slack—but nothing can currently replace the amazing work being done by NOAA, NASA, DOD and the many labs affiliated with DOE.

The US is a high-energy society that will require plenty of fossil fuels for years to come—and so programs to improve the safety and efficiency of shale gas or domestic oil could be beneficial. But as our global peers, like China, poor billions of dollars into clean energy, slashing US research efforts to support clean energy would be foolish in the extreme. Hopefully the incoming administration will realize that not only do these research and policy efforts protect our environment, they protect our economy, and create sustainable US jobs.

Cheer Up! The World of DERs will be Wonderful!

A couple of months ago, I was in San Francisco for the California Distributed Energy Future Conference and it seems to fair to say that the field of distributed energy resources (or DERs as we energy nerds say) is burgeoning. With an apology to science writers of the 1950s, I want to dispel some of the hype and see what’s really going on.

In this post, I’m going to look at some of the trends that are driving DER growth—and then offer a few cautions. Distributed resources could meet at least half our needs according to new research—but we should also keep in mind the time it takes to deploy new technologies and the potential impacts from a highly distributed power system.

Trends in DER Technology and Regulation

First, I want to underline some positive trends: a new study by the National Renewable Energy Laboratory (NREL) finds that in every state in the lower-48, at least 70% of the buildings are suitable for rooftop solar arrays. And, in California, Florida and Michigan—and across the Northeast—at least 45% of the total power demand could be satisfied by rooftop solar within that state. Other states have less potential, but every state has at least some potential.

Of course rooftop solar is only one kind of DER (I made an infographic listing the major kinds of projects). Other technologies also hold great promise: battery systems will allow us to meet peak demand with less generation; microgrids and combined-heat-and-power systems will improve both the resiliency and the efficiency of the grid; and demand response systems will reduce our need for dirty “peaker” plants and improve the integration of variable renewable resources.

Many of these individual technologies have existed for decades—what’s really new is that advances in computer control are now allowing DERs to be remotely controlled en masse. We are looking at a near-term future where thousands of DERs of different types can be integrated and controlled in real time to create Virtual Power Plants (VPPs). VPPs will allow grid operators to respond to real-time power demand—and to the variable output of large-scale renewable projects—with flexible, local power resources.

The growth in distributed resources is also being driven not just by technology, but also by state mandates. For instance, in 2015 California passed a law requiring utilities to bring 12 GW of renewable DG onto the grid by 2020. Other states, including South Carolina and New York have either created requirements for DERs or are considering them. States are mandating or supporting these projects because distributed resources can reduce power system pollution and reduce the need for costly and complicated power line construction projects.

Another reason why states are interested is the potential for DERs to improve grid resiliency. After hurricane Sandy knocked out power to millions of people in the tristate area, policy makers began paying a lot more attention to the potential for microgrids and DERs to keep the lights on when the larger power system goes down. So is the “soft path” for energy that Amory Lovins advocated back in 1976 finally coming to fruition? Will the future for DERs be wonderful?

A Few Notes of Caution

Yes, but—let’s not lapse into techno-utopianism. With the positive trends outlined, I want to inject a few notes of caution based on the research I’ve done on technology diffusion and public opposition to energy infrastructure. What we know from the research on these two subjects is that: 1) the estimated technical potential of an energy resource is very rarely, if ever, realized and 2) all energy technologies—even “green” ones—have impacts that may arouse local opposition.

We can use an analysis of past energy diffusion patterns to inform our understanding of today. Although technology diffusion paths can vary widely and change quickly, at the macro level, the most classic diffusion curve follows a logistic model and creates an S-type shape. Viewed from this perspective, technologies have three rough phases:

  1. A phase of “gradual diffusion” as the technology is introduced to the market and improved with the lessons learned from deployment
  2. A phase of “rapid, exponential growth” as the cost of the technology falls and performance improves
  3. And then a final phase where growth slows as the market becomes saturated (p. 82, Wilson, 2012).

One benefit of DERs that has sped deployment is that DERs are modular in nature. In other words, DERs are usually based on components that are produced in large quantities in a factory environment. And, as Wilson (2012) shows, modular technologies (Wilson uses the example of compact fluorescent bulbs) are inherently easier to deploy as compared to large, industrial scale facilities—like a gas-fired power plant, which must be custom designed. DERs, properly placed, can also return monetizable benefits to grid operators in terms of balancing and power control services—and they can help utilities avoid costly and difficult transmission upgrades.

With that said, it’s also important to remember that all (existing) power technologies have impacts on someone to some extent. For example, a rooftop solar array produces no pollution during operation, and is far less disruptive to habitats as compared to a greenfield solar project built in the desert. But rooftop solar can create impacts nonetheless (and I am ignoring for the moment the impacts created by production and disposal) such as creating glare, blocking sunlight, and harming aesthetic views. Although these may seem like minor impacts, a person who used to look out onto a familiar cityscape or a fallow field, but now looks out on endless rows of panels, may feel otherwise.

Other problems could crop up from moving large amounts of generation from distant sources into urban settings. For instance, a microturbine or a combined-heat-and-power unit may be more efficient per se, but it also creates local air pollution that might not exist if the owner depended solely on grid-resources. A battery installation doesn’t create air pollution, but it does create electric and magnetic fields (EMFs). Fears of EMFs are often overblown—but there is also evidence that exposure to high levels of EMFs can be a threat to human health.

Too often with new technology, the media inflates the potential for benefits and underplays the possibilities of harm. So it’s important to be realistic about the impacts of more distributed resources—and to realize that some people may react very negatively to projects. Instead of decrying this, we should anticipate it and let criticism inform both technological development and policy. This excellent research paper on DG by Pepermans and coauthors at the University of Leuven has a balanced discussion of the benefits and possible costs from DG. And this blog post by Timothy Brennan of RFF describes some of the economic reasons why utilities may prefer centralized generation. A full discussion of the costs and benefits of DERs as compared to centralized resources is beyond the scope of this post—but it’s clear that we need to continue to study this topic so that we can value the benefits from DERs properly.

Conclusion

In the “soft path” energy future envisioned by Amory Lovins, centralized generation resources would fade away and be replaced by smaller and more renewable resources. Although technology is making DERs ever more desirable, it will take time for these distributed resources to be deployed—and so, in the coming decades at least, it appears that a hybrid system will emerge. DERs will surely grow. And, appropriately planned, they hold great potential to reduce pollution and reduce the need for new grid infrastructure while simultaneously increasing resilience to extreme weather. But it’s important to realize that power projects take time to build and alway cause impacts—to someone, someplace. And so, as we move to reshape our grid, we should keep in mind that people who oppose projects should be treated with respect, and their concerns should be taken seriously.

The US Energy System: Visualizing the Lay of the Land

When I talk with people casually about energy, the many sources and uses of energy are often confused and so I’ve written this blog post to provide a basic introduction to the US energy system and clarify a few key concepts.

Because the energy system of the United States is a complicated beast, let’s start with this amazing flow chart created by the Lawrence Livermore National Laboratory. Energy sources are shown along the left side, electricity generation is show in the center, and the pink boxes on the right side depict energy use by economic sector.

Estimated U.S. Energy Use in 2014

Let’s first make a distinction between electricity and everything else. In the US, we generate electricity, which is a secondary form of energy, from coal, natural gas, nuclear and renewables—but we no longer use oil for electricity. As the chart shows, there are a very few exceptions that account for less than 0.01% of generation. Thus the price of oil has no direct effect on the cost of electricity in the US.

Of course, the price of oil is important to the US economy—as the chart shows, petroleum is the largest single source of energy, but 70% of all oil used in the US goes to transportation. Other significant uses of oil are for industrial products (e.g., asphalt, chemicals and lubricants) and space heating (mainly in the Northeast). The good news, not shown in the chart above, is that the US imported 60% of the oil we used in 2005, but this fell to only 27% in 2014—the lowest amount since 1985. However, we still use almost 7 billion barrels of oil per year and over 25% of our imports come from nations who are either hostile or have dubious human rights records, so there’s certainly a lot of progress yet to be made.

Turning back to electricity, coal is still the single largest fuel used for electricity in the US and is used to generate 39% of US power . Coal is a low-cost fuel that packs a huge amount of thermal value per unit, but it’s also destructive to produce and dirty to burn. Not only is coal the largest source of carbon dioxide, but an eye-opening report by researchers at Harvard Medical School estimates that coal combustion causes between $65 billion and $187 billion in impacts from air pollution alone. The EPA’s Clean Power Plan and cheap domestic natural gas are leading to the accelerated shut-down of older (and dirtier) coal-fired plants.

Natural gas is the second-largest source of energy and it has found favor in a range of settings: it’s used to generate electricity, especially in the Western US, and is commonly used for space heating, water heating and cooking in the residential and commercial sectors. But, what maybe surprising, is that natural gas’s largest single use is in the industrial sector to power factory processes, and also as a feedstock for fertilizer, plastic and other products.

The last source I want to discuss is renewable energy: Hydropower, wind, solar, geothermal and biomass are all individually relatively small contributors—but together they make up almost 10% of the total US energy supply. Renewables are also growing very quickly in the US and around the world. My dissertation looks in detail at the achievable potential of wind power, so I look forward to discussing that in a future post—but all I want to point out here is that renewables now exceed the contribution of nuclear power as an overall source of power in the US.

Nuclear power is a controversial, but still-important source of electricity in the US. Nuclear power is either the “most expensive way to boil water” as wags have called it, or the key to meeting the rapidly growing energy demands of an urbanizing world, as “ecomodernists” like those at The Breakthrough Institute have labeled it. Because nuclear power and the issues surrounding it are complicated, I’ll save a more detailed analysis for future posts.

Before I wrap this post up, I want to highlight one last thing from the chart above: the light grey traces in the graph depict “rejected energy” which is energy lost in transmission, generation and waste heat. So the headline of this graph is that the US consumed 97.4 quads of energy in 2013, but produced only 38.4 quads of useful energy services. And this doesn’t account for inefficiencies by the end user such as leaving the lights on, heating an empty building or driving in circles to find parking.

In other words, a maximum of only 39% of the energy we produce is put to useful ends! This fact highlights the critical importance of energy efficiency or improving the efficiency of how energy is generated, transported and used. “Negawatts” as the Rocky Mountain Institute have dubbed energy savings from efficiency, is not only by far the cheapest way reduce pollution, efficiency can also have a host of follow-on benefits such as more comfortable living spaces and higher-quality products. People focused on the international energy picture often emphasize that efficiency can’t help, for instance, the 300 million Indians who live without access to electricity. But even those in the more-energy-is-better camp would probably agree that wasting nearly 40% of what we produce on inefficient production, transmission and distribution systems is too high.

In the next post, I’ll spend some time discussing the impacts of the current energy system—the air pollution, water pollution, damage to ecosystems and other kinds of harm that are associated with activities like burning coal, pumping oil and mining uranium.

The World is on the Verge of a Massive Energy Transition

A few weeks ago, I got word that my dissertation was accepted. I’m excited (and relieved) to be receiving my degree at the end of this year.

Over the past five years, as I’ve studied toward a PhD in Environmental Policy and World Politics at Claremont Graduate University (CGU), one of my key areas of focus has been energy. I find energy an endlessly fascinating subject because cheap and reliable energy is so critical to contemporary society—yet the destructive effects of energy sector pollution have become ever more clear.

Because of a growing consensus that we have to massively reduce pollution from energy use, and because much of our energy system relies on technologies that are nearly a hundred years old, a wave of innovation is hitting the energy sector. We are, in the words of Vaclav Smil, experiencing a massive “energy transition”— a shift in the sources and technologies associated with the energy services we enjoy, such as light, heat and transportation.

Here are some of the key trends in the electricity sector:

We have clear, incontrovertible evidence that pollution from human activity has already warmed our planet and, unless we take radical, coordinated action, will cause irreparable harm to the humans and the planet. I will save a detailed discussion for a future post, but the most credible skeptic, Richard A. Muller announced in 2012 that he is now a “converted skeptic” and concludes that global warming is real and that “humans are almost entirely the cause.”

Renewable sources of energy are growing by leaps and bounds—and certain kinds, such as new wind farms, can actually be cheaper then the cheapest conventional sources of power. We have a long way to go to cut emissions quickly enough to stave off the worst effects of climate change, but renewables, aided by new technology such as energy storage, are starting to make economic sense even without subsidies. With that said, debates over whether renewables alone can meet climate change mitigation goals persist. I believe, that if we increase spending on clean energy innovation, we can clean things up pretty quickly.

The growth in renewables, which are variable in output, is pushing utilities to rethink and reconsider the classic utility grid. Instead of the current approach, where remote power stations serve major users, so-called “smart grids” are making use of real-time pricing, demand management, energy storage and distributed energy resources (DER) to balance supply and demand. In 2013, I wrote a short white paper that goes into more depth about these topics.

The political and policy questions related to these changes are important to consider. For instance, although people broadly support renewable power plants, like solar and wind farms, they may strenuously object to the placement of a massive wind farm near them. In a democratic nation, local concerns over siting of projects—and over the inevitable impacts of even “clean” power plants—are not a trivial problem and raise a host of thorny equity concerns.

Some of the questions I have worked on, in conjunction with professors Hal Nelson, Heather Campbell, Mark Abdollahian and with other students at CGU:

How can we site locally disruptive projects in a fashion that’s efficient and allows us to meet climate change goals while also protecting local communities?

Should renewables be built only where there is little effective local protest?

How should we set the rules for our energy system in order to ensure that energy prices are low, service is secure and resilient, and innovative new technology can be integrated into our power system without steam-rolling local communities?

I’ve participated in several projects that have attempted to answer these questions. One paper, which I coauthored with Hal Nelson, examines What drives opposition to high-voltage transmission lines? We looked at transmission lines specifically, because power lines are needed to connect remote renewable power stations with urban areas—but the framework we developed can easily be applied to all kinds of locally unwanted land use projects.

In 2013, Hal, Mark and a CGU doctoral student Zining Yang built a sophisticated agent-based model (ABM) using, in part, some of the theoretical research I developed. And then, last year, Hal, myself and Zining used the ABM model to simulate how a community’s political power influences project outcomes. We found that in a more egalitarian world, disruptive projects would be less likely to be built. We published this study in a new book written by Heather Campbell and colleagues, Rethinking Environmental Justice in Sustainable Cities.

Suffice it to say that siting renewable power plants and related infrastructure is just one of the challenges we face as we try to manage a massive energy transition to a lower-carbon world. How to guide the diffusion of energy innovation and create the right incentives and regulations is another critical area of inquiry.

At base, all of these changes beg the fundamental question: How can we create a radically more sustainable society that allows our standard of living to continue to grow? And from the global perspective, How can we promote a more equitable world—a world where people in developing countries have access to the energy services we take for granted—without destroying the planet that we call home?

Useful Resources: Statistics and Data Visualization

In out current era, even nontechnical people need to have some understanding of data science and statistics. What’s confusing is that so many terms—data science, big data, machine learning, statistics—get bandied about with little explanation. To understand how the pieces fit together, I highly recommend this article from the Win Vector blog.

The R programming language is another tool that is making headlines. From Google’s use of R for analysis to Microsoft buying IDE maker Revolution Analytics, R is all the buzz. What is it? Why use R? The R Bloggers website answers these questions and their email list provides a nicely curated flow of articles that will keep you up-to-date.

For folks who are programming already, the Quick-R page is very useful, especially when you are trying to find equivalent R code for a Stata routine.

As a former graphic designer and communications officer, I’ve spent a lot of time thinking about how to present research. As this nice overview article by the WSJ explains, in an era of Big Data, visualization is going to be a key area of inquiry for firms, researchers and the public.

A good introduction to data visualization can be found on Edward Tufte’s website and in his books (the Visual Display of Quantitative Information is well worth buying and his traveling seminar is also great).

For a more contemporary spin on these issues, the Flowing Data website and book are also worth a look. For true visualization geeks, the Parsons Journal for Information Mapping is a fantastic (and free) publication put out the the Parsons Institute.

If you are working on a project and need a resource to inspire creative approaches to visualization, I highly recommend the Visual Literacy project’s Periodic Table of Visualization Methods.