Monday, June 30, 2014

Poverty and Promise in Our Own Backyard

The Rocky Mountain Institute recently released an article about rural electrification on American Indian land that was picked up by Cleantechnica [1]. It begins by quoting some rather unfortunate statistics: "almost 40 percent of the people live without electricity, over 90 percent live below the poverty line, and the unemployment rate exceeds 80 percent." It provides just a snapshot of the hardships American Indians are put through. By being "relocated" to remote lands with few resources, opportunities for American Indians to improve their livelihood are scant and often inadequate [2]. Being a white, middle-class New Englander I have very little at stake advocating for American Indian rights, but I do feel passionately that the poverty endured by American Indians on reservation land is one of the largest injustices in American history. A strong statement for a blog that typically refrains from such pointed language, but it's a serious matter that deserves a serious tone.

It's perhaps a bit of poetic justice that some of the reservation land that American Indians call home has some of the most promising wind and solar potential in the country. I've collected some resource potential maps from NREL to be displayed alongside a Bureau of Indian Affairs map of reservation locations:

For wind resources, South Dakota and parts of Montana have the most potential, and for solar PV, Arizona and Southern California have a lot of promise. The BIA has already identified the benefits of wind development on American Indian land and has published a report highlighting a few reservations with high wind potential [3]. Already a few large wind projects have been developed. I haven't come across much by way of solar development, which I think is a bit of an overlooked opportunity; the Navajo Nation for instance resides in an area with one of the highest solar insolation levels in the country, yet 40% of homes don't have access to electricity [4].

American Indians developing their renewable energy resources to their highest potential is a definite win-win scenario: availability of in-demand jobs, meeting of climate goals, access to electricity, improvement of relationships. It's one of the most clear-cut examples of a profitable triple bottom line enterprises I could imagine.

--------------------------------------------------------------------------------------------------------------------------
[1] http://blog.rmi.org/blog_2014_06_24_native_energy_rural_electrification_on_tribal_lands
[2] http://www.spotlightonpoverty.org/ExclusiveCommentary.aspx?id=0fe5c04e-fdbf-4718-980c-0373ba823da7
[3] http://www.bia.gov/cs/groups/xieed/documents/text/idc013229.pdf
[4] http://ewb-gt.org/navajo-nation

Thursday, June 26, 2014

Sun Day Sunday

According to some blogger with a solar observatory, last Sunday June 22nd (see how I like to keep things timely?) was International SUNday, a day devoted to appreciating our life-giving sun through observation (with proper equipment of course) and education [1] [2]. I came across the holiday after a friend had posted it on Facebook with a list of sun-related facts. I had intended to include the bit of trivia about how how the sun could provide some so-many-thousand times more energy than we consume each year, but I couldn't remember it and looked up the numbers behind it. I instead got distracted and decided that perhaps I should follow up with a blog post about it, which you're currently being subjected to.

Factoring in panel efficiency (~20%) and land cover (~30%), the energy that could be captured from the sun is 420-times our total annual energy consumption [3]. This means that covering 1/420th of land area (0.2%) in panels would be able to meet all of our transportation- and stationary-energy needs. This comes to 110,000 sq. miles, which is about the area of Nevada [4]. This seems like a lot, and it is, mostly because we use A LOT of energy. It seems a bit hopeless to try and tile all of Nevada in panels, but that doesn't take into account is that we humans have gotten really good at building things over large areas already, and not just simple things: cities. I looked up the population density of the largest cities and found that they're in the 10,000 people/sq. mile ballpark [5] [6]. We know half the world's population lives in cities now, so that makes 335,000 sq. miles, or 0.6% of land area.
1) I actually think it's pretty amazing that in 200 years of modernization and urbanization, we've only covered 0.6% of land area; the world is BIG. That said, even having only covered that 0.6%, we've managed to screw up a lot of natural processes. Humans are MESSY.
2) This means that if we cover 1/3 of city area in panels, we'd be able to meet all our energy needs via solar power. Based on my years of playing SimCity, that's roughly what roads typically take up (NOT an endorsement for "solar freakin' roadways," we know that's a silly idea; just serving as a comparison).

Now, real talk:
This is a quick analysis that is based on averages. Solar power isn't available everywhere all the time and there are locations where it wouldn't make a lot of sense. It was meant to provide context: yes we can build enough panels to cover that area because we've built more of more complicated things, but no it isn't going to be easy. Here's where it gets interesting though. Urbanization is a very strong force; most of the growth in population is going to be in newly urbanized areas in developing regions of China, India, and Africa [7]. In other words, the way we as a civilization will grow in the next 30 years is going to be by building new densely populated areas, not by making current population centers denser. 2/3rds of the new people expected by 2030 will live in buildings that don't currently exist yet in regions that are characterized by high solar insolation [8]. This is a HUGE opportunity. Those new urban areas need to be extremely efficient and reliant on local renewable energy, primarily solar.

Want to change the world? Become a contractor specializing in building low-cost, low-energy apartments in developing countries, or a local policy expert pushing for low-energy new building codes. That will mean the difference between a future of much of the same (that is to say, getting worse), and a future with an inflection point.

--------------------------------------------------------------------------------------------------------------------------
[1]http://www.slate.com/blogs/bad_astronomy/2014/06/22/international_sunday_celebrating_the_nearest_star.html
[2]http://solarastronomy.org/sunday.html
[3]http://en.wikipedia.org/wiki/Solar_energy
[4]http://en.wikipedia.org/wiki/Land
[5]http://www.wolframalpha.com/input/?i=population+density+tokyo%2C+mexico+city%2C+new+york%2C+boston%2C+london
[6]http://www.citypopulation.de/world/Agglomerations.html
[7]http://www.scientificamerican.com/article/cities-may-triple-in-size-by-2030/
[8]http://solargis.info/doc/_pics/freemaps/1000px/ghi/SolarGIS-Solar-map-World-map-en.png

Friday, June 13, 2014

"Where's my F*cking Electric Car?!"

I've been maintaining a pretty good clip on this blog of about new 3 posts a week ("not bad," thought the novice blogger to himself)...until two weeks ago. Since then I've been very wrapped up with work and had to focus on that whole science thing. The problems we're working on are very nuanced, but not unknown in the field of electrochemistry, which itself is a relatively new field. Basically, electrochemistry is hard, and few people outside of the science understand just how hard it is (hence the title).

Electrochemistry emerged as a separate field of chemistry after early scientists first started laying the groundwork for general electromagnetic theory and chemistry. Elements were discovered, conservation of mass and matter were accepted, electrostatic generators were built, and electrical detectors were invented all before scientists even started tinkering with electrochemistry. And the first steps were pretty gruesome. In 1800, an Italian doctor (Galvani) dissecting frogs was able to make dead muscles twitch by touching them with different metals connected to each other in series. A physics professor (Volta) disagreed on the mechanism and arranged stacks of different metals and brine-soaked paper to achieve similar results. This was the invention of the battery: the first device that turned chemical energy into electricity, but no one at the time knew how it worked. That didn't stop anyone from using it though; application outpacing understanding in the energy field has been the MO since the battery was first invented.

Electrochemistry got its first big scientific break from Michael Faraday in 1830s linking current (amount of electricity) and the amount of matter deposited during electroplating experiments. It took another 90 years and the framing of modern thermodynamics by Willard Gibbs before the groundwork of analytical electrochemistry was laid by Hermann Nernst relating voltage to chemical equilibrium. So only in the last decade of 1800 are we even able to discuss the describe the designed properties of a battery in simple terms of voltage and current.

So...electrochemistry took a while to get to a point where we can actually analyze it. So what? Shouldn't it have erupted in discovery after discovery since then? Not really. Most of the batteries we use today were invented long before anyone really understood what was going on. Even the modern lithium ion battery, invented in the 1980's, features a component called the "solid-electrolyte interface" (SEI) that sets the longevity and safety of lithium ion batteries, however scientists have only recently began to understand the structure and composition of it. In other words, the microns-thin layer that determines how long you can use a battery and whether or not it will burst into flames is the least understood part. There's almost too much that goes into the design of a practical battery not to take a trial-and-error approach. There's:

  • The electrical potential of the positive and negative electrodes (sets the cell voltage)
  • The chemical kinetics of the reactions at the positive and negative electrodes (helps determine maximum current)
  • The structure of the electrodes (determined by the conductivity of the reactant species and the speed of the kinetics)
  • The electrolyte composition and properties. This can be further broken down into:
    • Operating pH (determines chemical compatibility with battery housing, electrodes; also determines prevalence of undesired side reactions)
    • Conductivity (helps determine maximum current)
    • Organic vs. Aqueous (determined by cell voltage, cost considerations, and storage reactions)
  • The membrane separating the positive and negative sides of the battery (critical component, helps determine a lot of things)
    • Cell durability
    • Cell efficiency
    • Maximum current draw
    • Cost and manufacturability
    • Operating temperature regimes
  • Cell housing and architecture
    • Sealed vs. Flow (flow batteries limited to liquid energy storage reactions)
    • Bipolar vs. Monopolar (tradeoffs on manufacturability)
This is by no means an exhaustive list. To give an idea of how all these parameters fit together, I'll walk through an example in another post where we'll go from chemical fundamentals to full cell operation. During that exercise, it'll become pretty clear that we're lucky to have even what we have now given how easily things can go wrong.