image_pdfimage_print

Climate change or peak oil – does it really matter?

Has it been that long since my last blog entry?  Been extremely busy this winter and of course, busy is good!  But on the other hand, I have a set of topics piling up that I would like to write about.

Earlier, I blogged when I read Jeff Rubin’s book “How the World is going to get a Whole lot Smaller”.  When I posted the blog, I had good feedback.  I was told that if I read this book, then I should definitely read “The Long Emergency” by Jeff Kunstler.    Having been written in 2005 it is getting a bit dated.  This makes it even more interesting because as you read, reality can be compared to the author’s predictions over the last 5 years.

I really did enjoy the book. The concepts are similar and predate Jeff Rubin.  In summary, Jeff Kunstler is convinced that the age of peak oil is upon us and that the world is going to be a very different place sooner rather than later.  A number of his predictions have come to pass including the housing crisis and the very deep economic recession that we are just coming out of.  Unfortunately the book then goes on to predict doom and gloom- basically the complete collapse of society as we know it.  While he may be right, and I hope not, the trouble with this is that it discourages readers from paying attention to the main message.  And this message is an important one now being put forward by Jeff Rubin as well.

I do believe him when he says that we are at or near peak oil.  I also believe that there is no magic bullet to replace oil and that those who postpone decisions to adapt on the basis that “technology will save us” tend to be somewhat deluded – or in reality are just avoiding the issue.  On the other hand, I don’t believe that the world will come to an end and I do believe that there is technology that will help us delay the large scale effects to give us even more time to adapt.   But remember, adapting means changing behaviour. 

For example, look at one industry.  Publishing. How much carbon is used in the manufacture and distribution of books, magazines and newspapers?  Look at the business model.  Books are published in a big print runs.  They are then transported to book shops where they are to be sold, generally on consignment.  If not sold, the books are returned (more transport) to be destroyed.  While I don’t have the numbers I can assume the carbon costs to be significant.  So why am I talking about this?  Well, along comes technology – an e-reader or now an Apple IPad and what happens?  Millions of books, magazines and newspapers no longer have to be distributed in hard copy, but can now be distributed electronically thus reducing the carbon footprint of this one industry by a huge amount.  Now I don’t want to get into the discussion about the merits or e-readers here – and in fact I do want to blog about it at a later date – but just assume that it does come to pass.  Then assume there are other industries that can also do the same.  You see where I am going.

So now let’s bring climate change into the equation.  I am one who certainly does believe that the carbon we are putting into the atmosphere is having an impact on our climate.  But even if you don’t, then focus on peak oil.  If we take action to curb climate change then we can put in place policies to reduce oil consumption before the natural economics affect us too drastically.  i.e by implementing carbon reduction policies to reduce carbon, we must price it and thus try and reduce use.  Bacuase as we all know from the recent events, nothing is as effective in changing behaviour than changing costs.  This artificially pushes us to the same situation that would come naturally once peak oil has come and oil becomes scarcer.  Of course people like Jeff Kunstler believe we are already too late!

This is why Copenhagen was such a big disappointment,  In a sense it re-enforces  the views in the Long Emergency that our dependence on oil is so great that we just don’t have the political will to go in the right direction.  Very discouraging.

As we saw from this last recession, when demand drops so does the price of oil. In fact what we see is that it doesn’t really take that much of change to impact the price quite dramatically.  With the price risking to almost $150/bbl in early 2008, it dropped to less than $50 by the end of 2008 and has continued to rise modestly since then.  Now at over $80, once again there is fear that high oil prices will impact the economic recovery!  Therefore the only policy is to price carbon and keep the price of oil from dropping by adapting the carbon price as necessary.  Anything else will just lead to short term change and then back to the status quo.

One thing is certain.  Oil is a finite resource. Yes we may find more but yes it will be more expensive to exploit.  At some point we are gong to have to accept that we need to start to shift to a less oil dependent economy. And given oil’s uses outside of energy doesn’t it make sense to use alternatives?  So I will conclude by suggesting that climate change is our warning – start to act now to save the environment or wait until the oil is well past peak and have no plan to save society.

What do you think?

Lower demand and more renewables – is Surplus Base Load Generation here to stay?

Late in November I blogged about a recent phenomenon being experienced in some systems – Surplus Baseload Generation (SBG).  This is being experienced in Ontario, Canada due to falling electricity demand and the increased use of variable renewable energy sources such as wind and solar.

At that time, I started a poll asking about the future of baseload power.  Since then, the IESO in Ontario has published its latest Reliability Outlook.  The numbers are striking.  Demand was down 6.4% in 2009.  The following graph shows that demand is not expected to reach pre-economic crisis peaks even by 2018.

Ontario Demand Forecast

As of result the province continues to experience Surplus Baseload Generation (SBG).  Forecasts of SBG are now made daily.  With the growth of renewable generation SBG is expected to continue into the future.  This will certainly impact any decision for building new nuclear, as nuclear plants are most suited to providing long term stable baseload power and energy. 

The commitment to renewable energy continues to grow.  Wind generation in Ontario rose by more than 60 per cent in 2009 over the previous year, to 2.3 TWh.  Ontario has implemented the Green Energy Act, arguably making it one of the “greenest” jurisdictions in North America.  Just this past week, government announced a $7 Billion deal for 2,500 MW of new renewable generation from a Korean consortium led by Samsung C&T.  The deal includes the implementation of new manufacturing in the province for both wind and solar components.

While the above chart does not show baseload, with 1,000 MW of wind on the system and 11,500 MW of nuclear, this spring, Ontario started to experience SBG on a weekly basis.  This resulted in nuclear unit reductions on 54 days, nuclear shutdowns on five days and water spillage at hydro facilities on 33 days.  In the Reliability Outlook the projection is for 1600 MW of wind by 2013.  With the Samsung deal and other FIT program renewables, we could be approaching 4,000 MW of wind and solar in the coming years while the overall demand is not expected to increase dramatically.  Therefore, the baseload requirements will be further squeezed from the bottom as renewable generation has priority to the system when available.  In other words, both renewables and nuclear are “non flexible” load i.e. not readily dispatchable.  Clearly SBG will be an ongoing issue. 

And now, for the results of my earlier poll.  Although the number of votes was somewhat modest, the trend was clear. 

While the comments suggested that baseload is important, only 10% of respondents thought that renewables will have a small impact on the use of baseload.   The most votes were for “Medium Impact” as it seems to be recognized that renewables are here to stay and that the nature of electric grids are going to be changed forever.

Today’s hottest business model – FREE – Review of the book by Chris Anderson

Just finished reading “Free – The future of a radical price” by Chris Anderson.  It was interesting reading and builds on many of the themes from Jeff Jarvis’ book “What would Google do?” that inspired me to start this blog earlier this year.

This book is well written and makes a strong case for free as a business model.   The argument is that the web provides an easy low cost way to distribute information at a near zero marginal cost.  Therefore it is much easier to make information available rather than try and protect it.   Of course many will argue against this principle; noting that people’s time cost money and nobody (with some exceptions) works for free.  However, reading between the lines I do believe that Chris Anderson recognizes that for FREE to work, money must be made somewhere.  At a more strategic level, I think the main point of the book is that dramatic changes are happening in business models and to succeed – innovation in the way money is made is now a requirement.

Three FREE models are discussed.

  1. Direct Cross Subsidies – where products or services are effectively bundled with some provided for free and the others for a fee.  In this model, usually you need the paid for product or service to get value from the free one.  e. g. Cell phone is free, cost is to use it.
  2. Three party or “two sided” markets – a traditional model in which one class of participant subsidizes the other.  This is standard way of receiving a good at the cost to advertisers.  e.g. any advertising supported delivery of content such as TV or ad supported web sites.
  3. Freemium model – in which a basic service is free but there is a fee for a more sophisticated version.  This is has evolved into a model where the base free service is good and quite usable for a large quantity of users and that a smaller set of users are willing to pay for a premium service.  e.g. Skype where there is free computer to computer talk but it costs to call a phone.

This book provides a good history of using free to entice customers to move up the value chain.  What is different in today’s world is that we now have services where a majority of the users will only use the free service and are subsidized by either a small group of specialty users or by advertising.    While this may be the case – is this model really sustainable?

Chris Anderson suggests that this is something that you can’t fight.  Trying to fight against free will ensure failure as a competitor will likely embrace it.  This is where the discussion gets interesting.  The challenge is to find new business models where something is free and new different ways of payment are discovered.  The example is for consultants (since I am one – this is relevant) who provide free general information that results in paid individual consulting or speaking opportunities.  Now of course, there may be a level of naiveté in this thinking.  As consultants, one thing we always know is that any manhour not paid for is gone forever!!  But what I do know is that things are constantly changing.  As soon as you assume something new will work, it too is replaced by new thinking.  Innovation is the new constant!  What we have in this era of almost unlimited free information is a huge global exchange of ideas.  And this has extreme value – the question then becomes how to find that value.   Malcolm Gladwell has another interesting view in his review of this book.  This shows the level of debate which I think will continue for some time.  However while the debate is raging, more and more still seems to be available for free.

As an energy economist, I find the economic model fascinating.   What is being said here is that in the area fed by the internet, there is abundance.  And as we know, abundance means a low price as economics clearly points out that we value what is scarce.  But as is also pointed out in the book, every time we create abundance we end up with scarcity somewhere else.  So in this case, the abundance of information means that our time to absorb, understand and use this information is becoming scarce.  Or as the example goes – some people have more time than money and others have more money than time.  For the latter group, payment to save them time is valuable indeed!

The other issue is that sometimes abundance isn’t abundance.  Externalities must be considered or we end up in the situation that we now find ourselves, warming the earth with green house gases because the true cost of the impact to society is not included.  Abundance leads to waste and sometimes waste leads to societal damage elsewhere down the line.

But what is clear is that we have now moved to a state where certain things that we valued in the past; we are no longer prepared to pay for.  Does this mean the end of these things?  In fact no, they are shared freely because they are abundant.  What it does mean is that we all need to think up new business models that make sense in the world of FREE.

Is there a future for base load generation? Please respond to the poll?

System operators have recently seen something rather new  – SBG – or “Surplus Baseload Generation”.  This is due to falling demand related to the current economic situation and a newer phenomenon; the displacement of base load by variable load renewable generation.

With governments everywhere and the public strongly supporting new renewable generation, primarily wind and solar; these forms of variable generation are displacing base load by being must run when the resource is available.   So the question is “Is there a future for base load generation?”.  Please respond to the poll at the bottom of this blog entry

This issue was addressed at last week’s Association of Power Producers of Ontario (APPrO) annual conference where a session was dedicated to this new phenomenon.  The following shows the amount of time Ontario experienced SBG over the past 18 months.  Excess generation of well over 1,000 MW was experienced!  This resulted in shutting down low marginal cost nuclear plant as well as spilling water at hydro plants.  The 18-month forecast by the IESO in Ontario expects SBG to continue to be an issue going forward.

Surplus Base load Generation

IESO Presentation to APPrO 2009

IESO Presentation to APPrO 2009

The variability of the wind is shown in the following chart illustrating how two days in a row the wind at the same time varied from 989 MW to 7 MW on the following day.

Wind Capacity on Consecutive Days

IESO Presentation to APPrO 2009

IESO Presentation to APPrO 2009

So what does this all mean?  In the smart systems of the future is the concept of large scale base load generation doomed?  Do you have to be able to manoeuvre to survive?  Or will policies change to ensure that low cost base load generation is not displaced for higher cost alternatives?

This is just the beginning of the discussion for this subject.  Please answer the following simple poll.  I would like to get your views.  More work is needed on this issue as we plan the systems of the future.

The precarious world of uranium supply and demand

Last month, the supply of uranium was severely interrupted when BHP declared force majeure on its deliveries of uranium as the main haulage system failed at Olympic Dam.  Production has been reduced to about 20% of nominal and it is expected to take a number of months to repair and bring production back to its full output.  Olympic Dam is a major producer of uranium, producing about 4,000 tonnes U per annum or just under 10% of global primary production.  Therefore, losing the equivalent of 3,000 tonnes per year for six months or so (say 1,500 tonnes) represents a significant event in overall production that affects the delicate balance between uranium supply and demand.

Many people do not appreciate that the supply / demand situation for uranium is somewhat unique amongst commodities.  I first gave a paper on this topic in 2007 to the Raymond James Uranium conference in New York (when the price of uranium was at its peak).

So what makes uranium so special in the world of commodities?  A few things come to mind immediately.  First, uranium is a single use commodity. Its demand is completely dependent upon how many nuclear power plants are in operation and how much fuel they need.  In recent years, the global nuclear fleet has been consistently improving its operations but now has pretty much achieved it maximum.  This means that demand cannot go up for the current fleet of nuclear power plants – there can only be negative shocks if a plant performs poorly. For example, following an earthquake in Japan, some plants were shut down for an extended period. This means that they are not using fuel so demand decreases.

As for the future of demand, the forecasts are for a dramatic growth in new nuclear plants. The WNA is projecting growth of more than 50% in the number of GW in production over the next 20 years.  This means a significant increase in demand that must be accommodated in future supply plans.  However, it takes from 10 to 15 years to implement a new nuclear project from conception so there are really no surprises in demand in the short to medium term.  We all know what plants are under construction so the projection for new demand is quite stable for the next 5 to 10 years with some uncertainty starting to appear at the 10 year mark.

So what does this mean?  It means that demand increases in a predictable fashion and that the potential is always there for negative demand shocks if existing units perform poorly or are taken out of operation for any reason.

Now for supply.  Similar to nuclear power plants, bringing new uranium mines into production takes quite some time and effort.  Many projects are delayed as companies have been having difficulty in bring on new mines.  Therefore, supply potential is also quite predictable for at least 5 years going forward.  Again, as with nuclear power, the risk is that shocks affect the system negatively as there have been a number of events over the past few years that have halted production or delayed new mines.

And finally, as a fuel, uranium is also unique in that it is bought in batches.  The volume of fuel required to operate a nuclear power plant is quite small so utilities can carry a significant inventory to reduce their risk.  This means that buying and selling is not completely in step with usage.  This is different from say, coal or gas that must be consistently delivered to keep fossil generating plants operating.

In the end, uranium prices have remained rather low over the past 20 years with a short term blip in 2007.  These prices remain low because in most scenarios, supply and demand are in balance making it difficult for price increases that are needed to encourage new supply.  However, for utilities the risk remains.   Therefore, the trend is now for utilities in the east (Japan, China, Korea and India) who are fast becoming the world’s biggest users of fuel to invest in the resource itself to help them mitigate the risk.  These countries also have little domestic supply so need to rely on supply from other countries.

Events like the one at Olympic Dam demonstrate how precarious supply can be. So we should expect countries with growing demand and little domestic supply to continue to step up their efforts to invest in global resources to reduce their overall supply risk.

Have we reached peak oil?

I just finished reading Jeff Rubin’s book “Why Your World Is About to Get a Whole Lot Smaller: Oil and the End of Globalization“.  Was a good thought provoking read.  In summary, Rubin is stating that the world has reached peak oil production and that ultimately prices will continue to increase post economic crisis and supply will continue to dwindle.  The ultimate effect of this on society is that transportation costs will increase so high that it will no longer be economic to source goods from low labour cost countries like China and others.  The cost of transportation will more than offset the lower production costs.  The result will be a return to building factories much closer to market.  So in the case of North America, jobs will return as making product locally will once again become economic.

In fact there are really two issues as I see it, combined into one.  On the one hand, he notes that transportation costs will become so high that we move jobs closer to home.  On the other hand, the high cost of oil will mean that we won’t be able to sustain our current standard of living so we will have to do with less.

I think that a good case is made with some evidence that we may indeed have achieved peak oil.   The case for the world getting smaller is somewhat more anecdotal in nature.  Rubin also accepts that people are smart and that technology may indeed come to rescue although he does not think it will come fast enough for us to avoid large structural change in our economies.   

There have been numerous reviews of this book so I will not try and do another review.  In my case, I would like to focus on making a few points that came to me as I thought about these issues.  And yes, the book does make you think.

First, while the world may try and get smaller once again as it was in the past, we cannot forget the great strides in communications technology.  So while we may not be able to travel as much, we will continue to be aware of the goings on all around the world.  The internet will continue to bring us together with increasing global collaboration.  Just imagine all of the ways that improved technology can reduce oil use.  And we know from this recession that it doesn’t take a really huge drop in demand for oil prices to fall.  Think of all of the communications technology that can reduce consumption.  For example, how much oil does it take to print and distribute newspapers?  Well, it now looks like the future will have paperless newspapers fed to us on e-readers.  How about magazines?  Books?  If we eliminate these from use (or even reduce their use dramatically as a start) what will the impact be?  No oil to ship the paper to the factory, no printing requiring energy, no packaging and most of all, no distribution.  And this is only one example.  How about business travel?  Of course, it will never go to zero but with improved video conferencing the need to travel by plane to far away places or even by car somewhere closer is being reduced.  Look at the reductions in business travel already apparent in this recession.  In these cases, it means that we will hopefully be able to use oil to transport only what needs to be transported as we get more efficient and reduce overall transportation.

He discusses climate change as well.  This is also an important point.  The global concern about carbon emissions is leading us to price carbon, thus increasing the cost of oil from its normal economic position.  The goal is to use policy to change behaviour and find ways to move off oil to more carbon friendly forms of energy.  This means that governments are working to try and encourage fuel switching BEFORE the oil actually runs out due to concerns about its current use – not due to concerns about its scarcity.  This should have a positive impact as policies continue to encourage demand reduction in advance of a global supply catastrophe.

Next, if he is right and factories once again move closer to home, yes, blue collar jobs long lost to far away places may indeed come back home to North America.   But the current trend of white collar jobs moving off shore will not be reversed.  It is ironic that the man on the factory floor may once again have a good job while the engineer designing the process may more often be in places with low cost professional labour.  Engineering, accounting and other professions in the service sector that produce mostly paper will not see their jobs return as the internet will assure that quality work can be done literally anywhere around the world.  So does this mean that in the next phase of globalization it is the higher paying jobs that will be moved away to lower cost locations while the low paying jobs return home? 

Was an enjoyable read.  I am interested in other’s thoughts on this book. Let me know what you think.

MIT Report Update “The Future of Nuclear Power”

This week MIT released an update to its 2003 report, “The Future of Nuclear Power”.  Back in 2003 this report brought the economics of nuclear power in the United States to the forefront.  It supported new nuclear as a low carbon option for electricity generation and considered a scenario that would see the increase in capacity by a factor of 3 (meaning building about 200 new units) by the middle of this century.  It is commonly accepted that this report was an important input into the policy that followed with respect to nuclear power including the nuclear power 2010 program and the Energy Policy Act of 2005.

This update looks at progress over the past 6 years and of most interest, updates the economics.  The following table from the report shows the new versus old analysis.

Click on table to enlarge

Click on table to enlarge

As can be seen, the costs have increased significantly over this time period with the projected costs of nuclear increasing faster than the costs of the coal and gas alternatives.  However, the authors draw the same conclusions as they did in 2003; that nuclear is competitive with the alternatives. The report continues to assume a higher project risk for nuclear than fossil.  This translates into a higher cost of capital and the highest cost of electricity.  Assuming the same cost of capital as the alternatives results in nuclear being extremely competitive.

I want to comment on the costs and assumptions.  I have to admit, that back in 2003, when I worked for a nuclear vendor, I was not happy with this report assuming nuclear at $2,000 /kW.  At that time we all believed that we were making strides to lower the cost of new plants and we wanted to see that reflected in the analysis.  Well, I was wrong.  Today the cost of nuclear power has increased and I do accept that $4,000 /kW is a reasonable assumption to make in today’s world.  Does that mean that I think that it is OK for nuclear plants to cost $4,000 /kW?  I definitely think that more work needs to be done to bring these costs down but that is the subject for another discussion.

On the other hand, things have evolved so that the other assumptions do need to be challenged.   While it may have made sense to assume different costs of capital in 2003, this is no longer the case.  The argument in the report is based on the industry’s poor track record of building on time and on budget.  It states that issues with new plants since that date confirm this and that the risk premium can only be eliminated with proven plant delivery performance.  While I do agree that the industry needs to prove it can deliver a new fleet of plants to budget and schedule, things have changed since 2003.

In the current environment, the majority of new plants under consideration in the United States are with regulated utilities.  These plants will be financed on balance sheet so they will be financed at the cost of capital of the utility itself, no different than if it were to build a coal or a gas plant.  And now that the cost estimates have escalated significantly, it is reasonable to assume that part of this increase is due to utilities being more conservative and taking the risks into account in the cost estimates themselves.

Also, the risks of the alternatives have changed significantly.  The risk of new climate change initiatives being put into place after the coal or gas plant is committed has increased.  This means additional costs to the utilities to implement new carbon control requirements or charges due to additional costs for releasing carbon are likely.  Is $25/t sufficient?  At this stage nobody knows meaning higher risk.

And finally, it is interesting how the success of carbon capture and storage (CCS) is assumed, even though the technology has yet to be demonstrated while the success of building a new nuclear plant is consistently challenged.  The MIT study itself recognizes that CCS is not proven. The costs of CCS seem to go up every time a new estimate is made, yet they assume that nuclear has a higher risk profile and cost of capital than coal with a yet to be proven technology attached to it.

In the case of a merchant plant, should there be one; it will very likely only be implemented under the US government loan guarantee program.  This means that they can achieve the 80/20 debt/equity ratio assumed for the other technologies with even a lower potential cost due to the benefit of the government guarantee.

All that being said, the timing of this update is useful.  Their conclusion that more needs to be done is important.  As stated “The sober warning is that if more is not done, nuclear power will diminish as a practical and timely option for deployment at a scale that would constitute a material contribution to climate change risk mitigation.” It will be interesting to see how both government and industry respond.