The Global Risks Report 2021, 16th Ed., Insight Report, was just released by the World Economic Forum (WEF). This is one of the best places in the world to gather ideas for scenario planning, especially Chapter 1 on Fractured Futures. Essentially, this report is scenario planning, but for the whole of the world, and then on a region-by-region basis. So, a government, a business, or a non-profit organization can simply review these risks, adapt the concepts to your locale and situation, and wa-la, you have scenarios to plug into your scenario planning workshop. Of course, multiple risks might have similar results for your organization; for example, disaster recovery planning might be similar for man-made disasters as well as natural disasters.
Category: Horrizon Planning Page 1 of 2
A Recession is destructive innovation. It has accelerated, for example, the Amazon effect of online sales and purchases with the closure of some 29 retailers. The recession of the 2020 Pandemic is different in some respects, straining local restaurants and bars, even the best of local.
Beyond Moore’s law (by Dr Ed Jordan)
After almost 60 years, Moore’s law, related to the doubling of computing power every year-and-a-half-ish, still holds. At the current exponential speed, there is a brick wall looming in the foreground: the physical limitations of silicon chips. The most straightforward example of how that might impact a company is to look at Intel Corp. But first more on Moore’s law and the more general idea of learning curves.
Oh My God!
Trumps administration completely stopped the PREDICT program that did USAID training and response world-wide for pandemics. Since the Bird Flu of 2005 (H5N1), the US presidents (Bush II and Obama) have moved toward building a program to identify potential pandemics and to help countries (including the USA) deal with such an eventuality. Of course, the PREDICT program got to deal with several pandemic-type events including SARS, MERS, Ebola and even Zika (mosquito). The idea, which apparently worked very well, is to fight a pandemic where it originates in other countries, so that you don’t have to fight it here in the USA. Of course, the train-the-trainer program would be developed and applied here in the USA.
Okay!. So the Obama administration left the incoming Trump administration several scenarios for them to think about. No evidence as to what happened to the final report. But apparently, it showed everything that we have seen since November in China, and the first outbreaks outside of China. The result was a world-wide pandemic. Overwhelming the US with supply shortages and patent overflows. What happened to the final report? The early warning signposts? The disaster (recovery) plans?
We did a blog post about the military planning scenarios that would have realized a pandemic as an act of war… Or, even if it wasn’t caused by an act of war, the resulting story line would be similar. And, of course, if the pandemic started elsewhere, there would signals and signposts, in the jargon of Shell related to early warning signs in scenario planning.
A major country should be doing military scenario planning for several topics. None is quite as huge and integrated as climate change. Look at this report produced by The National Security, Military, And Intelligence Panel on Climate Change (NSMIP, February 2020), A Security Threat Assessment of Global Climate Change: How Likely Warming Scenarios Indicate a Catastrophic Security Future. It doesn’t seem possible that the US would not have a plan like this for a pandemic? It also doesn’t seem possible that the US would not have multiple plans for economic recession, no matter the cause. Since we missed the boat on the pandemic, hopefully we didn’t miss planning for a recession… However, this recession is like none we have ever had before. Kind of like turning the switch off on most of the economy for a while. (How long is a “while” is a $4T question?)
Kiaser Health News did a composite of highlights from several articles about the Trump transition team and how badly they apparently failed in the Pandemic exercise: How A Crisis Simulation Run Before Trump’s Inauguration By Obama’s Team Eerily Mirrors Current Outbreak!
November in China. By December, the military and health official had to know that expansion into a global pandemic was was not only possible, but likely. In Florida, when we see a Hurricane coming, we dust off all of our contingency plans for the businesses and start buying toilet paper and canned goods. We make sure that we have lots of jugs to fill with water if needed (no need to buy water, by the way).
Did the military and health officials simply forget to mention these things up the chain of command, or — apparently more likely — did the higher-up-the-chain-of-command not listen? What the hell happened to the results of the simulation? Did it take the path of anything and everything that Obama did/mentioned/signed/said, flushed down the toilet?
It is hard to imagine a world where the (US) military did not make plans for a pandemic like the coronavirus (COVID-19). Each and every military must have a plan for weaponized bio-warfare. In fact, every military will have their own plans for ways they can weaponize biowarfare. Think about the types of bioweapons the terrorist groups might want to employ?
The military has been planning for the big issues of global warming and has been shouting out that climate change is one of the biggest risks to the world in the future. Droughts and rising sea levels will produce mass instability in regions, much along the line of the human tragedies in Chad, Sudan, Syria, etc. For decades now, the military has warned of the risks of climate change on US national security. Pentagon, for example, here.
With the pandemics that have passed through (Ebola, SARS, mosquito-borne) over the last 10-20 years, this too is a national security threat.
In the spirit of Scenario planning, setting up sign-posts and early warning signs, you have to wonder when the military started to escalate the coronavirus outbreak in China to the highest risk levels of world, and therefore US pandemic. November? December? The military would already have contingency plans to help other countries. By early December 2019, the signposts were visible for a spread from China to the rest of the world. By mid December, the US mainland would have been clearly at risk.
The power of having scenario plans, early warning signs, and contingency plans, can break down anywhere along the line. All of the planning in the world is useless, if you don’t react and implement.
Scenario planning should be back in focus. We go a few years – 10 years now since the Great Recession – and we think that the current trajectory, or the “Official View”, should be consistent this time. But the corona virus brings that all back into focus, even though people are probably not taking it as seriously as they probably should. You have to look at the entire supply chain forward and backward. China plays a major role in many of the world’s supply chains. End consumers on the one hand; production supply chain on the sourcing side. If factories are closed, if people can’t go to work, if people don’t go out and buy the consumption and the supply chain get continually interrupted. China is initiating all kinds of stimulus. Telling banks to be forgiving on impacted factories seems like a good idea; no one wants the factories to go out of business because of such an exogenous event such as the virus. But other stimulus will be rather useless.
Probably no one knows, yet, how this epidemic will play out. There’s no reason to believe that this won’t be rather long and protracted for China. The consequences for China will ripple throughout the world. With a world that is densely (over) populated, there is no reason to believe that such outbreaks will not happen other places, and more frequently.
So, this brings us back to scenario planning. The advantage of scenario planning is that you can build Contingency or Disaster Recovery Plans based on various scenarios. Serious and protracted supply chain disruptions, no matter the cause, seem like logical scenarios.
Right now might be a good time to dust off the Contingency Plans and see if anything needs to be updated, or executed, because of the recent events.
In the 2018 Guide by Hall and Hinkelman, the scenario chapter discusses Y2K as the greatest scenario planning exercise in history. Read about the Y2K Scenario from that chapter (pp. 161-163). Remember that right now many companies are executing their contingency plans related to current events, many others are trying to develop them on the fly – kind of a fly-by-night approach to scenario planning.
<*This section below is reproduced here with permission of the authors.*>
The Great Scenario Planning Exercise, Y2K!
There were several major advantages to corporations’ planning – scenario planning really – that came out of the Year 2000 (Y2K) preparation process. Planners were forced to consider at least two views of the future: the official view where Y2K caused no interruptions, and the view of chaos where it caused massive interruptions (mainly because of sustained interrupts to the power grid). One of the interesting parts of this process is the spillover implication – legally, morally and brand-image-wise – of doing nothing in preparation and being wrong. The scenario planning processes associated with Y2K resulted in stronger business planning and improved disaster recovery plans (DRPs). It also helped with business continuity plans by building stronger relationships with critical business partners.
Many people would say that this is a bad example because Y2K was a bust. Actually, the major push to organize IT and transition from legacy systems has substantially contributed to increased productivity for several years after the turn of the century. Business productivity has been surprisingly low since about 2005. Two examples where the Y2K efforts proved to be well justified are Burger King and FPL.
Burger King Corporation, then a division of DIAGEO, worked very closely with franchisees and its most critical suppliers (beef, buns, fries and Coke) to make sure that there would be no interruption and that contingency plans would be in place for likely situation related to Y2K. By far the biggest risk, and the most attention to contingency planning, went to AmeriServe. AmeriServe was the number one supplier to the Burger King system that had bought out the number two supplier and now represented three-fourths of the global supply chain. Three weeks into the new Millennium, AmeriServe declared bankruptcy! The contingency plans related to distribution had fortunately been dramatically improved during 1999 and continuity actions were immediately executed. Although it had nothing to do with Y2K, per say, much if not the entire contingency plan could be used for any distributor outage.
An adjunct to the Y2K story relates to power. Once organizations got past addressing their critical IT systems, the biggest wild card was power outages. No assurances came from the power companies until just months before the turn of the millennium, and even then, not much was given in the way of formal assurances. Of course, that was too late for a big organization with brand and food safety issues to have avoided the major contingency planning efforts.
Most people did not realize how fragile and antiquated the entire power grid was until the huge Ohio, New England and Canadian black out August 14, 2003 (CNN). A cascading blackout disabled the Niagara-Mohawk power grid leaving the Ottawa, Cleveland, Detroit and New York City region without power. There was a shutdown of 21 power plants within a three-minute period because, with the grid down, there was no place to send the power. Because of a lack of adequate time-stamp information, for several days Canada was believed to be the initiator of the outage, not Iowa.
There have been similar blackouts in Europe. That Y2K could have resulted in massive outages may not have been so far-fetched after all. Ask someone who was stuck in an elevator for eight hours if the preparations for long-term power outages could have been better.
Hall (2009) developed a survival planning approach that would help an organization survive during times of extreme uncertainty, like the Great Recession. Of course, the process is far ahead if the organization already has a good strategic plan (StratPlan) that includes contingency and scenario planning.
Hall, E. (2009). Strategic planning in times of extreme uncertainty. In C. A. Lentz (Ed.), The refractive thinker: Vol. 1. An anthology of higher learning (1st ed., pp. 41-58). Las Vegas, NV: The Lentz Leadership Institute. (www.RefractiveThinker.com)
Hall, E. B. & Hinkelman, R. M. (2018). Perpetual Innovation™: A guide to strategic planning, patent commercialization and enduring competitive advantage, Version 4.0. Morrisville, NC: LuLu Press. ISBN: 978-1-387-31010-4 Retrieved from: http://www.lulu.com/spotlight/SBPlan
SustainZine (SustainZine.com) blogged about a rather cool idea on the decentralization of power (here). The idea in Nature Communications is to have buildings everywhere use their renewable power sources to generate a biofuel of some type. And the authors had the Heating Ventilation and Air Conditioning (HVAC) unit extract CO2 from the atmosphere to generate the fuel. Some of the technologies they pointed to were new-er technologies that are now (hopefully) making their way into main-stream. (Read the nice summary article in Scientific American by Richard Conniff.)
Basically, everyone everywhere can now produce their own power at rates that are a fraction of lifelong utility power. Storage is now the big bottle neck to completely avoiding the grid. The distributed power should only be a big plus to the overall power grid; however, the existing power monopolies are still resisting and blocking. So complete self-containment is not only a necessity for remote (isolated) power needs, but a requirement in order to break away from the power monopolies.
In the US, there is the 30% Renewable Investment Tax Credit which makes an already good investment even better for homeowners and businesses. Plus, businesses can get accelerated depreciation making the investment crazy profitable after accounting for the tax shield (tax rate times the basis of the investment). Many of the states also sweeten the deal even more. But the 30% tax credit starts to reduce after 2019, so the move to renewable starts to drop off precipitously at the end of 2019.
You would think that the power companies would join in the solutions, and not spend so much time (and massive amounts of money) on obstructing progress. All those tall buildings that are prime candidates for wind. Think of all the rooftops, roads and parking lots worldwide that are prime candidates for solar. Distributed power. As needed, where needed. No need for new nuclear, coal or nat-gas power plants. Little need for taking up green fields with solar farms.
Of course, the oil, coal and gas companies need the perpetual dependence on the existing infrastructure. When we all stop the traditional fossil fuel train — and all indications from the IPCC show that we must stop that train sooner, not later — then all the oil and gas in the world will need to stay in the ground. Call me an optimist, or a pessimist, but I would not buy oil or gas for almost any price. I definitely wouldn’t buy into the Saudi-owned oil company spinoff.
It is probably a mistake to think that technology to take CO2 out of the atmosphere after the fact can repair past sins. Avoiding putting pollution into the air, water and land — the negawatt and the negagallon, in this case — are by far the best approach.
In Sustainzine, BizMan concluded with this thought about the here-and-now scenario, not in the future at all:
“Hidden in this whole discussion is that scenario that is here and now, not futuristic. Renewable energy is cheaper and massively cleaner than conventional energy, and it can be located anywhere. Storage, in some form, is really the bottleneck; and storage in the form of synthetic fuels is a really, really cool (partial) solution.
Dittmeyer, R., Klumpp, M., Kant, P., & Ozin, G. (2019, April 30). Crowd oil not crude oil. Nature Communications. DOI: 10.1038/s41467-019-09685-x
The researchers over at Strategic Business Planning Company have been contemplating scenarios that lead to the demise of oil. The first part of the scenario is beyond obvious. Oil (and coal) are non-renewable resources; they are not sustainable; burning fossil fuels will stop — eventually. It might cease ungracefully, and here are a few driving forces that suggest the cessation of oil could come sooner, not later. Stated differently, if you owned land that is valued based on carbon deposits, or if you owned oil stocks those assets could start to become worth less (or even worthless).
We won’t spend time on the global warming scenario and possible ramifications of government regulation and/or corporate climate change efforts. These could/would accelerate the change to renewables. There are other drivers away from fossil fuels including: National Security, Moore’s Law toward renewables; and, efficiency.
1. National Security. Think about all the terrorist groups and rogue countries. All of them get part, or all of their funding from oil (and to a lesser extent, NatGas and Coal). Russia. Iran. Lebanon, where the Russians have been enjoying the trouble they perpetuate. The rogue factions in Nigeria. Venezuela. Even Saudi is not really are best friend (15 of the 19 bombers on 911 were Saudi citizens). Imagine if the world could get off of fossil fuels. Imagine all the money that would be saved, by not having to defend one countries aggression on another if the valuable oil became irrelevant. Imagine how much everyone would save on military. This is more than possible with the current technology; but with Moore’s law of continuous improvement, it becomes even more so.
2. Moore’s Law. Moore’s law became the law of the land during the computer chip world, where technology is doubling every 18 months, and costs are reducing by half. (See our blog on The Future of Computing is Taking on a Life of Its Own. After all these decades Moore’s law is finally hitting a wall.) In the renewable world, the price of solar is dropping dramatically, when the efficiency continues to increase. For example the increase of 30% on imported PV, matches the cost reductions of the last year. In the meanwhile battery efficiency is improving dramatically, year-over-year. Entire solar farms have been bid (and built) for about $.02 per kilowatt and wind and/or solar with battery backup is about $.03 per kilowatt. At that price, it is far cheaper to install renewable power vs coal or NatGas, especially given the years to create/develop for fossil fuel plants.
Note, that we haven’t even talked about peak coal and peak oil. Those concepts are alive and well, just that fracking technology has pushed them back maybe 10 years from a production supply-side perspective. At some point you hit the maximum possible production (on a non-renewable resource) and production can only go down (and prices go up) from there. The world production of oil is now up to 100m barrels per day. But oil wells deplete at about 4%-5%, so you need 4% more new wells every year. Fracking drops about 25%-30% in the first year! So you need about many more wells each year to stay even. But let’s go on to efficiency and probably the major demand-side force.
3. Efficiency. The incandescent light bulb, produces very little light… it produces more than 95% heat, and just a tiny bit of light with 100 watts of energy. With only 10-15 watts an LED light can produce the same light was required 100 watts in days of old. The internal combustion engine is hugely inefficient, producing mostly (unused) heat and directly harnessing only 10-15% of energy from gas or diesel… plus it took huge amounts of energy to mine, transport, refine, transport, and retail the fuel. Electric engines are far more efficient, and they produce no toxic emissions. A great book that talks about energy, efficiency and trends is by Ayers & Ayers, Crossing the Energy Divide. The monster power plants (nuclear, coal, NatGas) have serious efficiency issues. They produce huge amounts of heat for steam turbines, but most of the heat is lost/wasted (lets say 50%). Electricity must be transmitted long distances through transmission lines (where up to 40% can be lost in transmission).
Producing power as needed, where needed, makes so much more sense in most cases. Right now, using today’s technology, pretty much everyone can produce most of their own power (PV or wind) at about the same cost as the power monopolies. But Moore’s law is making the renewable technology better and better every year. Add some batteries and microgrid technology and you have robust electric systems.
The losers in these trends/scenarios can be the BIG oil companies and the electric monopolies. They will fight move until they change, or they lose. Just like peak oil, it is a mater of time… but the time is coming faster and faster…
Saudi is trying to keep prices high enough to complete their oil Initial Public Offering so they can diversify out of oil. Venezuela is offering a new cyber coin IPO (their Petro ICO) with barrels of buried oil as collateral (See Initial Kleptocurrency Offering). But what if that oil becomes a stranded asset? Your Petro currency becomes as worthless as the Venezuelan Bolivar.
You really want to carefully consider how much and how long you want to own fossil fuel assets… Fossil fuels may be dead in a decade or two… Moore or less.
Previously, we talked about the Tic-Toc of computing at Intel, and how Gordan’s law (Moore’s law) of computing – 18 months to double speed (and halve price) – is starting to hit a brick wall (Outa Time, the tic-toc of Intel and modern computing). Breaking through 14 nanometer barrier is a physical limitation inherent in silicon chips that will be hard to surpass. Ed Jordan’s dissertation addressed this limit and his Delphi study showed what the next technology might likely be, and how soon it might be viable. His study found that several technologies were looming on the horizon (likely less than 50 years)… and that organic (i.e. proteins) was the most promising, and should certainly happen sometime in the next 30 years.
Apparently quantum computing technology is here and now– kinda – especially at Google. See Nicas (2017) WSJ article about Quantum computing in the Future of Computing. As the article states about the expert Nevens, he’s pretty certain that no one understands quantum physics. At the atomic level, a qubit can be both on and off, at the same time. The conversation goes into parallel universes and such… Both here and there, simultaneously. The Quantum computer is run in zero gravity, at absolute zero temperature (give or take a fraction of a degree). Storage density using qubits is unimaginable. The computer works completely differently, however, based on elimination of the non-feasible to arrive at good answers, but not necessarily the best answer. Heuristics, kinda. The error rate is humongous, apparently, requiring maybe 100 qubits in error correction associated with a single working qubit.
Ed Jordan was reminiscing about quantum computing yesterday… “Basically, all computing in all its permutations need to be rethunk. Quantum computing is sort of the Holy Grail. One could argue it is sort of like control fusion: always just 10 years away. Ten years ago, it was 10 years away. Ten years from now it may still be ten years away. There is a truck load of money being thrown at it. But there isn’t anything mature enough yet to do anything that looks like real computing. The problem is how do you read out the results? Like Schrödinger’s cat, that qubit could be alive or dead, and by looking at it you cause different results to happen – as opposed to something that exists independent of your observation.”
Quantum computing is now moving past the technically impossible into the proved and functional, and maybe soon to be viable. The players in this market are Google (Alphabet), IBM and apparently the NSA (if whistle blower Snowden is to be believed.)
Intel may not be able to capitalize on the next generation of computing. Some computations, such as breaking encryption, can probably be done in a couple seconds on a quantum computer, even though it might take multiple current silicone computers a lifetime. There are several potential uses of the quantum computer that make businesses and security targets very nervous.
Jordan and Hall (2016) talk about using Delphi to anticipate deflection points that are possible on the horizon, including those scenarios that would be possible via quantum computing, or bio-computing for that matter. The use of experts or informed people could make the search for such deflection points more evident, and the ability to develop contingency plans more effective.
One of the most interesting things in the Nicas article is a look at the breakthroughs in computing technology, and comparing them to Jordan’s 2010 dissertation. He found that two or three types of technology should likely be feasible within 25 to 40 years and viable in application within about 30 to 50 years. In his case that would be as early as about 2040. Note that the experts discussed by Nicas were pegged to have full application of a quantum computer by about 2026; that is when digital security will take on a whole new level of risk. It also makes you wonder how block-chain (bitcoin) will fare in the new age of supersonic computing.
This seems like a great time to start working of security safeguards that are not anything like the current technology? Can you imagine the return of no-tech or lo-tech? Kinda reminds you of the revival of the old “brick” phones for analog service (in the middle of the everglades).
Debnath, S., Linke, N. M., Figgatt, C., Landsman, K. A., Wright, K., & Monroe, C. (2016). Demonstration of a small programmable quantum computer with atomic qubits. Nature, 536(7614), 63–66. doi:10.1038/nature18648
Jordan, Edgar A. (2010). The semiconductor industry and emerging technologies: A study using a modified Delphi Method. (Doctoral Dissertation). Available from ProQuest dissertations and Theses database. (UMI No. 3442759)
Jordan, E. A., & Hall, E. B. (2016). Group decision making and Integrated Product Teams: An alternative approach using Delphi. In C. A. Lentz (Ed.), The refractive thinker: Vol. 10. Effective business strategies for the defense sector. (pp. 1-20) Las Vegas, NV: The Refractive Thinker® Press. ISBN #: 978-0-9840054-5-1. Retrieved from: http://refractivethinker.com/chapters/rt-vol-x-ch-1-defense-sector-procurement-planning-a-delphi-augmented-approach-to-group-decision-making/
Nicas, Jack (2017, November/December). Welcome to the quantum age. The future of Computing in Wall Street Journal. Retrieved from: https://www.wsj.com/articles/how-googles-quantum-computer-could-change-the-world-1508158847