A smart grid is a transactive grid.
- Lynne Kiesling
Could Google’s Disdain For Utilities Ruin Its Nest?

Via Smart Grid News, an interesting argument that, while Google is focused on value streams from home appliances and energy services, utilities have access to other value streams such as “grid-side savings”, as well as, moving away from average cost to serve to true cost to serve based on the locational differences between premises.  The article also has some fascinating insights into storage (including the penetration rate tipping point that could flatten load curves and do away with price spikes). And into time-of-use pricing and why it will NOT help Google.  (By the way, the article notes that there’s a fourth value stream – combining energy usage data with all the other data Google knows about consumers to create ultra-targeted marketing):

After Google’s acquisition of Nest, I heard from utilities and consultants asking “What could Google possibly be thinking, paying $3 billion for Nest? I have no interactions with Google, but I offer my opinion via the following piece. We can certainly forecast some of how this will play out. It starts with the fact that the players don’t understand how the other side makes money.

Google rarely interacts with utilities. Utilities tend to resist innovations from Google, Apple, Microsoft and a host of others (e.g., AT&T, Comcast, Verizon, etc.). Their business models, risks, margins and opportunities do not behave the same way. Failure to appreciate these differences is likely to depress the earnings potential on both sides.

What Google thinks it knows

Google’s actions suggest the following:

First, they learned early that utilities are not keen on sharing customer usage data with third parties. This is an understandable, classic utility stance.

Second, Google learned with PowerMeter that even if they do get the data, customers are largely uninterested. Sure, some are curious about it, view it once or twice, and then move on with their lives. There was no groundswell of consumer passion, no barrage of clever apps, and so the music stopped. It was clear that, to change the industry, someone would need to change customer behavior. And to be consequential, it would need to be more than an OPower-type 1% to 2% savings and would need to not depend upon the persistence of customer vigilance.

Enter a host of other home energy management systems to fill that void, with automated dispatching of end uses. Hundreds of pilots and offerings over the past few years are plodding along at the typical utility pace.

Too risky for non-regulated firms

We saw very few firms interested in approaching this task from the non-regulated side of the utility due to the significant hardware investment costs (i.e., relays, sub-metering, dispatching systems, installers, customer service). For the typical non-regulated energy provider, it is too risky to chase a customer that can switch providers at will.

Google was no different. They opted to wait and placed a bet on Nest, instead.

In the meantime, the ISOs are trying to help out, with increased focus on service markets. But the pace is still slow and the ISOs primarily  value the commodity side, and only then above the bus, leaving the bulk of the grid side savings to the regulated utilities. Utilities are well aware of the significant value of the grid side savings. But they are very concerned about reliability issues with large scale rollouts. This slows the adoption pace as well as instill fears about modifying traditional margins.

Google watches and waits

In most industries, commodity margins converge to very low numbers. Google seems to believe the energy space is similar. It is therefore opting to forego participation in the commodity business in favor of value-added comfort and convenience marketing.

However, the energy space is unique in that we have no storage, no silo, no warehouse, in which the commodity can be reasonably held. This is why prices spike. Supply and demand are balanced every second.

Even branding stalwarts like Coke (where the commodity is sugar water) and Nike (textiles, rubber) know that some level of vertical back integration is necessary, despite having access to inventory and storage buffers. With energy, the need is even more important. In the energy space, there is 2X to 5X more margin to be had by using near real time dispatching analytics or arbitraging against ISO prices, wind forecasts, thermal inertia in AC and water heat, and other areas. Only those competitors that master these nuances will win, irrespective of the glitz or appeal of one’s thermostat. At some point, the money saved rules the market, or at least makes competition much more intense.

Is energy an untapped gold mine?

Google probably appreciates that the energy space continues to be an untapped goldmine, where competitor utilities are slow to react, rest on regulated laurels, have limited knowledge of their customers, and are not built to innovate at all, much less at a rapid pace.

Utilities address innovation pressures via consumer-direct offerings and regulatory appeals, at which they excel. But Google realizes that consumers are social, connected animals. And those consumers’ homes are replete with unconnected appliances — a big, untapped market for Google.

This space seems worth the cost of entry, and ripe for new innovation — appliance monitoring, control, security, comfort, mood lighting, music, even social enhancement.

Now extend the consumer experience to commercial markets, and we can see that opportunity exists for new scent/smell systems linked to retail sales, lighting displays to enhance consumer experiences, automated shopping from in-home product RFID tag monitoring, choreography of sales promotion with home use tracking, and others.

Clearly these are growth opportunities that utilities either do not see, or would not be able or willing to chase, given their regulated mindset. Conversely, Google does not seem to see that these offerings also impact the use of energy. If done optimally, they can yield significant cost savings beyond the service itself.

So who will win?

The eventual winner will combine both worlds. It will either 1) combine low cost leadership with product innovation, or 2) low cost leadership with being the channel of preference (e.g. Walmart). It’s tough to be all three — low cost, innovator, and preferred channel.

Google could do it, if it focused more on the commodity. But low cost leadership in energy necessitates sophisticated analytics directed at virtual storage (dynamic dispatching of the end uses) and this requires unique quant skills which Google likely cannot develop on their own, or at least not in time.

Google has the analytical capability to dominate the value-added service offerings, but energy analytics for optimal arbitraging requires much more than tapping talent from policy-oriented or “strategic” energy experts. It requires a unique blend of engineering, statistics and financial derivative modeling. And the low cost leadership will only come from solving the virtual storage potential, and end use arbitraging, at least until we see 20% to 30% storage penetration.

Even then, the price point for virtual storage is so much lower than physical batteries, that the depression of the marginal prices due to 20% virtual storage penetration will put most physical storage out the market. Not all, mind you, but enough. Our analyses reveal a tipping point at about 15% to 20% penetrations of virtual storage where we can choreograph end use start and stop times optimally, such that the load on a circuit is essentially flattened. This causes price spikes to almost disappear, and it is these price spikes that drive many current physical battery storage valuations.

What Google is overlooking

Google does not seem to appreciate how margins will be made on the commodity side of electricity, or the grid side. It is axiomatic that to survive, commodity businesses have low margins that require excellent execution. Google’s Nest acquisition suggests that they perceive the market will remain a regulated, cost-plus business model, ripe for the picking.

Yet with the continued emergence of smart grid innovations, solar, other DG, DR and continued de-regulation, the utility business model is slowly evolving to one where the average margin per customer is no longer a useful metric. Utilities already are sharpening their pencils to uncover which customers are the most costly and which are most likely to adopt new technologies. They do know the marginal cost of service calculations on larger competitively bid loads, from their trading activities, and it is inevitable that these more micro and granular household-level marginal cost calculations will occur.

With the smart meter roll outs and emerging substitutes (e.g., DG, solar), utilities will be forced to create sophisticated margin measurements on individual homes and businesses tied specifically to locational premiums. Admittedly this is not easy or familiar territory to the regulated side but the large potential for value capture in a margin-driven commodity business means it is inevitable.

It took my firm six years to automate the process such that 100,000 customers can be valued in a matter of minutes, so we know it takes time and analytical expertise. But utilities know that they need to go down this road, and they will. Moreover, regulators know that to make the EE, DR and solar more cost-effective, they need to push utilities down this road to locate these resources where they get the biggest bang for the buck. We already see this push in California, and other States will follow.

Why Google will struggle

Margins and costs vary every hour, across 8,760 hours of the year, and across various booms and busts in energy prices, mostly attributable to hourly weather, congestion and shortages. In a grid with few buffers (storage, warehouse, inventory) between supply and demand, we see the familiar price spikes which underlie the drive to smart grid innovation.

So even though Google has a strong quantitative team, they likely don’t know that regressions, logits, multi-nomial logits, Black-Scholes, panel data models, Bayesian modeling and other statistical methods for financial engineering are inadequate to accurately quantify the substantial value available from knowing the cost and margins of individual household loads. It requires a mix of engineering, simulation, forecasting and causal-based financial derivatives to achieve (not Monte Carlo methods, nor Black-Scholes).

The more advanced utilities are getting it and moving in this direction. Moreover, the more sophisticated financial modeling methods reveal that the difference in accuracy between a good valuation and “averaged” valuations are frequently a factor of 2x or more different from each other. Large amounts of value are just now being identified and captured. This is a big issue, because knowing which 20% of the market yields the most creates a significant first-mover advantage. The remaining 80% won’t be worth nearly as much, due to the specific locational values of certain customers and circuits.

Which utilities will survive (or even thrive)

Those utilities that do understand the difference, and work toward preserving those customers where margin exists, are likely to survive, perhaps thrive, the threat of a death spiral in their franchise. But during this process, there will also be increasing pressure to accept the true cost to serve for their customers, likely driven by the regulators.

Does Google appreciate this? It remains to be seen. Utilities will be forced to move from class level pricing to individual pricing. It is a truer and more accurate cost to serve (though some cross-subsidy is likely to continue). Thus, the ability of new entrants to cherry-pick customers will evaporate. Whoever gets there first wins.

Smart meter usage profiles are already in place, and it is very easy for the regulators to establish an exact cost to serve per home versus the current model where we use “settlement shapes” which are an average load shape for all customers. If a death spiral seems to be in the offing, regulators will likely accept a utility movement to employ individual cost to serve methods versus the current use of averages.

I doubt Google is ready for that or appreciates the potential threat this poses to the Nest business case.

Why time-of-use pricing won’t help Google

Time of use (TOU) pricing (upon which many programmable thermostats build their business case) won’t resolve the matter, either. Google may know that the current market acceptance of TOU is low. But they need to realize that it will always be low. Active TOU participation rarely exceeds 5% to 10%, and many of these active participants are largely “free riders” — those who already use little energy on peak.

The utilities know this, but the regulators have not quite caught on. Most regulatory agencies and evaluation teams have a staff of well-trained economists, not marketers. While an economist may trust the rationality of the consumer, a marketer knows that non-price options generally win the day when commodities are comparably priced. Indeed, the TOU pilots where we see the greatest participation are those that beef up on non-price factors, such as TV advertising, radio, etc. (e.g., Flex Alert during California pricing pilots, etc.).

Google’s continued avoidance of utility partnerships might suggest that  they intend to push TOU-based transactive signals for NEST to receive. At the end of the day, most consumers are unlikely to respond to these prices on their own, without a third party doing it for them.  And the value of virtual storage can only be achieved via tightly linked, and optimized, end use dispatching in a dynamic fashion. 

But Google may be thinking along these (TOU) lines. What better way to unhinge the utility from their customer?  However Google may be leaving value on the table by under-appreciating local variation and the value of virtual storage. The maximum cost savings is achieved by optimal arbitraging of end uses directly versus hoping for a customer response to a transactive signal. In the end, the response will be automated by third parties, leaving consumers to live their lives without worrying about saving pennies each day by watching an hourly TOU signal.

Enter locational targeting

Enter locational targeting and locational cost-effectiveness. In late 2013, California began calling for more attention to the specific locations where EE, DR, solar and smart grid activity might yield a more significant efficiency bang for the buck. The regulators are currently determining how utilities should be required to respond to this — I get questions about it every month.

This much is clear. We know that the cost to serve a home can often change dramatically if we pick it up and move it over a few streets. We know that local congestion on the network will spike local LMPs (locational marginal prices). What Google does not yet seem to appreciate is that this financial impact, due to location, can alter margins up to 10 fold, and be larger than the value of the commodity cost alone.

I suspect that the Nest business models are based on spot market energy and capacity credits in the range of $80 per KW-yr. What Google probably doesn’t know is that there are very specific locations where the total avoided costs can be 2X to 10X larger than their averaged value assumptions.

Did Google overpay? By a bunch?

Or, maybe they do know this, and that is why they paid about $3,000 per customer for Nest. We often see the average avoided cost savings to be in this range, when we value the savings from all of the utility value buckets jointly (e.g., supply, transmission, distribution, ancillary services, voltage, losses, asset deferral, etc.). The normal range for the commodity margin of a utility customer is loosely $100 per year (residential) or perhaps $700 NPV. Still way less than $3,000. We only see $3,000+ valuations where we identify high cost to serve customers in specific locations. Then, the value can climb quite a bit higher than $3,000.

All of this assumes the current price volatility and grid costs persist. And they will until 20% of the load participates in virtual storage. But my guess is that Google is not valuing the utility cost savings or margin, but rather the value-added service markets and appliance offerings that can come after a Nest sale. In this sense, Google gets a great deal from the Nest purchase, and Nest has short-changed their investors. Most utilities don’t want to be in the value added services side of the business. But neither does Google see the money that they are leaving on the table by ignoring the large value to the utility from specifically targeted customers.

But what about high value customers

You might think that more of these high value customers just give Nest and Google more upside. Right?  Wrong.

We have performed analyses which show after a certain number of MWs of smart grid savings for a specific minimum number of hours per year, these smart grid “resources” can become the price setter in the market, as opposed to the iron in the ground.

If those resources were to be controlled by an unscrupulous third-party vendor, using Enron-like gaming tactics (pre-cooling, pre-heating, charging EV in the morning), the resource could be used to drive prices up in the morning to make a better business cases for reducing load during afternoon peaking hours — in effect gaming the utility’s DR incentives.

Many regulators have already decided that EV charging does not require utility level oversight. It’s not a stretch to believe they won’t anticipate the need to regulate NEST and other Google services. To mitigate that risk, utilities can ask to retain control or secure sufficient MWs of DR behind their own customer meters to mitigate the potential gaming.

The key point: There is a set of ideal market segments one would ideally pursue to insure competitive sustainability. Whoever secures these groups first, especially in specific locations, stands to gain the most by offering more money, value, and innovation. We have conducted several modeling projects where a utility can realistically flatten the load curve, and reduce marginal prices significantly, using only 20% of the customers.

So, it is the best 20% that will win the day, in the long run. All other contenders will be priced out of the markets. Prices will drop, and business models will be rendered moot for many current technologies.

The importance of tapping avoided cost to serve

Sure, glitzy consumer innovations may secure market share in small niches, but the ones that tap into the true utility avoided cost to serve will stand to benefit dramatically from the very large locational differences from street to street, town to town.

Will it be Google or others that secure demand reductions in the $20 to $40 per KW-yr range?  Where they can target the right customers in the right locations, year to year, they do stand to dictate the market. And in this scenario, if I only have to acquire 20% of the right customers to set the price in the market, paying $3,000 per customer is a bargain basement price. But to capture this value, Google will need to grow their utility analytics quite a bit, assure continued ISO cooperation, and attempt a more conciliatory relationship with both utilities and state regulators to achieve this. It is clearly possible. But it is far from inevitable.

And what about grid reliability

Google may not understand the significance of grid reliability. A recent Wall Street Journal report cites that the loss of 10 or so key substations would bring down the grid for month. Disaster ensues. Regulators cannot allow this. The ISOs have sensed this for some time, increasingly investigating how to forecast local loads for use in 10 to 20 day LMP forecasts. All they have today are next day LMPs.

But the rapid growth of large grid scale solar, wind and storage are placing increasing strains on system operators (e.g. Texas, California) to the point where I think ISOs may be worried about the long run reliability of the grid. The way to address this is to migrate from city-wide load forecasting (current state) to local acre-level forecasting. And the way to quantify this, ironically, is to use a combination of econometric modeling and Google Earth-style satellite imagery.

When we do this, we can identify very specific locations where smart grid resources yield the biggest bang for the buck. Moreover, we can forecast local LMPs by bus accurately for 10 to 20 years, inclusive of large scale solar, wind or storage. So the utility, the ISO, and third parties like Google could know exactly where the placement of micro resources returns the biggest bang for the buck. In essence, the ability to improve reliability (and uncover large commodity savings) at a fraction of the current projected costs does exist today, and we will be seeing a lot of it in the months ahead.

To the best tools go the spoils

There is a race to dominate this market. The winners will have a sophisticated set of tools. Some tools will use advanced analytics to value marginal customers instead of average customers, and do it locationally. Some tools will target and secure the optimal set of market segments that will survive the coming competition.

Has Google thought through these two issues fully?  I don’t see much evidence for it. Do they care? They should. If they don’t, someone will and they’ll lose the bigger market opportunity. Should they partner with utilities or ignore them?   Ignore the utilities, and Google potentially loses out on value buckets that might be greater than what one gets from value-added services and gadgets alone. Gadgets are replicable. Being first to market, to the right segments, year by year, wins the day, in my opinion. And so, let the race begin.



This entry was posted on Thursday, April 10th, 2014 at 5:45 am and is filed under Uncategorized.  You can follow any responses to this entry through the RSS 2.0 feed.  You can leave a response, or trackback from your own site. 

Leave a Reply

You must be logged in to post a comment.


About This Blog And Its Authors
Grid Unlocked is powered by two eco-preneurs who analyze and reference articles, reports, and interviews that can help unlock the nascent, complex and expanding linkages between smart meters, smart grids, and above all: smart markets.

Based on decades of experience and interest in conservation, Monty Simus believes that a truly “smart” grid must be a “transactive” grid, unshackled from its current status as a so-called “natural monopoly.”

In short, an unlocked grid must adopt and harness the power of markets to incentivize individual users, linked to each other on a large scale, who change consumptive behavior in creative ways that drive efficiency and bring equity to use of the planet's finite and increasingly scarce resources.