Volatility and Creative Destruction's Effects on Damages in Arbitration
This is an Insight article, written by a selected partner as part of GAR's co-published content. Read more on Insight
Extreme volatility in commodity prices, stock markets and the wider global economy is a major contributor to the recent surge in international disputes. In addition, we are witness to fundamental changes in technology, infrastructure and business processes, as well as broader economic developments and geopolitical change. Indications are that this upheaval is unlikely to abate soon and will give rise to further arbitration activity over the coming months and years. The inherent uncertainty as to what the future holds makes the quantification of forward-looking damages claims and valuation of businesses a complex exercise. Given the dramatic pace of change, how is it possible to make reasonable predictions as to what even the short-term future holds?
If a damages expert’s work is to be valued by a tribunal, he or she must take special care to avoid finding themselves on the wrong side of the divide between providing a reasonable estimate of a claimant’s economic loss and mere speculation. This article examines the link between volatility and change, and commercial and investment disputes. It suggests why such volatility and change occurs, and examines some of the issues damages experts must resolve in the context of their work.
How volatility causes disputes
Volatility and risk are ever-present in business, and corporations have devised a number of risk management strategies to enable them to seek to limit such risk and transact with other parties or enter into long-term agreements. By way of illustration, one way in which companies hedge their pricing risk (eg, raw material purchases) is through the use of derivatives such as futures and options. In addition, commercial contracts frequently include price revision clauses using, for example, annual benchmarking processes, detailed price escalation formulae linked to market-based indices or minimum or maximum order quantity agreements. While such mechanisms are often successful in serving their intended purposes, unforeseen circumstances can and do arise that ultimately end in the parties being unable to re-negotiate and having to resort to arbitration to resolve their disputes.
To give an obvious example of how price volatility causes disputes, recent swings in the price of natural gas has seen many parties to long-term supply contracts fall out with their business partners. One of the structural causes of such disputes is the linkage in certain contracts (particularly in mainland Europe) between the price of natural gas supplied and the price of oil.1 In many contracts, the price of natural gas supplied is partly determined by the prevailing price of oil; all is well provided gas and oil prices mirror one another - as they did to an extent until 2009 - but a rising oil price coupled with a falling natural gas price is to the detriment of the wholesale gas purchaser. The gas purchaser, faced with a contract that is no longer commercially viable, faces the commercial necessity to renegotiate price and rebalance the relationship. Self-evidently, the inverse is true of the seller who is enriched by this oil or gas price decoupling. The result is frequently breach of contract or arbitration.
The chart below shows not only the wild swings in the prices of natural gas and crude oil over recent years, but also the widening disconnect between the two commodities.
What is volatility?
In the world of finance, volatility may be defined as the variation of the price of a security or asset over a certain period of time. It is important to distinguish between swings in prices within a certain range (volatility) and the direction (up or down) in which price movements occur, as the concepts are quite different. By way of illustration, a general rise in a stock market index over a number of years is said to reflect a bull market (direction), whereas a 200-point daily rise in the index followed the next day by a fall of the same magnitude exhibits volatility.
There exists much debate - and little consensus - as to what exactly causes volatility. Many academic papers have been written examining the relative influences on volatility of changing attitudes to risk, fear, asymmetrical and incomplete information, market inefficiencies or efficiencies, computerised trading, market manipulation and so on. All of these factors undoubtedly play their role at different times, but the nature, cause and scale of volatility is ever-changing. Like the proverbial chicken and egg, one might wonder whether it is uncertainty that causes volatility or the other way round.
Finally, it is worth noting that volatility can occur at all stages of the business cycle, not merely in times of recession; it arises in both bull and bear markets. Some commentators take the view that commodities are in a multi-year bull market; the chart overleaf shows that the general trend in commodity prices is indeed up but that, since 2008, the market has been subject to sharp swings up and down.
Market and price volatility is likely only a small part of the backdrop to the rise in international disputes. We would suggest that volatility is merely a symptom of something with far greater significance for arbitration, namely continuous and profound change in all aspects of the global economy. At this point, it behooves us to turn to the Austrian-Hungarian-American economist and political scientist Joseph Schumpeter, who famously coined the term ‘creative destruction’ to describe the process by which capitalist societies are constantly changing. According to Schumpeter:
...the fundamental impulse that sets and keeps the capitalist engine in motion comes from the new consumers, goods, the new methods of production or transportation, the new markets, the new forms of industrial organisation that capitalist enterprise creates.2
Schumpeter criticised the prevailing view among his contemporaries that simple price competition was the best (or only?) explanation for how the economy functioned. He believed instead that change was far more prevalent and that competition in the real world came from:
...the new commodity, the new technology, the new source of supply, the new type of organisation... competition which commands a decisive cost or quality advantage and which strikes not at the margins of the profits and the outputs of the existing firms but at their foundations and their very lives.3
Schumpeter’s insights appear to make a great deal of sense when one considers a real-life example of creative destruction in action such as is occurring in the smartphone market. Having launched the revolutionary BlackBerry smartphone in 1999 and dominated the market for years, the very survival of BlackBerry producer Research in Motion (RIM) is now under serious threat. From an all-time high of around US$150 in 2008, the share price of RIM has collapsed to under $7 today. What happened? Quite simply, new technologies emerged in the form of handsets using Google’s Linux-based operating system, Android, and the iPhone. These rivals captured huge swathes of the market previously occupied almost exclusively by RIM. From boasting a market share in the United States of close to 50 per cent in early 2008, RIM now accounts for less than 10 per cent, against a combined share of 82 per cent for (now dominant) Android handsets and iPhones.4 5
The emergence of the first iPhone in 2007 was not even seen as a genuine threat by RIM executives, who completely underestimated the likely impact of this new technology.6
On a topic of direct relevance to much current arbitration, recent advances in hydraulic fracturing techniques (fracking) have had a significant impact on natural gas prices. While fracking was first used in 1947, it was not until 1997 that modern techniques were devised to allow shale gas to be extracted economically. Shale gas has now greatly expanded global energy supplies and production has grown exponentially. In the United States, for example, production of shale gas accounted for less than 2 per cent of all natural gas produced in 2001, but has increased to around 30 per cent today.7 As natural gas prices in the United States and elsewhere have retreated in (part) response to this additional supply, there has been a knock-on effect to other parts of the economy, resulting in many different types of disputes emerging. In addition to the glut of price-review arbitrations, companies engaged to build or operate LNG terminals, for example, have found themselves in dispute as their counterparties renege on transactions that appeared highly profitable prior to the shale gas revolution. Conversely, it is interesting to look at the impact of a less favourable regulatory environment in the Nuclear industry after the Fukushima Daiichi accident or the rash of disputes that have erupted in the Solar Energy market following changes to levels of government subsidies.
Other examples of creative destruction abound, such as digital photography rendering the photographic film industry near-obsolete, or the rise of internet retailing destroying the business models of many high-street retail chains. While their precise form may be hard to predict, in a highly dynamic economy with technology changing at unprecedented speed, it is clear that other trends will emerge with important implications for arbitration.
As we shall see, extreme volatility and creative destruction (ie, change) raise important issues in the context of forward-looking damages claims and valuation of businesses.
Impact on quantum
In many ways, the increased volatility witnessed today changes nothing from the perspective of the professional valuer or damages expert; dealing with uncertainty about the future and trying to model and quantify the unknown has always been integral to the expert’s role. The expert’s job, however, has undoubtedly become more difficult for a number of reasons, which we set out in the remainder of this article.
Many damages experts’ assignments involve forecasting what the future outcome of a specific project would have been in the absence of one or more actions on the part of the respondent (the ‘but for’ or ‘counterfactual’ scenario), and modelling what the future is likely to hold as a result of said actions (the actual scenario). The valuation professional has a number of different methodologies available to him or her including the market comparables approach, the income approach, and asset-based approaches. For the purposes of this article, due to its inherent flexibility and growing acceptance in the arbitration community, we will assume that the expert is producing a discounted cash flow (DCF) model.
The output of a DCF model is a single number (or range of numbers), representing the net present value (NPV) of a project’s or business’ projected future cash flows, discounted to take into account the time value of money and the uncertainty - both upside as well as downside - of the projected future earnings. When DCF is applied correctly, the calculated NPV approximates to the fair market value (FMV) of a project or business since it reflects the present value of the future cash flows and hence determines a price at which a well-informed and willing vendor and purchaser could transact. Crucially, the expert’s conclusions on value are reached as at a certain point in time. Since the conclusions are based only on information that is either known or knowable at the time of valuation, hindsight information should usually not be employed to cast doubt on the valuation conclusion.
The quality and relevance of the output from a DCF model depends on the quality of the inputs, for example, the reasonableness of the growth assumptions and the discount rate. Typically, both the counterfactual and the actual scenarios will require the expert to produce a DCF model that includes forecasts of future revenues, growth rates, costs, required capital investment, working capital needs, and make a number of other assumptions. Increased volatility and the pace of change more broadly means that it is far harder than before to predict each of these elements. Before embarking on creating complex spreadsheets, therefore, the expert firstly needs to invest significant time and due diligence in order to understand in depth the nature of the industry and market in which the relevant business or project operates. How is the subject project or business positioned from a competitive standpoint against its peers? To what extent can current or past earnings be said to be sustainable given potential risks, for example, entry into the market of a new competitor, demise of key customers, increases in the cost of raw materials, and so on?
The example of RIM shows how difficult forecasting can be for the valuation expert, even if he or she is armed with the most up-to-date information. An expert attempting to value RIM at the end of 2007 would certainly be aware of the launch of Android technology and the Apple iPhone; what the expert could not know at the time is both how quickly and fundamentally the iPhone and Android-based phones would change the smartphone market at the expense of RIM. Given that senior RIM executives themselves saw the iPhone as only one more new entrant in a crowded market, the expert might reasonably have assumed RIM would continue to dominate. On this basis, provided that the expert made their valuation reasonably, taking into account all known or knowable information, it would be hard to criticise the resulting valuation. Indeed, the market has to deal with uncertainty as a fact of life even though subsequent events may disprove assumptions made at the time. With the benefit of hindsight, however, we can say that a DCF model produced in December 2007 that predicted significant year-on-year revenue growth for the following 10 years would have been wrong.
For these reasons, quantum experts and valuation professionals should provide a sensitivity analysis to show how their valuation would change as the assumptions are adjusted. Moreover, experts need to be especially cautious in developing forecasting models and avoid the use of aggressive or unrealistic assumptions. In practice, however, what does this mean?
We suggest that the experts who are of most assistance to the tribunal are those who adopt a sceptical and objective mindset. Firstly, this means being prepared to challenge firmly what he or she is told by the claimant about its business. If, for example, a claimant’s management suggests to the expert that a given project’s internal rate of return (that is, the average annual return or yield from the investment) over its 10-year lifetime is expected to be in the order of 30 per cent, this should cause proverbial alarm bells to ring in the expert’s head.8 This is not to deny that some projects can be highly profitable in the short-term, especially if the investor has an important first-mover advantage or there are large barriers to entry in the market; however, according to basic economic principles, price competition and innovation will reduce margins over time. As such, returns in excess of the cost of capital trend towards zero over the long term. On (fortunately rare) occasions, we have seen DCF models that ignore such realities and project exponential returns into perpetuity.
We now turn to how specifically the valuation expert adopts a cautious approach in building a DCF model. A DCF model will comprise firstly an explicit forecast period in which the valuation expert will make specific revenue, cost, growth, capital expenditure and other assumptions in order to model expected cash flows for each of those years. This explicit period may mirror the period over which the claimant felt able itself or did forecast its businesses prospects. If appropriate, the model may also include a terminal period to capture the value of the cash flows after the explicit period. In order to calculate the terminal value, the expert must make assumptions as to revenues, costs, and required capital expenditure in the final year of the explicit forecast period; crucially, he or she must make an assumption as to the long-term growth of the project or business.9
Since the future is by definition unknown, best practice dictates that the explicit forecast period should not be too long. What is reasonable will vary on a case-by-case basis and the expert must use his or her judgment and knowledge of the relevant industry to determine the appropriate length of the explicit period. We would suggest that, for some industries (eg, utilities such as water), cash flows are more predictable and less influenced by volatility or change than in others (eg, high technology, media).
Arbitrators should be wary of cash flow forecasts that assume high growth rates over an extended time period. While counter-cyclical businesses do exist that prove an exception to the rule, trading conditions have generally become harder for most businesses and therefore assumed growth rates should reflect this. In addition, as we have explained above, growth rates will tend to level off over time due to the entry of new competitors and price competition among other factors.
Notwithstanding the fact that valuation can be a difficult exercise given the inherent uncertainties, we wish to emphasise that, in practice, investors and companies trade and transact all the time on the basis of less than perfect information. Such investment decisions are based on parties’ respective evaluation of relevant future prospects. DCF is a powerful tool to enable such valuations to be performed, based as they are on the risk adjusted rate of return on future cash flows. Best practice requires that the conclusions of a DCF model be cross-checked where possible, for example, against recent transactions in comparable companies.
Market volatility and current economic conditions have important implications for the determination of the discount rate, which is used to discount future cash flows into present day terms. The discount rate takes into account investors’ required returns on their investment and reflects the riskiness of the future cash flows. In general terms, the more risky a project is, the higher its discount rate. The higher the discount rate that is applied to a stream of future cash flows, the lower will be their present value and vice versa.
For the purposes of this discussion, we will assume that the relevant project or business requires a mix of debt and equity funding. In our discussions, we will consider the impact of market turbulence on the cost of debt and cost of equity, the two components needed to calculate the weighted average cost of capital (WACC), being the discount rate. We will assume that the cost of equity is to be calculated using the capital asset pricing method (CAPM).
There are, of course, several different methods for determining the discount rate, and valuation experts and academics can and do disagree as to which is the most appropriate approach. Interestingly, a recent Delaware court decision came out strongly in favour of the CAPM approach over the ‘build-up’ method on the basis that the presiding judge, Chancellor Strine, believed the build-up method ‘has not gained acceptance among distinguished academicians in the area of corporate finance’.10
It is interesting to read Strine’s views, although it should be understood that CAPM and the build-up methods are not entirely different animals. While we do not have the space in this article to attempt a forensic comparison between the two, conceptually and in practice, the build-up method is closely related to CAPM and there are modified versions of CAPM that can appear similar to the build-up approach. It will be interesting to see whether the arbitration community will form any consensus on such matters in the future.
Returning to the calculation of WACC, our starting point is the risk-free rate, which is the default-free long-term interest rate in a currency and is used to estimate both the cost of debt and equity. One proxy for the risk-free rate is the yield on 10-year US treasury bonds, although there is now some debate as to whether even US treasury bonds are truly ‘risk-free’. As we shall see, the risk-free rate has changed significantly over time; as the rate changes, this causes the valuation to change also.
From a record high of over 14 per cent in 1982, the yield on the 10-year US treasury bond reached an all-time low of 1.4 per cent in July 2012. At the time of writing this article, the yield has recovered only slightly, to around 1.65 per cent. The chart below shows in stark terms both the change over time of the risk-free rate and also the historical nature of the record low.
While this chart puts the all-time low in US bond yields into historical context, since it spans nearly 100 years, it fails to show just how volatile the yield has been since the financial crisis began. Since most commentators pinpoint the real beginning of the financial crisis as the collapse of Lehmans in mid-September 2008, it is worth considering the path of the yield curve thereafter. As the chart below shows, the yield on 10-year bonds has been fairly volatile since 2008, fluctuating within a range of around 1.5 per cent to 4 per cent on a downwards trend. As we explain later, the presence of volatility means that the valuation date chosen is an issue of great importance.
As a key component of both the cost of debt and the cost of equity, all things being equal, a lower risk-free rate results in a lower discount rate. The significance of a low discount rate is that the net present value of a future stream of cash flows will be higher than with a higher discount rate. This may at first seem paradoxical; since much of the economy is struggling to tread water, one might expect the riskiness of most projects’ cash flows to be higher not lower. The answer to this paradox is that the risk-free rate is, of course, only one component of the discount rate. In the cost-of-debt formula, for example, one needs to also factor in the additional return above the risk-free rate required by lenders to companies, referred to as the corporate debt margin or corporate spread. We discuss the cost of equity later in this article.
The debt margin on any given corporate bond will depend on a number of factors, and is greatly influenced by the credit ‘rating’ attributed to it by specialist agencies such as Standard & Poor’s and Moody’s. The purpose of such ratings, according to Moody’s, is to provide investors ‘with a simple system of gradation by which future relative creditworthiness of securities may be gauged’.11 Agencies can and do vary in their methodology for assessing the creditworthiness of securities and will take into account many different parameters. One of the most important of these parameters is the interest coverage, defined as the ability of the borrowing firm to pay interest on its debt from its earnings; in general terms, the higher the interest coverage, the better the credit rating.
Depending on the particular agency, ratings range from the highest investment grade (AAA) to medium-grade (Baa) down to the lowest (junk) bonds (C).12 For obvious reasons, the most credit worthy bonds (AAA) enjoy the lowest spreads (ie, the differential above the treasury benchmark bonds) and junk bonds the highest. In calculating the relevant cost of debt, the valuation expert will usually refer to corporate spread data for the debt instruments and company types that most closely matches the risk profile of the company or project they are seeking to value.
With the onset of the financial crisis in 2008, something very interesting began to happen in terms of corporate spreads between debt instruments of different ratings. As the chart below shows, from 2005 until 2007, the corporate spread of AAA-graded bonds remained slightly below 1 per cent and that of Baa bonds around 2 per cent, resulting in a differential between the two of around 1 per cent. The period between 2008 and 2009 was marked by a severe credit crunch when, due to the ‘flight to safety’, investors eschewed almost all debt-instruments except for those of the highest investment grade. Consequently, as shown in the chart below, the corporate spread of Baa bonds (medium grade) exceeded 6 per cent, whereas the spread of AAA bonds was only around 2.6 per cent, ie, a differential of 3.4 per cent. While this differential subsequently narrowed, it is still significantly above pre-crisis levels.
It will be appreciated that wildly fluctuating corporate spreads - and the differential between debt instruments of different ratings - have real implications for the calculation of the discount rate and hence valuation of a claimant’s damages. Significant differences can emerge between experts’ respective valuations due to, inter alia, the valuation date assumed, the assumed risk profile of the subject company (ie, selection of the appropriate corporate spread) and other factors. There is a further factor to take into account: in times of high volatility and heightened risk, companies with relatively high-risk profiles may struggle to obtain debt financing at all. If it is unable to obtain lending, the claimant’s counterfactual scenario is likely to be pessimistic, resulting in a small (if any) economic loss.
Turning to the cost of equity, in general terms, the valuation expert will seek to take into account the additional return the equity investor requires compared with the risk-free investment. The cost of equity is almost always higher than the cost of debt due to:
- the tax shield on interest paid on debt;
- the fact that debt holders will be repaid in preference to shareholders in the event of liquidation; and
- the fact that the return on debt is fixed and thus predictable, whereas shareholders benefit from excess returns.
The first element of this additional return is known as the equity (or market) risk premium (ERP). The size of the ERP at any point in time is heavily influenced by market volatility and overall change. Some of the factors that determine the ERP include:
- investors’ attitude towards risk - the more risk averse investors (as a collective) are, the higher the additional returns they require. At the height of the credit crunch, investors’ flight to the relative safety of US treasury bonds was indicative of wide-scale risk aversion; and
- general state of the economy - where we are in the business cycle at a given time will influence the ERP. Volatile conditions, including erratic swings in inflation, economic growth, interest rates, and so on, will tend to increase the ERP.
There are differing opinions in the valuation and academic communities as to how the ERP should be calculated, over what time period, what the proxy for the risk-free rate should be and so on. Many experts prefer to take a (simple) long-term view and estimate the average historical ERP over say 50 years or even further back. This is done by estimating the actual excess return from equities over risk-free assets over the chosen historical period. Proponents of this method believe that using a long-term average means that the peaks and troughs of the market and business cycle are ironed out such that the historical ERP takes all eventualities into account.
The obvious downside with using a historical ERP is that the past may not be a reliable guide to the future; valuation is, after all, focused on what the future holds. For this reason, some experts prefer to calculate the ‘supply-side’ ERP which has a forward-looking assumption built-in. Under the supply-side ERP approach, the valuation expert attempts to calculate the difference between the expected total returns on stocks and the expected risk-free return. The expected total returns on stocks are usually calculated by focusing on either the expected dividend payment and future growth in dividends, or the share-price-to-earnings ratios. Clearly, in a time of volatility, uncertainty or economic recession, the market may have very low expectations of future earnings and dividends. Stock-market crashes are not unheard of either: in the eight days from 1 October 2008 to 10 October 2008, the Dow Jones Industrial Average fell by over 22 per cent, resulting in depressed price-earnings ratios for many stocks. Finally, for various reasons, the supply-side ERP will usually be lower than the historical ERP.
For the above reasons, the question of how and when the valuation expert calculates the ERP will have important implications for the determination of the discount rate.13
Once the ERP has been determined, the valuation expert’s next task is to calculate the beta, being a measure of how sensitive a company’s share price is to movements in the overall stock market index. The higher the beta, the more sensitive the stock is to changes in the overall market and vice versa. Thus, while a stock with a beta of one will theoretically move exactly in step with the market, stocks with high betas will exaggerate market movements (in both directions). A stock with a beta of 1.5 is, in theory, 50 per cent more volatile than the overall market. In other words, there is a greater chance of making more money than the market but equally a greater chance of making less: the stock is more risky.
In valuing the relevant project or company, the valuation expert will usually estimate the beta by referring to published data of comparable quoted companies. It should be understood that the value of beta for any given company is calculated using historical data and regression analysis; beta is, therefore, backward-looking. Despite this, it is thought by many that stock betas remain constant over time and do not vary, for example, in response to changes in market volatility. This assumption may not be correct. According to some recent academic research, betas can and do change in response to highly volatile market conditions. For example, Arisoy et al write:
...we find that portfolio betas change significantly when aggregate market volatility is beyond a certain threshold. More specifically, portfolios of small and value stocks have significantly higher betas at times of high volatility. The opposite is true for big and growth stock portfolios. Due to changes in their market betas, small and value stocks are perceived riskier than their big and growth counterparts in bad times, when aggregate volatility is high.14
Depending on the project or company appraised, under the modified CAPM approach it may be appropriate to add additional risk premiums to build up the cost of equity. One of the most common risk premiums is the country risk premium, which reflects the greater perceived risk of investing in a given region relative to investing in the most stable and developed economies, such as Western Europe, the UK and the USA; this reflects different political, economic and local currency risks. One way of measuring the country risk premium is by reference to the additional interest rate that would be payable on a loan for a given investment project in a particular (less stable) country compared to the rate payable for a loan for a similar project in a stable country such as the United States. Care must be taken to ensure that the terms, maturity and currency of the loans are the same. By way of illustration, few countries are currently immune to concerns over sovereign debt, with Greece, Spain and Portugal providing obvious examples. Political instability, upheaval, and depressed economic conditions are other factors that can impact on country risk. Consequently, many country risk premiums are likely to be volatile, within an upward trend.
As noted above, since conditions have been volatile for a number of years, the choice of valuation date is likely to have a significant bearing on valuation, and hence quantification of damages. It is important for the legal team and the expert to work closely together in order to determine what information should be taken into account, including the use of post ante or ex ante information.
In the face of market volatility and depressed economic conditions, Claimants need to be realistic in their expectations as to how the counterfactual scenario would have played out in the absence of the respondent’s actions. Failure to take into account new or changed realities can lead to damages claims being significantly overstated.
Need for additional risk premiums
Volatility affects different businesses and industries in assorted ways. Companies which have powerful brands, a low fixed-cost base, low gearing, and a strong business model are far more likely to come out the other side of an economic downturn than companies that do not meet these criteria. Whilst experts have different opinions, under the modified CAPM approach, it may be appropriate to include additional risk premiums to take into account the above-average riskiness of, for example, a small company.
As we have discussed, market volatility and economic change have important implications for valuation and the quantification of damages. Given the recent economic environment, the challenges faced by experts in dealing with these issues have been exacerbated. This places greater emphasis on the ability of the expert to not only quantify and factor in these difficult areas in their analyses, but also in being able to explain and present their conclusions to the tribunal (and client) in as clear, transparent and logical manner as possible.
Forward-looking projections are only as good as the inputs - ‘garbage in, garbage out’ is as true today as it ever has been, yet those inputs as described above are a moving target, and have been more volatile in recent years than what many of us have experienced in our lifetime. This in turn leads to greater uncertainty in being able to rely upon these inputs for the purposes of estimating the future. Can we predict the future? That is the ultimate question here. The answer lies within a range of possibilities. The expert needs to consider all possibilities, but ultimately reach a reasoned conclusion based on a sound methodology, reliable inputs, and an open mind as to what the future may bring.
* This paper should not be construed as expressing opinions on matters of the stock markets and law, which are outside the scope of the authors’ expertise. Nor does this paper represent the view of FTI Consulting Inc or any of its experts, who have held a range of views on the matters discussed below and may be expected to do so in future.
- See, for example, www.icis.com/Articles/2012/05/07/9556553/commentary-energy-hurts-europe.html.
- From Capitalism, Socialism and Democracy (New York: Harper, 1975) (originally published 1942), pp82-85.
- Jim Balsillie, co-chairman of RIM famously announced ‘[Apple and the iPhone is] kind of one more entrant into an already very busy space with lots of choice for consumers... But in terms of a sort of a sea-change for BlackBerry, I would think that’s overstating it.’
- Report dated 18 August 2011 of the Shale Gas production subcommittee of the US Department of Energy (www.shalegas.energy.gov/resources/081811_90_day_report_final.pdf), p6.
- The IRR measures the average annual yield on an investment: the higher the IRR, the more attractive the investment. In general terms, the IRR shows the discount rate that would be needed to produce a present value of all future cash flows of zero.
- In our experience, claimants do not always remember to make provision for the likely capital expenditure that will be required after the explicit period.
- Appraisal of The Orchard Enterprises, see http://courts.delaware.gov/opinions/download.aspx?ID=175740.
- Based on Moody’s ratings.
- Incidentally, the same Chancellor Strine to whom we made reference in relation to CAPM favours the supply-side ERP. Strine provides his reasoning thus:
I recently... addressed the choice between the historical equity risk premium and the supply-side equity risk premium in Global GT LP v Golden Telecom, Inc. In Golden Telecom, although recognising that the historical equity risk premium is the more traditional estimate, I concluded that the academic community has shifted toward greater support for equity risk premium estimates that are closer to the supply-side rate published by Ibbotson.
- ‘Aggregate Volatility and Threshold CAPM’ by Yakup Eser Arisoy, Aslihan Altay-Salih, Levent Akdeniz, November 2011.