Full description not available
J**R
Focuses on the key inputs of a decision analysis
Highly recommended.Most capital investments are poorly analyzed. Doug Hubbard provides remedies for the common shortcomings:o Typical cost-benefit analyses use single-value best estimates for inputs; these ignore or inadequately address risk and uncertainty. Also, often, important benefits are omitted because they are "intangibles" such as "improved customer service." Remedies in the book: Capture expert judgments as probability distributions; then solve the forecast model with Monte Carlo simulation. Decompose and explicitly represent former-intangibles in measurable units.o Multi-criteria scoring approaches often feel good yet have little theoretical foundation. They are entirely subjective and have not been shown to improve decision-making. Remedy: clarify the business (or other) objective and craft a quantified decision policy accordingly. Judge and/or model inputs in meaningful, quantitative terms.Everything important to a decision should be in a forecasting model. And everything in those models is either structural or quantified.For me, three primary themes emerge in the book:1. Calibration. Hubbard asserts that everything can be quantified--and he enjoys challenging individuals and groups to find any exception. Most people, even if expert in their field, are biased when making judgments. Hubbard shows ways to "calibrate" these experts (including the reader) with perhaps a half-day of practice. Most readers (as did I) will find that they initially fare poorly on the engaging calibration exercises in the book. We suffer from overconfidence and other cognitive biases. With feedback and practice, most people can quickly improve at assessing probabilities and probability distributions, and confidence ranges.2. What needs to be measured further? The most interesting calculation in the book is a form of sensitivity analysis. Which variables are most important to measure further? Hubbard calculates the value of perfect information for each variable with a straightforward expected opportunity loss calculation. Though an analysis may have dozens of identified variables with uncertainty, in his experience typically only 1 to 3 variables are worthy of further measurement.3. Deliberately seek information about the most-important risks and uncertainties where the additional time and cost are justified. This usually means obtaining more data with targeted investigation. Hubbard offers these encouraging maxims:o It has been done beforeo You have more data than you thinko You need less data than you thinko It is more economical than you thinkAdditional highlights of the book include:o Abundant war story examples. Much of Hubbard's work has been in the information technology (IT) sector. I especially enjoyed the case about forecasting fuel consumption for the U.S. Marine Corps.o Demystifying parameters that the reader might initially think as "intangibles." For example, "Improved Customer Service" might be measured in terms of percent of customers re-ordering; percent of returns; average delivery time, etc.o Example calculations, all done with Microsoft® Excel. He hosts a website, [...], where the reader can download example calculation worksheets, additional calibration quizzes, papers, articles, and reader comments.The first edition is excellent. Upon learning of a second edition, I immediately bought the updated book for a fresh re-read. This was again a good investment of time and money. The new edition is updated, expanded (about 15%), and more crisp. The editing and layout are again excellent. Several references are now embedded to his other best-selling and companion book, The Failure of Risk Management.
C**N
fantastic to open your mind... dangerous to close it as well
This is one of the few books that I well and truly recommend to my colleagues to read, given that we are mostly stuck with the simple risk models in matrices for the majority of our risk management endeavors. This book provides a very good insight as to how risk could be approached and how your information is probably better than you think it is.the danger with Hubbard's writing style is that he promotes his approach above all other approaches, insisting that anything can be quantified and should be quantified in your risk management. This is maybe good for ordered-domain problems (referring to the Cynefin framework of Prof Snowden), but unordered or disordered domain problems could be too overwhelming (or impossible) for this approach. Hubbard does not seem to care enough for the limitations of this approach. Regardless, this book is an excellent read and still highly recommended to asset managers, especially those that take care of physical assets.
R**T
Business Metrology
It would be tempting to refer to this book as a business statistics book in disguise - as it is certainly that - but given its title, it's really a book about "business metrology." While most texts on statistics mostly cover the analysis of data once it's been gathered, this book is fundamentally about measurement: obtaining data. Given the close coupling between the two, binding both within a single business cover makes for a very useful and practical business tool.There have been a couple of reviews stating that the book doesn't offer practical ways to measure intangibles. One of the lessons from this book is that by their very nature, intangibles often have to be measured indirectly by observing other variables and then discovering a correlation. Statistical analysis can handle the latter, but choosing an appropriate set of other variables can be very challenging, often requiring clever outside-the-box thinking.Fundamentally, this book is about method, process, first principles - ideas about measurement and information applied in a business context - and not so much about their technological implementation; however, it's interesting to see how the former fares as the latter progresses. If the concepts remain unscathed or are reinforced, one can conclude that they continue to be valid and useful.Since the original 2007 publication date, "big data," "analytics," and "data science" have become everyday business terms. In the chapter, "Illusion of Intangibles," the author lists four useful measurement assumptions. The second is, "You have more data than you think." Big data in the business context is based on the notion that businesses collect and store mountains of data. So you do have more data than you think, a LOT more. Often however, much of it is recorded for other purposes and seems on the surface to have little value otherwise. However, this book suggests we challenge this assumption. If all this data could be somehow collected and analyzed, it's possible that there could be ways to extract the latent information hidden inside.There are lots of recent examples of this kind of analysis. One is the 2009 Google Flu Trends' prediction of the advance of the H1N1 pandemic in 2009. Here is a wonderful example of what the author refers to as a Fermi problem: (cleverly) using what you do know to measure indirectly what you are looking for. One of the most valuable aspects of this analysis - aside that it was essentially free - was that it made predictions in near real-time. The notion that it's possible to track the activity of a pandemic by analyzing search terms entered in a web browser is quite remarkable - or it was at the time. It's actually common-place now. It's also indicative that the concepts discussed in the book are not only valid, relevant, and useful, but are possibly even more so now given the access to data and computation that drive big data, analytics, and data science in business.
Trustpilot
4 days ago
3 weeks ago