Rethinking Randomness: A New Foundation for Stochastic Modeling (Paperback)

Before placing an order, please note:

  • You'll receive a confirmation email once your order is complete and ready for pickup. 
  • If you place a pre-order in the same order as currently available titles, an additional shipping fee will be added to your order. 
  • Women & Children First is not responsible for lost or stolen packages.
Rethinking Randomness: A New Foundation for Stochastic Modeling By Jeffrey P. Buzen Cover Image


Mathematical models based on stochastic processes have proven surprisingly accurate in many situations where their underlying assumptions are unlikely to be correct. Rethinking Randomness introduces an alternative characterization of randomness and a new modeling framework that together explain the improbable success of these probabilistic models. The new approach, known as observational stochastics, is derived from "back of the envelope" methods employed routinely by engineers, experimental scientists and systems oriented practitioners working in many fields. By formalizing and extending these intuitive techniques, observational stochastics provides an entirely rigorous alternative to traditional mathematical theory that leads to vastly simpler derivations of certain major results and a deeper understanding of their true significance. Students who encounter probabilistic models in their courses in the physical, social and system sciences should find this book particularly helpful in understanding how the material they are studying in class is actually applied in practice. And because all mathematical arguments are self-contained and relatively straightforward, technically oriented non-specialists who wish to explore the connection between probability theory and the physical world should find most of the material in this book readily accessible. Most chapters are structured around a series of examples, beginning with the simplest possible cases and then extending the analysis in multiple directions. Powerful generalized results are presented only after simpler cases have been introduced and explained thoroughly. Readers who choose to bypass the mathematically complex sections of this book can still use these simpler examples to obtain a clear understanding of the basic principles involved. The most extensive series of examples appear in Chapter 7, which incorporates a "mini course" on queuing theory and its applications to Computer Science. The author's first hand accounts of early developments in this area lend Rethinking Randomness a unique flavor. Chapter 8 examines the implications of observational stochastics for the debate between Bayesians and frequentists regarding the true meaning of "probability". Once again, the discussion is centered on a series of simple and highly approachable examples, leading ultimately to an interpretation of probability that is aligned most closely with the view of the great French mathematician Poincar (1854-1912). This proportionalist interpretation of chance then provides the foundation for the intuitive discussions of the Law of Large Numbers and the Ergodic Theorem that appear in Chapter 9. Advanced students and researchers will recognize that observational stochastics has the potential to be extended in many directions that are largely unexplored. These include the use of shaped simulation to improve the speed and accuracy of Monte Carlo simulations, the development of new error bounds for cases where assumptions of empirical independence are not satisfied exactly, and the investigation of mathematical properties of special formal structures known as t-loops. Extensions required to deal with transient and trans-distributional aspects of observable behavior may also be feasible, but represent a substantially more difficult undertaking for researchers who wish to take up the challenge.

About the Author

ScB in Applied Mathematics, Brown University, 1965; MS and PhD in Applied Mathematics, Harvard University, 1966 and 1971. Credited with major contributions to stochastic modeling including the convolution algorithm (Buzen's algorithm), the central server queuing model, and operational analysis. Gordon McKay Lecturer on Computer Science at Harvard, where his students included Microsoft's Bill Gates and Ethernet inventor Bob Metcalfe, 1971-76. Co-founder and Chief Scientist of BGS Systems, a software company whose modeling tools were used for decades to manage computer performance at many of the largest financial institutions, industrial enterprises and government agencies on six continents, 1975-98. Elected to the National Academy of Engineering "for contributions to the theory and commercial application of computer system performance models," 2003. Author of more than 100 refereed technical papers including Fundamental Laws of Computer System Performance, which received the inaugural ACM Sigmetrics "Test of Time" award in 2010, reflecting 34 years of enduring influence.
Product Details
ISBN: 9781508435983
ISBN-10: 1508435987
Publisher: Createspace Independent Publishing Platform
Publication Date: August 21st, 2015
Pages: 332
Language: English