Featured Articles

Probabilistic Foundations of Statistical Network Analysis (Book)

Harry Crane (2018).


See homepage for book here.

Logic of Probability and Conjecture

Harry Crane (2018).


I introduce a formalization of probability which takes the concept of 'evidence' as primitive. In parallel to the intuitionistic conception of truth, in which 'proof' is primitive and an assertion A is judged to be true just in case there is a proof witnessing it, here 'evidence' is primitive and A is judged to be probable just in case there is evidence supporting it. I formalize this outlook by representing propositions as types in Martin-Lof type theory (MLTT) and defining a 'probability type' on top of the existing machinery of MLTT, whose inhabitants represent pieces of evidence in favor of a proposition. One upshot of this approach is the potential for a mathematical formalism which treats 'conjectures' as mathematical objects in their own right. Other intuitive properties of evidence occur as theorems in this formalism.

Is Statistics Meeting the Needs of Science?

Harry Crane and Ryan Martin. (2018).


Publication of scientific research all but requires a supporting statistical analysis, anointing statisticians the de facto gatekeepers of modern scientific discovery. While the potential of statistics for providing scientific insights is undeniable, there is a crisis in the scientific community due to poor statistical practice. Unfortunately, widespread calls to action have not been effective, in part because of statisticians' tendency to make statistics appear simple. We argue that statistics can meet the needs of science only by empowering scientists to make sound judgments that account for both the nuances of the application and the inherent complexity of fundamental effective statistical practice. In particular, we emphasize a set of statistical principles that scientists can adapt to their ever-expanding scope of problems.

Why "Redefining Statistical Significance" Will Not Improve Reproducibility and Might Make the Replication Crisis Worse

Harry Crane (2017).


A recent proposal to "redefine statistical significance" (Benjamin, et al. Nature Human Behaviour, 2017) claims that false positive rates "would immediately improve" by factors greater than two and replication rates would double simply by changing the conventional cutoff for 'statistical significance' from P<0.05 to P<0.005. I analyze the veracity of these claims, focusing especially on how Benjamin, et al neglect the effects of P-hacking in assessing the impact of their proposal. My analysis shows that once P-hacking is accounted for the perceived benefits of the lower threshold all but disappear, prompting two main conclusions: (i) The claimed improvements to false positive rate and replication rate in Benjamin, et al (2017) are exaggerated and misleading. (ii) There are plausible scenarios under which the lower cutoff will make the replication crisis worse.

Commentary by Andrew Gelman.

Response by one of the authors (E.J. Wagenmackers) Bayesian Spectacles Blog.

More discussion by Tim van der Zee.

Edge exchangeable models for network data

Harry Crane and Walter Dempsey. (2017). Journal of the American Statistical Association, in press.


Exchangeable models for countable vertex-labeled graphs can- not replicate the large sample behaviors of sparsity and power law degree distribution observed in many network datasets. Out of this mathematical impossibility emerges the question of how network data can be modeled in a way that reflects known empirical behaviors and respects basic statistical principles. We address this question by observing that edges, not vertices, act as the statistical units in networks constructed from interaction data, making a theory of edge-labeled networks more natural for many applications. In this context we introduce the concept of edge exchangeability, which unlike its vertex exchangeable counterpart admits models for networks with sparse and/or power law structure. Our characterization of edge exchangeable networks gives rise to a class of nonparametric models, akin to graphon models in the vertex exchangeable setting. Within this class, we identify a tractable family of distributions with a clear interpretation and suitable theoretical properties, whose significance in estimation, prediction, and testing we demonstrate.

A framework for statistical network modeling

Harry Crane and Walter Dempsey (2016).


Basic principles of statistical inference are commonly violated in network data analysis. Under the current approach, it is often impossible to identify a model that accommodates known empirical behaviors, possesses crucial inferential properties, and accurately models the data generating process. In the absence of one or more of these properties, sensible inference from network data cannot be assured. Our proposed framework decomposes every network model into a (relatively) exchangeable data generating process and a sampling mechanism that relates observed data to the population network. This framework, which encompasses all models in current use as well as many new models, such as edge exchangeable models, that lie outside the existing paradigm, offers a sound context within which to develop theory and methods for network analysis.

The ubiquitous Ewens sampling formula (with discussion and a rejoinder by the author)

Harry Crane. (2016). Statistical Science, 31(1):1-39.


Ewens's sampling formula exemplifies the harmony of mathematical theory, statistical application, and scientific discovery. The formula not only contributes to the foundations of evolutionary molecular genetics, the neutral theory of biodiversity, Bayesian nonparametrics, combinatorial stochastic processes, and inductive inference but also emerges from fundamental concepts in probability theory, algebra, and number theory. With an emphasis on its far-reaching influence throughout statistics and probability, we highlight these and many other consequences of Ewens's seminal discovery.