Dr. G. (Guido) Bacciagaluppi

Dr. G. (Guido) Bacciagaluppi

Associate Professor
History and Philosophy of Science
+31 30 253 5621

My main field of research is the philosophy of physics, in particular the philosophy of quantum theory, where I have worked on a variety of approaches, including modal interpretations (for my PhD), stochastic mechanics, Everett theory, de Broglie-Bohm pilot-wave theory, and spontaneous collapse theories, with a special interest in the theory of decoherence. Other special interests include time (a)symmetry, the philosophy of probability, issues in the philosophy of logic, and the topics of emergence, causation, and empiricism. I also work on the history of quantum theory and have co-authored four books on the topic (two of them currently in press), including a widely admired monograph on the 1927 Solvay conference. I am currently engaged in research across a range of these topics, and plan to focus on them also in the next few years.

For full bibliographic references (where available), see my CV with List of Publications.


Recent projects:

Core topics in the foundations of quantum mechanics:

  • ‘Bell Inequality Violation and Relativity of Pre- and Post-selection’ (with Ronnie Hermens):

This paper centres on the fact (known since 1999) that one can violate the Bell inequalities using uncorrelated particles by performing an entangling measurement and then postselecting the results. We show that one can simulate this violation classically and propose a variant of existing experiments in order to rule out such alternative explanations. We are hoping to interest experimentalists to perform it (my very first foray into experimental work!).

  • ‘The Role of Decoherence in Quantum Mechanics’:

I have also recently completed a major revision of my Stanford Encyclopedia of Philosophy entry on the role of decoherence in quantum theory. Decoherence is the suppression of the interference effects typical of quantum mechanics that can take place spontaneously, in particular under appropriate interaction of a system with its environment, and it plays a prominent role in how most interpretations of quantum mechanics aim to recover the classical world around us.


Philosophy of probability:

  • ‘Unscrambling Subjective and Epistemic Probabilities’:

I believe this paper to be one of my more significant pieces of work, and it collects the result of several years of thinking about probability which until now had resulted in only few publications. The central topic is the distinction between the notion of subjective probability (and its opposite objective probability or chance) and that of epistemic probability in the sense of ignorance-interpretable probability, i.e. a probability measure over unknown facts (and its opposite, which I call ontic probability following the terminology usual in debates about the epistemic or ontic status of quantum states). I argue that these two pairs of notions are logically independent, and propose novel ways of drawing these distinctions precisely, including a wide-ranging generalisation of Lewis’s Principal Principle with a distinctly perspectival flavour (allowing for backwards chances, high-level chances etc.). I conclude the paper by applying some of the resulting insights to the interpretation of the quantum state in relativistic collapse theories, and by distancing myself from both objectivist and neo-Humean positions to advocate instead a de Finettian and more generally Humean view.


History of the foundations of quantum mechanics:

  • The Einstein Paradox: The debate on nonlocality and incompleteness in 1935 (with Elise Crull):

My largest project of recent years is clearly this book for CUP. Like my previous book with Antony Valentini on the 1927 Solvay conference (begun 1997, published 2009) this has been a long-term endeavour begun in 2008. We have now delivered the 450-page manuscript and have just finished sorting out copyright issues before production. It consists for one-third of original research and for two-thirds of critical edition and translation of primary sources (many hitherto unpublished). The book explores in detail the background for and the debate surrounding the 1935 paper by Einstein, Podolsky and Rosen (EPR). This is one of the defining papers in the foundations of quantum mechanics, specifically of the debates on (non)locality and (in)completeness, and indirectly the basis for the current explosion of interest in quantum information and quantum technologies.

The volume is the most comprehensive collection of sources on EPR, and includes first-time translations or publications of many original materials, in particular ca. 30 letters by Einstein, Schrödinger, Bohr, Heisenberg, Pauli, Born and others. The original research comprises revisiting both the ideas and role of Einstein and of Bohr, and analysing - largely for the first time - those of Schrödinger and of Heisenberg, as well as providing a briefer discussion of minor responses to the EPR paper. Among other things, we give the most detailed analysis to date of the development of Schrödinger’s ideas in the period 1927-1935, and are the only authors to analyse (and translate into English) Heisenberg’s own draft response to EPR. We further provide new insights into Einstein’s views on completeness, as well as uncovering a prehistory stretching back to 1926 of his famous photon-box thought experiment of 1930. We also analyse to our satisfaction the notoriously difficult reply to EPR by Bohr, laying out what we believe are its strengths and limitations. This work of interpreting the ‘founding fathers’ falls squarely within the tradition of the philosophy of quantum mechanics.

  • The Oxford Handbook of the History of Quantum Interpretations:

I have been part of the editorial team (led by Olival Freire) for this volume comprising some 40 chapters on a vast array of historical topics in the interpretation and foundations of quantum mechanics. I also contributed one chapter myself, extending to von Neumann my previous work on the beginnings of the statistical interpretation of quantum mechanics as understood by Born and by Heisenberg, including assessing his notorious 'no-hidden-variables' theorem.

  • Grete Hermann:

Before joining Helmut Pulte and Thomas Reydon as editor of JGPS, I contributed a Special Section on Grete Hermann, including a detailed essay review of Kay Herrmann's (German) edition of her works and correspondence on the philosophy of science and translations of three of her shorter papers in this area. 



Current projects:

Core topics in the philosophy of quantum mechanics:

  •  ‘Colbeck-Renner Theorem without Setting Independence’ (with R. Hermens and G. Leegwater):

This paper currently in progress is a strengthening of the 2010 theorem by Colbeck and Renner showing that any ‘non-trivial’ hidden variables theory must violate ‘parameter independence’ (it would allow to signal instantaneously if one knew the hidden variables). Colbeck and Renner’s theorem, just like Bell’s, relies on the additional assumption that the statistical distribution of the hidden variables is independent of the (later) choice of measurement settings (‘setting independence’). Our version of the theorem drops this additional assumption, and is the only non-locality theorem to do so, thus applying also to Gerard ’t Hooft’s superdeterminism and Huw Price’s retrocausation. However, it should not be seen as a ‘no-go’ theorem, but as clarifying the status of these proposals. 

  • ‘Classicality, Decoherence and Time-Symmetry’:

This paper concerns the fact that models of decoherence always invoke special initial conditions leading to ever increasing entanglement between a system and its environment, with the formation of ‘permanent records’ of certain thereby privileged quantities of the system. The mechanism and effects of decoherence thus appear to introduce a new arrow of time into quantum mechanics (corresponding to the arrow of ‘branching’ in many-worlds theory). I have questioned already in previous work (‘Probability, Arrow of Time and Decoherence’) whether such a radical assumption is needed. In this paper I develop a toy model of time-symmetric decoherence, in which a system gets more and more entangled with some degrees of freedom in the environment while at the same time getting more and more disentangled from other degrees of freedom (which are thus ‘antirecords’  or ‘prophecies’ of the future). I thus suggest that ‘branching’ may be a perspectival effect: if one coarse-grains over the (anti)records of the future, one recovers the familiar branching structure of decoherence - but if one coarse-grains over the records of the past, one obtains an analogous antibranching structure.

  • ‘Multi-time Correlations in Stochastic Mechanics’ (with M. Derakhshani):

I am also working with my former PhD student Maaneli Derakhshani on a paper arguing that taking proper account of decoherence dissolves a longstanding problem (of ‘multi-time correlations’) in another approach to the foundations of quantum mechanics, Edward Nelson’s ‘stochastic mechanics’.


Issues in quantum logic and quantum probability:

  • ‘A Proof of Specker’s Principle’:

Ever since Lucien Hardy’s ‘Quantum Theory from Five Reasonable Axioms’ from 2001, there has been a revival of axiomatic approaches to quantum theory seeking to derive it from ‘transparent physical principles’. One powerful principle that gets used in a number of current proposals was formulated long ago by my teacher Ernst Specker: that any set of mutually pairwise testable propositions should also be jointly testable. Despite its simplicity, Specker’s principle is not quite ‘physically transparent’, and in this other paper in preparation I propose to derive it from the three assumptions of ‘maximal entanglement’ (existence of states of pairs of systems such that measuring any quantity on one system is equivalent to measuring the same quantity on the other system), ‘no-signalling’ (the impossibility of using measurements to signal across entangled pairs) and ‘existence of combinable measurements’ (for any two compatible physical quantities, there are separate measurements of the two quantities that can be freely combined in either order to yield a joint measurement). The first two conditions are clearly physically transparent, and the third can be seen as a criterion for judging whether a putative measurable quantity is indeed a genuine physical quantity. This paper also includes a new (‘entangled’) version of Specker’s fable of the seer from his 1960 paper in Dialectica.      

  • ‘Quantum Logic in Disguise?’:

This other paper in preparation elaborates on a point made in my ‘Is Logic Empirical?’. It discusses one of the notorious claims made by Putnam in 1968: that one can think of the classical logical connectives as being in fact quantum logical connectives ‘in disguise’. I argue that such a recovery of classical logic from quantum logic proceeds analogously to that of classical logic from intuitionistic logic from the point of view of a hard-core intuitionist, or to that of classical logic from paraconsistent logic from the point of view of a dialetheist. Namely, certain additional inferences are licensed not by the form of the arguments, but by the meaning of the propositions involved: about finite domains in the case of intuitionism, about domains in which there happen to be no true contradictions in the case of dialetheism, and about properties that are compatible with any others (so-called ‘classical propositions’) in quantum logic.


History of the philosophy of quantum mechanics:

  • Grete Hermann:

I am writing a paper on Hermann's natural philosophy (working title: `Kantianism with a Human Face') and one on her version of the Copenhagen interpretation of quantum mechanics (working title: `Better than Bohr'). I argue that Hermann belongs in the canon of Western philosophy: her contributions to the philosophy of quantum mechanics are the best and most thorough exposition of complementarity ever, and she developed a general neo-Kantian approach compatible with the whole of the physics of her time, besides important contributions to ethics, mathematics and more general philosophy of science.


Future projects:

  • Prospects for relativistic collapse theories:

Spontaneous collapse theories are modifications of quantum mechanics that introduce a fundamental (as opposed to ‘measurement-induced’) mechanism of collapse of the quantum mechanical wave function. The first arguably successful proposal was by Ghirardi, Rimini and Weber (GRW) in the mid-1980s for standard non-relativistic quantum mechanics. In the non-relativistic case, however, the collapse happens instantaneously everywhere, so that there is no straightforward relativistic generalisation. Various proposals have been put forward, but they are all problematic in some ways, and there is currently no consensus on the way forward. I already have seed funding from the Foundational Questions Institute for what is intended as a collaboration with W. Myrvold (Western Ontario) and O. Maroney (Oxford) on these questions. We have all been significantly involved with these issues. My own contribution (‘Collapse Theories as Beable Theories’) was the independent introduction (actually following a suggestion by J. S. Bell) of the so-called ‘flash’ ontology (in the terminology of Tumulka), in which the quantum state is interpreted not as some kind of physical field, but as an expression of the law-like probabilities for spatiotemporally localised events, namely the collapses (‘jumps’, ‘flashes’). However, a 2017 theorem by Myrvold shows that relativistic collapse theories must suffer from infinite energy production. As a way out, I suspect one may find inspiration in Heisenberg’s original statistical interpretation and take transition probabilities but not quantum states as fundamental. This would be a way of making collapse theories relativistically invariant analogous to Maroney’s suggestion (with D. Bedingham) for how to make collapse theories time-symmetric, as well as in line with ideas about probability as developed in my ‘Unscrambling Subjective an Epistemic Probabilities’.

  • Realism and empiricism:

I am developing my own variant of Bas van Fraassen’s constructive empiricism (I call it ‘adaptive empiricism’). Expressed in a slogan, the idea is that science should not just save the phenomena, but also determine what should count as a phenomenon and thus be worth saving in the first place. While I grant much of Van Fraassen’s analysis and maintain a distinction between what we care about getting right and what we can safely be agnostic about, I suggest that this distinction should not be drawn once and for  all, and that we should adapt what we consider to be ‘observable’ to changes in both theory and especially experiment. This idea can be developed further, maybe using Van Fraassen’s recent notion of ‘empirical grounding’, but in particular borrowing some of the tools of structural realism (notably the idea of ‘real patterns’). The resulting picture may come to resemble some forms of structural realism (in particular Ladyman’s ‘rainforest realism’), but that would only mean that sensible forms of realism and sensible forms of empiricism meet in the middle – a pleasing conclusion for a philosopher of irenic temper like myself. I have already published one paper on this topic (`Adaptive Empiricism’), and I am developing the case study of quantum mechanics (`What is a Quantum Phenomenon?’).

  • Niels Bohr for the 21st Century:

Niels Bohr has been hugely influential both within and without physics but also hugely misunderstood, partly because a few of his writings have been overstudied at the expense of many others. I would like to establish an unprejudiced understanding of Bohr, building on the current revival of interest in Bohr’s ideas in the philosophy of physics, and extending it to his wider influence on the practice of physics, in other scientific disciplines, and on the perceived role of science.