The Dutch Association for Philosophy of Science (NVWF), the Freudenthal Institute and Descartes Centre of Utrecht University and the Faculty of Philosophy of the University of Groningen jointly host a meeting between Philosophy of Science, AI and Machine Learning.
Machine learning and “big data” methods are rapidly transforming a broad range of sciences. Cell biologists employ automated causal search to analyse the workings of cells, astronomers employ big data techniques to label stars in distant galaxies, climatologists estimate their models by ever larger data sets, and linguists make use of deep neural nets to improve automatic translators. Especially for sciences that target complex systems, data science methods hold enormous promise. At the same time the concerns over the introduction of data science and machine learning in the sciences grow. What kind of scientific knowledge do the new methods deliver, and how do these methods transform the substance matter of the sciences? The philosophy of science can bring decades of debate on scientific evidence, methods, models, and theory to bear on such issues. The symposium showcases some of the best work from this new and exciting field of research.
The program of the afternoon is as follows:
13:30 Room open
14:00 Tom Sterkenburg "Epistemology and theory of machine learning"
14:30 Emily Sullivan "Machine learning in science: Just a toy?"
15:00 Break with tea, coffee and cookies
15:30 Mieke Boon "Where to locate explainability of explainable machine learning?"
16:00 Henk de Regt "The prospects of artificial scientific understanding"
16:30 Plenary discussion
16:45 Award ceremony Pieter van Foreest Science Prize (Medical Humanities)
Speakers and abstracts
All speakers will talk for about 25 minutes, followed by a brief exchange with the audience. At the end of the afternoon all speakers will gather on the stage for a plenary discussion.
Tom Sterkenburg (LMU Munich)
Epistemology and theory of machine learning
The mathematical theory of machine learning offers generalization guarantees that support the epistemological claim that our learning algorithms are, in fact, good algorithms. This is a modern version of the traditional project in the philosophy of science to provide a formal justification for scientific or inductive inferences. But in both cases, there is a question of how any such justification could go together with existing skeptical impossibility results and arguments. In this talk, I address this question by spelling out the kind of justification that learning theory offers: general and analytic, yet model-relative. I will argue that this kind of justification fits in a broader epistemological perspective on inquiry. Finally, I will briefly address the recent debate about the apparent failure of classical machine learning theory to explain the generalization of modern algorithms like deep neural networks, and the epistemological contours of a new theory.
Tom Sterkenburg runs an Emmy Noether junior research group in the epistemology of machine learning at the Munich Center for Mathematical Philosophy, LMU Munich.
Emily Sullivan (Utrecht University)
Machine learning in science: Just a toy?
More and more sciences are turning to machine learning (ML) technologies to solve long-standing problems or make new discoveries—ranging from medical science to fundamental physics. The ever-growing fingerprint ML modeling has on the production of scientific knowledge and understanding comes with opportunity and also pressing challenges. In this talk, I discuss how philosophy of science and epistemology can help us understand the potential and limits of ML used for science. Specifically, I will argue that ML models in science function in a similar way that highly idealized toy models do. Thinking of ML models as toy models can help to shed light on the scope of ML’s potential for scientific understanding.
Emily Sullivan is an assistant professor at Utrecht University and is currently the PI on an NWO Veni project on the explainablity of ML models.
Mieke Boon (University of Twente)
Where to locate the explainability of explainable machine learning?
In this talk, we will question whether explanations obtained through XML enable understanding the reasons as to why ML models makes predictions. Our motivation is based on the analogy with Hempel's covering-law account, suggesting that just like accidental generalizations XML explanations fail to be genuine explanations of real-world phenomena, as, e.g., they may be based on spurious correlations in training data sets on which ML models are constructed. We will rely on mechanistic view of explanation (Craver and Tabery, 2016), according to which the mechanism is the explanation of the law, and the mechanism thus makes the law intelligible. In this view, the distinction between accidental regularities and real laws can be accounted for because the real law is based on a mechanism.
Mieke Boon is professor of philosophy of science at the Department of Philosophy of the University of Twente.
Henk de Regt (Radboud University Nijmegen)
The prospects of artificial scientific understanding
In many areas of present-day science AI plays an increasingly important role. While it is clear that AI can assist scientists in their quest for understanding the world, a more controversial question is whether it can also generate scientific understanding independently of human scientists. The answer to this question obviously depends on how we define (scientific) understanding. In my talk I will outline my contextual theory of scientific understanding and explore the prospects of ‘artificial scientific understanding’ from the perspective of this theory.
Henk W. de Regt is professor of philosophy of the natural sciences at the Institute for Science in Society, Radboud University.
- Start date and time
- End date and time
- Aula - Academiegebouw/University Hall, Domplein 29 - Utrecht
- Entrance fee
- Free, but registration required
Please register for the event by sending an email to the Descartes Centre at A.denDaas@uu.nl. The event is free but a moneybox and a QR-code will be made available on site for your much-appreciated donations to support public events of the society.