"Foundations of Physics" will take place from 29 to 31 July, 2013. There will be an evening lecture by Julian Barbour together with a drinks reception in the "Ehrensaal" (main building) of Deutsches Museum (at Museumsinsel) on the first day of the conference.
The conference itself is taking place in rooms on Theresienstraße (where the conference begins) and Richard-Wagner-Straße (for the parallel sessions and registration in room 109); all addresses can be found in "Practical Info". Please note that registration will be open after the morning session on 29 July and will be open on all conference days.
Please find the final version of the program below. All invited plenary talks are assigned 45+30 minutes (talk/Q&A), for contributed talks please plan on 30+10 minutes (talk/Q&A).
All conference participants are listed here.
Download the program as a PDF file here.
Our lecture halls at 10 Richard-Wagner-Str. have famous namesakes to help you find your way about:
|Room 102: "Sommerfeld"||Room 104: "Pauli"||Room 106: "Wien"||Room 110: "Heisenberg"|
|Monday, 29 July|
|9:00||Stephan Hartmann: Welcome|
|9:05||Bernd Huber: Welcome of the LMU President|
|Invited Session 1 – Chair: Stephan Hartmann (Room B139, Theresienstr.)|
|9:15||John D. Norton: The Neglect of Fluctuations in the Thermodynamics of Computation|
|Invited Session 2 – Chair: Mathias Frisch (Room B139, Theresienstr.)|
|10:45||Jean Bricmont: From the Microscopic to the Macroscopic World|
|12:00||Lunch Break: Change of Venue and Registration (Room 109, Richard-Wagner-Str.)|
|4 parallel tracks|
|Colloquium 1: Journeys in Platonia: Celebrating 50 Years Since The End of Time (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 1 – Chair: Miklos Redei (Room "Heisenberg", Richard-Wagner-Str.)|
|13:30||Harvey Brown: Leibniz, Mach and Barbour||Michael Stoeltzner: On some Virtues and Vices of the Algebraic Approach|
|14:10||Edward Anderson: Kendall’s Shape Statistics as a Classical Realization of Barbour-type Timeless Records Theory Approach to Quantum Gravity||Jeremy Butterfield: Renormalization and Nagel's Account of Inter-Theoretic Relations|
|14:50||Sean Gryb: Barbour’s Shape Space as an Ontology for Gravity||Jonathan Bain: Pragmatists and Purists on CPT Invariance in Relativistic Quantum Field Theory|
|Contributed Session 2 – Chair: Iulian Toader (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 3 – Chair: Gábor Hofer-Szabó (Room "Wien", Richard-Wagner-Str.)|
|13:30||Pui Him Ip: The Mystery Behind Schrödinger's First Communication: A Non-Historical Study on the Variational Approach and its Implications||Christian de Ronde: Interpreting the Modal Kochen Specker Theorem: On Quantum Possibility and Potentiality|
|14:10||Molly Kao: Unification in the Old Quantum Theory||Benjamin Feintzeig: Hidden Variables and Commutativity in Quantum Mechanics|
|14:50||Alexander Blum and Christoph Lehner: A Silver Lining Among Infinities: How the Spin-Statistics Theorem Restored Faith in Quantum Field Theory||Paul Näger: The Causal Structure of EPR Experiments|
|4 parallel tracks|
|Contributed Session 4 – Chair: Alexander Blum (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 5 – Chair: Jeremy Butterfield (Room "Heisenberg", Richard-Wagner-Str.)|
|16:00||John Manchak and James Weatherall: The Geometry of Conventionality||Miklos Redei: Operational Independence Concepts in Algebraic Quantum Theory|
|16:40||Erik Curiel: Are Black Holes Hot or Cold?||Gábor Hofer-Szabó: On the Localization Problem of the Common Cause|
|17:20||Vincent Lam: The Status of Mass and Energy in General Relativity||Paniz Imani: The Locality Axiom in Quantum Field Theory and Tensor Products of C*-algebras|
|Contributed Session 6 – Chair: Adam Caulton (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 7 – Chair: Charlotte Werndl (Room "Wien", Richard-Wagner-Str.)|
|16:00||Iulian Toader: Trans-World Structuralism and Naturalistic Metaphysics||Paul Boes: Is Howard's Separability Principle a Sufficient Condition for Outcome Independence?|
|16:40||Thomas Müller: Is a Gas Equal to a Collection of Molecules? On the Modal Logic of Reduction||Mile Gu, Karoline Wiesner, Elisabeth Rieper and Vlatko Vedral: Sharpening Occam's Razor with Quantum Mechanics|
|17:20||Paul Teller: The Failure of Traditional Measurement-Accuracy Realism and the Repercussions for Understanding Vagueness||Rathindra Nath Sen: Galilei Invariance and the Welcher Weg Problem|
|18:00||Change of Venue|
|19:00||Dinner Reception at Deutsches Museum (only for registered conference participants)|
|Evening Lecture – Moderation: Stephan Hartmann (LMU Munich)|
|20:00||Introduction: Johannes-Geert Hagmann (Deutsches Museum)|
|Evening Lecture by Julian Barbour: The Mystery of Time and Size in Einstein's Theory of Gravity
Watch the video abstract on "MCMP at First Sight" here, and download a printable version of the lecture's summary here.
|Tuesday, 30 July|
|4 parallel tracks|
|Contributed Session 8 – Chair: James Weatherall (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 9 – Chair: Vincent Lam (Room "Heisenberg", Richard-Wagner-Str.)|
|9:00||Adam Caulton: Being Serious about Permutation Invariance in Quantum Mechanics||Alyssa Ney: Configuration Space in a Relativistic Setting|
|9:40||F.A. Muller: Witness-Discernibility of Elementary Particles||Juliusz Doboszewski: Branching From Cobordism|
|10:20||Noel Swanson, David John Baker and Hans Halvorson: The Conventionality of Parastatistics||Hanoch Ben-Yami: Causal Order, Temporal Order, and Becoming in Special Relativity|
|Contributed Session 10 – Chair: Thomas Müller (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 11 – Chair: Paul Teller (Room "Wien", Richard-Wagner-Str.)|
|9:00||Charlotte Werndl: Counting States in Statistical Mechanics and Dynamical Systems Theory||Michael Cuffaro: On the Significance and Implications of the Gottesman-Knill Theorem|
|9:40||Marij van Strien: The Origin and Foundations of Laplacian Determinism||Johannes Kofler: Bell Violation with Entangled Photons and without the Fair-Sampling Assumption|
|10:20||Sylvia Wenmackers and Danny Vanpoucke: A Deterministic Model for Norton's Dome||Meinard Kuhlmann: Decoherence: How Much Does it Help?|
|4 parallel tracks|
|Contributed Session 12 – Chair: F.A. Muller (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 13 – Chair: Christoph Lehner (Room "Heisenberg", Richard-Wagner-Str.)|
|11:15||Lev Vaidman: Local Explanation of the Aharonov-Bohm Effect||Ken Wharton: Lagrangian-Only Quantum Theory|
|11:55||Neil Dewar: The Aharonov-Bohm Effect and Non-Locality||Michael Silberstein, William Stuckey and Timothy McDevitt: A Path Integral Over Graphs Approach to Unification and its Foundational Implications|
|Contributed Session 14 – Chair: Michael Cuffaro (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 15 – Chair: Meinard Kuhlmann (Room "Wien", Richard-Wagner-Str.)|
|11:15||Peter Janotta and Raymond Lal: Generalized Probabilistic Theories Without the No-Restriction Hypothesis||Attila Molnár and Gergely Székely: Operationality in the Axiomatic Foundations of Relativity Theory|
|11:55||Bruno Hartmann: Physical Determination of the Action||David Rey: Similarity Assessments, Spacetime, and the Gravitational Field: What Does the Metric Tensor Represent in General Relativity?|
|4 parallel tracks|
|Contributed Session 16 – Chair: Lev Vaidman (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 17 – Chair: Ken Wharton (Room "Heisenberg", Richard-Wagner-Str.)|
|13:45||Richard Dawid and Karim Thebault: On the Empirical Testing of the subjective Everett Interpretation||Daniele Oriti: On the Emergence of Spacetime in Quantum Gravity|
|14:25||Jeffrey A. Barrett: Pure Wave Mechanics and the Very Idea of Empirical Adequacy||David Schroeren: Decoherent Histories of Spin Networks|
|15:05||Richard Healey: Decoherence in a Pragmatist View: Resolving the Quantum Measurement Problem||Yann Benetreau-Dupin: Cosmic Surprise, Anthropic Reasoning, and Bayesian Analysis|
|Contributed Session 18 – Chair: Gergely Székely (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 19 – Chair: Sam Sanders (Room "Wien", Richard-Wagner-Str.)|
|13:45||Giovanni Cinà: Connecting Abstractions from Hilbert Spaces: From Categories to Frames||Syman Stevens: The Dynamical Approach as Regularity Relationalism|
|14:25||Jarosław Pykacz: Many-Valued Logic as a Basis for the New Interpretation of Quantum Mechanics||Erdmann Görg: About the Change in Newton's Justification of an Absolute Space|
|15:05||Marco Giovanelli: A Dialog of the Deaf, Einstein and the Logical Empiricists on Rods and Clocks in General Relativity|
|15:45||Change of Venue|
|Invited Session 3 – Chair: Roland Poellinger (Room B139, Theresienstr.)|
|16:15||Fay Dowker: Things Happen, They Just Happen in a Partial Order|
|Invited Session 4 – Chair: Christian Joas (Room B139, Theresienstr.)|
|17:45||Markus Aspelmeyer: Schrödinger's Mirrors: Towards Table-Top Experiments at the Interface between Quantum Physics and Gravity|
|Wednesday, 31 July|
|4 parallel tracks|
|Colloquium 2: On the Split Between Gravity and Inertia in Different Spacetime Theories – part 1 (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 20 – Chair: Stephan Hartmann (Room "Heisenberg", Richard-Wagner-Str.)|
|9:00||Dennis Lehmkuhl and Oliver Pooley: Against the Gravity–Inertia Split||Sam Sanders: The Universe is a Green Turing Machine, when seen Through Green Glasses|
|9:40||Eleanor Knox: The Gravity-Inertia Split in Newtonian and Relativistic Contexts||Tim Räz and Tilman Sauer: Applying an Account of the Applicability of Mathematics to the World|
|10:20||Harvey Brown: The Role of Geometric Explanation in General Relativity|
|Contributed Session 21 – Chair: Richard Dawid (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 22 – Chair: Jeffrey A. Barrett (Room "Wien", Richard-Wagner-Str.)|
|9:00||Antonio Vassallo: The Metaphysics of Bohmian Quantum Gravity||Tom Bullock and Paul Busch: On the Connection Between SIC POVMs and MUBs|
|9:40||Casey McCoy: The Quantum to Classical Transition in Inflationary Cosmology||Federico Holik, Angel Plastino, Manuel Saenz and Gabriel Catren: Quantum Probabilities and Non Distributive Lattices|
|10:20||Marc Holman: The Significance of Empirical Principles for Quantum Gravity||Alexei Grinbaum: Quantum Observer and Kolmogorov Complexity|
|4 parallel tracks|
|Colloquium 2: On the Split Between Gravity and Inertia in Different Spacetime Theories – part 2 (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 23 – Chair: Michael Silberstein (Room "Heisenberg", Richard-Wagner-Str.)|
|11:15||J. Brian Pitts: Inertia and the Conformal-Projective Decomposition for Nordström-Einstein-Fokker, Massive Scalar, Einstein, and Massive Spin 2 Gravities||Elizabeth Miller: Quantum Mechanics and Humean Supervenience|
|11:55||Roundtable Discussion||Thomas Barrett: On the Structure of Classical Mechanics|
|Contributed Session 24 – Chair: Wolfgang Pietsch (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 25 – Chair: Daniele Oriti (Room "Wien", Richard-Wagner-Str.)|
|11:15||Angel S. Sanz: Bohmian Mechanics: Is it possible to Think the Quantum World in a Different Way?||Markus Mueller: Axiomatic Reconstructions and Generalizations of Quantum Theory|
|11:55||Charles Sebens: Quantum Mechanics Without Wave Functions: The Newtonian Dynamics of Many Interacting Pseudo-Bohmian Worlds||Andreas Doering: Topos-Based Logic for Quantum Systems and Bi-Heyting Algebras|
|4 parallel tracks|
|Contributed Session 26 – Chair: Roland Poellinger (Room "Sommerfeld", Richard-Wagner-Str.)||Contributed Session 27 – Chair: Karim Thébault (Room "Heisenberg", Richard-Wagner-Str.)|
|13:45||Alison Peterman: Spinoza on Extension and Space||Matthias Egg: Inequivalent Representations Do Not Undermine Realism About Particles|
|14:25||Petter Sandstad: Powers and Kinds - Exemplified through a Study of Waves||Radin Dardashti: Group Structuralism in Particle Physics|
|15:05||Wolfgang Pietsch: Against Bayesianism in Physics: a Novel Argument||Pablo Ruiz de Olano: Are Elementary Particles Irreducible Representations? Color, Flavor, and Structuralism about SU(3)|
|Contributed Session 28 – Chair: Andreas Doering (Room "Pauli", Richard-Wagner-Str.)||Contributed Session 29 – Chair: Markus Mueller (Room "Wien", Richard-Wagner-Str.)|
|13:45||Peter Janotta, Christian Gogolin, Jonathan Barrett and Nicolas Brunner: Limits on Nonlocal Correlations From the Structure of the Local State Space||Marcin Pawlowski: What do we know about Monogamy of Nonlocal Correlations?|
|14:25||Johannes Biniok and Paul Busch: Multi-Slit Interferometry and Commuting Functions of Position and Momentum||Wieslaw Laskowski, Marcin Markiewicz, Tomasz Paterek and Marcin Wiesniak: Incompatible Local Hidden-Variable Models of Quantum Correlations|
|15:05||Simon Friederich: Rethinking Local Causality||Louis Vervoort: Can the Bell Inequality be Violated through Local Interactions with a Background Medium?|
|15:45||Change of venue|
|Invited Session 3 – Chair: Karim Thébault (Room B139, Theresienstr.)|
|16:15||Rob Spekkens: On Causal Explanations of Quantum Correlations|
|Invited Session 4 – Chair: Detlef Dürr (Room B139, Theresienstr.)|
|17:45||Tim Maudlin: New Foundations for Physical Geometry|
Organizer: Sean Gryb
“Time is nothing but change. I spent hours and hours pacing through the Englischer Garten in Munich while persuading myself of this fact. Physics must be recast on an new foundation in which change is the measure of time, not time the measure of change.” This conviction, as recounted by Julian Barbour in The End of Time, had its birth in Munich in 1963. 50 years later and on the site, we celebrate the end of time and the impact Barbour’s conviction has had on theoretical physics. Beginning with his early work on Mach’s principles, we study the development of Best Matching and how it can be used to understand General Relativity and the idea of Background Independence. Then, we examine the influence these ideas have had on the Philosophy of Physics and Quantum Gravity. Finally, we end with an assessment of how Barbour’s ideas continue to be relevant today, from studies of the Problem of Time to the development of Shape Dynamics, where the space of scale-invariant 3-geometries, which Barbour calls Platonia, serves as the fundamental ontological arena for gravity.
Partipants: Harvey Brown, Edward Anderson, Sean Gryb
Leibniz, Mach and Barbour
My comments will be concerned with the way that Leibniz's and Mach's thinking on the nature of space have influenced Julian Barbour's approach to the formulation of dynamical theories.
Kendall’s Shape Statistics as a Classical Realization of Barbour-type Timeless Records Theory Approach to Quantum Gravity
I already showed that Kendall’s shape geometry work was the geometrical description of Barbour’s relational mechanics’ reduced conﬁguration spaces (alias shape spaces). I now describe the extent to which Kendall’s subsequent statistical application to such as the ‘standing stones problem’ realizes further ideas along the lines of Barbour-type timeless records theories, albeit just at the classical level.
Barbour’s Shape Space as an Ontology for Gravity
I will give a personal account of the development of the conformally invariant version of ‘Shape Dynamics’. The story will be told from three perspectives: i) a historical one, highlighting the role of College Farm and the unique interactions with Julian Barbour, ii) a philosophical one, describing a simple observation about the meaning of local scale in physics, and iii) a formal one, showing how an early observation of Poincare ́ combined with York’s method for solving the initial value problem in General Relativity led a concrete implementation of Barbour’s ontology.
Organizers: Dennis Lehmkuhl and Oliver Pooley
It is sometimes said that General Relativity (GR) unifies the pre-relativistic concepts of inertia and gravity; but in such a way that neither concept survives unscathed. This symposium aims to explore the manner and extent to which the pre-GR concepts survive. Is GR a theory of the gravitational field? Or does it dispense with gravitational fields by showing that gravitational phenomena are reducible to inertial structure? Some authors (e.g., Stachel, Giulini and Janssen) see the affine connection as representing a unified gravitational-inertial field. Is this approach best understood in terms of something like a (coordinate-dependent, arbitrary) gravity/inertia split? Or is the presence of gravitational fields represented in a coordinate-independent way, via a non-vanishing curvature tensor? The latter idea was advocated most explicitly by Synge, and is indeed present in many contemporary text- books on GR. The symposium aims to situate this aspect of the interpretation of GR in the wider context provided by different types of spacetime theories. One has to clearly distinguish the question of how GR, a theory with a single dynamical connection determined by a metric, should be interpreted, from the question of how the distinction between gravity and inertia is best understood in theories that employ different/additional structures (e.g., two metrics/connections; a non-symmetric connection). Each talk of the symposium compares GR to at least one other spacetime theory, in order to explore exactly how gravity and inertia should (or should not) be distinguished in the different theories.
Participants: Harvey Brown, Eleanor Knox, Dennis Lehmkuhl and Oliver Pooley, J. Brian Pitts
The Role of Geometric Explanation in General Relativity
Dennis Lehmkuhl has recently discussed the reasons why Einstein did not regard general relativity as a theory that geometrised gravity. I intend to add further arguments supporting this negative view.
The Gravity-Inertia Split in Newtonian and Relativistic Contexts
Relative to Newton-Cartan theory, Newtonian gravitation involves the split- ting of a single curved connection into gravitational and inertial parts. I examine the prospects for imposing an analogous division of the connection in general relativity. It’s well-known one cannot split the Levi-Civita connection in quite the same way as one does the Newton-Cartan connection, into a symmetric connection and gravitational field. However, it is possible to divide the Levi-Civita connection into a non-symmetric connection and a part that has sometimes been held (in Teleparallel theories) to represent the gravitational field. I’ll argue that non-symmetric connections are not candi- dates for representing full inertial structure, and hence that general relativity unites the gravitational and inertial field in a particularly profound sense.
Against the Gravity–Inertia Split
Dennis Lehmkuhl and Oliver Pooley
To make sense of talk of a frame-dependent inertia–gravity split in general relativity, one needs to relate the theory to Newtonian gravity, and to recognise that two routes to privileged frames of reference need not yield the same sets of frames. On the first route, which paths in spacetime correspond to unaccelerated (“inertial”) motions is an absolute, coordinate-independent matter. The privileged frames are those whose standard of rest corresponds to inertial motion. On the second route, privileged frames are identified via classes of co-moving coordinate systems with respect to which dynamical equations take a simple, canonical form. In Newtonian gravity, the second route yields globally-defined frames with respect to which freely-falling bodies are (in general) accelerating. In practice, however, the theory cannot distinguish between frames that are relatively translationally accelerated. At best, therefore, an empirically undetectable proper subset of these frames encode inertial motion. The idea of a frame-dependent inertia–gravity split arises when one combines the idea that these frames encode inertia (and thus that free-fall motions involve gravitational deflection from inertial motion) with the idea that they are fundamentally physically equivalent. This combination, however, is not coherent. A preferable viewpoint reconciles an absolute notion of inertia with the physical equivalence of the frames identified via the second route by denying that they encode inertial motion. They are, instead, frames with respect to which the components of the connection take a particularly simple form, even though they do not all vanish. We will argue that Einstein’s central claims concerning the equivalence principle, and the frame-dependence of the gravitational field, are compatible with this second viewpoint.
Inertia and the Conformal-Projective Decomposition for Nordström-Einstein-Fokker, Massive Scalar, Einstein, and Massive Spin 2 Gravities
J. Brian Pitts
The Ehlers-Pirani-Schild (EPS) construction, which derives a metric tensor from a projective connection and a conformal metric density, has sometimes been thought to undermine the conventionality of geometry. It might be of renewed interest due to the appearance of the dynamical or constructivist ap- proach to space-time geometry of Brown and Pooley. Constructivism shares with conventionalism the modally cosmopolitan awareness of a multiplicity of options, not all so tidy as to fit a unique geometry, leaving the ‘true’ geometry ambiguous.
An EPS-inspired decomposition is applied to Nordström-Einstein-Fokker (massless spin 0) scalar gravity and its belatedly studied cousin, massive spin 0, which agree on the geometry seen by matter (conformally flat). For mas- sive scalar gravity, the symmetry group of the whole theory is the Poincaré group of Minkowski geometry, not the 15-parameter conformal group as in Nordström-Einstein-Fokker. By focusing only on the matter action, the EPS construction fails to notice the key geometrical diferences between massless and massive spin 0 theories and hence fails to address key issues motivating conventionalist and constructivist positions. For both massless and massive scalar gravities, inertia has an absolute core but is modifiable invariantly by gravity.
The decomposition is then applied to Einstein’s General Relativity (mass-less spin 2) and its recently revived cousin(s), massive spin 2 gravity(s). Similar issues to the spin 0 comparison arise prima facie, but complicated by gauge freedom (in both cases but for different reasons) as well as the greater number of fields.
Abstracts for individual talks
Schrödinger's Mirrors: Towards Table-Top Experiments at the Interface between Quantum Physics and Gravity
Massive mechanical objects are now becoming available as new systems for quantum science. Devices currently under investigation cover a mass range of more than 17 orders of magnitude - from nanomechanical waveguides of some picogram to macroscopic, kilogram-weight mirrors of gravitational wave detectors. This has fascinating perspectives for quantum foundations: the mass of available mechanical resonators provides access to a hitherto untested parameter regime of macroscopic quantum physics, eventually enabling novel tests at the interface between quantum physics and gravity.
Pragmatist approaches to relativistic quantum field theories (RQFTs) trade mathematical rigor for the ability to formulate non-trivial interacting models. Purist approaches to RQFTs trade the ability to formulate non-trivial interacting models for mathematical rigor. Philosophers of physics are split on whether foundational issues related to RQFTs should be framed within pragmatist or purist approaches. This essay addresses this debate by viewing it through the lens of a specific result that many authors have claimed is unique to RQFTs; namely, the CPT theorem. I first consider Greenberg's (2002) claim that, within the purist axiomatic approach, a violation of CPT invariance entails a violation of restricted Lorentz invariance. I then review a critique of Greenberg within the context of "causal perturbation theory", which seeks to establish a mathematically rigorous foundation for the perturbative techniques that underlie pragmatist approaches (Dütsch & Gracia-Bondía 2012). I then assess the extent to which causal perturbation theory can be viewed as an attempt to reconcile pragmatism and purity.
A century ago Einstein and Minkowski changed the world view of physics with the notion of spacetime, according to which there is no universal Now. Forty years later the great British quantum physicist Paul Dirac found evidence within the very structure of Einstein's wonderful theory which suggests that the notion of Now should be restored to the universe. Dirac's proposal is intimately related to the puzzle of the expanding universe: with respect to what is its size increasing? My talk will cover aspects of this fascinating story. In the words of Winston Churchill "It is a riddle, wrapped in a mystery, inside an enigma. I cannot forecast to you the action of Russia. It is a riddle, wrapped in a mystery, inside an enigma; but perhaps there is a key. That key is Russian national interest."
Watch the video abstract on "MCMP at First Sight" here!
Download the files here.
Jeffrey A. Barrett
Hugh Everett III proposed his relative-state formulation of pure wave mechanics as a solution to the quantum measurement problem. There is indeed a concrete sense in which Everett's relative-state formulation of pure wave mechanics resolves the quantum measurement problem. But there is also a sense in which it predicts that everything physically possible in fact happens. The question then is whether pure wave mechanics might somehow be taken both to resolve the measurement problem and to be empirically adequate, and, if so, precisely how. Everett argued that pure wave mechanics was, as he put it, empirically faithful. We will consider what this meant, the sense in which he was certainly right, and the relationship between Everett's notion of empirical faithfulness and notions of empirical adequacy more generally. The suggestion will be that empirical faithfulness is well understood as a weak variety of empirical adequacy. If so, the very idea of empirical adequacy is something that must be renegotiated in the context of new physical theories given their other relative virtues, and Everett's formulation of pure wave mechanics provides a concrete example of such renegotiation.
Jill North (North, 2009) has recently argued that Hamiltonian mechanics ascribes less structure to the world than Lagrangian mechanics does. I will argue that North's argument is not sound. In doing so, I will present some obstacles that must be navigated by anyone interested in comparing the amounts of structure that different physical theories ascribe to the world.
I reconstruct from Rietdijk and Putnam’s well-known papers an argument against the applicability of the concept of becoming in Special Relativity, which I think is resistant to some of the objections found in the literature. I then consider a line of thought found in the discussion of the possible conventionality of simultaneity in Special Relativity, beginning with Reichenbach, and apply it to the debate over becoming. We see that it immediately renders Rietdijk and Putnam’s argument ineffective. I end by considering a possible objection to the approach of this paper.
Anthropic reasoning in cosmology emphasizes the difficulties for probabilistic analysis to deal with ignorance. Imprecise probabilities (families of credal functions) allow for a more adequate representation of ignorance. Thus, they clarify what epistemic artifacts sharp credences introduce, and allow us to better characterize what (limited) role anthropic considerations may play: rather than allowing for actual predictions, they only restrict the range of admissible values of parameters to be explained, unless informative priors are adopted. The predictive or explanatory gain that anthropic constraints yield then comes down to a much weaker compatibility condition.
Download the slides here.
Johannes Biniok and Paul Busch
In a recent, modified double-pinhole diffraction experiment the existence of an interference pattern was established indirectly, while at the same time the position localisation properties of the quantum state remained unchanged. Here, a description in terms of joint eigenfunctions of periodic position and momentum is presented. The validity of the proposed description is supported by additional simulations. The experiment is analysed further using a form of uncertainty relation adapted to multi-slit experiments.
A Silver Lining Among Infinities: How the Spin-Statistics Theorem Restored Faith in Quantum Field Theory
Alexander Blum and Christoph Lehner
The talk traces the developments that led to Wolfgang Pauli’s formulation of the spin-statistics theorem in 1940, beginning with early (around 1927) speculations as to why electrons obeyed Fermi-Dirac statistics while photons obeyed Bose-Einstein statistics. As these speculations matured, two main elements of any future answer to this question emerged: Spin, which was recognized as a fundamental property of any elementary particle, and the role of quantum field theory as the theoretical framework that allowed to formulate the question in a precise manner. The spin-statistics theorem then marked a success for the quantum field theory program and restored faith in its aim of formulating a fundamental theory of elementary particles.
In order to motivate his claim that the violation of the Bell inequalities forces us into a certain kind of holism, Howard (1985,1989,1992) develops a formal equivalence proof supposed to ground the stochastic “outcome independence” condition, the assumption of which is necessary for the derivation of the inequalities, in a physical principle, the separability principle. I will discuss several criticisms of Howard's equivalence proof that focus on the sufficiency of the separability principle for outcome independence. I will argue that, while none of these criticisms succeeds, they do constrain the possible form of Howard's argument.
Download the files here.
The derivation of the laws describing the macroscopic world from those governing the microscopic one is a very difficult problem. The root of the difficulty is sometimes seen as arising from the fact that the first set of laws are often time-irreversible, while the second ones are time-reversible. The goal of the talk will be to explain precisely these notions (macroscopic, microscopic, (ir)reversibility) and why this difference does not constitute an insuperable difficulty. We will also discuss the role of probability in the derivation of the macroscopic laws from the microscopic ones and criticize several misleading attempts at justifying this derivation.
Download the files here.
Tom Bullock and Paul Busch
Quantum tomography and quantum cryptography repeatedly make use of the concepts of mutually unbiased bases (MUBs) and symmetric informationally complete positive operator-valued measures (SIC POVMs). Beyond this, similarities between these constructions seem to suggest a deeper connection between the two. We will investigate this idea, showing that such a connection does in fact exist and that, under the correct circumstances, we are capable of producing MUBs from a SIC POVM, and vice versa. This will provide us with a better understanding of the nature of these constructions for future research.
I relate Ernest Nagel's account of inter-theoretic relations, especially reduction, to renormalization in quantum field theory (QFT). My main point turns on a contrast between two approaches to renormalization. On the old approach, which prevailed from ca 1945 to 1970, it is a piece of good fortune that high energy physicists can formulate renormalizable quantum field theories that are so empirically successful. But the new approach (from 1970 onwards) explains why the phenomena we see, at the energies we can access, are described by a renormalizable QFT. I argue, against some authors, that this situation accords with Nagel’s views.
Download the files here.
In this talk, I expound an interpretation of the quantum formalism in which factor Hilbert space labels are taken to be nothing but a descriptive artefact, and so permutation invariance emerges as a gauge principle; i.e. a necessary condition on quantities' being genuinely physical. In this interpretation, fermions are always discernible - even by monadic predicates - and bosons are usually so. The interpretation also has novel implications for the notion of entanglement for such systems.
Download the files here.
The present talk aims at bridging the researches on the foundations of Quantum Mechanics developed by the research groups in Amsterdam at ILLC and in Oxford by the Quantum Group. More precisely, we show how the frames of the logic(s) proposed by Baltag and Smets are related to the kinds of categories studied by Abramsky and Coecke. These structural investigations are followed by some remarks on how to improve the logical formalism in order to prove the correctness of quantum protocols. We conclude evaluating the consequences of this work for the foundations of Quantum Mechanics.
Download the files here.
According to the Gottesman-Knill theorem, quantum algorithms which exclusively utilise operations chosen from a particular restricted set of operations, are efficiently simulable classically. Since some of these algorithms involve entangled states, it is commonly concluded that entanglement is insufficient to enable quantum speedup. As I argue, however, what the Gottesman-Knill theorem actually demonstrates is only that it is possible to use an entangled state to less than its full potential. Despite this, an entangled quantum system provides sufficient physical resources to enable quantum speedup, whether or not one elects to use these resources fully. The relevance of this for our understanding of the Bell inequalities (e.g., CHSH, GHZ) will be discussed.
Download the files here.
In the early 1970s it is was realized that there is a striking formal analogy between the so-called laws of black-hole mechanics and the laws of classical thermodynamics. Before the discovery of Hawking radiation, however, it was generally thought that the analogy was only formal, and did not reflect a deep connection between gravitational and thermodynamical phenomena. In particular, it is still commonly held that the surface gravity of a stationary black hole can be construed as a true physical temperature only when quantum effects are taken into account; in the context of classical general relativity alone, one cannot cogently construe it so. Does he use of quantum field theory in curved spacetime offer the only hope for taking the analogy seriously? I think the answer is 'no'. To attempt to justify that answer, I shall begin by arguing that the standard argument to the contrary begs the question. Looking at the various ways that the idea of ''temperature'' enters classical thermodynamics then will suggest arguments that, I claim, show the analogy between classical black-hole mechanics and classical thermodynamics should be taken more seriously, at least so far as temperature goes, without the need to rely on or invoke quantum mechanics. If this is correct, then there may be a deep connection between classical general relativity and classical thermodynamics on their own, independent of quantum mechanics.
Particle physicists in their construction of Lagrangian densities are guided by several requirements that the terms of the Lagrangian need to satisfy. For instance, the Lagrangian needs to be invariant with respect to the symmetry group and needs to be renormalisable. Another less well-known requirement is the Completeness requirement (CR). The CR claims that the Lagrangian should contain all the terms satisfying the other requirements. Unlike the other requirements the CR holds a different status and the aim of this paper is to discuss both the physical and philosophical significance of the CR in modern particle physics.
Richard Dawid and Karim Thébault
The subjective Everettian approach to quantum mechanics presented by Deutsch and Wal- lace fails to constitute an empirically viable theory of quantum phenomena. The decision the- oretic implementation of the Born rule realized in this approach provides no basis for rejecting Everettian quantum mechanics in the face of empirical data that contradicts the Born rule. The approach of Greaves and Myrvold, which provides a subjective implementation of the Born rule as well but derives it from empirical data rather than decision theoretic arguments, avoids the problem faced by Deutsch and Wallace and is empirically viable. However, there is good reason to cast doubts on its scientific value.
Download the slides here.
This paper asks whether the Aharonov-Bohm (A-B) effect does indeed show, as has been claimed, that physics is non-local. It begins by clarifying some of the different senses of the term ‘locality’ that have been used in the recent literature; then gives a (brief) exposition of the A-B effect itself. Then, various different interpretations of electromagnetism are considered, and it is discussed how the accounts these interpretations give of the A-B effect bear upon whether that effect should be considered to breach any of the above senses of locality.
We explore to which extent branching space-times of Belnap can be understood in terms of topological cobordism between Lorentzian manifolds. We proceed in two steps: first we investigate how branching space-times postulates constraint topological and Lorentzian cobordisms between manifolds, and show that there exist such models of BST. Then we look at Earman criticism of BST, arguing that dropping time-orientability and causal compactness are not that serious flaws of the theory, given modal and probabilistic motivations. We end with brief comments concerning advantages of this approach comparing to dropping the Hausdorff condition, pointing out possible connections with the literature.
The so-called topos approach provides a radical reformulation of quantum theory, based on structures in certain presheaf topoi. In particular, to each quantum system a generalized state space, the spectral presheaf, is associated. Subsets (more precisely, subobjects) of this state space represent propositions about the quantum system, in analogy to classical physics. We show that there is a complete bi-Heyting algebra of subobjects, which leads to a completely new and well-behaved form of logic for quantum systems that has two notions of implication and two negations, corresponding to the two aspects of this logic: one intuitionistic aspect and one paraconsistent.
In causal set quantum gravity spacetime is hypothesized to be atomic and causal order is the most basic organising principle. Fundamental discreteness brings with it novel possibilities for "dynamical laws" in which spacetime grows by the accumulation of new atoms, potentially realising within physics C.D. Broad's concept of a growing block universe in which the past is real and the future is not. That a growing block can be compatible with general covariance and the lack of a global time, is demonstrated by the Rideout-Sorkin Classical Sequential Growth models in which the “present” is identified with the growth process itself.
Download the files here.
I critically assess two arguments against interpreting quantum field theory (QFT) in terms of particles, both of which are based on the occurrence of unitarily inequivalent representations in QFT. The first one is employed by Doreen Fraser to support an underdetermination claim concerning different approaches to QFT, which would call into question the approximate truth of the standard model. The second one (recently advocated by Laura Ruetsche) makes use of the Unruh effect to demonstrate that “particle” is an observer-dependent notion and hence lacks objectivity. This paper seeks to show that both arguments are inconclusive.
Download the files here.
This paper takes up a suggestion that the reason we cannot ﬁnd hidden variable theories for quantum mechanics, as in Bell’s Theorem, is that we require them to assign joint probability distributions on incompatible observables; these joint distributions are empirically meaningless on one standard interpretation of quantum mechanics. Some have proposed to get around this problem by using generalized probability spaces. I present a “no-go” theorem to show a sense in which generalized probability spaces can’t serve as hidden variable theories for quantum mechanics, so the proposal for getting around Bell’s Theorem fails.
There is widespread belief concerning a tension between quantum theory and relativity, motivated by the view that quantum theory violates local causality. J.S. Bell regards a theory as local causally causal if, for the probability P(A) of an event A, P(A|E) = P(A|E&B), where B is space-like separated from A and E specifies what happens in its backward light cone. I show that if probability (following Lewis) is what imposes constraints on rational credence, P(A|E) = P(A|EB) is irrelevant for local causality. I diagnose how the misleading (to my mind) impression that quantum and relativity theory are in conflict arises.
This paper intends to show that not only did Einstein and the Logical Empiricists come to disagree about the role, principled or provisional, played by rods and clocks in general relativity, but also that, in their lifelong interchange, they never clearly identified the problem they were discussing. Einstein's ambivalent attitude towards the rods and clocks as empirical indicator should be understood in the context of a discussion with Weyl and Eddington about the status of the “Riemannian postulate” within general relativity. On the contrary, Logical Empiricists, though carefully analyzing the Einstein-Weyl-Eddington debate, incomprehensibly insisted to interpret Einstein's epistemology of geometry as a continuation of the age-old Helmholtz-Poincaré debate on empirical-conventional choice among Euclidean or non-Euclidean geometries.
One of the most fundamental concepts of Newton’s system is absolute space. It builds the connection between his theological-metaphysical thinking and his natural philosophy. Because of this role as fundament or ‘starting point’, the existence of space is not empirically demonstrable for him, neither in De Gravitatione nor in the Principia. Rather, it is taken for evident and underpinned by metaphysical arguments. Just in one of the last writings, the famous debate between Leibniz and Clarke, do we find a change in the justification of absolute space: To a very small extent, Newton anticipates Euler’s justification. To explore this, I will start with an analysis of De Gravitatione, which shows that Newtonian space is rooted in theology and metaphysics. Thereafter, I will examine the scholium to the definitions of the Principia. I will show that this text is largely an extension of the arguments found in De Gravitatione. Finally, I take a look at the correspondence between Leibniz and Clarke to trace the development of Newton’s justification of space.
Quantum mechanical formalism has an orthodox interpretation that relies on the von Neumann-Dirac cut between the observer and the observed system. This ''shifty split'', as John Bell called it, cannot be removed: the formalism only applies if the observer and the system are demarcated as two separate entities. Standard quantum mechanics says nothing about the physical composition of the observer, who is an abstract notion having no physical description from within quantum theory. As emphasized by Wheeler, this makes it extraordinarily difficult to state clearly where “the community of observer-participators” begins and where it ends. As a part of his relative-state interpretation, Everett argued that observers are physical systems with memory, i.e., “parts... whose states are in correspondence with past experience of the observers”. We call this a ‘universal observer’ hypothesis: any system with certain information-theoretic properties can serve as quantum mechanical observer, independently of its physical constituency, size, or presence of conscious awareness. In this vein, Rovelli claimed that observers are merely systems whose degrees of freedom are correlated with some property of the observed system: “Any system can play the role of observed system and the role of observing system… The fact that observer has information about system is expressed by the existence of a correlation”. The universal observer hypothesis has remained a controversial statement to this day. For example, Peres claims in exact opposition to Rovelli that “the two electrons in the ground state of the helium atom are correlated, but no one would say that each electron ‘measures’ its partner”. The purpose of this paper is to clarify an information-theoretic definition of quantum mechanical observer. Our argument is based on Kolmogorov complexity of the information received by the observer. We prove and discuss a condition based on algorithmic complexity that allows a system to be described as an objective “element of reality” by a class of observers. Finally we suggest an experimental test of the universal observer hypothesis based on the heat measurement during disintegration of fullerenes. We conjecture that, although they only have a small number of the degrees of freedom that correlate with external systems, fullerenes can act as quantum mechanical observers of photons.
Mile Gu, Karoline Wiesner, Elisabeth Rieper and Vlatko Vedral
Mathematical models generate predictions about the future, based on information available in the present. In the spirit of Occam's razor, simpler is better; should two models make identical predictions, the one that requires less input information is preferred. Yet, for almost all stochastic processes, even the provably optimal classical models require unnecessary information. Here, we systematically construct quantum models that break this classical limitation, and show that the system of minimal entropy that simulates such processes must necessarily exploit quantum logic. This indicates that the reality we observe could be significantly simpler than classically possible should quantum effects be involved.
Physical theories ought to be built up from colloquial notions such as ’long bodies’, ’energetic sources’ etc. in terms of which one can define pre-theoretic ordering relations such as ’longer than’, ’more energetic than’. One of the questions addressed in previous work is how to make the transition from these pre-theoretic notions to quantification, such as making the transition from the ordering relation of ’longer than’ (if one body covers the other) to the notion of how much longer. In similar way we introduce dynamical notions ’more impulse’ (if in a collision one object overruns the other) and ’more energetic’ (if the effect of one source exceeds the effect of the other). In a physical model - built by coupling congruent standard actions - those basic pre-theoretic notions become measurable. We uncover the origin of (basic) physical quantities of Energy, Momentum and Inertial Mass. From physical and methodical principles - without mathematical presuppositions - we derive all equations of (classical and relativistic) Dynamics and ultimately the principle of least action.
A pragmatist interpretation permits a satisfactory resolution of the quantum measurement problem. The classic measurement problem dissolves on recognizing that the quantum state does not describe or represent the behavior of a quantum system. The residual problem of when, and to what, to apply the Born Rule may then be resolved by judicious appeal to decoherence.
Download the files here.
The Mystery Behind Schrödinger's First Communication: A Non-Historical Study on the Variational Approach and its Implications
Pui Him Ip
It is well known that Schrödinger first derived his equation via a seemingly ad hoc variational principle. However, he abandoned the approach soon after. In this paper, it is argued that the variational approach can be well motivated physically. In this perspective, the quantum state is found to be conceptually analogous to the role of motion in classical mechanics. Paradoxically, it refers to a real ensemble with fictitious parts. Further, it is argued that due to unobservability, the concept motion in quantum mechanics should be relegated to a lesser role.
Reichenbach's Common Cause Principle claims that correlations betwen causally non-related events should be traced back to a common cause which is usually characterized probabilistically and localized spatiotemporally. But what is the exact relation between the probabilistic characterization and spatiotemporal localization? Intuitively, common causes should be accomodated in the strong past, that is in the intersection of the causal past of the correlating events, but the axiomatics of algebraic quantum field theory, for example, seems to suggest that they should be in a broader region: in the weak past, that is in the union of the causal pasts. How these localizations relate to each other in the classical and in the quantum theory, and how they relate to Bell's notion of local causality characterized in probabilistic terms these are the questions the paper is addressing.
Federico Holik, Angel Plastino, Manuel Saenz and Gabriel Catren
We study the origin of quantum probabilities as arising from non-boolean propositional-operational structures. We discuss the relationship between a group theoretical notion of object and the problems posed by von Neumann regarding the development of a quantum probability theory.
Download the files here.
When attempting to formulate a theory of quantum gravity, it is crucial to distinguish between empirical and theoretical principles. The latter are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. I will illustrate these remarks in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories and I will then list four empirical principles that should arguably play a fundamental role in deducing the key structural features of quantum gravity.
The prototype of mutually independent systems are systems which are localized in spacelike separated regions. In the framework of locally covariant quantum field theory we show that the commutativity of observables in spacelike separated regions can be encoded in the tensorial structure of the functor which associates unital $C^*$-algebras (the local observable algebras) to globally hyperbolic spacetimes. This holds under the assumption that the local algebras satisfy the split property and involves the minimal tensor product of $C^*$-algebras.
Peter Janotta, Christian Gogolin, Jonathan Barrett and Nicolas Brunner:
We examine connections between nonlocal correlations in a physical theory and the local structure of individual systems. In particular, we study toy theories with local state spaces given by regular polygons. The correlations on the maximally entangled state of these systems provides a transition from classical and maximal nonlocal post-quantum correlations to quantum correlations in the limit of infinitely many vertices. We proof that correlations on joint states acting as semi inner products on measurement operators are restricted, which applies to all bipartite quantum states. Furthermore, we find evidence that the equivalence of states and measurement operators limits correlations.
Peter Janotta and Raymond Lal
Generalized probabilistic theories are a popular bottom-up approach for the study of the foundations of quantum theory. We abandon the no-restriction hypothesis, i.e. the usual assumption that for a given state space all mathematical consistent measurements should be included as physically valid. Consequently, the definition of possible preparations and measurement outcomes of a theory becomes independent up to new consistency conditions, including a new maximal set of consistent joint states. Restricted systems can model inherent noise and act as a modification of any theory to show equivalent states and measurement operators as in quantum theory.
The old quantum theory can offer us many rich insights into the challenges and motivations of developing a new physical theory. Drawing on work by Planck, Einstein and Bohr, I argue that from 1900 to 1915, the old quantum theory gained ground in the scientific community because the hypothesis of a discrete element in nature enabled scientists to account for various physical phenomena that were previously thought to be unrelated. This is a prime example of unification in a scientific theory, and thus provides historical evidence for the idea that unification is epistemically valuable.
The violation of a Bell inequality is an experimental observation that forces one to abandon a local realistic
worldview, namely, one in which physical properties are (probabilistically) defined prior to and independent of
measurement and no physical influence can propagate faster than the speed of light. All such experimental
violations require additional assumptions depending on their specific construction making them vulnerable to socalled “loopholes.” I will discuss a recent experiment (M. Giustina et al., arXiv:1212.0533) which, for the first
time, closes the fair-sampling (detection) loophole for photons. This makes the photon the first physical system
for which each of the main loopholes has been closed, albeit in different experiments.
Download the files here.
My paper consists of three parts. In the first part I outline the main ideas of decoherence and its claimed achievements. The following part is negative and points to the shortcomings of the decoherence program. I argue that decoherence only supplies partial, local and approximate answers, which solve none of the deep conceptual problems of quantum mechanics. The concluding third part is moderately positive and argues that decoherence reveals its greatest power only if the perspective is changed from foundational to certain pragmatic questions, in particular concerning the tenability of mechanistic explanations.
The aim of this talk is to investigate the difficulties related to the notions of mass and energy within general relativity in the light of the dual and dynamical nature of the gravitational field, which encodes both inertio-gravitational effects and the chrono-geometrical structures of spacetime. We discuss in particular the tension between on the one hand the ambiguities related to the notions of mass and energy and on the other hand the practical need for so-called ‘quasi-local’ quantities (e.g. quasi-local mass), such as in the context of evolving, dynamical black holes (e.g. within the recent ‘dynamical horizon’ framework).
Wieslaw Laskowski, Marcin Markiewicz, Tomasz Paterek and Marcin Wiesniak
There exist correlations between quantum systems that cannot be explained by any local hidden-variable (LHV) theory. The simplest scenario that demonstrates this phenomenon involves bipartite entangled quantum states measured with one of two local observables. This original approach of Bell was later extended to correlations between more parties and to correlations between different numbers of subsystems. A natural question arises if there is a correlation Bell inequality that can be violated although all inequalities involving correlations between fixed number of observers are satisfied? Our main finding is that there exist multiparty states with explicit local hidden-variable models for correlations between any fixed number of subsystems, in a Bell scenario with two settings per party. Nevertheless these models can be disqualified. It turns out that they are incompatible with each other and cannot be extended to model correlations between various numbers of subsystems. We present a Bell-like inequality that involves correlations between different numbers of subsystems which is satisfied by all LHV models and violated by the quantum correlations.
John Manchak and James Weatherall
Hans Reichenbach famously argued that the geometry of spacetime is conventional in relativity theory, in the sense that one can freely choose the spacetime metric so long as one is willing to postulate a “universal force field”. Here we make precise a sense in which the field Reichenbach fails to be a “force”. We then argue that there is an interesting and perhaps tenable sense in which geometry is conventional in classical spacetimes. We conclude with a no-go result that conventionalism available in classical spacetimes does not extend to relativistic spacetimes.
In any cosmological theory of structure formation where the seeds of large-scale structure are quantum fluctuations in the early universe, the quantum to classical transition must be addressed, i.e. the success of such a theory depends on there being a solution to the measurement problem. It is of interest to see how the measurement problem plays out in cosmology, owing to factors such as the uniqueness of the universe, epistemological limitations, etc. I show how the many worlds interpretation of quantum mechanics in particular has the resources to address the measurement problem in the cosmological setting.
Download the slides here.
Thomas Meier and Radin Dardashti
Worrall’s (1989) approach is the locus classicus to the debate on structural realism in contemporary philosophy of science. This view affirms the following. When our theories change, what is retained is their structural content and that there is structural continuity between our theories. The case study we are going to consider is the standard model of particle physics (SM) and certain proposed grand unified theories (GUT), which aim at unifying the fundamental forces of the standard model. Within this framework the mathematical groups specifying the theories present a natural framework to discuss the question of structure and structural continuity.
David Lewis is a natural target for those who believe that findings in quantum physics threaten the tenability of metaphysical reductionism. These philosophers point to allegedly holistic properties or entities that, they claim, are subjects of some claims of quantum theory and which purportedly fail to supervene on the Lewisian manifold. Here I defend Lewis’s doctrine of Humean supervenience in particular and metaphysical reductionism in general against an alleged threat from quantum holism.
Attila Molnár and Gergely Székely
The operational definition of mass and the notion of "experiment" is hardly expressible in a classical first-order language, however a physical axiomatization should be delivered in such a strong logic. Luckily, there are such strong logic capable of expressing the notion of maybe undone, but doable collisions, the experimantion and hence the operational definition of mass of objects. The first-order modal logics are these type of logic. We give a formal operational definition of mass in a first-order modal theory of relativity and prove the main special relativistic dynamical theorems in it.
Download the files here.
Starting with Lucien Hardy's seminal work, the past few years have seen considerable progress in deriving quantum theory's abstract Hilbert space formalism from simple operational axioms. These approaches aim on the one hand at answering Wheeler's question "why the quantum?", and on the other hand at constructing the simplest or most plausible modifications of quantum theory that could be tested experimentally. In this talk, I describe two recent axiomatizations of quantum theory. While the first one is conceptually simpler, the second one relates quantum theory directly to current experiments on conceivable "third-order interference".
In the context of discussions about the nature of ‘identical particles’ and the status of Leibniz’s Principle of the Identity of Indiscernibles in Quantum Mechanics, a novel kind of physical discernibility has recently been proposed (by Dieks and Versteegh, and Ladyman and Bigaj), which we call witness-discernibility. We inquire into how witness-discernibility relates to known kinds of discernibility. Our conclusion will be that for a wide variety of cases, including the intended quantum-mechanical ones, witness-discernibility collapses extensionally to absolute discernibility, that is, to discernibility by properties. This conclusion is attained by a sequence of theorems in the framework of the model theory of elementary predicate logic.
We consider the reductive claim that a gas is a collection of molecules, employing the logical framework of case-intensional first-order logic (CIFOL), inspired by Bressan 1972. Our aim is to provide a means of representation that helps to distinguish this seemingly innocuous claim from stronger and less plausible ones, and to clarify the role of identity in this and similar reductions. The main logical point is that the two sortal predicates involved, 'gas' and 'collection of molecules', correspond to different persistence conditions. We conclude by offering an analysis of the status of the law of entropy increase as true for gases, yet possibly false for collections of molecules.
Download the files here.
In this talk we examine possible causal structures of experiments with entangled quantum objects. We try to improve on previous discussions in that (i) we proceed from a recent stronger version of Bell's theorem (Näger 2012, preprint) and (ii) use the rigorous methods of causal graph theory. Our result will be that the EPR correlations come about exclusively by common causes. Among the common causes there must be the quantum state and at least one of the measurement settings. Particularly, we provide an argument that there is no direct causal relation between the measurement outcomes. This refutes the standard view that the EPR correlations arise due to an influence from one outcome to the other ("outcome dependence"). Our result also improves on conclusions from information theoretic considerations (Maudlin 1994, ch. 6; Pawlowski 2010), which wrongly allow for such direct connections between the outcomes.
A recent article by David Wallace and Christopher Timpson argues that relativistic considerations should make us question the viability of wave function realism. A central problem stems from the fact that in quantum field theory, the particle number of a system may vary. This entails that the dimensionality of configuration space can vary, undermining the position of the wave function realist who takes configuration space to be the fundamental, physical space in which the dynamics play out. I address their argument to try to show why the failure of particle number conservation does not undermine wave function realism.
John D. Norton
The thermodynamics of computation assumes that thermodynamically reversible processes can be realized arbitrarily closely at molecular scales. They cannot. Overcoming fluctuations so that a molecular scale process can be completed creates more thermodynamic entropy than the small quantities tracked by Landauer's Principle. This no go result is the latest instance of a rich history of problems posed by fluctuations for thermodynamics.
Download the files here.
Pablo Ruiz de Olano
It is sometimes claimed that elementary particles are “just” irreducible representations of the symmetry groups of the Standard Model. Ontic structural realists, in particular, have suggested that we may able to do away with objects and embrace a fully structural account of elementary particles in terms of group-theoretical invariants. In this paper, I assess the plausibility of such claims by looking at a historical case-study, which concerns the development of QCD during the 1960s through the mid 1970s. As I argue, examination of the various attempts made during that period to come up with a successful field theory for the strong interaction do not support the idea that elementary particles ought to be identified with elementary particles. Rather than that, the case-study suggests that the importance of group theory for particle physics derives from its power as a tool for codifying dynamical information, and from its usefulness as a heuristic for inquiring into the nature of the strong force.
We recall the arguments for the disappearance of continuum spacetime at microscopic scale, from various quantum gravity approaches. We then consider the idea of emergence of spacetime from the collective behaviour of pre-geometric quantum building blocks. We argue for this emergence being the result of a phase transition (“geometrogenesis”) and discuss the related conceptual issues. As a concrete example, we outline the Group Field Theory framework for quantum gravity, and present recent results on the explicit realization of geometrogenesis in it. Last, we re-examine the conceptual issues raised by the emergent spacetime idea in light of this example.
Monogamy of nonlocal correlations has been identified as one of the key properties of Bell inequalities. In a nutshell, monogamy means that, in a multipartite system, the larger the violation of a Bell inequality for one subset of parties the smaller for any other. In this presentation I try to give an exhaustive review of what is already known about monogamy of nonlocal correlations and also report some new results. My talk is divided into two parts. In the first I present the actual monogamy relations and explain the methods of finding them. In the second I give several applications of these relations.
I argue that when Spinoza talks about the attribute of Extension he does not mean the three-dimensional extension “extension of the geometers”, and that by “an extended thing” he does not mean a volume. The argument proceeds in two parts: first, I present Spinoza’s argument that corporeal substance is not extended in length, breadth and depth. Second, I make the more controversial case that, according to Spinoza, our perception of physical things as volumes is a function of the imagination, and hence inadequate. Neither physical substance nor finite bodies, then, are understood properly as three-dimensional.
I argue that it constitutes a category mistake to ascribe probabilities to physical theories which renders Bayesian probabilism largely inapplicable to physics. The argument starts from the observation that physical theories generally contain conventions and that conventions by their very nature cannot be evaluated in terms of probabilities. I then discuss several options how probabilities might be ascribed to a conjunction of empirical hypotheses and conventions - with the result that none of them works. The most promising attempt, namely in terms of probabilities of the empirical consequences given certain conventions, fails because empirical and conventional elements in physical theories cannot be separated.
Paper contains critical review of previous attempts at using many-valued logics in foundations of quantum mechanics. Isomorphism of 'quantum logic' in the sense of Birkhoff and von Neumann (i.e., orthomodular latice with an ordering set of probability measures) with a specific version of infinite-valued Łukasiewicz logic, elaborated in ouir previous papers is presented. This forms a basis for a new interpretation of quantum mechanics in which numerous paradoxes, like Greenberger-Horne-Zeilinger paradox, cannot be derived.
Tim Räz and Tilman Sauer
This paper is about the problem of the applicability of mathematics to the world. We will confront a promising account of applicability, the so-called "Inferential Conception", with a historical case study, the collaboration of Albert Einstein with Marcel Grossmann on the "Entwurf theory", an early "draft" of general relativity. We closely examine the mathematical part of the Entwurf and the mathematical sources it draws on. We show that a considerable effort of adaption is involved in application, and that the question of the nature of the applied mathematical theory is highly nontrivial.
Download the files here.
The talk specifies a number of independence concepts in terms of non-selective operations understood as completely positive unit preserving linear maps on the C* and W* algebras representing observables of a quantum system. The independence concepts are grouped into two large classes: one class expresses the co-possibility of different types of operations on subsystems S1 and S2 of a larger quantum system S, the other group expresses no-signaling properties of operations with respect to the subsystems S1 and S2. Propositions and a number of open problems on the relation of the different operational independence concepts are presented, and the status of the operational independence concepts in relativistic quantum field theory is discussed.
Similarity Assessments, Spacetime, and the Gravitational Field: What Does the Metric Tensor Represent in General Relativity?
In this paper I explore the dialectics underlying the choice between a geometrical and a field interpretation of the metric tensor gab in general relativity. My aim is to examine the role of a specific type of reasoning process (similarity-based reasoning) in interpreting gab. In recent years, philosophers of physics have claimed that the problem of choosing between the two interpretations in question is somehow insubstantial. This appearance of insubstantiality, I contend, stems from a basic form of underdetermination that affects the concepts of spacetime and physical field in the context of general relativity. I characterize and defend such underdetermination.
Download the handout here.
Christian de Ronde
In this paper we attempt to physically interpret the Modal Kochen-Specker theorem. In order to do so, we analyze the features of the possible properties about quantum systems arising from the elements in an orthomodular lattice and distinguish the use of “possibility” in the classical and quantum realms as related to their particular formalisms.
In his famous correspondence with Clarke, Leibniz maintained that if space is a substance or something real, then there would be no sufficient reason for God to have distributed matter in the world so as to occupy the specific places it does, rather than other places by, say, rotating East into West with all relative distances remaining the same. Clarke indeed acquiesced that this would constitute a different distribution of matter in space, but that the sufficient reason for the actual distribution lies solely in the will of God. Here I explore whether Clarke could have just flunked the Leibniz test without abandoning substantivalism. Certainly various theological commitments concerning God’s relation to space and time and the concomitant principles of individuation have to give way, but what about the space-time physics? I argue that some revision is necessary, but only by way of moving from a space+time to a space-time ontology.
Angel S. Sanz
Quantum mechanics is the most powerful tool that we have nowadays to describe the microscopic world. It provides us with accurate answers about quantum systems, but says nothing about why they emerge in the way they do. The purpose of this contribution is to show how Bohmian mechanics can help to overcome this flaw without going out of a fully quantum scenario, as it is commonly done in many physical disciplines to interpret the quantum outputs. To achieve this goal, a series of systems from the Atomic, Molecular and Optical Physics have been selected.
Digital Philosophy (DP for short) is the thesis, put forward by e.g. Gregory Chaitin, Edward Fredkin, Stephen Wolfram, and Konrad Zuse, that the universe is at its core a Turing machine. As a slogan, one might say that The universe is computable. While it is presently impossible to (dis)prove the claims of DP, we can investigate whether e.g. the theories, the mathematical practice of Physics, or the models used in Physics are computable. In this talk, we undertake this foundational challenge in three specific areas. Our (weaker than DP) thesis is that Mathematics used in Physics is computable by design, and this constitutes evidence against DP as 'When wearing Z glasses, everything looks Z', where Z can be 'green' or 'computable'.
Recently there has been much written defending a power ontology in metaphysics (e.g. Stephen Mumford, Dispositions, OUP, 1998). What has been understated in this discussion is the role of (natural) kinds in this ontology. In this paper I argue that a power ontologist ought to include kinds in his ontology. It is argued that kinds serve an essential role in scientific inquiry.
- An explanatory role by kind-membership (cause formalis)
- As basis for inference from one kind-member to another (analogy)
This is defended through a case-study of the development of wave as a kind in physics.
The decoherent histories formalism, developed by Griffiths, Gell-Mann, and Hartle is a general framework in which to formulate a timeless, 'generalised' quantum theory and extract predictions from it. Recent advances in spin foam models allow for loop gravity to be cast in this framework. In this paper, I propose a decoherence functional for loop gravity and interpret existing results as showing that coarse grained histories follow quasiclassical trajectories in the appropriate limit.
Quantum Mechanics Without Wave Functions: The Newtonian Dynamics of Many Interacting Pseudo - Bohmian Worlds
On the face of it, quantum physics is nothing like classical physics. Particles enter into superpositions and get entangled. Point masses interacting via forces are replaced by wave functions governed by the Schrödinger equation. Work in the foundations of quantum theory has provided some palatable interpretations of the quantum formalism, but the world is still undoubtedly strange and each interpretation faces serious objections. In this paper I will argue that we can use Bohmian mechanics as a stepping stone to give a new interpretation of quantum mechanics as a purely Newtonian theory. All there are are point particles accelerated by forces, no wave functions, no Schrödinger equation. However, there is a cost: we must accept the existence of many worlds.
Download the files here.
Rathindra Nath Sen
In a double-slit experiment with electrons, path detection will cause Heisenberg's uncertainty principle to destroy interference. In an experiment with excited atoms, path detection by de-exciting the atom may leave its wavelength essentially unchanged. Duerr, Nonn and Rempe found experimentally that interference is nevertheless destroyed, as predicted by Scully, Englert and Walther. These authors invoke a 'principle of complementarity', more general than uncertainty, which they assert is enforced by different mechanisms in different experimental situations. We show that Galilei invariance suffices to explain the result; it also predicts a testable consequence which is not predicted by complementarity.
It has been suggested that granting lawhood status to special-science generalisations (SSGs) implies elaborate conspiracies amongst fundamental particles, whose behaviour at the micro-level is mysteriously coordinated to make SSGs projectible: the 'microscopic conspiracy' problem (MC). This paper will critically assess two theories of special-science laws:
- Albert and Loewer's (AL) theory, and
- Callender and Cohen's 'Better Best System' (BBS) theory
I will defend the AL theory against Callender and Cohen's critcisms, but will ultimately find that the optimal non-conspiratorial theory of lawhood is a version of BBS that considers the way in which the origins of macroscopic subsystems restricts their later behaviour. I call this the 'Subsystem Genealogy' amendment, and propose that it closes vital explanatory lacunae in the otherwise powerful BBS theory.
Michael Silberstein, William Stuckey and Timothy McDevitt
We propose a path integral over graphs approach to unification that requires a modification and reinterpretation of both general relativity and quantum field theory via their graphical instantiations, Regge calculus and lattice gauge theory, respectively. Accordingly, the spacetime metric and the matter and gauge field gradients on the graph are co-determining, so there is no “background spacetime” connoting existence independent of matter-energy-momentum, and the graphical action can be characterized geometrically via graphical boundary operators. We will explain foundational implications of this approach for quantum mechanics, quantum field theory and general relativity.
Download the files here.
If correlation does not imply causation, then what does? Causal discovery algorithms take as their input facts about correlations among a set of observed variables, and they return as their output causal structures that can account for the correlations. We show that any causal explanation of Bell-inequality-violating correlations must contradict a core principle of these algorithms, namely, that an observed statistical independence between variables should not be explained by fine-tuning of the causal parameters. The fine-tuning criticism applies to all of the standard attempts at causal explanations of Bell correlations, such as superluminal causal influences, superdeterminism, and retrocausal influences that do not introduce causal cycles. This suggests a novel perspective on the assumptions underlying Bell's theorem: the nebulous assumption of realism can be replaced with the principle that all correlations ought to be explained causally and Bell's notion of locality can be replaced with the assumption of no fine-tuning. Finally, we discuss the possibility of salvaging a causal explanation of quantum correlations by casting quantum theory as an innovation to the theory of Bayesian inference.
Download the files here.
This essay introduces and explores Oliver Pooley’s recent presentation of the ‘dynamical approach’ to Special Relativity as a relativistic version of Nick Huggett’s ‘regularity relationalism’. Thus it considers the possibility of an ontologically and ideologically relationalist best-systems account of Minkowski geometry, based upon an ontology of physical fields with point-like parts and primitive topological structure. The dynamical approach, regularity relationalism, and Pooley’s proposal are first introduced, after which it is shown that the topological structure for Pooley’s construal of the dynamical approach is unlikely to be any richer than the standard Euclidean topology. Some potentially successful examples are outlined, but are accompanied with philosophical concerns that apply regardless of whether the mathematical project can be carried out.
Feelings among philosophers of physics have been running high about the virtues and vices of the C*-algebraic approach to quantum field theory (AQFT). Does this approach constitute the natural starting point for foundational investigations – as Fraser holds – or does conventional quantum field theory (CQFT) represent a coherent rival program in full accordance with the standards of present-day theoretical physics – as Wallace has it. I argue that both arguments miss each other because they do not take into account the complex history of AQFT and the peculiar role of mathematical rigor when mathematical physicists contribute to an evolving physical program rather than reconfigure a domain that is physically as well-determined as classical mechanics or atomic physics.
Marij van Strien
In this paper I examine the historical relationship between determinism and classical mechanics. I argue that Laplace, the most famous proponent of determinism in physics, never actually derived his determinism on the basis of mechanics; instead, his determinism was based on metaphysical ideas about causality and the principle of sufficient reason. Later authors, who argued for determinism in physics during the nineteenth century, followed Laplace without giving further justification of determinism; throughout the nineteenth century, determinism was a metaphysical principle rather than a theorem in physics.
Download the files here.
Noel Swanson, David John Baker and Hans Halvorson
In addition to bosons and fermions, the quantum indistinguishability postulate permits the existence of paraparticles obeying mixed-symmetry statistics. Why are these particles absent from nature? We consider one potential answer: every paraparticle theory is physically equivalent to some theory of bosons or fermions, making their absence a matter of theoretical convention. We argue that this equivalence thesis holds for all physically admissible quantum theories falling under the domain of the rigorous Doplicher-Haag-Roberts approach to superselection rules. Inadmissible parastatistical theories are ruled out by a locality-inspired principle we call Charge Recombination.
The failure of Traditional Measurement-Accuracy Realism and the Repercussions for Understanding Vagueness
I challenge “traditional measurement-accuracy realism”, according to which there are in nature quantities that have definite values. An accurate measurement outcome is one that is close to the value for the quantity measured. A measurement of the temperature of some water being accurate in this sense requires that there be this temperature. But there isn’t. Not because there are no quantities “out there in nature” but because the term ‘the temperature of this water’ fails to refer, owing to idealization and failure of specificity in picking out concrete cases. The paper then explores connections with the phenomenon of vagueness.
The metaphysics of the "Trans-World" structuralism suggested by the ontic structural realist approach to unitary inequivalence in quantum physics refutes actualism, i.e., the view that everything that exists is actual. To support this claim, I argue that neither reductive nor non-reductive actualism can account for the modality in "Trans-World" structuralism. This presents ontic structural realists with a dilemma: either they embrace possibilism, or they give up naturalism.
I show that if the solenoid in the Aharonov-Bohm effect is treated quantum mechanically, the effect can be explained via local interaction between the field of the electron and the solenoid. I argue that the electromagnetic potential is just an auxiliary concept:everything can be explained through local action of fields. The core of the Aharonov-Bohm effect is that of quantum entanglement: the quantum wave function describes all systems together.
The paper shows how the Bohmian approach to quantum physics can be applied to develop an ontology of quantum gravity. We suggest to conceive atoms of space as the primitive ontology of the theory and show how a non-local law in which a universal and stationary wave function figures can describe a development of configurations of such atoms such that a classical spacetime emerges. Although there is as yet no fully worked out theory of quantum gravity, we consider the Bohmian approach as setting up a standard that proposals for a serious ontology in this field should meet.
Download the files here.
We investigate stochastic hidden-variable models that describe the interaction of Bell-particles with a stochastic ‘background’ medium. It appears possible to construct a model that violates the Bell-inequality, even if relying on local interactions only. The model is based on a generalized lattice-gas or Ising Hamiltonian. The essential premise of the Bell-inequality that is violated by the models is ‘measurement independence’; however the model does not rely on any superdeterministic process. The model would confirm a conjecture made by participants of this congress. Finally, we discuss a possible connection with published ‘sub-quantum’ theories all invoking a stochastic background field.
Sylvia Wenmackers and Danny Vanpoucke
Norton’s dome is an example of indeterminism in Newtonian physics, based on a non-Lipschitz continuous differential equation (Norton 2008). We present an alternative model using non-standard analysis, which involves infinitesimals and is close to physical praxis. Our hyperfinite model for the dome is deterministic. Moreover, it allows us to assign probabilities to the variable in the indeterministic model. If we follow Sommer and Suppes’ (1997) suggestion that non-standard models are empirically indistinguishable from models based on standard reals, we have to conclude that (in-)determinism is a model-dependent property. Werndl (2009) reaches the same conclusion for a different source of indeterminism.
In Boltzmannian statistical mechanics, and in dynamical systems theory more generally, a measure is defined over all possible states. A popular view among contemporary Boltzmannian physicists is to interpret this measure as typicality measure, i.e., as counting the states. In dynamical systems theory measures can similarly be interpreted as typicality measures. For this approach to be defensible, a justification needs to be provided for the particular choice of typicality measure. However, such a justification is missing in the literature, and this paper attempts to fill this gap. It is first argued that Pitowsky's (2012) justification of typicality measures does not fit the bill. Then a new justification of typicality measures is advanced which appeals to measurement, translation-closeness and the dynamical condition of epsilon-ergodicity.
This talk will summarize the conceptual framework behind a newly-proposed spacetime-realist account of quantum phenomena (arxiv.org/abs/1301.7012). Altering the path integral by restricting the "sum over all histories" (and replacing complex amplitudes with subjective probabilities) seems to allow a realistic Lagrangian-based ontology. This no longer permits a Hamiltonian description (or any particular dynamical equations), but does allow a principle-based derivation of the Born rule. Since the Lagrangian obeys future boundary constraints, this also allows for an explicit description of a Bell-inequality-violating system in terms of a continuous, spacetime-based, hidden-variable model.
Download the files here.
I will revisit Smale's 14th problem and address the question whether we can condently class both properties of numerically iterated solutions as well as those of analytically constructed maps as 'chaotic'. I will argue that these two traditions of chaos research are concerned with different properties of chaotic systems: the analytic tradition is primarily concerned with instability while iterated chaotic maps display both instability and aperiodicity. In addition, I will draw attention to the fact that there currently exists no conclusive evidence showing that chaotic properties of iterated maps are ontological rather than artifacts of the necessary numerical truncations. These differences indicate that the two systems are not as closely related as they are usually presented to be.