2013 2014 2015 2016 2017 2018 2020

Rethinking Foundations of Physics 2018

Saturday March 17 - Saturday March 24 2018

 

Traditional conferences and subject-specific workshops offer little room for having in-depth discussions about the foundations of physics in an open, creative and speculative way. Since the first meeting in 2013, this workshop has offered a platform for engaging in such discussions.

The workshop centers on discussions in small groups. The aim of these is to reveal implicit assumptions in physics, to clarify the conceptual core of its guiding ideas and to explore new ways of thinking about problems in basic research. Ideally, the workshop thus bridges the gap between deep critical thinking and creative brainstorming. To lay the ground for these sessions, there will be talks given by the participants.

The following general questions convey the spirit of the workshop and may serve as guidelines for more specific subjects of the discussion sessions:

- Which mathematical, conceptual, and experimental paradigms underlie modern formulations of QM, GR, and QFT?
- Can they be relaxed or changed? If so, how?
- Which new mathematical developments could be relevant for future foundations?
- Are there promising new or non-standard experimental possibilities?

The workshop is aimed at PhD students and Postdocs in physics, mathematics and philosophy. Participants are not required to have substantial knowledge in all modern physical theories, nor do they have to be currently working on basic research questions. We look forward to applications from people who are passionate about conceptual questions, open to other fields and, most importantly, eager to engage in deep discussions.

The deadline for applications is January 14, 2018. Further examples of the type of topics to be expected can be found on the websites of the previous workshops (2014, 2015, 2016, 2017).

Date: Saturday, March 17 - Saturday, March 24, 2018
Place:
Mountain Cabin, Dorfgastein, Austria
Participants: 18
Workshop fee: 350 Euros (includes accommodation, shuttle to the cabin, all meals, and nonalcoholic beverages, financial support for both workshop fee and travel expenses is available*)

 

Participants:

 

Jeremy Attard
Centre de Physique Théorique, Aix-Marseille University, France

Florian Buchholz
Max Planck Institute for the Structure and Dynamics of Matter, Hamburg, Germany

Flavio Del Santo
Faculty of Physics, University of Vienna, Austria

Guilherme Franzmann
Physics Department, McGill University, Canada

Tomas Gonda
Perimeter Institute for Theoretical Physics & University of Waterloo, Canada

Felix Huber
Institute for Theoretical Physics, University of Cologne, Germany

Johannes Kleiner
Institute for Theoretical Physics, Leibniz University of Hanover, Germany

Marius Krumm
Institute for Quantum Optics and Quantum Information Vienna, Austria and University of Vienna

Robin Lorenz
Department of Computer Science, Oxford University, UK

Tim Ludwig
Institute for Theoretical Condensed Matter Physics at Karlsruhe Institute for Technology, Germany

Cristian Mariani
Philosophy and Human Sciences, University of Milan, Italy

Pierre Martin-Dussaud
Centre de Physique Théorique, University of Aix-Marseille, France

Maria Papageorgiou
Institute for Quantum Computing, University of Waterloo, Canada

Joshua Tan
Department of Computer Science, University of Oxford, UK

Alexander Thomas
Department of Mathematics, University of Strasbourg, France

Federico Zalamea
ERC Philoquantumgravity, Laboratoire SPHERE, Université Paris Diderot, France

Václav Zatloukal
Department of Physics, Czech Technical University, Prague

 

Statement of Inclusiveness

We affirm that scientific events have to be open to everybody, regardless of race, sex, religion, national origin, sexual orientation, gender identity, disability, age, pregnancy, immigration status, academic affiliation, social class, financial situation or any other aspect of identity. We believe that such events have to be supportive, inclusive, and safe environments for all participants. We believe that all participants are to be treated with dignity and respect. Discrimination and harassment cannot be tolerated. We are committed to ensuring that the scientific events in which we participate follow these principles.

(This statement is part of a larger initiative among the scientific community which we invite you to discover here.)

Discussion Proposals:


Can We Think of Physics without Spacetime?

Jeremy Attard


The question I propose to address is the following: Can we think of physics without spacetime ?
This question is two-folded. First, on a practical level, it goes to wonder whether it is possible to build physical theories without taking spacetime (i.e. a 4-dimensional smooth manifold) as a primary notion. Second, on a conceptual level, with the help of some practical (mathematical) notions, the question will be (how) can we think of physical reality without having this strong and intellectually invasive image of a spacetime existing independently of physical phenomena.

 

Deep Learning - Why Can It be Successful?
Florian Buchholz


My presentation on the workshop will be based on the article from Max Tegmark et al. (Astrophysicist at MIT) "Why does deep and cheap learning work so well?" [1], where he and the co-authors connect insights from fundamental physics with deep learning. They compare the process of physical modelling with the structure of neural networks and point out many very interesting similarities, e.g. just how symmetries simplify description or more sophisticated the hierarchical structure of certain process that physicists for example exploit in effective field theory.

Behind that there are some very discussion-worthy statements, e.g. that physics "favors certain classes of exceptionally simple probability distributions that deep learning is uniquely suited to model." I want to discuss this also in connection with my own experience in many-body physics, where we make a similar observation: Although Hilbert-space covers all possible states of a wavefunction, we observe that in fact only a very tiny fraction of this possibility space is realized in nature. State-of-the-art approximations to many-body quantum mechanics like density functional theory can be regarded as "expressions" of this fact.

[1] Lin, H. W., Tegmark, M., & Rolnick, D. (2017). Why Does Deep and Cheap Learning Work So Well? Journal of Statistical Physics, 168(6), 1223–1247

 

Quantum Measurement Problem: Observers in Quantum Superposition
Flavio Del Santo


One of the principal reasons for which different interpretations of QM are still object of heated debate, is the fundamental "quantum measurement problem", which is strongly related to the issues of determinism and realism. This addresses the problem of what distinguishes a measurement from the standard (i.e. unitary) evolution in QM, namely "what makes a measurement a measurement" (Brukner, 2017, In Quantum [Un] Speakables II, pp. 95-117).

Following recent developments of Deutsch’s version of the "Wigner’s friend" Gedankenexperiment, Brukner has proposed to prepare pairs of "observers" in quantum superposition, on which "superobservers" can perform quantum measurements. This results into a no-go theorem (a form of Bell’s inequality) whose confirmation could lead to even more severe fundamental consequences, and rule out the view according to which "facts of the world" (i.e. truth value of statements about observed outcomes) can exist per se, but only relative to a certain observer.

The discussion will focus on the assumption underlying this new theorem as well as on its profound consequences.

 

On the Intertwined Nature of Mass and Time
Guilherme Franzmann


GR and QFT take for granted the existence of local time. Local time differs from a global notion of time in the following sense. If you think about a thermodynamical system in thermal equilibrium, this system is completely frozen from a macroscopic perspective, but that is not the case necessarily when you zoom into the system, allowing you to observe microscopic dynamics happening, as particles being scattered off each other, for instance. On the other hand, we could also have a thermodynamical system made out of massless particles in a box, in which each particle individually follows a null geodesic line, and we could imagine the size of the system being increased or decreased, resulting in variations of the overall temperature. In this case, even though locally the notion of time might be troublesome to be defined, given each particle follows a null geodesic, the temperature can be considered our global clock (this is the case cosmologically, for instance, with the Hubble parameter defining our clock). Hence, it is important to distinguish the notion of a global (thermodynamical) time from the notion of a local (geometrical) one.

All the current fundamental theories take the notion of local time for granted, which boils down to considering the underlying manifold as M=R1,3, related to the fact that the local metric has negative signature in D=4. However, reflecting upon our previous argument, maybe this is not so appropriate, since as we go back in universe's history we find a period in which mass was absent. Moreover, it is worth pointing out that although the notion of space and time have been unified after special and general relativity, the notion of mass remains somewhat singled out. We wonder if this should not be the case.

From a purely local point of view, we know that the contracted Bianchi identity that defines the LHS of Einstein's equations is valid for any manifold, that being Riemannian or pseudo-Riemannian. In particular, it is possible to write down an ansatz for the metric that connects the Euclidean and Minkowsky spacetime solutions through a discontinuity. Therefore, we could wonder if the full space of solutions for Einstein's equations could also allow for such transition happening given a particular type of matter. Given we already know that a discontinuity happens in vacuum, we also wonder if this should would be manifested by the matter sector as a phase transition. Also, given our argument for the existence of mass being relevant for the existence of local time, we wonder if the Higgs' phase transition could be responsible for such flip of the metric's signature, which ultimately characterizes the existence of local time . For this whole process, we would consider the dynamical parameter being given by the temperature, which is already the parameter that deforms the shape of the Higgs' potential until the phase transition happens. Thus, we see the importance of distinguishing the notion of local and global time for this discussion. This whole argument/proposal hints towards the initial question: should there be local time in a universe composed of massless particles?

This idea fits quite nicely to the objective of the workshop. Mathematically speaking, it proposes to study signature changing of the metric within the framework provided by GR, something that has been considered in the past in other contexts. This also opens the possibility for considering new mathematical structures, which would possibly allow for a smoother transition between the phases (this hints at the role of topology). Conceptually, it goes towards the non-existence of time as we know when massive particles are absent, something that happens in the early stages of the universe when the temperature is much higher and the Higgs' potential is not yet deformed. Experimentally, we know from previous works that particle production can happen when the signature of the metric changes, which should produce experimental signatures that could be investigated cosmologically speaking. Given all this, and the fact that the issue of the existence of time has haunted physicists, mathematicians and philosophers for long, I believe this is a question always worth discussing.

 

What is the Ontological Nature of Causality?
Tomas Gonda


Causality is a topic attracting much interest in the study of fundamental physics. No wonder, the whole scientific endeavour is built upon the notion of explaining what we observe in causal language. There is a lot one can discuss in relation to causality and its stance in physics currently. However, I would like to focus on one particular aspect, which I found people to disagree on implicitly and which I find could be useful to discuss. In particular, it is the ontological nature one assigns to cause-effect relations.

The reason why I think it is worth discussing these issues is that different opinions lead to very different developments, at least in the study of quantum gravity with respect to the topic of indefinite causal structure.

There are several options and combinations thereof I can see at the moment. One is that there exist causal relations that are at least as fundamental as other ontological elements (beables), but due to their special character, they should be treated separately. Second one is that in fact we can place some causal relation on the same footing as the rest of our ontology. Finally, one can also speculate that all causal relations are just emergent and despite their usefulness, they are somewhat of a red herring when it comes to ontology. I would be glad if our discussion resulted in other distinct and more refined possibilities or if we could identify that some of these are inconsistent with other beliefs or observations.

The main questions to address that I can think of right now are:
1. Should one ontological character of causality be preferred over another when talking about physical phenomena?
2. Does it depend on whether these phenomena are fundamental/emergent and whether their nature is ontological/epistemological, whatever those are in our world?
3. Is general relativity compatible with all of the views?
4. And finally, can we test it experimentally, for example by showing a violation of causal inequalities, by implementing the so-called quantum switch or else?

 

What about Holography?
Felix Huber


A lot of recent work focused on the so-called holographic principle and the bulk-boundary correspondence. It's exact formulation is often done by hand-waving, making connections with AdS/CFT, tensor networks, and quantum error-correcting codes. I would like to present this approach in more detail, drawing out the fundamental assumptions and the conceptual core of this approach. I deem this to be of interest, as this principle is often alluded to, but rarely properly explained. Thus the aim here is to draw out the core assumptions of this often used approach, and to explore its potential
usefulness for other areas in physics.

I would like to focus on following questions:
What predictive power does the holographic principle have? Which insights arise from connecting the fundamentally different concepts of AdS/CFT, tensor networks, and quantum error-correcting codes by this principle? In what other physical theories can holography be found? I will take Ref. [1] as a starting point.

[1] Harlow, "TASI Lectures on the Emergence of the Bulk in AdS/CFT", arXiv:1802.01040

 

Three Frameworks for New Physics
Marius Krumm


As General Relativity and Quantum Theory have shown, the introduction of new frameworks can have a shocking impact on our thinking and picture of the world. Despite the success of both General Relativity as well as Quantum Theory, there are important hints that both theories might eventually be replaced: The most important problem might be that of unifying them to quantum gravity. In this talk we will present three recent frameworks that allow to go beyond our current theories.

The first framework is called "General Probabilistic Theories" (also called e.g. "Operational Convex Theories"). It formalizes the idea of using probabilities in operational scenarios, trying to be as general as possible. This framework allows to derive quantum theory from physical postulates, to discuss higher-order interference and stronger-than-quantum-correlations.

The second framework is the "Process Matrix Framework". Our current application of quantum theory can still be considered semi-classical in many ways: For example, while the gates and systems in quantum computing behave according to quantum mechanics, the circuits (e.g. the arrangement of the gates) themselves are classical. There is no macroscopic superposition of optical tables or particle accelerators. The superposition principle could also be applied to causal scenarios, superpositions of past and future, superpositions of space-times and metrics. The process matrix framework has been introduced as a systematic formalism to analyze the statistics of such exotic scenarios. It assumes the local validity of quantum mechanics, but stays agnostic about the global causal structure. One important physical scenario in this framework is the gravitational quantum switch: It uses a spatial superposition of a large mass to induce a superposition of time-dilations to switch the communication direction of two labs.

The third framework is Markus P. Mueller's agent-centric theory. While most theories assume the existence of an external objective world, this theory considers observers as fundamental. The framework assumes that there are classical observers (e.g. brains) receiving bit-strings as inputs. Starting from this scenario, Mueller shows how regularities stabilize themselves, leading to the impression that there is an objective external world generating the bit-strings. Mathematically, this framework is based on the computer science concepts of Kolmogorov complexity and Solomonoff induction. Here, science itself is formalized as coming up with the shortest program that is able to generate our observations.

 

Are QM, GR, and QFT the Only Fundamental Theories?
Tim Ludwig


A central question for the workshop is "Which mathematical, conceptual, and experimental paradigms underlie modern formulations of QM, GR, and QFT?". It is an interesting and important question to ask. However, I would like to discuss a paradigm underlying the question itself. That is, in the context of the workshop "Rethinking Foundations of Physics", the question seems to suggest that QM, GR, and QFT are the only theories that would pass a test for being fundamental.

A paradigm that seems to underlie this list of only three fundamental theories is reductionism. However, this list might be incomplete. Even from a reductionist point of view, statistical mechanics seems to be missing, since it draws on concepts that cannot be derived from QM, GR, or QFT. Changing the paradigm from reductionism to a "constructionist" one (as used in "More Is Different" [1]), opens up the set of fundamental theories for many more candidates.

To set the stage for discussion about "fundamentality of theories", I will present key arguments of two classical articles "More Is Different" [1] and "The Theory of Everything" [2] and argue in favour of the "constructionist" paradigm.

[1] by P. W. Anderson, Science, vol. 177, 4047, year 1972
[2] by R. B. Laughlin and D. Pines, PNAS, vol. 97 (1), 28, year 2000

 

Metaphysical Indeterminacy in Orthodox Quantum Mechanics
Cristian Mariani


Metaphysical indeterminacy has recently received a great deal of philosophical interest. According to one of the most influential theories, metaphysical supervaluationism, metaphysical indeterminacy occurs when reality is unsettled between different options, although each option is itself fully determinate. Many authors have argued against this view by claiming that according to quantum mechanics, reality cannot in principle be made fully precise, and thus metaphysical supervaluationism should be ruled out as a valid option. In my talk I argue against this claim. My strategy is two-fold. First, I show that the above objection relies on two assumptions that can easily be rejected. Second, I argue that, even if those assumptions are accepted, metaphysical supervaluationism has many ways out. In order to do all this, I will start by introducing the dialectics behind this topic, both from the point of view of metaphysics, and philosophy of quantum mechanics.

 

Is the Quest for an Ultimate Physical Theory made Vain by Gödel’s Theorem?
Pierre Martin-Dussaud


In 1931, the Austrian Kurt Gödel was proving two famous theorems of incompleteness. They establish absolute bounds to the possible knowledge one can access through formal reasoning. However physics might escape from this final statement because of its experimental ground. Paradoxically theoretical physicists have been valiantly looking for an ultimate mathematical foundation of physics and have reached some partial success. So, is the quest for an ultimate physical theory made vain by Gödel’s theorem? Are the experiments themselves constrained by these theorems? Or is it possible to state any fundamental independence between the physical world and the world of formal systems?

Recently, the logician Leonid Levin has proved some results that seem to extend the significance of Gödel’s theorems to physics: no physical process could circumvent the incompleteness as some people had hoped before. However, this statement itself relies on some physical postulates as « the postulate of independence » which might be experimentally testable…

 

Is the Particle Ontology Compatible with QFT?
Maria Papageorgiou


One of the most curious aspects of relativistic quantum theories is that notions of localisability cannot be maintained, which certainly challenges the particle interpretation of relativistic quantum field theories. The notion of a particle in QFT is tight to the Fock space structure of the Hilbert space, most commonly constructed with respect to momenta, while there is no primary notion of a spatial wavefunction of a particle as in non-relativistic quantum mechanics. Many results from the algebraic approach to QFT show how in the presence of interactions or gravity (QFT on curved spacetime) the Fock representation is not generally available which implies that a general QFT cannot support a particle ontology.

To address these issues we will have to re examine how the principle of locality, mostly motivated by the the theory of relativity, is incorporated in quantum physics. Or, how did we even end up with quantum field theories at the first place. To set up the discussion, I will mostly refer to the following articles:

[1] What is a particle? by Daniele Colosi and Carlo Rovelli, 2004
[2] The fate of 'particles' in quantum field theories with interactions, by Doreen Fraser, 2008
[3] In Defense of Dogma: Why There Cannot be a Relativistic Quantum Mechanics of (Localizable) Particles, by David Malament, 1996
[4] More Ado about nothing, by M Redhead, 1995

 

How Can We Build a Higher-Order Model of Physical Experiments?
Joshua Tan


How can we build a higher-order model not only of the physical theories but of the physical experiments which test those theories, so that we can ground out "physical interpretations of theories" (e.g. of QM) in terms of their pragmatics, i.e. the experiments they suggest?

The key thing here is the definition of experiment, and its precise relationship to 'data'. Think of all the assumptions that go into any experiment... the availability and integrability of data, the assumptions required to interpret that data, and even the particular idea of falsifiability or "truth". For starters, a higher-order model of "experiment" should clarify and organize some of these assumptions. Going further, a higher-order model should suggest a way to not only design single experiments for isolated features of a physical theory, but to "co-design" ensembles of different experiments that test combinations of many features. In particular, I will talk about one, simple class of experiments in machine learning, and why "ensembles of experiments" makes sense in this context.

More speculatively, I will also consider generalized "mathematical experiments" in pure mathematics, and discuss different ideas for constructing higher-order models of these objects in category theory.

 

Continuity versus Discreteness
Alexander Thomas


In the attempts for physical descriptions of Nature two antithetic features are omnipresent: continuity and discreteness. The geometrical description of general relativity or classical field theories seems to be profoundly continuous whereas quantum mechanics and theory of information seem to be discrete. In the attempt of unifying relativity with quantum mechanics both concepts come together. What do these terms really mean and imply? Are these features so opposed one to the other? Or is there a discrete and a continuous aspect in any natural phenomenon, or even a discrete-continuous duality?

I wish to enlighten this discussion with a quite recent mathematical discovery: random geometries. In the 1960’s William Thomas Tutte gave a formula counting finite planar graphs. This formula is the starting point for defining finite random graphs. In 2003, Angel and Schramm proved that there is a well-defined limit process: a random surface. What are the features of the continuous random surface compared to the discrete graphs? In what way does this new example enlighten the continuity-discreteness discussion?

 

The Role of Geometry in our Quest for a Satisfactory Model of Nature
Václav Zatloukal


I would like to discuss guidelines one should follow searching for a unified, coherent, self-contained and intelligible picture of Nature on its fundamental level. Does such theory have to be geometrical (in certain sense to be discussed), or would it be acceptable to achieve a set of rules with little geometric structure, provided they possess sufficient predictive power? What sort of description do we consider to be geometric, or 'geometric enough'? Is there a strict boundary between geometric and algebraic (and/or something else)?

Kick-off talk: Geometrodynamics as an attempt to understand physical phenomena in terms of geometry

I will present Wheeler's approach to unification of gravity and electromagnetism known as 'geometrodynamics', and point out some important conceptual ideas contained in his work. Are particles and fields some foreign object moving in spacetime, or can they be described as various kinds of curvature of the spacetime itself?

 

Poster download:

JPG or PDF

Supported by: Foundational Questions Institute (FQXi), Basic Research Community for Physics e.V. (BRCP), Leibniz University of Hannover

Organisation:
Johannes Kleiner, Robin Lorenz, Jan-Hendrik Treude, Federico Zalamea
Leibniz University of Hannover

Scientific advisors:
Prof. Dr. Felix Finster, Department of Mathematics, University of Regensburg
Prof. Dr. Domenico Giulini, Institute for Theoretical Physics, Leibniz University of Hannover

Data Protection Statement