2013 2014 2015 2016 2017 2018 2020
Rethinking Foundations of Physics 2015
28th of March - 4th of April 2015
Traditional conferences and subject-specific workshops offer little room for in-depth discussions about the foundations of physics in an open, creative, and speculative way. This workshop offers a platform for young scientists to engage in such discussions.
The major part of the workshop will consist of discussion sessions in small groups, aiming at new approaches and ways of thinking about specific topics in fundamental physics. The discussion sessions will be led by the talks of some of the participants (there will be not more than three talks per day). The topics of discussions will be selected based on the expertise and interests of all participants and, similarly to the topics of talks, will be centered around some of the following questions:
- What are the mathematical, conceptual, and experimental paradigms underlying modern formulations of QM, GR, and QFT?
- Can they be relaxed or changed? And how?
- Which mathematics and principles could be relevant for new foundations?
- Are there promising nonstandard experimental possibilities?
The workshop is directed at PhD students in physics and mathematics and young researchers who have completed their PhD in the last few years. It is not required that participants have substantial knowledge in all modern physical theories, but a general interest and openness towards other fields and ideas is expected.
Links to similar workshops can be found here.
Group photo Rethinking Workshop 2015
Ryszard Paweł Kostecki
Markus P. Mueller
Problems with current theories of Quantum Gravity
What are the principles that should underline a successful theory of Quantum Gravity? Unitarity, causality, diffeomorphism invariance, UV finitness, perhaps some dose of nonlocality? But how to implement them?
In this talk I will give a brief introduction to the popular approaches to solving Quantum Gravity and peak under their hoods to see how they deal with these principles. I will point out open problems/issues that are usually not talked about and which have to be overcome, in order for the approaches to work. For example, in Canonical Loop Quantum Gravity this is the problem of obtaining the quantum scalar Hamiltonian constraint, with similarly big problems in other directions. I will then suggest some ways to move forward, which should hopefully lead to a lively discussion.
Which principles should be implemented strongly, and which should be allowed to be weakened in the construction of a theory (theories) of Quantum Gravity?
Should these properties be constraining the quantum dynamics, or should some of them only appear in the (semi-)classical limit?
It seems obvious that something in the notion of spacetime has to be modified in a theory of quantum geometry, but how much? How radical change should it be?
Are we missing some basic principle of nature?
Does it make sense to construct a quantum theory of gravity, without having to consider matter couplings? Or do we have to go for a theory of everything?
Relatively special relativity
Initially, a brief account of the most common ways to think about Lorentz symmetry violation shall be given, as well as of selected experimental results. Comments on attempts of 'deriving' Lorentz symmetry are made. Then, departing from a view on the Poincaré group in terms of Bacry and Lévy-Leblond's classification of kinematical groups from 1986, the question of rigidity of symmetries will be raised and interpreted algebraically. The quantum case motivates the notion of Hopf algebras, where emphasis shall be layed upon how they are thought of as formalising a putative abstract duality between geometry and algebra. Applications are presented that question the common interpretation of the relativity principle.
Formation of smaller discussion groups by expertise or interest with subsequent presentation of results is envisaged for part of the topics. Possible guiding questions are:
Part A (concrete, talk-oriented)
– Is the title of the talk justified? In how far do we need to reassess the relativity principle on our way towards a theory of quantum gravity?
– In light of the strong experimental bounds on violations of Lorentz symmetry, should we attribute to it a particularly fundamental status?
– Discussion of the postulates for kinematical groups. Alternative suggestions?
– Doesn't causality protect Lorentz symmetry? (And what this questions means.)
Part B (abstract, generalising)
– How do the participants feel and think about symmetries in nature and our conventional description of it in terms of groups, their actions and representations etc.?
– Can there exist fundamental symmetries at all, or could the patterns we perceive just be mesoscopic remnants of some form of random dynamics? Does the image of a lowest length scale reflect a psychological need, in that it would signify a 'boundedness' of physics, the definite and final setting for our understanding of nature?
– What virtue of redundance? (And the paradigm of geometrical invariance.) Contrasting gauge and diffeomorphism symmetry.
– What do we learn from the occurence of anomalies? Does the status of symmetries change from classical to quantum theory?
Self-organization -- the mechanism behind current physics laws?
Self-organization is a general phenomenon in complex systems of forming structure, order and symmetry from apparent randomness, disorder and asymmetry without external driving. In our nature the phenomena are extremely universal, from the galaxy formation, to the structure of brain, dairy cow's skin pattern, amazing symmetry in the flowers and leaves, snail's logarithmic spiral, etc. Many phenomena superficially look like "intelligently designed", but can actually be explained by different types of self-organizations. In some situations there is no need for fine tuning the parameters and initial states to the accurate special values to allow the phenomena to happen.
The rich consequences of complex systems have been studied a lot in different subjects, however, very limited efforts has been made in the aspect of using self-organization to explain current physics laws. In this talk, I will give an introduction to self-organization in complex system, and why it might shed new light on seeking for fundamental physics laws. Taking quantum gravity as an example, I will introduce the general ideas of both Loop Quantum Gravity and Spin Foams, and analyze the limitations. I will also briefly review the historical attempts at the direction of emergent gravity and their challenges, and discuss whether self-organization would provide a new path towards quantum gravity.
- Can some of the fundamental symmetries be the consequences of self-organized pattern?
- What are the principles in physics laws that cannot come from derivation, emergence/self-organization, but have to be postulated? (e.g. causality, unitarity, Lorentz signature, symmetry etc.)
- What are the characters of complex system models that might give rise to some properties in current physics through self-organization?
- In quantum gravity, are we quantizing the correct degrees of freedom? And how would we know it?
- What are the possible directions of observational tests to distinguish emergent gravity/symmetry with the conventional established theories?
- In history, never has a revolution in physics happened without using new mathematics. What new branches of mathematics might be important for the next leap in Physics?
More is Different - Weak and Strong Emergence in Science
I will discuss the concept of "emergence" in science and especially physics. I will outline the main points made by Phil Anderson in a famous 1972 essay made against a "constructionist" view of the universe and then introduce the concepts of weak and strong emergence. I will then present a number of arguments and counter-arguments for the existence of strong emergence, which I hope can incite a number of discussions and raise a number of questions as to whether the concept of emergence is "useful" in science and whether it has to be accepted as a corner stone of the scientific method.
Is emergence a concept that is useful in scientifically explaining observed phenomena? Is there a difference between weak and strong emergence? Is the concept of strong emergence "scientific"?
Rethinking Relativistic Propagation
Everyone knows that relativity theory forbids superluminal propagation (SP) of matter. But what, precisely, does this mean? Certainly the SP of point-sized test bodies is prohibited in the theory, but these idealized bodies represent only relatively isolated low-energy matter. How does the theory prohibit the SP of extended matter fields with non-negligable energy? A widespread conviction (see, e.g., Hawking and Ellis, 1973) is that this prohibition is given by the dominant energy condition, which requires every observer in a relativistic spacetime to determine the four-momentum flux at a point to be causal. Recently, however, Earman (2014), following a suggestion by Geroch (1996, 2011), has challenged this viewpoint, suggesting instead that the prohibition on SP be understood as arising from the differential equations used in the (local) initial-value problems for matter fields: if they are of a certain form (roughly quasilinear hyperbolic), they determine their own causal cones, inside which wave-like solutions must propagate. The prohibition on SP is then just the statement that these causal cones are nested within the light cones of spacetime. This condition, however, is independent of the dominant energy condition. Moreover, it is not clear that prohibiting the SP of wave-like solutions is really sufficient in general to rule out things like superluminal signaling. Is Earman's argument sound? And if so, what then becomes of the interpretation of the dominant energy condition and the stress-energy tensor more generally?
How are we to interpret energy conditions? What is the status of point particle idealizations in relativity? How important conceptually (as opposed to practically) is the well-posedness of the local initial value formulation? And does the spacetime metric have an independent geometric status, or does it merely describe the causal cones of electromagnetism?
In light of these issues, what is the status of no-signaling and other locality condition in Bell's theorem and in axiomatic reconstructions of quantum theory? These conditions are typically understood as being delivered from relativity theory, but to what extent do they really derive from a proper formal understanding of relativistic propagation? Are they instead merely "inspired" by relativity theory? Does rethinking SP reveal new possibilities for understanding quantum foundations previously thought foreclosed or suggest new experimental tests?
Evening discussion: Imre Lakatos - scientific practises and research programmes
Imre Lakatos is a Hungarian philosopher of science from the 20th century. He was a student of Popper and a close collegue of Feyerabend. Amongst many contributions to the philosophy of mathematics and methods for proof and definitions he is well known for his theory of research programmes. He further develops the idea of Popper concerning the demarcation of science and pseudoscience and falsification as the driving force of scientific method. In opposition to Popper he suggests that the scientic process should be understood as the formation and ceasing of rivalling research programmes. A research programme consists of an irrefutable hard core, the dogma as it were, and a protective belt that accumulates. A negative heuristic ought to divert the modus tollens away from the core. Core implies Phenomenon and Result implies Neg(Phenomenon) must not lead to abandoning the Core, which would be the modus tollens. I would like to explain and demarcate his idea of his.
Which are the standard arguments that are trying to divert away from the central dogmas of our time. Is there heuristic that we are given allowing us to "save" the core. What would be the core of current research programmes?
What are the scientific practises and habits that we have when pursuing our research? Should we change them?
On finding new physics
Usually, physicists stick to conceptual or philosophical „paradigms" when trying to find new theories. E.g., many proposals for a quantum gravity still contain, on a fundamental level, something generalizing space-time (on the one hand) and objects „therein" (on the other hand).
Even though this is of course useful in many situations, one could take another route to finding new theories: One could take the mathematical description of a typical situation with current theories and try to reformulate those mathematics to obtain different structures, forgetting for a moment about the usual paradigms. If this is successful, one can ask a) how much information is still contained in those new structures and b) whether it is possible to invent an action principle or something similar which yields exactly the original physical situation (reformulated in the new structures).
In the talk, I will outline this idea in more detail and then give an example (of a new unified theory http://en.wikipedia.org/wiki/Causal_fermion_system), where in principle this procedure could have been used (even though, historically, it was not). The goal is to lead up to discussion sessions, where we can try this procedure.
This talk leads up to a discussion/brainstorming session where we try to carry out a first step in the idea sketched in the abstract.
The fact that the linearity of the Schrödinger equation per se seems to be in conflict with the experimental observation of definite measurement outcomes, usually referred to as "measurement problem", necessitates modifications or reinterpretations of quantum theory if one takes a realist point of view. In this talk, I will present a preliminary idea how one could possibly distinguish some of these modifications experimentally. It is still unclear whether this idea can be implemented in a concrete experimental setup, and some rather known voices, whom I talked to, think that this is not the case. The aim of the talk is to lead up to a discussion about this idea and to initiate a brainstorming session where we study the question of experimental implementation in more detail. In the talk I will explain several rough ideas about possible implementations to which the standard "not possible"-reasoning does not apply.
This talk leads up to a discussion-session where we critically discuss the proposed general idea, as well as brainstorm about possible implementations.
Is physics just a statistical inference?
Ryszard Paweł Kostecki
This talk will be aimed at the perspective according to which the fundamental physical theories are ontically noncommittal tools for statistical inference. In order to substantiate this view (and also to pinpoint some of its problems), I'll discuss several important theoretical results that are along this line (and have a noticeable potential for providing a paradigm shift in foundations of physics), including maximum entropy approach to foundations of nonequilibrium statistical mechanics, information theoretic approaches to foundations of quantum mechanics, information theoretic approach to emergent space-times, and information theoretic approach to renormalisation.
1. Information theory and statistical inference as new foundational thought styles/paradigms in physics.
2. Epistemic vs ontic perspectives on physical theories.
3. Is space-time an ontic observer-independent container or an intersubjective emergent construct, dependent on the choice of experimental design?
Are there any practical reasons to care about the ontology of the wave-function?
In my talk, I will try to revisit from a practical point of view the puzzle around the interpretation of the wave-function, a question that has been debated since the birth of quantum mechanics. First, I will scetch the historical events which lead to the wavefunction in quantum mechanics. Second, I will try to make a platonic review of the two main school of thoughts (either the wave-function is "ontic" i.e. a state of reality, or "epistemic i.e. a state of knowledge) and some streams inside each of them. Third, I will discuss some arguments about possible consequences for experiments. Finally, we shall come to a more general question about whether the wave-function is a fundamental object in quantum mechanics at all.
First and foremost, as the titles suggests, is it relevant whether the wavefunction is ontic or not? Second, could one find an experimental or theoretical proof for one or the other interpretation? Related questions are: can one, in the framework of theoretical physics, and in particular in quantum mechanics, go beyond an operational approach? Is there a measurement problem in quantum mechanics? More abstractly speaking, is the existence of an objective reality a necessity in physics/science?
Is causal structure the catalyst for reconciling quantum theory and relativity?
Causal structure is a central concept in fundamental physics. On the one hand, causal structure plays a crucial role in the mathematical structure of relativity, as evidenced by e.g. Malament's theorem. On the other hand, the role of causality is also central to quantum theory, e.g. the use of tensor products to describe composite systems implicitly enforces the causal independence of the systems (e.g. ensuring that the no-signalling principle is satisfied for measurements on bipartite systems). However, in recent years, several results - obtained using the perspective of quantum information - have suggested that causality in quantum theory and relativity differ in crucial respects. We shall consider two such results:
1. A difference arising from time-symmetry: Time-reversal in relativity does not introduce signalling between spacelike separated regions. However, in joint work with Bob Coecke, we showed that no-signalling boxes, which formalise a Bell experiment, generically introduce signalling under time-reversal. Hence there is an incompatibility between the causal structure of relativity and that of probabilistic devices which are embedded in a spacetime. This also reveals an arrow of time due to causal stucture.
2. A difference arising from indefinite causal structure: Oreshkov et al. have shown that quantum processes are consistent with the idea of indefinite causal structure. More specifically, quantum processes (but not classical processes) can violate an inequality that holds for processes which have a definite causal ordering. In contrast, relativity has definite causal structure built into its definition.
1. Do these examples provide genuine conflicts between quantum theory and relativity?
– In the first example, the conflict is resolved if the probabilistic processes are classical, suggesting that the conlict is only present for genuinely quantum processes (i.e. non-local ones). In the second example, it is not clear how to realise (i.e. with a particular physical theory and Hamiltonian) the processes which violate the causal inequality.
– How would theories of quantum gravity fare with respect to these examples, especially the second example?
2. Are these new types of non-classical behaviour?
– In both of these examples, quantum theory provides a features which are not present in classical physics. In what ways do these relate to the standard types of non-classical behaviour, such as contextuality and non-locality?
3. What is the role of agents in these examples?
– Are these differences due to the two different ways that relativity and quantum theory treat observers? In quantum theory, observers are part of the theory, whereas in relativity, there is no fundamental role for them.
– Would an operational formulation of relativity, such as discussed by Hardy, be useful for understanding this?
Realizing Mach's princple
I would like to give an overview of the most important theses about space of Machs book "Die Mechanik in ihrer Entwicklung" where he also introduced his famous principle. I would then initiate a discussion about how to implement this principle mathematically in various theories by telling how I did this in my work and how Julian Barbour did this in his work. It is of course also important to think about whether this is a reasonable principle at all.
Physics from inverse Solomonoff induction
Markus P. Mueller
All known theories of fundamental physics describe our world as an objective material universe, in which observers are embedded like actors on a stage. In my talk, I argue that this must be the wrong ontology, and suggest a mathematically rigorous, alternative approach, in which observations are fundamental — that is, the first-person perspective of observers who try to bet on future observations.
Based on theoretical computer science, I will show that from one simple assumption ("Solomonoff induction works to predict observations"), which is derived from a set of thought experiments, many aspects of physics can be proven as consequences: the emergence of simple probabilistic laws of physics, the apparent existence of an "outside world" that looks as if it had once started in a "Big Bang", and objectivity (i.e. communicating observers agree on what they see). Other surprising consequences follow as well, including a solution to the Boltzmann brain problem, a proof of the nonexistence of Wittgenstein zombies (except for special situations), and consequences for brain emulations and puzzling cosmology problems.
This approach is meant to be a proof-of-principle that objective laws can emerge from a fundamentally epistemic starting point.
If we would like to start from an epistemic or Bayesian perspective in general (NOT only within my specific approach, but in general), then this raises several questions that I would like to discuss:
* What should our goals be? I.e. what to achieve? What are the main problems of epistemic approaches, and what are the main benefits?
* Why do so many (particularly traditionally-minded) physicists find this perspective weird? How can we get into fruitful discussions with the skeptics? Where might they be right?"
Particle QFT the Feynman Way
It is often believed, especially in light of theorems by Malament and others, that there can be no relativistic theory of localizable quantum particles and, therefore, that we must accept a field understanding of relativistic quantum physics. However, Feynman's path integral approach to quantum electrodynamics seems to be just that, since one simply sums over particle paths through spacetime. We introduce Feynman's theory, showing how calculations work, discussing the particle character (and its limits), and analyzing two special aspects of the theory: that off-shell processes come about due to summing over all parametrizations of the paths, and that forward-in-time causality appears to be derived for free. Finally, we discuss how this theory interfaces with well-known theorems forbidding any relativistic quantum theory of particles.
Many of the issues mentioned in the abstract I do not currently have answers to, and this could lead to some important discussion since they are essential ingredients in any QFT. The overarching question is: what, if anything, should we make of the significance of this so-very-different theory by Feynman for fundamental physics?
We have here a fundamental formulation of QFT in which the basic ingredients are almost completely different, so this is a rare opportunity to question their significance in a directed way. Questions are raised about the nature of creation and annihilation events, locality and nonlocality, unitarity, the meaning of quantum states, and also mathematical rigor. Also, it is possible to recast any topic in the foundations of physics within the framework of the Feynman path integral, where it typically looks significantly different - particularly interesting ones are particle identicality, EPR correlations, and quantum gravity.
On the role of infinite idealizations in Quantum Field Theory
The success of renormalization group methods (RG) in predicting the behavior of second order phase transitions is one of the greatest achievements of quantum field theory. Despite this fact it is far from clear to what extent RG allow us to explain this behavior. First of all, the behavior of second order phase transitions is generally described by appealing to a highly idealized model (the Ising model). Second, RG seem to work only if the system is assumed to be infinite. In this discussion, I would like to call attention to the latter kind of idealization, referred to in the literature as infinite idealization. In particular, I would like to discuss the meaning of the following claim made by Kadanoff (2009): 'phase transitions cannot occur in any finite system; they are solely a property of infinite systems'.
In this discussion, I will put forward the following questions: What justifies the assumption of infinite idealizations in the description of phase transitions? Is this idealization needed for the description of these phenomena? If so, should we conclude together with Lebowitz (1999) that phase transitions are examples of genuinely emergent phenomena? If the idealization is not needed, should scientists try to elaborate finite approaches to phase transitions? Apart from these specific questions, my presentation will also lead to a general discussion on the role of idealizations in scientific explanation. For instance, to what extent can unrealistic models explain the behavior of physical systems?
Evening discussion: Where do theories come from?
What is the origin of spectacular new theoretical insight? What preconceptions enter into our models of the world and how strong are they influencing our way of thought? Can we really probe into a platonic realm of universal ideas or do theories gain support more by adopting to shared belief? And to what extend can we benefit from conceptions outside the field of natural sciences? I want to approach such questions on the basis of examples from mathematics, physics and chemistry.
- Are we sure about our underlying preconceptions and do we even share them?
- Is it useful to get rid of them to be more 'objective'?
- On what basis should a new theory be build?
The Cosmological Constant Problem
It has often been blamed to be the worst prediction of physics that according to our current understanding of QFT, the value of the cosmological constant should be some 120 orders of magnitude above the (nonvanishing) value that we observe. I will give a short introduction to the problem, and various theoretical ideas of how to solve it.
- Is it a challenge to our ideas about how to reconcile particle physics and cosmology, or an important pointer towards the solution?
- What will be necessary to show in order to convince different subsets of the physics community?
- Which value do anthropic arguments have?
University of Regensburg - Faculty of Mathematics, German Research Foundation: DFG Graduate School GRK 1692 "Curvature, Cycles, and Cohomology"
Johannes Kleiner, Leonhard Horstmeyer, Ryszard Paweł Kostecki, Jan-Hendrik Treude
Prof. Dr. Felix Finster, Department of Mathematics, University of Regensburg