Sunday, November 06, 2016

Avi Loeb on diversity of hypotheses, fine tuning and inflation + a step towards testability of maxHELP?

Epicycles - Courtesy Wikipedia
An excellent Comment piece in Nature called Good data are not enough by Prof Avi Loeb at Harvard catches my eye.  He points out that prevailing world views can by highly limiting of scientific progress.
"The astronomy division of the US National Science Foundation, for example, devotes most of its funds to major facilities and large surveys, which are performed by big teams to collect better data within mainstream paradigms. Fields from particle physics to genomics do the same.
The consequences of a closed scientific culture are wasted resources and misguided ‘progress’ — witness the dead end that was Soviet evolutionary biology. To truly move forward, free thought must be encouraged outside the mainstream. Multiple interpretations of existing data and alternative motivations for collecting new data must be supported."
He compares modern cosmology to the theory of Epicycles. He is particularly critical of the deployment of anthropic reasoning with the multiverse theory, deploying 2 objections:
  1. He has a 2016 paper suggesting that "life is 1,000 times more likely to exist 10 trillion years from now around stars that weigh one-tenth the mass of the Sun. This means that terrestrial life might be premature and not the most likely form of life, even in our own Universe." - see also his book chapter here which has several very interesting discussions eg on the implications of slightly fatter "tails" in the assumed gaussian distributions of primadorial fluctuations on the emergence of the earliest forms of life (p2-3)
  2. "The anthropic argument... suppresses much-needed needed efforts to understand dark energy through an alternative theory that unifies quantum mechanics and gravity."
Big problems for Inflation - Table 1 from Ijjas, Steinhardt
and Loeb "Inflationary Schism after Planck2013
He says "The fact that we have not yet converged on such a theory is indicated by paradoxes in other areas of physics. For example, information contained in, say, an encyclopaedia is lost if it is swallowed by a black hole that ultimately evaporates into heat known as Hawking radiation. This contradicts a basic premise of quantum mechanics that information is preserved, and is known as the ‘information paradox’. In addition, currently viable models of cosmic inflation require fine tuning of the conditions of the Universe before and during inflation." (He cites a terrific paper co-authored by Anna Ijjas who seems truly brilliant - I wonder if she knows Corina?)

He rightly advocates that funding agencies should promote the analysis of data for serendipitous purposes beyond major programmes and the main-stream dogma. The need for a change in course is even more timely now. Empirical constraints on expected forms of dark matter (such as weakly interacting massive particles or supersymmetric partners to known particles) are getting tighter, and the hope of identifying testable consequences of string theory is receding. At a minimum, when funding is tight, a research frontier should maintain at least two ways of interpreting data so that new experiments will aim to select the correct one. This should apply to alternatives of inflation when dealing with new cosmological data, and to alternatives of cold dark matter when discrepancies are observed in the properties of dark-matter-dominated galaxies

This is of course one aspect of the herding problem and relates somewhat to my Regulators Dilemma paper. The "optimum" for each individual grant application may be to stay within the mainstream paradigm - not least because there can be considerable academic bitchiness when it comes to refereeing application that have "heretical" ideas - but this is not necessary optimal for advancing knowledge as a whole.

Fig 4A from Loeb et al. Probability distribution for the
emergence of  life within a fixed comoving volume
 of the Universe as a function of  cosmic time. They
show the probability per log time, tdP/dt for
different choices of the minimum stellar mass,
equally spaced in log m between 0.08 MSun and 3 MSun. . 
BTW his fascinating paper cited "Relative Likelihood for Life as a Function of Cosmic Time" has this chart showing that (based on some fairly standard simplifying assumptions) the time with maximal probability per unit time for the emergence of life is roughly now (ie about 2 * 10^10 years from Big Bang) if the minimum stellar mass that can lead to life is about 0.9 MSun, but as this minimum decreases the peak time increases by roughly a decimal order of magnitude for each halving of the mass.

It immediately occurred to me that two reasons why the minimum stellar mass might be constrained to something close to MSun are the need to have a gas giant planet and a significant moon, since without the moon we would have had a many much larger asteroid impacts and life would not have had time to reach substantial levels of intelligence. These points are touched on at the end of their paper (p10) though quite understandably they don't model them because it's extremely complicated to do so.

They conclude "The probability distribution dP(t)/dlnt is of particular importance for studies attempting to gauge the level of fine-tuning required for the cosmological or fundamental physics parameters that would allow life to emerge in our Universe." with which of course I agree. It seems to me that this is a step towards being able to address at least some of the partial derivatives in the MaxHELP hypothesis, although clearly a lot more work needs to be done.

No comments: