Quicklists
public 01:34:44

Rahul Dalal : Counting level-1, quaternionic automorphic representations on G2

  -   Number Theory ( 164 Views )

Quaternionic automorphic representations are one attempt to generalize to other groups the special place holomorphic modular forms have among automorphic representations of GL2. Like holomorphic modular forms, they are defined by having their real component be one of a particularly nice class (in this case, called quaternionic discrete series). We count quaternionic automorphic representations on the exceptional group G2 by developing a G2 version of the classical Eichler-Selberg trace formula for holomorphic modular forms. There are two main technical difficulties. First, quaternionic discrete series come in L-packets with non-quaternionic members and standard invariant trace formula techniques cannot easily distinguish between discrete series with real component in the same L-packet. Using the more modern stable trace formula resolves this issue. Second, quaternionic discrete series do not satisfy a technical condition of being "regular", so the trace formula can a priori pick up unwanted contributions from automorphic representations with non-tempered components at infinity. Applying some computations of Mundy, this miraculously does not happen for our specific case of quaternionic representations on G2. Finally, we are only studying level-1 forms, so we can apply some tricks of Chenevier and Taïbi to reduce the problem to counting representations on the compact form of G2 and certain pairs of modular forms. This avoids involved computations on the geometric side of the trace formula.

public 01:24:57

Dan Goldston : Small Gaps between Zeros of the Riemann Zeta-Function

  -   Number Theory ( 135 Views )

We consider the complex zeros of the Riemann zeta-function &rho = &beta + i &gamma, &gamma > 0. The Riemann Hypothesis (RH) is that &beta = 1/2. If we consider the vertical distribution of these zeros, then the average vertical spacing between zeros at height T is 2&pi / log T. We expect theoretically and find numerically that the distribution of the lengths of these gaps follows a certain continuous GUE distribution where both very small and very large multiples of the average spacing occur. In contrast to this, the existence of a Landau Siegel-zero would force all the gaps in a certain large range to never be closer than half the average spacing, and also have even more bizarre and unlikely properties. There are three methods that have been developed to prove something about small gaps. First, Selberg in the 1940's using moments for the number of zeros in short intervals, was able to prove unconditionally that there are some gaps larger than the average spacing and others smaller than the average spacing. Next assuming RH Montgomery in 1972 introduced a pair correlation method for zeros and produced small gaps less than 0.67 times the average spacing. Finally, in 1981 Montgomery-Odlyzko assuming RH introduced a Dirichlet polynomial weighted method which found small gaps less then 0.5179 times the average spacing. (This method was further developed by Conrey, Ghosh, and Gonek.) These methods all exhibit the presumed barrier at 1/2 times the average spacing for small gaps. I will talk about two projects that are work in progress. The first is joint with Hugh Montgomery and is motivated by the observations that all the results mentioned above do not exclude the possibility that the small gaps found are all coming from multiple zeros and thus gaps of length zero, and at present we do not know if there are any non-zero gaps that are shorter then the average spacing. While we have not yet be able to prove there are any smaller than average non-zero gaps, we can quantify the relationship between non-zero gaps and multiple zeros and show there is a positive proportion of one or the other. The second project is joint work with Caroline Turnage-Butterbaugh where we have developed a Dirichlet Polynomial Weighted Pair Correlation Method which potentially can be applied to a number of questions on zeros.

public 27:40

Eric Wang : P-adic Algebra and Analysis

  -   Number Theory ( 161 Views )

public 01:34:38

Stuart Kauffman : The Open Universe

  -   Number Theory ( 128 Views )

Laplace gave the simplest early statement of reductionism. His Demon, if supplied with the positions and momenta of all the particles in the universe, could, using Newton's laws, calculate the entire future and past of the universe. Add fields, quantum mechanics, and General Relativity and you have, roughly, modern physics. There are four features to Laplace's reductionism: (I) Everything that happens is deterministic, called into question a century later by quantum mechanics and the familiar Copenhagen interpretation and Born rule. (ii) All that is ontologically real are "nothing but" particles in motion. (iii) All that happens in the universe is describable by universal laws. (iv) There exists at least one language able to describe all of reality. Quantum mechanics is evidence against (i). I will argue that biological evolution, the coming into existence in the universe of hearts and humming birds co-evolving with the flowers that feed them and that they pollenate, cannot be deduced or simulated from the basic laws of physics. In Weinberg's phrase, they are not entailed by the laws of physics. I will then claim that at levels above the atom, the universe will never make all possible proteins length 200 amino acids, all possible organisms, or all possible social systems. The universe is indefinitely open upwards in complexity. More, proteins, organisms, and social systems are ontologically real, not just particles in motion. Most radically, I will contest (iii). I will try to show that we cannot pre-state Darwinian pre-adaptations, where a pre-adaptation is a feature of an organism of no use in the current selective environment, but of use in a different environment, hence selected for a novel function. Swim bladders are an example. Let me define the "adjacent possible" of the biosphere. Once there were the lung fish that gave rise to swim bladders, swim bladders were in the adjacent possible of the biosphere. Before there were multi-celled organisms, swim bladders were not in the adjacent possible of the biosphere. What I am claiming is that we cannot pre-state the adjacent possible of the biosphere. How could we pre-state the selective conditions? How could we pre-specify the features of one or several organisms that might become pre-adaptations? How could we know that we had completed the list? The implications are profound, if true. First, we can make no probability statement about pre-adaptations, for we do not know the sample space, so can formulate no probability measure. Most critically, if a natural law is a compact description before hand and afterward of the regularities of a process, then there can be no natural law sufficient to describe the emergence of swim bladders. Thus, the unfolding of the universe is partially lawless! This contradicts our settled convictions since Descartes, Galileo, Newton, Einstein and Schrödinger. It says that (iii) is false. In place of law is a ceaseless creativity, a self consistent self construction of the biosphere, the economy, our cultures, partially beyond law. Were reductionism sufficient, the existence of swim bladders in the universe would be entailed by physical law, hence "explained". But it appears that physics, as stated, is not sufficient in its reductionist version. Then we must explain the existence in the universe of swim bladders and humming birds pollenating flowers that feed them, on some different ground. We need a post-reductionist science. Autocatalytic mutualisms of organisms, the biosphere, and much of the economy, may be part of the explanation we seek. In turn this raises profound questions about how causal systems can coordinate their behaviors, let alone the role of energy, work, power, power efficiency, in the self-consistent construction of a biosphere. There is a lot to think about.