For any $r\geq 2$, an $r$-uniform hypergraph $\mathcal{H}$, and integer $n$, the \emph{Tur\'{a}n number} for $\mathcal{H}$ is the maximum number of hyperedges in any $r$-uniform hypergraph on $n$ vertices containing no copy of $\mathcal{H}$. While the Tur\'{a}n numbers of graphs are well-understood and exact Tur\'{a}n numbers are known for some classes of graphs, few exact results are known for the cases $r \geq 3$. I will present a construction, using quadratic residues, for an infinite family of hypergraphs having no copy of the $4$-uniform hypergraph on $5$ vertices with $3$ hyperedges, with the maximum number of hyperedges subject to this condition. I will also describe a connection between this construction and a `switching' operation on tournaments, with applications to finding new bounds on Tur\'{a}n numbers for other small hypergraphs.
The number of inversions of a permutation is an important statistic that arises in many contexts, including as the minimum number of simple transpositions needed to express the permutation and, equivalently, as the rank function for weak Bruhat order on the symmetric group. In this talk, I’ll describe an analogous statistic on the reduced expressions for a given permutation that turns the Coxeter graph for a permutation into a ranked poset with unique maximal element. This statistic simplifies greatly when shifting our paradigm from reduced expressions to balanced tableaux, and I’ll use this simplification to give an elementary proof computing the diameter of the Coxeter graph for the long permutation. This talk is elementary and assumes no background other than passing familiarity with the symmetric group.
PIMS CRG in Explicit Methods for Abelian Varieties
Abstract:
I will describe recent joint work with Jeroen Sijsling, Drew Sutherland, John Voight and Dan Yasaki on genus 2 curves over Q. Our work has three primary goals: (1) produce an extensive table of genus 2 curves and their associated invariants; (2) explain the various Sato-Tate groups that arise in terms of functoriality; (3) prove at least one example of modularity for each nongeneric Sato-Tate group. Goal (1) was achieved in arXiv:1602.03715, with the data accessible inthe LMFDB, while goals (2) and (3) are in progress.
Bootstrap percolation, one of the simplest cellular automata, can be viewed as an oversimplified model of the spread of an infection on a graph. In the past three decades, much work has been done on bootstrap percolation on finite grids of a given dimension in which the initially infected set A is obtained by selecting its vertices at random, with the same probability p, independently of all other choices. The focus has been on the critical probability, the value of p at which the probability of percolation (eventual full infection) is 1/2.
The first half of my talk will be a review of some of the fundamental results concerning critical probabilities proved by Aizenman, Lebowitz, Schonman, Cerf, Cirillo, Manzo, Holroyd and others, and by Balogh, Morris, Duminil-Copin and myself. The second half will about about the very recent results I have obtained with Holmgren, Smith, Uzzell and Balister on the time a random initial set takes to percolate.
Central to Alan Turing's posthumous reputation is his work with British codebreaking during the Second World War. This relationship is not well understood, largely because it stands on the intersection of two technical fields, mathematics and cryptology, the second of which also has been shrouded by secrecy. This lecture will assess this relationship from an historical cryptological perspective. It treats the mathematization and mechanization of cryptology between 1920-50 as international phenomena. It assesses Turing's role in one important phase of this process, British work at Bletchley Park in developing cryptanalytical machines for use against Enigma in 1940-41. It focuses on also his interest in and work with cryptographic machines between 1942-46, and concludes that work with them served as a seed bed for the development of his thinking about computers.
While Turing is best known for his abstract concept of a "Turing Machine," he did design (but not build) several other machines - particularly ones involved with code breaking and early computers. While Turing was a fine mathematician, he could not be trusted to actually try and construct the machines he designed - he would almost always break some delicate piece of equipment if he tried to do anything practical.
The early code-breaking machines (known as "bombes" - the Polish word for bomb, because of their loud ticking noise) were not designed by Turing but he had a hand in several later machines known as "Robinsons" and eventually the Colossus machines.
After the War he worked on an electronic computer design for the National Physical Laboratory - an innovative design unlike the other computing machines being considered at the time. He left the NPL before the machine was operational but made other contributions to early computers such as those being constructed at Manchester University.
This talk will describe some of his ideas behind these machines.
Many scientific questions are considered solved to the best possible degree when we have a method for computing a solution. This is especially true in mathematics and those areas of science in which phenomena can be described mathematically: one only has to think of the methods of symbolic algebra in order to solve equations, or laws of physics which allow one to calculate unknown quantities from known measurements. The crowning achievement of mathematics would thus be a systematic way to compute the solution to any mathematical problem. The hope that this was possible was perhaps first articulated by the 18th century mathematician-philosopher G. W. Leibniz. Advances in the foundations of mathematics in the early 20th century made it possible in the 1920s to first formulate the question of whether there is such a systematic way to find a solution to every mathematical problem. This became known as the decision problem, and it was considered a major open problem in the 1920s and 1930s. Alan Turing solved it in his first, groundbreaking paper "On computable numbers" (1936). In order to show that there cannot be a systematic computational procedure that solves every mathematical question, Turing had to provide a convincing analysis of what a computational procedure is. His abstract, mathematical model of computability is that of a Turing Machine. He showed that no Turing machine, and hence no computational procedure at all, could solve the Entscheidungsproblem.
During this series of lectures, we are talking about infinite graphs and set systems, so this will be infinite combinatorics. This subject was initiated by Paul Erdös in the late 1940’s.
I will try to show in these lectures how it becomes an important part of modern set theory, first serving as a test case for modern tools, but also influencing their developments.
In the first few of the lectures, I will pretend that I am talking about a joint work of István Juhász, Saharon Shelah and myself [23].
The actual highly technical result of this paper that appeared in the Fundamenta in 2000 will only be stated in the second or the third part of these lectures. Meanwhile I will introduce the main concepts and state—--and sometimes prove—--simple results about them.
Let $p$ be a prime. The main subject of my talks is the estimation of exponential sums over an arbitrary subgroup $G$ of the multiplicative group ${\mathbb Z}^*_p$:
$$S(a, G) = \sum_{x\in G} \exp(2\pi iax/p), a \in \mathbb Z_p.$$
These sums have numerous applications in additive problems modulo $p$, pseudo-random generators, coding theory, theory of algebraic curves and other problems.