Chronic disease management often involves sequential decisions that have long-term implications. Those decisions are based on high dimensional information, which pose a problem for traditional modeling paradigms. In some key instances, the disease dynamics might not be known, but instead are learned as new information becomes available. As a first step, we will describe some of the ongoing research modeling medical decisions of patients with chronic conditions. Key to the models developed is the incorporation of the individual patient's disease dynamics into the parameterization of the models of the disease state evolution. Model conception and validation is described, as well as the role of multidisciplinary collaborations in ensuring practical impact of this work.
The Pacific Institute for the Mathematical Sciences (PIMS) was founded in 1996, and Simon Fraser University is a founding member. The members of PIMS now include all the major Canadian research universities west of Ontario, as well as universities in Washington and Oregon. Please join us to celebrate 20 years of productive collaboration, with a lecture by SFU alumna and professor at UCL Nataša Pržulj on Data Driven Medicine followed by a reception.
We are faced with a flood of molecular and clinical data. Various biomolecules interact in a cell to perform biological function, forming large, complex systems. Large amounts of patient-specific datasets are available, providing complementary information on the same disease type. The challenge is how to model and mine these complex data systems to answer fundamental questions, gain new insight into diseases and improve therapeutics. Just as computational approaches for analyzing genetic sequence data have revolutionized biological and medical understanding, the expectation is that analyses of networked “omics” and clinical data will have similar ground-breaking impacts. However, dealing with these data is nontrivial, since many questions we ask about them fall into the category of computationally intractable problems, necessitating the development of heuristic methods for finding approximate solutions.
We develop methods for extracting new biomedical knowledge from the wiring patterns of large networked biomedical data, linking network wiring patterns with function and translating the information hidden in the wiring patterns into everyday language. We introduce a versatile data fusion (integration) framework that can effectively integrate somatic mutation data, molecular interactions and drug chemical data to address three key challenges in cancer research: stratification of patients into groups having different clinical outcomes, prediction of driver genes whose mutations trigger the onset and development of cancers, and re-purposing of drugs for treating particular cancer patient groups. Our new methods stem from network science approaches coupled with graph-regularised non-negative matrix tri-factorization, a machine learning technique for co-clustering heterogeneous datasets.
During World War II Hedy Lamarr, a striking Hollywood actress, together with George Antheil, a radical composer, invented and patented a secret signaling system for the remote control of torpedoes. The ideas in this patent have since developed into one of the ingredients in modern digital wireless communications. The unlikely biography of these two characters, along with some of the more modern developments in wireless communications will be described.
Experimental design is a branch of statistics focused upon designing experimental studies in a way that maximizes the amount of salient information produced by the experiment. It is a topic which has been well studied in the context of linear systems. However, many physical, biological, economic, financial and engineering systems of interest are inherently non-linear in nature. Experimental design for non-linear models is complicated by the fact that the optimal design depends upon the parameters that we are using the experiment to estimate. A Bayesian, often simulation-based, framework is a natural setting for such design problems. We will illustrate the use of such a framework by considering the design of an animal disease transmission experiment where the underlying goal is to identify some characteristics of the disease dynamics (e.g. a vaccine effect, or the infectious period).
In this seminar we will discuss a new model for strategic investment model for a merchant energy storage facility. The facility's actions impact market-clearing outcomes, and thus it is a price-maker facility. We consider the uncertainties associated with other generation units offering strategies and future load levels in the proposed model. Thestrategic investment decisions include the sizes of charging device,discharging device, and energy reservoir. The proposed model is astochastic bi-level optimization problem where planning and operation decisions of the energy storage facility are made in the upper level, and market clearing is modeled in the lower level under different operating conditions. To make the proposed model computationally tractable, an iterative solution technique based on Benders¹ decomposition is implemented. This provides a master problem and a set of subproblems for each scenario. Each subproblem is recast as a Mathematical Programs with Equilibrium Constraints (MPEC). Numerical results based on real-lifemarket data from Alberta's electricity market will be provided.
The presentation will take us along the road to the ozone standard for the United States, announced in Mar 2008 by the US Environmental Protection Agency, and then the new proposal in 2014. That agency is responsible for monitoring that nation’s air quality standards under the Clean Air Act of 1970. I will describe how I, a Canadian statistician, came to serve on the US Clean Air Scientific Advisory Committee (CASAC) for Ozone that recommended the standard and my perspectives on the process of developing it. I will introduce the rich cast of players involved including the Committee, the EPA staff, “blackhats,” “whitehats,” “gunslingers,” politicians and an unrevealed character waiting in the wings who appeared onstage only as the 2008 standards had been formulated. And we will encounter a couple of tricky statistical problems that arose along with approaches, developed by the speaker and his coresearchers, which could be used to address them. The first was about how a computational model based on things like meteorology could be combined with statistical models to infer a certain unmeasurable but hugely important ozone level, the “policy related background level” generated by things like lightning, below which the ozone standard could not go. The second was about estimating the actual human exposure to ozone that may differ considerably from measurements taken at fixed site monitoring locations. Above all, the talk will be a narrative about the interaction between science and public policy - in an environment that harbors a lot of stakeholders with varying but legitimate perspectives, a lot of uncertainty in spite of the great body of knowledge about ozone and above all, a lot of potential risk to human health and welfare.
Interesting mathematics arises in many areas of the study of sea ice and its role in climate. Partial differential equations, numerical analysis, dynamical systems and bifurcation theory, diffusion processes, percolation theory, homogenization and statistical physics represent a broad range of active fields in applied mathematics and theoretical physics which are relevant to important issues in climate science and the analysis of sea ice in particular.
The main optimization problem in many applications in signal processing (e.g. in image reconstruction, MRI, seismic images, etc.) and statistics (e.g. model selection in regression methods), is the following sparse optimization problem. The goal is finding a sparse solution to the underdetermined linear system Ax = b, where A is an m x n matrix and b is an m-vector and m ≤ n . The problem can be written as
min (over x) ||x||₁ subject to Ax = b.
There are several approaches to this problem that generally aim at approximate solutions, and often solve a simplified version of the original problem. For example passing from ℓ-norm to ℓ₁-norm yields an interesting convexification of the problem . Moreover the equality Ax = b does not cover noisy cases in which Ax + r = b for some noise vector r
min (over x) ||x||₁ subject to ||Ax - b||₂ ≤ σ.
Extensive theoretical [6, 7] and practical studies [5, 8] have been carried on solving this problem and various succesfull methods adopting interior-point algorithms, gradient projections, etc. have been tested. The discrete nature of the original problem also suggests possibility of viewing the problem as a mixed-integer optimization problem . However common methods for solving such mixed-integer optimization problems (e.g. Benders’ decomposition) iteratively generate hard binary optimization subproblems . The exciting possibility that quantum computers may be able to perform certain computations faster than digital computers has recently spiked with the quantum hardware of D-Wave systems. The current implementations of quantum systems based on the principles of quantum adiabatic evolution, provide experimental resources for studying algorithms that reduce computationally hard problems to those that are native to the specific evolution carried by the system. In this project we will explore possibilities of designing optimization algorithms that use the power of quantum annealing in solving sparse recovery problems.
 Emmanuel J. Cand´es, Justin K. Romberg, and Terence Tao. Stable signal recovery from incomplete and inaccurate measurements. Communications on Pure and Applied Mathematics, 59(8):1207–1223, 2006.
 Simon Foucart and Holger Rauhut. A Mathematical Introduction to Compressive Sensing. Birkh¨user Basel, 2013.
 N.B. Karahanoglu, H. Erdogan, and S.I. Birbil. A mixed integer linear programming formulation for the sparse recovery problem in compressed sensing. In Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on, pages 5870–5874, May 2013.
 Duan Li and Xiaoling Sun. Nonlinear Integer Programming. International Series in Operations Research & Management Science. Springer, 2006.
 Ewout van den Berg and Michael P. Friedlander. SPGL1: A solver for large-scale sparse reconstruction, June 2007. http://www.cs.ubc.ca/labs/scl/spgl1.
 Ewout van den Berg and Michael P. Friedlander. Probing the pareto frontier for basis pursuit solutions. SIAM Journal on Scientific Computing, 31(2):890–912, 2008.
 Ewout van den Berg and Michael P. Friedlander. Sparse optimization with least-squares constraints. SIAM J. Optimization, 21(4):1201–1229, 2011.
 Ewout van den Berg, Michael P. Friedlander, Gilles Hennenfent, Felix J. Herrmann, Rayan Saab, and ¨Ozg¨ur Yilmaz. Algorithm 890: Sparco: A testing framework for sparse reconstruction. ACM Trans. Math. Softw., 35(4):29:1–29:16, February 2009.