Search Google Appliance


Maseeh Mathematics & Statistics Colloquium Series

(In Neuberger Hall 454 unless otherwise noted.)

The upcoming colloquium:

3:15 Wednesday, January 7, 2015, in SMSU 296
Bruno Jedynak, The Johns Hopkins University
The game of 20 questions: a delight of information theory, probability, control, and computer vision

Abstract: We will explore various instances of the game of 20 questions with special interest in the situations where (1) the responses are noisy and (2) there are multiple targets. We will discuss adaptive as well as non-adaptive policies. We will study performance and optimality for an information theoretic cost function. Application in fast face detection, micro-surgical tool tracking, and human vision will be briefly presented.
 

The 2014-2015 Colloquium Series:

(Here are videos of the Fall 2014 colloquia.)

3:15PM Friday, October 10, 2014:

Liz Stanhope, Lewis & Clark College
You can't hear the shape of an orbifold
view abstract

3:15PM Friday, October 24, 2014:

Irene Fonseca, President of SIAM, Carnegie Mellon University
Variational methods in materials and image processing
view abstract

3:15PM Friday, October 31, 2014:

Tanya Kostova Vassilevska, Lawrence Livermore National Laboratory
Model reduction with proper orthogonal decomposition for dynamical systems: Using snapshots from the time derivatives
view abstract

3:15PM Friday, November 7, 2014:

Boris Mordukhovich, Wayne State University
Variational analysis: what is it?
view abstract

3:15PM Friday, November 14, 2014:

Piotr Zwiernik, University of California–Berkeley
Understanding statistical models through their geometry
view abstract

3:15PM Friday, November 21, 2014:

Andrew Gillette, University of Arizona
Modern directions in finite element theory: polytope meshes and serendipity methods
view abstract

3:15PM Friday, December 5, 2014:

Alexis Dinno, Portland State University School of Community Health
Frequentist tests for equivalence, tests for relevance
view abstract

3:15PM Friday, January 7, 2015, in SMSU 296:

Bruno Jedynak, The Johns Hopkins University
The game of 20 questions: a delight of information theory, probability, control, and computer vision
view abstract

3:15PM Friday, January 9, 2015:

Leonid Chindelevitch, Harvard School of Public Health
Modeling tuberculosis, from genes to populations
view abstract

3:15PM Friday, January 23, 2015:

Marina Meila, University of Washington
TBA

3:15PM Friday, February 20, 2015:

Reza Sarhangi, Towson University
Interlocking star polygons in Persian architecture: The special case of the decagram in mosaic designs
view abstract

3:15PM Friday, February 27, 2015:

David Yanez, Oregon Health & Science University
Longitudinal Change in IMT & Risk of Stroke, MI, and CHD
view abstract

3:15PM Friday, April 10, 2015:

Thuan Nguyen, Oregon Health & Science University
TBA


 

Abstracts:

October 10, 2014 (return)

Liz Stanhope, Lewis & Clark College
You can't hear the shape of an orbifold  
If we can infer the chemical composition of a star from the colors of light it emits, can we determine the shape of a bell from the ringing that it makes? One way to address this question is to ask if the eigenvalues of the Laplace operator associated to a Riemannian manifold determine the manifold. A famous answer of "No" came in 1992 when Gordon, Webb and Wolpert exhibited nonisometric planar domains with exactly the same Laplace spectrum. After an introduction to the mildly singular spaces known as Riemannian orbifolds, I will discuss the degree to which the Laplace spectrum of an orbifold gives us information about the geometry and topology of the orbifold.

October 24, 2014 (return)

Irene Fonseca, President of SIAM, Carnegie Mellon University
Variational methods in materials and image processing  
Several questions in applied analysis motivated by issues in computer vision, physics, materials sciences and other areas of engineering may be treated variationally leading to higher order problems and to models involving lower dimension density measures. Their study often requires state-of-the-art techniques, new ideas, and the introduction of innovative tools in partial differential equations, geometric measure theory, and the calculus of variations.  
In this talk it will be shown how some of these questions may be reduced to well understood first order problems, while in others higher order terms play a fundamental role.  
Applications to quantum dots in epitaxy deposition, and decolorization and denoising in imaging science will be addressed.

October 31, 2014 (return)

Tanya Kostova Vassilevska, Lawrence Livermore National Laboratory
Model reduction with proper orthogonal decomposition for dynamical systems: Using snapshots from the time derivatives  
In many areas of science and technology, complex multi-physics time-dependent problems are modeled by large systems of differential equations. Their analysis often poses huge computational challenges as it requires multiple, prohibitively expensive simulations, in terms of time and memory. However, in many cases, the solutions of these systems lie in low-dimensional manifolds. In these cases, reduced order models (ROMs) exploiting that manifold structure can dramatically reduce the time and memory needed to execute the corresponding full-order model (FOM). One of the most studied approaches to building reduced order models is based on Proper Orthogonal Decomposition (POD). Normally, POD uses the solution of the FOM at selected time, space and parameter values, typically called "snapshots, to calculate the basis of the reduced space.  
Kunisch and Volkwein suggested using difference quotients (DQs) as snapshots in addition to the solution snapshots. Their incentive came from the method of derivation of their error bound, which, without using the DQs, blows up when the distance between snapshots diminishes. Other authors have questioned whether using DQs brings any advantage for designing better POD ROMs but this question has so far been not addressed.  
We have made some progress in this direction. I will describe our recent work on developing error bounds to compare two POD ROMs: the first using only solution snapshots, the second using, in addition, snapshots from the time derivatives. This work brings two main new results. The first is the actual form of the error bound which involves the time moments at which the snapshots were taken. The second innovation is that the bounds demonstrate for the first time that, asymptotically, the method with time derivative information can be more accurate. The new bounds give insights about the behavior that we test numerically. Specifically, we demonstrate that the behavior of the errors in numerical experiments with the discretized FitzHugh-Nagumo system (known from neurophysiology) is well predicted by the bounds.  
In addition, if time permits, I will talk about possible future applications in biology using POD ROM methods.

November 7, 2014 (return)

Boris Mordukhovich, Wayne State University
Variational analysis: what is it?  
Variational analysis has been recognized as an active and rapidly growing area of mathematics and operations research motivated mainly by the study of constrained optimization and equilibrium problems, while also applying perturbation ideas and variational principles to a broad class of problems and situations that may be not of a variational nature. One of the most characteristic features of modern variational analysis is the intrinsic presence of nonsmoothness, which naturally enters not only through the initial data of the problems under consideration but largely via variational principles and perturbation techniques applied to a variety of problems with even smooth data. Nonlinear dynamics and variational systems in applied sciences also give rise to nonsmooth structures and motivate the development of new forms of analysis that rely on generalized differentiation.  
This lecture is devoted to discussing some basic constructions and results of variational analysis and its remarkable applications.

November 14, 2014 (return)

Piotr Zwiernik, University of California–Berkeley
Understanding statistical models through their geometry  
Discrete and Gaussian statistical models have a rich geometric structure and can be often viewed as semi-algebraic sets. The geometry viewpoint provides not only a good intuition about behavior of statistical procedures but also tools for proving concrete statistical results. I want to discuss in more detail two examples of statistical models: latent graphical tree models and Gaussian linear covariance models. Although these models are very different, in both cases the corresponding likelihood function is multimodal and so its efficient optimization requires potentially fragile numerical procedures. In the first case, I will show how combinatorics and algebra is used to understand the structure of such a model, which also provides a better insight into numerical procedures such as the EM algorithm. In the second case, using recent results in random matrix theory, I will show that optimization of the likelihood function is essentially (with high probability) a convex programming problem.

November 21, 2014 (return)

Andrew Gillette, University of Arizona
Modern directions in finite element theory: polytope meshes and serendipity methods  
Finite element methods take a domain decomposed into a mesh of elements with simple geometry and produce an approximate solution to specified PDEs in terms of basis functions associated to each mesh element. In this talk, I will discuss two trends in the analysis of finite element methods for modern applications: (1) the use of polygonal or polyhedral mesh elements for domain decomposition and (2) the use of reduced "serendipity" basis sets for efficient solution approximation. Theoretical and numerical results will be presented, along with a view of how future research in these areas will be closely related. This talk will be accessible to a general mathematical audience.

December 5, 2014 (return)

Alexis Dinno, Portland State University School of Community Health
Frequentist tests for equivalence, tests for relevance  
I motivate and introduce the frequentist test for equivalence using the Two One-Sided Tests approach, which originated in clinical epidemiology, but has application anywhere one wants to demonstrate evidence of the absence of an effect. I make a case that relevance tests—inference based on combining tests for difference and tests for equivalence—resolve several problems in frequentist hypothesis testing. I close with my own efforts to develop an equivalence test for the Kolmogorov-Smirnov test, and invite discussion.

January 7, 2015, in SMSU 296 (return)

Bruno Jedynak, The Johns Hopkins University
The game of 20 questions: a delight of information theory, probability, control, and computer vision  
We will explore various instances of the game of 20 questions with special interest in the situations where (1) the responses are noisy and (2) there are multiple targets. We will discuss adaptive as well as non-adaptive policies. We will study performance and optimality for an information theoretic cost function. Application in fast face detection, micro-surgical tool tracking, and human vision will be briefly presented.

January 9, 2015 (return)

Leonid Chindelevitch, Harvard School of Public Health
Modeling tuberculosis, from genes to populations  
Tuberculosis (TB) continues to afflict millions of people and causes over a million deaths a year worldwide. Multi-drug resistance is also on the rise, causing concern among public-health experts. Mathematical and statistical modeling and the development of improved computational tools have an important role to play in supporting worldwide control of TB infections.  
This talk will give an overview of my work on modeling TB by leveraging population information together with molecular genetics data. I will start by presenting a joint model of the dynamics of TB and HIV, whose analysis in a Bayesian framework has helped inform policy decisions on TB control. I will go on to discuss an optimization-based methodology I developed for an accurate classification of complex TB infections as originating from mutation or mixed infection. I will finish by describing an approach for improving the assignment of lineages to TB strains by using a model of molecular evolution, and an ongoing project on differentiating acquired and transmitted resistance in a high TB burden setting.

February 20, 2015 (return)

Reza Sarhangi, Towson University
Interlocking star polygons in Persian architecture: The special case of the decagram in mosaic designs  
A special series of Persian mosaic designs with a common element has appeared in some historical scrolls and monuments. The common element is a cancave ten-pointed star polygon, called a decagram for convenience. The decagram can be created through the rotation of two concentric congruent regular pentagons with a radial distance of 36° from each others’ central angles. To create a decagram-based interlocking pattern, however, a craftsman-mathematician would need to take careful steps to locate a fundamental region. A modular approach to pattern-making seems to be more applicable for this design than compass-straightedge constructions. Such designs include patterns that are sometimes called aperiodic or quasi-periodic tilings in the language of modern mathematics.

February 27, 2015 (return)

David Yanez, Oregon Health & Science University
Longitudinal Change in IMT & Risk of Stroke, MI, and CHD  
Carotid Intima Media Thickness(IMT) assessed by B-mode ultrasound is an important non-invasive modality for evaluating atherosclerotic disease burden and global cardiovascular (CV) disease risk. A number of studies have examined the relationship between carotid IMT and subsequent cardiovascular disease [O’ Leary, 1999; Psaty, 1999; Kuller, 2006 ARIC], and have generally shown a strong relationship. The evidence for a relationship between carotid IMT and future cardiovascular events is strong, especially among younger individuals [Lorenz MW et al. (2006). Carotid intima-media thickening indicates a higher vascular risk across a wide age range: prospective data from the Carotid Atherosclerosis Progression Study (CAPS). Stroke 37: 87–92]. Carotid (IMT) has been used also as measure of disease progression in clinical trials investigating the efficacy of new pharmacologic products tested for the ability to reduce cardiovascular disease burden [Bots ML et al. (2003)]. Change in IMT has been reported to be associated with several known cardiovascular risk factors [Chambless 2002]. The use of IMT as an image surrogate marker of sub-clinical atherosclerosis and cardiovascular events has several desirable features, as it is easily measurable in all study participants, it is non-invasive, is relatively inexpensive, and, of particular importance in clinical trials, it does not require an extended duration of follow up for cardiovascular events to occur [Demol P and Weihrauch TR (1998)]. Other studies in patients with more severe cardiovascular disease have shown disease regression with the use of statins, indicating a reduction in carotid IMT. Likewise, multiple diabetes [CHICAGO (Carotid Intima-Media Thickness in Atherosclerosis Using Pioglitazone)] and hypertensive medications have been shown to slow the progression of carotid IMT.  
However, much more limited evidence is available regarding the association of carotid IMT progression and cardiovascular outcomes in longitudinal studies. A meta-analysis of several longitudinal studies has examined the relationship between IMT and future events, but different studies have used different measurement methods and studied different populations, therefore these data, although important, are difficult to interpret [Lorenz MW Circulation]. A high correlation between the surrogate and the ultimate outcome is desirable for an intermediate outcome measure to be valid. As IMT measurement is associated with a noteworthy amount of measurement error, the effect of measurement error on IMT change can potentially affect the prediction of cardiovascular events and introduce bias.  
The Cardiovascular Health Study provides an ideal setting to examine the relationship between cardiovascular events and changes in IMT among a group of relatively healthy participants 65 and older that had carotid IMT measures at baseline, year 5 and year 11 of the study. For this investigation, our primary research hypothesis is to evaluate whether changes in the common carotid IMT and the internal carotid IMT are associated with subsequent clinical coronary heart disease, stroke, and myocardial infarction. We perform our analysis in this observational study by accounting for measurement error bias using risk-set regression calibration (RSRC) methods on both the time independent and time dependent IMT measurements. We also adjust for known baseline confounders in our analyses, such as age, sex, race, smoking status, height, weight, systolic blood pressure, HDL, LDL, LV Mass, Factor VII, fibrinogen, insulin, and blood glucose. We also investigate the impact of the measurement error bias by comparing our results to those using standard (naïve) Cox PH.