Colloquia
Department of Mathematics and Statistics
Texas Tech University
Dynamical systems at the frontiers of mathematics and biology use theoretical perspectives to gain insights into biological processes, as well as allow biological problems to drive the development of novel modeling frameworks. Two broad approaches are used in the field. The first focuses on using mathematics to advance biology. In this approach we apply existing modeling frameworks to biological processes to make predictions and deepen the understanding of the biological mechanisms that give rise to observed behaviors. The second approach allows the complexities of biology to demand new mathematics and leads to the development of novel modeling frameworks. These two approaches have advanced the fields of mathematics and biology. Here, I will highlight examples of collaborative work, where we use these two approaches to shed light on the mathematical and physical properties of many complex biological phenomena. These include a variety of techniques from dynamical systems theory including ordinary, partial, stochastic, and delay differential equations across a span of biological applications including ecological stoichiometry, ecotoxicology, pollination, malaria transmission, salamander pathogens, and cancer.Numerous statistical methods have been developed to explore two confounding epigenetic factors in complex human diseases: genomic imprinting and maternal effects. Full likelihood based statistical methods have been proposed to detect these two effects simultaneously. Such methods, however, have to make strong assumptions concerning mating type probabilities to avoid overparameterization. In this talk, I describe two ways to detect the two epigenetic effects jointly without making strong assumptions about nuisance parameters: 1) a series of robust partial likelihood methods that are applicable to case-control family data or discordant sibpair data with additional siblings. These methods overcome overparameterization by matching affected and unaffected probands of the same familial genotypes and deriving a partial likelihood that is free of the nuisance parameters. 2) a Monte Carlo expectation maximization method that overcomes overparameterization by using mating type probabilities as latent variables. In addition, if time allows, I will also introduce two methods to detect imprinting effect under the assumption of no maternal effect: 1) a Monte Carlo pedigree parental-asymmetry test using both affected and unaffected offspring. 2) a joint test for detecting imprinting and non-imprinting allelic expression imbalance using RNA-seq data based on a reciprocal cross design.Long range dependence is a property of stationary stochastic processes that means that observations even very far apart in time are far from being independent. This phenomenon is believed to occur in many applications, including hydrology, finance, climate and others. We argue that one should view long range dependence as a phase transition. In some cases such a phase transition in the extreme values of the process turns out to be closely related to the ergodic-theoretical properties of a certain dynamical system driving the process through its Levy exponent.There are several aspects of financial asset portfolio construction relevant for success. First, the methodology should be applicable to a reasonably large number of assets, at least on the order of 100. Second, calculations should be computationally feasible, straightforward, and fast. Third, realistic transaction costs need to be taken in account for the modeling paradigm to be genuinely applicable. Fourth, and arguably most importantly, the proposed methods should demonstrably outperform benchmark models such as the equally weighted portfolio, Markowitz IID and Markowitz using the DCC-GARCH model. A fifth "icing on the cake" is that the underlying stochastic process assumption is mathematically elegant, statistically coherent, and allows analytic computation of relevant risk measures for both passive and active risk management. The model structure to be shown, referred to as "COMFORT", satisfies all these criteria. Various potential new ideas will also be discussed, with the aim of enticing and motivating other researchers to collaborate and/or improve upon the shown investment vehicles.Option pricing is one of the main research areas of modern Mathematical Finance. Hence, new valuable developments in this area remain well-motivated and highly desirable. The aim of the talk is to present some comprehensive issues that can be interesting for a wider audience besides those experts who primarily work in Mathematical Finance. Moreover, the developments in option pricing can be considered as a reasonable source of new problems and studies in related mathematical disciplines. In the talk we discuss the essence of the notion “financial contract” and formulate the main problem for study in this context. It will be basically shown in the classical Black-Scholes environment. A dual theory of option pricing will be developed by means of market completions as an alternative of the well-known option price characterization via martingale measures. We also present another approach in approximate option pricing which is based on comparison theorems for solutions of stochastic differential equations. It will be shown that this method brings very satisfactory bounds for option prices. Finally, we will pay our attention to extensions of probability distributions of stock returns using orthogonal polynomials techniques. Going in this way we get a possibility to see what happens beyond the Black-Scholes model. PDF available.Arguably, all data can be stored as some sort of images: a picture is all it takes a car insurer to expedite the payment for a claim for a bumper scratch, satellite pictures of the Earth show that we are living indeed in a thin layer of air, medical images, first and second generation DNA sequence are initially stored as images, and digital cameras are at the fingertips of any smart phone owner. In each instance, certain information extracted from such images to be represented on a metric space, that often time has a smooth structure, or a structure of stratified space, thus opening the formidable doors to the realm of geometric and algebraic topological data analysis, for extracted from electronic images. A few simple examples of such methodology is presented here. This is joint work with Rob Paige (MST), Daniel Osborne, Mingfei Qiu, Ruite Guo, K. David Yao, David Lester, Yifang Deng, Shen Chen, Seunghee Choi and Hwiyoung Lee.Climate change requires a global perspective to understand the past and explore the future. The impacts of climate change, however, are experienced mainly at the local to regional level. A range of statistical techniques from simple to complex are commonly used to bridge the gap between the spatial scales at which climate is modeled on fundamental physical principles vs. the spatial and temporal scales at which impact assessments require climate projections. This step, often referred to as “downscaling” and bias correction, poses some significant challenges, but also has the potential to provide essential input to real-world decision-makers, from water managers to infrastructure engineers. In this presentation I will describe the methods and evaluation framework we have developed to generate and test these high-resolution climate projections, some of the ways that information has been used, and how I expect this field to continue to evolve in the future.
And I have a new co-authored book coming out with Cambridge University Press on this topic in November that people can refer to if they are interested in more information: here
We provide background and examples of tensor decomposition. We then consider a special case: decomposing higher-order moment tensors, i.e., the sum of symmetric outer products of data vectors. Such a decomposition can be used to estimate the means in a Gaussian mixture model and for other applications in machine learning. The dth-order empirical moment tensor of a set of p observations of n variables is a symmetric d-way tensor. Our goal is to find a low-rank tensor approximation comprising r << p symmetric outer products. The challenge is that forming the empirical moment tensor costs O(pn^d) operations and O(n^d) storage, which may be prohibitively expensive; additionally, the algorithm to compute the low-rank approximation costs O(n^d) per iteration. Our contribution is avoiding formation of the moment tensor, computing the low-rank tensor approximation of the moment tensor implicitly using O(pnr) operations per iteration and no extra memory. This advance opens the door to more applications of higher-order moments since they can now be efficiently computed. We present numerical evidence of the computational savings and show an example of estimating the means for higher-order moments. We also show how this can be done stochastically for massive datasets. Joint work with Samantha Sherman, Univ. Notre Dame.
This colloquium is hosted jointly by the Applied Math seminar group and the TTU Department of Mathematics and Statistics.Anisotropic bending energy models a continuum (rod, fiber, cable, etc.) whose resistance to bending is directionally dependent. In this talk, we will discuss critical curves for this type of energy in both two and three dimensional space. In particular, we will discuss the construction and classification of examples in dimension two and the existence of examples in three dimensions.
The talk is based on joint work with Dr. Alvaro Pámpano.
This colloquium is sponsored by the Elasticity seminar group in conjunction with the TTU Department of Mathematics and Statistics
Quantitative Risk Management (QRM) is an important field of interdisciplinary research. Applications are to be found in all fields of science. In this talk, I will concentrate on applications to Insurance and Finance. The theorem having the distinct honor reflected in the title has its roots in mathematical statistics. It separates the applied world/problems where QRM applications are fairly standard from those where every serious application is a research project on its own. A basic reference is the book: A.J. McNeil, R. Frey and P. Embrechts (2005/2015) Quantitative Risk Management: Concepts, Techniques and Tools. Princeton U.P. (1st/2nd Ed.). See also the book's website www.qrmtutorial.org