Events
Department of Mathematics and Statistics
Texas Tech University
In 1956, Charles Stein shocked the statistical world by proving that the maximum likelihood estimate (MLE) is dominated by another estimator in high dimensions. The core idea of the James-Stein (JS) estimator is shrinkage. That inspired the development of different regularization techniques, including lasso. Not only does lasso shrink MLE towards zero, but also thresholds the small estimates to exact zero. The thresholding effect makes lasso achieve smaller risk than the JS estimator in the sparse signal setting. In the normal mean estimation problem, both the JS estimator and lasso are special cases of a general family of estimators with shrinkage and thresholding effects (GEST). Here we consider the most dominant estimator among GEST in minimizing an approximate risk. The resulting nonparametric estimator, namely NOMAD, is consistent under certain conditions and can dominate both the JS estimator and lasso in reducing the risk. The applications of NOMAD to wavelet analysis and linear regression are discussed in the presentation.
Information: Dr. Wei Jiang is an Assistant Professor in the Department of Mathematics at UT Arlington, and is also affiliated with the Division of Data Science and Center of Innovation for Health Informatics. Prior to joining UT Arlington, he was on the faculty of Biostatistics at Yale University. Dr. Jiang focuses on developing low-dimensional modeling methods to improve dependency and scalability for analyzing large-scale high-dimensional data, with applications in multi-media and multi-omics. Along this line, he developed multiple statistical methods to explore the molecular mechanisms of biological traits. His first-authored research works have been published in Nature Communications, Statistica Sinica, American Journal of Human Genetics etc, and the paper exploring replicability of genomic discovery received the Best Paper Award in the 14th Asia Pacific Bioinformatics Conference held in San Francisco.
Please virtually attend this week's Statistics seminar at 3:00 PM via this zoom link.
Meeting ID: 955 0666 7723
Passcode: 058626
 | Wednesday Apr. 1
| | Algebra and Number Theory No Seminar
|
We introduce a new formulation of the classical Weierstrass–Enneper representation for minimal surfaces in four-dimensional Euclidean space. We present an explicit formulation that yields families of minimal surfaces in $R^4$ and allows both parametric and implicit descriptions. Several examples are discussed to illustrate the structure of the formula and its geometric implications. This new formulation provides a natural extension of classical minimal surface theory and offers new tools for studying minimal surfaces in higher codimension. This is joint work with Dr. Magda Toda.
CDT is UTC-5. This Differential Geometry, PDE and Mathematical Physics seminar uses a hybrid format and also available over zoom.
Abstract: In this talk, we introduce an enhanced version of the incremental singular value decomposition (SVD) method. The original incremental SVD, proposed by Brand, efficiently computes the SVD of a matrix by iteratively updating a sequence of orthogonal transformations. However, the accumulation of such transformations can degrade orthogonality, necessitating frequent and computationally expensive reorthogonalizations. Brand raised the open question of how often reorthogonalization is required to ensure numerical precision. In this talk, we first answer this question by presenting a modified algorithm that entirely avoids computing large numbers of intermediate orthogonal matrices.
We further apply this modified incremental SVD technique to the numerical solution of Non-Fickian flows, a class of problems where the current solution depends on all previous time steps. This temporal dependency results in a linearly growing memory footprint and a quadratically increasing computational cost. Assuming the solution data is approximately low-rank, we introduce a memory-free algorithm based on incremental SVD that maintains only linear growth in computational complexity with respect to the number of time steps. We show that the solution error introduced by our approach is within machine precision, and our numerical results affirm significant gains in both computational efficiency and memory usage.
When: 4:00 pm (Lubbock's local time is GMT -5)
Where: room Math 011 (Math Basement)
ZOOM details:
- Choice #1: use this
Direct Link that embeds meeting and ID and passcode.
- Choice #2: Log into zoom, then join by manually entering the meeting ID and passcode ...
* Meeting ID: 949 9288 2213
* Passcode: Applied
TTU Math Circle Spring Flyer 6:30-7:30 PM Thursdays in the basement of Math, room 010
abstract noon CDT (UTC-5)
Zoom link available from Dr. Brent Lindquist upon request.