Events
Department of Mathematics and Statistics
Texas Tech University
Anomaly detection in a dynamic network is one of the most actively developing fields in network sciences and its applications. In this study, we invoke the machinery of topological data analysis (TDA), particularly, persistent homology, and functional data depth to evaluate anomalies in weighted dynamic networks. The proposed method is based on higher dimensional topological structures, in particular, the persistence diagram, and its vector representations. Intuitively, this TDA and Data Depth based method (TDA-Depth) evaluates the patterns and dynamics of the networks at multi-scale levels and systematically distinguishes the regular and abnormal network snapshots in a dynamic network. In synthetic experiments, the TDA-Depth method outperforms the state-of-the-art method. We also evaluate our method on two real weighted dynamic networks. In both datasets, we demonstrate that our method can effectively identify anomalous network snapshots.
This Departmental Candidate Colloquium is at 2:00 PM (UT-6) and may be attended live in ESB1 120, and virtually via this Zoom link.
Meeting ID: 939 5469 1292
Passcode: Dey
The free probability theory was initially designed to study certain class of von Neumann algebras in the theory of operator algebras. It turns out to be the most suitable framework to study the universality laws in random matrix theory due to the groundbreaking work of Voiculescu. These limiting laws are encoded in abstract operators, called free random variables. The interactions between abstract operator algebras and random matrix theory are very fruitful. It opens the door for applications to a wide range of areas including high dimensional statistics, large neural network, wireless communication, and quantum information theory.
In this talk, I will report some exciting progress on limiting distributions of many deformed random matrix models. We calculated the Brown measures of a large family of random variables in free probability theory, which allows us to investigate many new non-Hermitian random matrix models. Our recent works surpassed all existing results on this topic and unified previous methods. We will discuss their applications to random network, high dimensional statistics, and outliers of large random matrix models.
This Departmental Job Candidate Colloquium may be attended virtually at 2:00 PM (UT-6) via this Zoom link.
Single index models provide an effective dimension reduction tool in regression, especially for high dimensional data, by projecting a general multivariate predictor onto a direction vector. In this talk, we propose a novel single-index model for regression models where metric space-valued random object responses are coupled with multivariate Euclidean predictors. The responses in this regression model include complex, non-Euclidean data, including covariance matrices, graph Laplacians of networks, and univariate probability distribution functions, among other complex objects that lie in abstract metric spaces. While Fréchet regression has proved useful for modeling the conditional mean of such random objects given multivariate Euclidean vectors, it does not provide for regression parameters such as slopes or intercepts, since the metric space-valued responses are not amenable to linear operations. As a consequence, distributional results for Fréchet regression have been elusive. We show here that for the case of multivariate Euclidean predictors, the parameters that define a single index and projection vector can be used to substitute for the inherent absence of parameters in Fréchet regression. Specifically, we derive the asymptotic distribution of suitable estimates of these parameters, which then can be utilized to test linear hypotheses for the parameters, subject to an identifiability condition. Consistent estimation of the link function of the single index Fréchet regression model is obtained through local linear Fréchet regression. The method is illustrated for distributional object data from the Human Mortality Database.
This week's first Statistics seminar and Departmental Job Candidate may be seen at 3 PM (UT-6) Monday via this Zoom link.
Meeting ID: 976 6677 9159
Passcode: 677590
Huntington’s Disease (HD) is an inherited neurological disorder caused by a single gene mutation. Spanning a long life-course that usually strikes in early life and manifests in the mid age, HD causes gradual deterioration of cognitive function and motor function that ultimately leads to loss of functional capacity. Despite seemingly straightforward etiology, there is no cure nor an effective treatment. Major challenges in designing effective trials include enormous patient heterogeneity and unclear timing of intervention. To this end, we propose a two-stage regression-based method which simultaneously identifies changepoints and subgroups of unbalanced longitudinal observations. The proposed method advances from the piecewise linear growth mixture model, which requires some prior information and often assumes a balanced design. In the proposed method, the realization of simultaneous detection is via the minimax concave penalty (MCP) of adjacent derivative difference and pairwise distance of functions represented by the second order B-spline basis functions. The alternating direction method of multipliers (ADMM) algorithm is developed to obtain the estimated coefficients and the locations of interior knots are fine-tuned to improve accuracy of the changepoints. Compared to the existing methods, the proposed method shows outstanding performance in numerous simulation studies. We also demonstrate the findings of our method on the Enroll-HD dataset.
Please attend this week's second Statistics seminar at 4 PM (UT-6) Monday via this Zoom link.
Meeting ID: 988 5627 1815
Passcode: 153652
The performance of the existing neural network models for time series predictions is evaluated based on squared error loss function. The Mean Square Error using the Auto Regressive Time Series Model of order 2 (AR (2)) is lower than that using Neural Network Models including Recurrent Neural Network (RNN) Model. Here each of the time series data sets comes from an Autoregressive Model of order 2 with known parameters. Theorem and conditions have also been conceived to demonstrate the superiority of the Time Series Modeling over Neural Network Modeling.
Please attend the semester's final Statistics seminar at 5 PM (UT-6) Monday via this Zoom link.
Meeting ID: 939 6108 8953
Passcode: ttustat
 | Wednesday Dec. 6 7 PM MA 108
| | Mathematics Education Math Circle Jeff Lee Department of Mathematics and Statistics, Texas Tech University
|
Math Circle Fall Poster