Events
Department of Mathematics and Statistics
Texas Tech University
Given the state of a system is not completely observable, filtering is concerned with state estimation based on partial observations of the system state. It enjoys many applications in the control of partially observed systems, target tracking, signal processing, statistics, and financial engineering. Devoted to the conditional distribution or density, the celebrated results of the Kushner equation and Duncan-Mortensen-Zakai equation produce nonparametric estimations of the conditional distribution. Approximating their solutions will suffer the curse of dimensionality. In this talk, we introduce a new filtering algorithm termed deep filtering based on the deep learning framework. It solves a long-standing (60-year-old) challenging problem in computational nonlinear filtering and has the potential to overcome the curse of dimensionality. Our work on deep filtering with adaptive learning rates will follow next. We convert the filtering problem to an optimization problem by finding the optimal weights of a deep neural network (DNN). We constructed a two-time-scale stochastic gradient descent algorithm to update the weights of the DNN and the learning rates adaptively. The updating of the learning rates is unsupervised learning. We proved the asymptotic results of our algorithm and achieved error bounds for the parameters of the neural network. This talk aims to give a tour of the filtering algorithm using deep learning to the general audience and we will present details on the implementation of our algorithm. Finally, we will present two numerical examples to show the efficiency and robustness of our algorithm. This is based on joint work with Prof. George Yin and Prof. Qing Zhang.
Please attend this week's Statistics seminar at 4 PM (UT-6) Monday via this Zoom link.
Meeting ID: 644 523 4421
Passcode: ttustat
In this talk, based on joint work with Ben Gripaios and Oscar Randal-Williams (arXiv:2209.13524 and 2310.16090), we will, with help from the geometric cobordism hypothesis, define and study invertible smooth generalized symmetries of field theories within the framework of higher category theory. We will show the existence of a new type of anomaly that afflicts global symmetries even before trying to gauge, we call these anomalies “smoothness anomalies”. In addition, we will see that d-dimensional QFTs when considered collectively can have d-form symmetries, which goes beyond the (d-1)-form symmetries known to physicists for individual QFTs. We will also touch on aspects of gauging global symmetries in the case of topological quantum field theories.