Events
Department of Mathematics and Statistics
Texas Tech University
 | Wednesday Apr. 22
| | Algebra and Number Theory No Seminar
|
In this talk, I will discuss recent results on spectral estimates for Schrödinger-type operators on compact Riemannian surfaces and their geometric applications to free boundary minimal and constant mean curvature surfaces.
In particular, I will present an upper bound for the second Robin eigenvalue in terms of the topology and geometry of the surface, and explain how this estimate is connected to stability questions for free boundary surfaces.
This is joint work with R. Antonia and V. Souza.
US CDT is UTC-5. This Differential Geometry, PDE and Mathematical Physics seminar is available over zoom.
Abstract: In this paper, a new semi-discrete version of the Carleman estimate-based convexification globally convergent numerical method is developed. It is used to provide a starting point for the training procedure of deep learning. An important feature of the continuous version of the convexification method is that its convergence to the true solution is independent of the availability of a good first guess. A new concept of hh-strong convexity is introduced, where hh is the grid step size in the semi-discrete version of the convexification method. The hh-strong convexity enables an {a priori} accuracy estimate of the starting point for the deep learning training step. This approach is demonstrated for a highly nonlinear problem of Electrical Impedance Tomography. Results of numerical experiments for complicated media structures demonstrate the computational feasibility of this procedure.
This talk is co-sponsored with the Analysis seminar group.
When: 4:00 pm (Lubbock's local time is GMT -5)
Where: room Math 011 (Math Basement)
ZOOM details:
- Choice #1: use this
Direct Link that embeds meeting and ID and passcode.
- Choice #2: Log into zoom, then join by manually entering the meeting ID and passcode ...
* Meeting ID: 949 9288 2213
* Passcode: Applied
TTU Math Circle Spring Flyer 6:30-7:30 PM Thursdays in the basement of Math, room 010
In the AI era, most people ask mathematics questions to AI, from kindergarten to PhD level. The problem is: are those answers trustworthy? How does the AI come up with those solutions? Unfortunately, most of the time, the mathematical proofs produced by AI are either hallucinated, lack rigor, or contain incorrect steps. So how do we train a machine to give precise proofs? In this talk, we address this problem from a differential geometry perspective using Lean. Lean plays a very different role among computer programming languages compared to Python, Mathematica, MATLAB, or any other computer-aided systems. All the aforementioned languages are capable of solving numerical and symbolic problems in a very systematic way. However, Lean is different: it is designed for writing formally verified mathematical proofs and creating verified software. Using Lean gives the machine a precise proof and provides insight into how the machine understands your findings. Moreover, it minimizes human errors. In this talk, we begin with an introduction to Lean and the paradigm of formal proof verification, highlighting why proof assistants are gaining adoption in contemporary mathematics. We present some simple examples in differential geometry and then survey the current state of differential geometry in Lean’s mathematical library (Mathlib), examining what has been formalized and what remains open. This talk also compares the formalization process with traditional proof techniques, considering both advantages such as absolute rigor and machine-checkable verification, and trade-offs such as verbosity, learning curve, and limited automation.
CDT is UTC-5.