WASHINGTON eXPERIMENTAL MATHEMATICS LAB

Winter 2025 Projects

Musical Billiards

Faculty Mentor: Dr. Jayadev Athreya

Project Description: We will explore the language of billiards in regular polygons: a point mass, moving at unit speed, with no friction, and bouncing off walls with angle of incidence = angle of reflection. The associated language is given by labeling all the sides, and considering the set of all possible codings of trajectories. We will explore this with coding and music, by assigning musical notes to the sides, and seeing what musical phrases we can get.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements: Differential Equations

Programming Requirements: python and java would be great


Quantum symmetries of gentle algebras

Faculty Mentor: Dr. Amrei Oswald

Project Description: The symmetries of an object can be formalized mathematically as invertible, property-preserving maps from the object to itself. In the classical setting, these symmetries are given by the actions of groups. Quantum symmetry is a generalization of the notion of symmetry to the quantum setting, where objects have symmetries not captured by group actions. In this setting, quantum symmetries are given by Hopf actions of quantum groups on algebras. We will work with a class of quantum groups called Taft algebras and aim to classify their Hopf actions on gentle algebras.

Path algebras are an important class of algebra which can be described via directed graphs, and Hopf actions of Taft algebras on path algebras have a parametrization in terms of linear algebraic data. Gentle algebras are quotients of particular path algebras, so our goal will be to determine when the actions on the path algebras are defined on these quotients. Further, Taft algebras contain a group, and the action of this group on the path algebra is a symmetry of the associated directed graph. Therefore, determining the symmetries of the relevant directed graphs will be a good starting place for our project.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements: math 403

Programming Requirements:


Wild Knot Mosaics

Faculty Mentor: Dr. Allison K. Henrich and Andrew Tawfeek

Project Description: Mosaic diagrams were developed in 2008 by Lomanoco and Kauffman to build quantum knot systems. Since then, the structure of mosaics has been widely studied by many others due to their convenient way of encoding a knot in three-dimensional space as a matrix. In 2014, it was shown by Kuriya and Shehab that mosaics are a complete invariant for tame knots, which are the concrete knots we can make with string in everyday life.

In this project, we attempt to push the capabilities of the mosaic representation further: by adapting them in order to capture knots that are not tame, namely wild knots. These knots can have infinite tangles and diverge towards infinity in various locations along the strand — but by adapting the structure of mosaic diagrams, some of these pathological objects may be represented, such as the Alexander horned sphere. We will spend this project formalizing the theory behind these wild mosaics, writing code along the way for their study and investigation.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements:

Programming Requirements:


Implementing Undetectable Backdoors in Machine Learning Models

Faculty Mentor: Daniel Shumow

Project Description: A recent paper showed how to implement undetectable backdoors in machine learning models. href=”https://ieeexplore.ieee.org/document/9996741″

The work is fairly theoretical, and it is unclear how difficult this would be to implement in practice. As such, the purpose of this project will be to go over this paper, understand it, and then try to implement these backdoors.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements: Linear algebra would be helpful.
Programming Requirements: Python


The Math AI Lab

Faculty Mentor: Dr. Jarod Alper

Project Description: Meeting on Monday at 4 pm, the Math AI Lab studies question in the intersection of mathematics and AI. Our main focuses this quarter will be on mathematical formalization using the the Lean Theorem Prover and on autoformalization algorithms using machine learning.

The Math AI Lab will break into several smaller groups focused on specific projects. Several projects will be dedicated to learning the Lean programming language and using it to formalize interesting theorems. Theorem provers such as Lean are changing how mathematical research is done so you might as well get on board now! To see whether this might be the right project for you and to get started learning Lean, you can play the “Natural Number Game” (https://www.ma.imperial.ac.uk/~buzzard/xena/natural_number_game/).

We will also run projects on designing our own tactics in Lean to facilitate the generation of proofs and our own reinforcement learning algorithms (in the style of Alphazero and Alphaproof) to automate theorem proving.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements:
Programming Requirements: programming classes, especially machine learning


Self-Organized Criticality

Faculty Mentor: Dr. Christopher Hoffman

Project Description: Self-Organized Criticality is a concept from mathematical physics that claims to explain diverse phenomenon such as earthquakes, avalanches and forest fires. These systems all have energy that builds up slowly and then is quickly released. The amount of energy released can vary widely from very small earthquakes that can barely be felt to large earthquakes that can destroy cities.

Activated random walk is a probabilistic model that is believed to share some crucial properties with models form self-organized criticality. This model involves many particles that are performing random walks that interact with each other. We will numerically study the distribution of the amount of energy released in the activated random walk model.


Project Level: Intermediate: students who have taken Math 300
Additional Course Requirements: 394/5/6 would be helpful.
Programming Requirements: programming skills are important


Identifying Feasible Regions in Highly Constrained Simulation-Based Optimization Problems

Faculty Mentor: Drs. Jeff Poskin and Joerg Gablonsky

Project Description: Optimization problems in industry may involve objectives and constraints determined by computationally expensive analysis codes. One of the major challenges in solving optimization problems in these settings is minimizing the total number of expensive function evaluations. This is particularly true in highly-constrained settings where many expensive evaluations may be immediately discarded by the optimizer as they fail to satisfy the constraints of the problem.
This project will explore different ways to efficiently identify feasible regions in constrained optimization problems. Students will be encouraged to explore algorithms that leverage surrogate models such as Gaussian Processes, Neural Networks, and Radial Basis Functions. Model-free algorithms which work directly from an evaluated set of points may also be considered.
Students will participate in training session on best software development practices from experienced Boeing personnel. Project mentors will additionally perform one or more code review sessions to support code development.


Project Level: Advanced: students who have taken multiple upper-level mathematics courses
Additional Course Requirements: An introductory course on numerical optimization should suffice
Programming Requirements: Python and some familiarity with Git for software development management


Structured and Sparse Covariance Estimation

Faculty Mentor: Dr. Bamdad Hosseini

Project Description: Covariance estimation and approximation are fundamental tasks in statistics and machine learning. We will begin by studying the relationship between sparsity structures in the inverse of the covariance matrix and Cholesky factors, and conditional independence of the multivariate Gaussian random variables. Next, we will examine the problem of estimating a covariance matrix given structural knowledge, such as a parametric form, sparsity constraint, or tensorization.

The main goal of the project will be to develop efficient algorithms for estimating and computing with Gaussian processes. Some particular topics of interest are nonparametrically covariance models for Gaussian process, fast sparse approximate factorizations of kernel matrices and structured inverse covariance estimation.


Project Level: Advanced: students who have taken multiple upper-level mathematics courses
Additional Course Requirements: Linear Algebra is necessary (MATH 208/AMATH 352). Any experience with probability and statistics would be very helpful.
Programming Requirements: Some basic programming is necessary. Python is preferred, any other experience is a plus.