Doctoral Thesis Proposal - Bailey Miller

— 4:30pm

Location:
In Person and Virtual - ET - Newell-Simon 4305 and Zoom

Speaker:
BAILEY MILLER , Ph.D. Student
Computer Science Department
Carnegie Mellon University

https://www.bailey-miller.com

Stochastic Geometry Primitives

Numerical computing on complex geometry faces two core challenges: representing geometry and performing computation on it. Discretization—voxels, meshes, global solves—works until geometry is too detailed or uncertain to resolve. To overcome these limitations, we propose a complementary paradigm—stochastic graphics primitives (SGPs)—that use randomness to avoid discretization in both representation and computation.

First, we’ll survey SGPs in graphics today: Monte Carlo rendering as an algorithmic primitive that interacts with geometry via local queries, and participating-media models as distributional representations that replace explicit particle interactions with free-flight sampling. Building on these ideas, we’ll show how the same principles extend beyond light transport to Monte Carlo PDE solvers that handle a range of boundary conditions using only local geometric queries, and stochastic solid representations that move past microparticle media to prior-free, uncertainty-aware geometry. These primitives are modular and differentiable, enabling inverse reconstruction and optimization-driven shape design without remeshing.

Crucially, we’ll position these methods as general-purpose primitives: black-box operators for physics simulation (elliptic and transport PDEs), geometric computation (harmonic coordinates, distance-driven queries, shape optimization), and machine learning (differentiable PDE layers or neural PDE surrogates supervised by stochastic operators). In this view, SGPs provide a common API in place of meshes and global solves, so the same primitives serve simulation, geometry processing, and learning.

Finally, we’ll outline current limits—hyperbolic and nonlinear PDEs—and a path forward via hybrid neural–Monte Carlo methods that iteratively refine a neural surrogate under supervision while preserving the geometric scalability and robustness of Monte Carlo PDE solvers. I’ll close with practical, more expressive stochastic surface models and a roadmap toward more general-purpose SGPs.

Thesis Committee
Ioannis Gkioulekas (Chair)
Keenan Crane
Nicholas Boffi
Ravi Ramamoorthi (University of California San Diego)
Mathieu Desbrun (École Polytechnique / Inria Paris-Saclay)

Additional Information

In Person and Zoom Participation.  See announcement. 

For More Information:
matthewstewart@cmu.edu


Add event to Google
Add event to iCal