- about
- docs
- ideas
- more
- now
-
posts
- 2025
- 2026
- ai-for-physics
- AuroraGPT
- dope-slides
- drafts
- ezpz-at-alcf
- ezpz-v1
- jupyter
- resume
- svgbob
- torchtune-aurora
- … 3 more
- projects
-
talks
- 2025
- ai-for-science-2024
- alcf-hpc-workshop-2024
- aurora-gpt-fm-for-electric-grid
- AuroraGPT
- AuroraGPT-SIAM25
- hpc-user-forum
- incite-hackathon-2025
- llms-at-scale
- llms-on-polaris
- openskai25
- index.mdx
-
webtui
- components
- contributing
- installation
- plugins
- start
- _redirects
- index.astro
- index.xml
- landing.mdx
- listings.json
- robots.txt
- search.json
- sitemap.xml
Sam Foreman
Computational Scientist · Argonne National LaboratoryI'm a Computational Scientist in the AI / ML Group at the Argonne Leadership Computing Facility.
My work focuses on large-scale distributed training of foundation models for scientific applications. I co-lead the Models & Pre-Training team for the AuroraGPT project.
Recent Posts
Latest writing and notes
| Title | Date |
|---|---|
| ⏱️ Comparing Launchers on Aurora Benchmarking and comparing the performance of different launchers on Aurora at ALCF: `torchrun` vs. `ezpz launch` | 02/28/26 |
| 🍋 ezpz A history and overview of ezpz, covering AMD and Intel PyTorch enablement timelines for portable distributed training across GPU vendors. | 01/10/26 |
| 🎉 Happy New Year! A New Year update summarizing ongoing projects including AuroraGPT, AERIS, and other involvements at Argonne. | 01/07/26 |
| 🧊 Cooling Down Checkpoints: Best Practices for Model Evaluation Best practices for cooling down model checkpoints before evaluation to improve validation loss comparisons. | 11/12/25 |
| 🎨 Mixing Between Distributions While Training A mathematical framework for smoothly interpolating between data distributions during training using an annealing schedule. | 10/06/25 |
Recent Talks
Presentations and workshops