About
My Background
What I build
I like building software products and systems that feel simple to use.
That often means working across boundaries: product definition, backend systems, data flows, infrastructure, frontend surfaces, and the engineering details that make the experience reliable.
Problems I like
I like problems where the shape is still being figured out: early ideas, product inflection points, and systems that need to grow without becoming harder to reason about.
Lately, I've been especially interested in AI tools and workflows: how they change the way software gets built, how to evaluate them, and where they still need careful review.
I tend to like work where there are many possible paths forward, and progress depends on taste, iteration, and good feedback.
Writing and outside work
I write notes on software, AI tools, and learning from small projects. If you want a better sense of how I think, start with my recent writing or get in touch here.
Outside of work, I spend time playing tennis, reading, hiking, rock climbing, and spending time with my cats.

Topics I Keep Coming Back To
- AI tools for software products
- Evaluation and review workflows for AI-assisted work
- Distributed systems, backend architecture, and reliability
- Full-stack product engineering
- Cloud infrastructure and developer tools
- Embedded and hardware-adjacent systems
Credentials
Professional licensure in Ontario since 2017.
Graduate studies in computer engineering and telecommunications.
Undergraduate training in electrical engineering and pure mathematics.
Latest Writing
Camping Indoors in San Francisco
April 27, 2026
My first week in San Francisco has mostly been logistics, empty rooms, cats, and the slow work of making enough space to focus.
Relearning Through Small Interactive Tools
April 10, 2026
Using a coding agent to build small browser tools turned out to be a good way to revisit concepts I had not touched in a while.
Dropout as Implicit Bagging in Deep Learning
March 7, 2026
A concrete way to understand dropout in deep learning: shared-parameter ensemble training that approximates bagging without training separate models.