I'm a former ML engineer turned full-stack developer. I used to work a lot with Graph Neural Networks and NLP models.
I believe in function over form, that done is better than perfect, and that data outweighs intuition.
I'm an avid πΉ player, cautious ποΈ rider, casual πΈ enjoyer, and an abysmal π₯ tosser.
Given free time, I pander to and moderate the (small but growing) Geometric Deep Learning subreddit.
I build products that make the world a better place.
Most recently I've been working with my friend and co-founder Albert Wang on Comend. We build tools for the rare disease community to help them become research ready.
I drink a lot of π΅ and enjoy the culture. I'm building Teatico, an encyclopedia of tea products with my girlfriend Ramy Zhang.
I release original piano compositions on YouTube and Spotify.
Comend - Co-founder and CTO
Feb 2023 - Present - Toronto, Canada
Entrepreneur First (EF) - Founder in Residence
March 2022 - Feb 2023 - Toronto, Canada
Kebotix Inc. - Machine Learning Developer Intern
Sept 2020 - May 2021 - Boston, MA
Relation Therapeutics - Machine Learning Research Intern
Sept 2019 - May 2020 - London, UK
ML for Chem/Bio-informatics - Personal Projects
Oct 2018 - June 2019 - Toronto, Canada
Canadian Imperial Bank of Commerce (CIBC) - Summer Intern
June 2018 - Sept 2018 - Toronto, Canada
Jake P. Taylor-King, Cristian Regep, Jyothish Soman, Flawnson Tong, Catalina Cangea, Charlie Roberts
DDD allows for the fitting of continuous-time Markov chains over these basis functions and as a result continuously maps between distributions. The number of parameters in DDD scales by the square of the number of basis functions; we reformulate the problem and restrict the method to compact basis functions which leads to the inference of sparse matrices only -- hence reducing the number of parameters.
What do you build with?
TypeScript, Python, SQL databases, Tailwind. Container-first.
Whatβs your setup?
JetBrains IDEs with vim bindings for web development. Codex occassionally. Terminal-heavy workflows when I used to train models.
What hardware are you on?
At home, I work on my custom-built rig (AMD Ryzen 9 9950X, RTX 5090). At work, I use a 2020 MacBook Pro M1.
Whatβs your dev style?
Fast iteration. Idea β deployed quickly. Prefer shipping > overengineering.
How do you decide what to build?
Talk to users, look at data, rationalize how much positive impact I'd be making.
Favorite AI tools?
I'm model agnostic and default to my locally-run SOTA models (currently Qwen3-Coder 30B) and switch between GPT, Gemini, and Claude situationally.
How do you use AI?
Well-scoped small context window one-shottable tasks, or exploring multiple ideas/avenues in parallel.
What do you suck at?
Writing and updating tickets, working on something I don't believe in, stepping away from a problem.
What's your cat's name?
Paloma is a tiny 4.5-pound Siamese π± we adopted from a rescue.
What bike do you have?
A 2023 Yamaha MT-03.