I'm an AI researcher and engineer with a PhD in Machine Learning and Meta-Learning from the University of Edinburgh. I've worked at Google, Amazon, and as principal scientist across AI startups in the UK and US. I specialize in large language models, multimodal learning, and self-supervised methods.
I believe intelligence emerges from structure, interaction, and information—not from sheer scale alone. The field has converged on one dominant recipe: make it bigger. But that's not how natural intelligence works. Brains compress experience, maintain state, and build hierarchical representations. They don't re-ingest the universe every time they need to think.
My work focuses on the first principles of learning—how structure, interaction, and information theory can replace massive parameter counts. I'm driven by that Cambrian Explosion spirit of the 2010s: new architectures, new paradigms, genuine exploration of the research tree.