Two Swarthmore students. A phone full of ideas. One line that sparked it all: “LeetCode 1v1”
Two builders who believe coding should be competitive, fun, and genuinely useful.

Co-Founder
Computer Science major. The idea keeper — Eli writes down every idea that spontaneously pops into his head, usually sparked by a problem or annoyance he notices in everyday life. Previously launchedLock In, a mobile app he built and shipped in six months. Always building, always shipping.

Co-Founder
Double major in Psychology and Computer Science. Currently studying abroad in Melbourne, Australia with CASA at UniMelb. Previously worked onPrologue, a sports coaching platform. Brings a deep understanding of user psychology and what makes experiences engaging.
Every great thing starts with an unremarkable moment.
It started at dinner after a basketball game. Eli pulled out his phone and showed Andy a list of ideas he'd been keeping — every idea that spontaneously popped into his head, usually as a result of a problem or annoying occurrence he'd noticed. Eli was already building and shippingLock Inat the time (which he launched within six months), but was always open to building more.
Andy was scrolling through the list, reflecting on how accessible building things had become with new technologies like LLMs. He was coming off his own project —Prologue, a sports coaching platform he'd started with a friend who ended up quietly walking away from the project. Andy was ready for something new, something he could be genuinely excited about.
Hey, that seems like it would be fun to make!
— Andy, pointing at “LeetCode 1v1” on Eli's list
Over the next two months, Eli and Andy went deep. They found, curated, and tailored 5,000+ open-source programming problems with test cases. They built an ELO rating system, real-time matchmaking, and an architecture similar to LeetCode and Kattis — one that's constantly being iterated on. The reasoning was simple: LeetCode is monotonous. They wanted to gamify the process and make competitive programming feel like a sport.
In the founders' own words
We're building AlgoArena.net, a competitive programming platform that gamifies the process. Our core feature is real-time coding 1v1s against other humans. We're essentially LeetCode and Chess.com's baby.
Most competitive programming tools are either too monotonous or too serious. We're the balance — making coding more fun while remaining genuinely engaging and useful.
Our flagship product. Real-time head-to-head coding battles with ELO-based matchmaking. Solve the same problem, race the clock, and outsmart your opponent. LeetCode meets Chess.com — competitive programming as a sport with 5,000+ curated problems across 16 languages.
A version of Kahoot, but fundamentally tailored for CS education. Professors can configure multiple choice, true/false, interactive puzzles, and full coding problems from our bank of questions. A revolutionary tool for teaching CS skills in the classroom — breaking the ice for students and making learning hands-on.
Recruiters can toggle between testing candidates on manual coding proficiency or AI-assisted development and prompt engineering. Multi-file questions that measure real-world skills: multitasking, prompt crafting, attention to detail, and interaction with AI tools — mirroring how developers actually work today.
From a pilot competitive programming class at Bryn Mawr College that used AlgoArena as their platform.
It was fire. I think the extra steps before you start programming is tedious, but everything else was fire.
Bryn Mawr CS Student
I liked it. It was cool to put our class practice to the test!
Bryn Mawr CS Student
I think it was good. The problems were a bit hard but good learning with time limit and competition.
Bryn Mawr CS Student
It was a good challenging experience. I think it's helpful — the time pressure helps us solve problems under pressure.
Bryn Mawr CS Student
Gained confidence in my ability to actually solve a problem.
Bryn Mawr CS Student
I enjoyed it. I do not like timed things but this was good practice.
Bryn Mawr CS Student
There's a massive disconnect in how recruiters test candidates today. For years, candidates were quizzed on abstract problems — inverting binary trees, obscure dynamic programming — that rarely reflected what they'd actually do on the job. It was a smart quiz, not a skills assessment.
Now with AI, that disconnect is even larger. Competitors like HackerRank or Codility offer dumbed-down versions of ChatGPT — not a synonymous coding environment to what real developers use today. Their AI can't make inline edits, run in parallel, or replicate real development workflows.
They're missing out on a wealth of signal: how candidates write prompts, their attention to detail, their multitasking ability, their interaction with AI and the editor. Data that would be far more revealing to recruiters — in OAs or live interviews — than anything we currently test or capture.
That's what AlgoArena OA Mode is built to solve.
Join hundreds of developers who are sharpening their skills through real-time coding battles. LeetCode made it educational. We made it a sport.