Simplismart is an AI infrastructure company building the critical platform-agnostic middleware to deploy, train/customize, and serve open-source AI models in production. With a very small team and under $1M in funding, these two Bengaluru-based founders built the world’s fastest inference engine.
In Episode 3 of our Decoding AI series, Amritanshu Jain, Co-founder and CEO of Simplismart, joins Anagh Prasad to discuss the startup’s remarkable pivot. Initially built as an AutoML platform, Simplismart’s breakthrough came when they happened to discover the speed of their inference engine after a conversation with a potential customer.
In this episode, Amritanshu also elaborates on the open-source versus closed-source debate, noting that the comparison is complex and requires significant education. He emphasises that it is not simply about declaring one model superior to the other but understanding the broader context and specific advantages each model offers. While open-source solutions can offer high customization and cost efficiency, the choice between open-source and closed-source models ultimately depends on the unique needs and constraints of the enterprise.