Case Study: Arize AI's Cofounders Discuss Their Path Towards Product-Market Fit
Learn how these two talented founders approached understanding what needed to be built before building it.
Executive Summary:
The Problem - Figuring Out What To Build First
Jason and Aparna took their time to reflect on their own experience building solutions to observing AI/ML models with varying complexity. They focused on the problem’s common features that would be applicable to a variety of companies.
Action Item: Take time to bound the problem you’re trying to solve by asking the most relevant questions in building a minimum viable product (MVP).The Solution - Listen Constantly, Build Iteratively
The cofounders did what most other startups wouldn’t do: they listened extensively to relevant potential customers before starting to build various iterations of a solution.
Action Item: Measure Twice, Cut Once.The Takeaway - It Takes Time To Build The Right Thing
Jason and Aparna are upfront about the drawbacks of their strategy. It took time before they began to type code, but the time they spent helped them find the right direction to build towards.Action Item: Take your time when gathering feedback - you’ll know you’re building the right thing when your customers say your product or service is solving their problem.
Founder File - Jason & Aparna’s Case Study: Product Fit
I got the chance to speak with Jason Lopatecki and Aparna Dhinakaran, co-founders of Arize AI. Arize AI is a startup focused on machine learning (ML) Observability. They help companies monitor, troubleshoot, and explain their artificial intelligence (AI).
Arize AI cofounders (from left to right): Jason Lopatecki and Aparna Dhinakaran.
The Problem: Figuring Out What To Build First
What was the most challenging problem you've solved recently?
As you are starting a company and want to turn what is a vision into a rough version of a product, you need to figure out what to build first. There is always a debate on your initial starting point and if that will be something that customers purchase. This is even harder if the product category doesn't exist. The big problem is what to build first, how to connect your vision to your initial step.
As Data Scientists/ML Engineers who have delivered 1000's of models into products, we could see that models were getting more complex, that the complexity was driving business issues, and that there were common patterns of problems in Data Science. We also realized we were building similar software to troubleshoot these problems at every company we worked at, over and over. That said, what do we create first, what is an MVP, and are the problems/solutions we have seen commonly applied to a more extensive set of companies?
What was your initial thought process in solving the problem?
We knew that there was no future where AI and ML indeed achieved our vision for their potential scale, which was coincident with Data Scientists troubleshooting deployed models in Notebooks. That said, connecting the vision to a nuts and bolts reality isn't always easy. What was the common set of problems between a large set of disparate Data Science teams? What were the common solutions, methods, and technologies that teams were building to troubleshoot models in products? Those two questions are what we set out to answer as the company was forming.
The Solution: Listen Constantly, Build Iteratively
How did you evaluate your initial solution(s) before trying to implement them?
Our initial thought was that there is typically this large period where a founding team is forming and everyone's heads down building the product's foundation. What if we used this period to pitch a clickable version of the product (not working yet) to a large set of Data Scientists. Continually gathering and iterating on feedback on the initial product concept, then feeding those notes back to the product team.
How is this different from what other people do? Essentially we pitched a proto-type A LOT. Probably 10-100x more than most people do. Across 100's of Data Scientists & ML Engineers. Then iterated. We filtered out customers who wouldn't be early adopters and focused on core features across a common set of problems that we observed.
When you were working to implement them, what else did you discover that either confirmed you were on the right track or opened your eyes to a new facet of the problem?
You don't get product fit without putting product in customers' hands, finding people who love it, and iterating on that product. Once data was live in the software, there were tons of insights that were only realizable as we sat down with users and solved problems. There were pieces we had to tackle in different ways than originally planned, but the large general direction and base product came out very solid.
The Takeaway: It Takes Time To Build The Right Thing
When did you realize that you arrived at the right solution in building early versions of your product?
We started to realize we were on the right track when we went from people having general interest to getting comments like “There is so much nonsense in Data Science and it’s refreshing to see a product just makes so much sense.”
In general, as people were using our product they were finding problems in their models faster. One comment from a customer was that they found some fairly large issues where the model was being used for pricing/forecasting and that model had a real effect on their business, all within 2 weeks of using the product.
Did that solution come with its caveats or tradeoffs? If so, what are they?
The approach we took to get the product to market was probably a bit noisier than normal. You get a lot of feedback before launching anything and have to decide what to incorporate and what to wait on.
In general, lots of feedback isn’t always good; it means you have more notes to sift through and more dots to connect. In our case, there were many suggestions that seemed to fit a different type of user. As we talked to more Data Scientists, we did realize some of the notes fit a different organizational user and we did have to go back and filter those down.
What is your general advice for founders on how to face the challenges related to early product experimentation?
If you think you have an amazing product idea, mock up an initial version and walk a lot of users through it before writing any code. Pitch. Listen and iterate.
Don’t listen to every idea in terms of what you need to build but try to find common, deep pain that represents a tackleable MVP. That means listening intently, pitching a lot but having your own view of what you should be building. That view should be modifying and changing as you talk to more people.
Previous F2F Q&A Articles:
Case Study: Vizy CEO Amos Gewirtz Figured Out How To Increase User Engagement Of Their Product
Case Study: Doppler (backed by Sequoia) CEO Brian Vallelunga Found Growth By Killing A Product
Case Study: Segment CEO Peter Reinhardt On How His Startup Achieved Product-Market Fit
Case Study: Segment President Ilya Volodarsky On How To Effectively Use Data Analytics In A Startup
Case Study: Behind The Fundraising And Founder Success with Instabug's Omar Gabr
Case Study: Paragon CTO Ishmael Samuel Reflects On How He Chose His Cofounder
Latest Forbes Articles:
It’s Eze To Trade Smartphones Thanks To These Two Y Combinator Founders
In Our Brave New Socially-Distanced World, Popl Keeps Us Connected With Contactless Sharing
Viv For Your V Provides Sustainable Period Care Products For Women
If you enjoyed this article, feel free to check out my other work on LinkedIn and my personal website, frederickdaso.com. Follow me on Twitter @fredsoda, on Medium @fredsoda, and on Instagram @fred_soda.