January 31, 2024
Portfolio
Unusual

Domino Data's product-market fit journey

Wei Lien Dang
No items found.
Domino Data's product-market fit journeyDomino Data's product-market fit journey
All posts
Editor's note: 

SFG 39: Domino Data's Chris Yang on helping data scientists run AI

Domino Data Lab was founded in 2013 and was one of the early innovators in the space known as MLOps. It delivers a unified platform for enterprises to build, deploy, and manage machine learning workloads and provides access to data, tools, compute, models across any environment. Domino enables ML and data teams to collaborate, ensure governance, and reduce costs at scale.

In this episode, Wei Lien Dang chats with Chris Yang, co-founder and CTO of Domino Data.

Be sure to check out more Startup Field Guide Podcast episodes on Spotify, Apple, and Youtube. Hosted by Unusual Ventures General Partner Sandhya Hegde (former EVP at Amplitude), the SFG podcast uncovers how the top unicorn founders of today really found product-market fit.

Summary and key takeaways

  • The Domino founding team's initial idea was for a recruiting platform, but they pivoted when user interviews revealed an opportunity in using the cloud for data science. This is why founder outreach matters - targeting the right market problem is table stakes for success, you can't skip this stage no matter how excited you are about your ideas.
  • The initial MVP was a product that helped users push compute workloads to the cloud and receive results back. Domino also saw early that data science would “become a team sport like software engineering.” They prioritized supporting collaboration.
  • The company followed a "hill climb" strategy, casting a wide net to see who was interested in their product and then steering towards those customers. Chris and his co-founders pivoted to enterprise customers when they found that the market pull for their solution was stronger there.
  • Chris sees MLOps as ”never being done" - as models become more sophisticated. Tooling has to advance in step with models.

Episode transcript

Wei Lien Dang

Today we're going to share the story of Domino Data Lab. Domino Data Lab was founded in 2013 and was one of the early innovators in the space known as MLOps. It delivers a unified platform for enterprises to build, deploy, and manage machine learning workloads and provides access to data tools, compute, and models across any environment.

Domino enables ML and data teams to collaborate, ensure governance, and reduce costs at scale. Today, I'm super excited to welcome Chris Yang, CTO and co-founder of Domino Data. Before founding Domino, Chris worked at Bridgewater, one of the world's largest hedge funds. And he helped deliver their next-generation investment platform.

He holds both a bachelor's in computer science and a master's from MIT and published a bunch of papers on both computer vision and distributed databases while he was there. Welcome to the Field Guide, Chris.

So I'm super excited to dive into your founder journey with Domino Data Lab and, maybe to start, if you take yourself back to 2013, gosh, 10 years and so much has changed, but if you can look back and, I love to start by asking what was the founding insight that prompted you and your co-founders to start Domino? What was your origin story behind the company? 

Chris Yang

Yeah, it's a great question. And I know that the audience for this talk is going to be other founders. So I did want to linger on this for a moment. I think there's a version of the story that we tell about Domino which is the movie version. My co-founders were also both at Bridgewater as well. And I think there's the movie version, which is like the three of us in doing this sort of like data analysis type work at Bridgewater came to some insight about how the broader world could do collaborative data science. And then that was the origin of the company. And that is absolutely not how this happened.

What actually happened was my two co-founders had already decided they wanted to, at the time, they had an idea for a recruiting platform. And I ended up hearing about it and then barged my way into this relationship that the two of them had and said do you need someone to code the thing? We tried a bunch of different tacks around this recruiting platform, but there was actually this like year, if not 18 months where we were working on that and a couple of others like very unrelated ideas. And it was actually that those other threads weren't going well. And I remember this vividly.

We were sitting in my living room in my apartment in San Francisco. And we were like, the three of us were like, this isn't working. Like this recruiting platform thing is not working. And we went back to basics. We were like, what do we actually know? And what's a better way for us to go figure out, like, what can be built here? And that part, that 18-month wandering in the desert is something like, we cut out of the story and I just wanted to tell maybe the other founders, maybe people who are thinking about founders, thinking about founding a company that like, Oh if I don't have this really glossy movie version of a story I don't have a chance.

And I just want to assure you that absolutely, so much rewriting of history happens when we tell these stories. And so anyways, what we ended up doing was we made a short list of stuff we felt like we actually had deep insight into, not necessarily that we had some disruptive idea. We made a list of okay, what do we actually know a lot about? Because at the time, we were coming into recruiting as people who weren't like recruiters. Like we didn't have a sort of like people or recruiting background. From there we realized, actually what we know a lot about is high-end complex collaborative data science. From there, then we bootstrapped from our network to basically do 30 to 60 to like a hundred basically cold interviews of people who were roughly doing data stuff at the time.

And it really just was like, hey, what are your biggest challenges? And after doing a whole bunch of those back-to-back, then the sort of patterns started to emerge. And even that telling of this story is a little bit glossified, but it's much, much closer to the truth. And so then what came out of that was essentially a set of challenges around scaling, scaling the magnitude of the analysis you were doing combined with… so at that time what we called data scientists then, I guess we still call them that, but that term itself has bifurcated into a whole bunch of subspecialties, sort of technical gap, like a skills gap around how to use the cloud.

And so we looked at that and said Oh, we can build a thing that helps somebody push this compute workload up to the cloud where it can scale infinitely and then send the results back. That was the very, very first use case and pain point. And then also the first MVP that was like the very first little tiny nugget of what then expanded over time to become the full platform.

Wei Lien Dang

First, I love that you bring up this topic of like revisionist history. Most successful startups, it's a windy path at best and, but one thing that really stands out from that you highlighted in the early days was went back to this problem that you and your co-founders had a lot of authenticity for, like you understood in a way that maybe a lot of other folks hadn't yet experienced, or they didn't necessarily have a strong insight like you guys did in terms of how to tackle a particular problem space, but then you went and did all this cold outreach to test your hypotheses.

I would say at Unusual, we're big proponents of, of doing that. And I'm curious, if there were any, if you think back, if there were any early learnings that came out of that process and how you thought about balancing or how founders can balance Hey, I have an authenticity and I understand this space probably deeper than most, but then, there's all this feedback from the market that you go try and extract. I'm curious, how that factored into the early thinking for you all

Chris Yang

Yeah. It's a really great question and now I'm fortunate enough to be in a position where I can work with other younger founders or people who are just starting out on this journey. And even thinking back to where I was back in 2014, the bias always appears to be, we're far too quick to build something and I think hopefully I still have my engineer card, although I haven't written a lot of code recently but I think as an engineer, as a person who was trained and brought up that the way I add value to the team I'm on or the company I'm working on is I write code, I build stuff. And a bunch of the other founders that I advise also came out of that tradition. And I suspect a lot of the folks that are in the Unusual orbit are probably also technical in that way. When you only have one hammer called write code, everything starts to look like a write code nail. And my reflection is that I was far too quick to solve a problem rather than understand more deeply what was actually challenging for the user or customer and to go deeper there.

And I see that again with a bunch of the companies I'm working with that they want to, they push in the direction of, I have a really clear vision for what the product is gonna be, And I'm like, Hey, that's great, but you need to temper that or test that with what problems do people actually have? Is this problem even worth solving? And at least in my experience, the thing you thought you were building has almost nothing to do with what you end up needing to build to like actually solve the problem. And fortunately, me and the first folks, we were very fast at building.

And it also happened that the set of problems we ended up tackling had in retrospect, like fairly straightforward solutions to them, at least from a technology perspective. And so it could have been much worse, but we were also like fast on the draw, which I think helped maybe it not spiral out of control.

But, you know, certainly the way I coach people now is like 10 to 1 like you should be having 10 conversations for every feature you had or something like that, whatever, whatever the unit of coding, because if you don't really have clarity about the problem you're trying to solve, and more importantly who you're trying to solve it for. You could build all day and at least in my experience, it's like most stuff you build ends up not really getting used, right?

And so it's like the most expensive way to do market research basically is to build stuff and then see if it sticks. And so cut out the middle, save everybody some trouble. Go talk to more people. It's always go talk to more people. I've never talked to anybody, and this was never a problem that we talked to too many people.

Wei Lien Dang

Cool! And I think if you go back to that sort of like first year and a half of wandering in the wilderness, I think one of the things that founders need is grit, and it's hard to gauge, how, how much grit you have until you're in the thick of it. But clearly, you and your co-founders stuck through it, you adapted, you wanted to continue to find something that worked. And I'm just curious is there something, in how you guys work together or you maybe you knew having pre-existing relationships that, that helped with that? Like, what gave you the resilience tokeep going and push through to find something that started to work?

Chris Yang

One thing is both my personality and also the culture we came out of at Bridgewater, we were always very open with each other and in particular, could talk openly about what was or wasn't working. And so I mentioned this crisis meeting where we were in my living room, and we had this big whiteboard up, and that wasn't a fun conversation, obviously.

But it also didn't have these like second-order complexities of, people are mad or upset at each other or living in denial. We were having a relatively straightforward, again, not fun, but a relatively straightforward conversation called hey, I don't think it's working. Here's why I don't think it's working. Here's what I think we can do. There's debate and discussion around that stuff, but we didn't have any added drama of it's not working because you're not doing X or I actually think it's working and we just need to push through. It's a very open and transparent and like logical conversation.

And I think my advice to folks is everyone's polite on the first date. And so if there's anybody out here who's like thinking about picking a co-founder or how to find a co-founder, it's tough because a lot of what you need to figure out is not can we get along now? It's when stuff really hits the fan, are you gonna have a reasonable partner that you can have a productive sort of brainstorm with?

So I think in that regard we were like very lucky. The second thing is it's funny, I wouldn't say any of the three of us are particularly sunshine and roses and optimistic in that sense. And three of us are actually very different, sure, from a personality perspective in a lot of ways. At least for me, what kept me going was not being alone. So just having co-founders who were like in the boat rowing with me, like really important. I don't understand how solo founders exist, to be honest. Like they must be made of sterner stuff than me. I cannot imagine going alone. And the second thing, and again, I think if I am a broken record about this, it's going to be on this point, which is no matter how much or how little traffic we had to the website I always got real joy, we always got real joy out of actually solving an actual customer's actual problem.

But you only really get that dopamine hit if you're like, seeing them, and observing them, and watching them. Both in terms of actually understanding what their challenge is so hopefully you go home and build a better feature or build a better improvement, but then also when you're there, they'll see them like use it and see their eyes light up and see them like do something that they couldn't do before.

And so it was this ‌very tiny dopamine loop. And I suspect every engineer probably has this part of their brain. I don't think you become an engineer unless you like solving problems. And so that little dopamine hit of I solved the problem and a person's happy about the work that I did was really the manna for me, at least psychologically, for many years, if not still actually forever and ever. And so if you're not out there engaging with, again, especially in the early days, you have like very few users, but if you're not out there engaging with them, you're going to miss out opportunities, on insight of the problems, but also the joy of seeing your solution that you built, like work for them.

We had a really early customer. Tesla was our sort of big, very early customer. And again, thinking back in retrospect how wild this was, but I actually spent three days a week there for easily three or four hours. I would just sit at the desk. And I just sat with the team for a big chunk of the week and I absorbed what they were doing when they ran into problems, I was just there. They'd be like, Chris, why is this thing broken? And I was like yeah, sorry. But you keep your ears open for opportunities, challenges, adjacencies, and that's very extreme. I think on balance, it was probably, obviously it was a very low-leverage activity in terms of my time. If you're not out there really getting ensconced in what your users are actually going through, you should really prioritize that. It's really important

Wei Lien Dang

So Chris, you all had this insight around collaborative data science, data analysis. You had these early customers. Like Tesla. Where did you start in terms of what you built first? Once you felt like you had enough validation, enough confidence to actually, start putting hands on keyboard, like, what was the first sort of like feature or product set out to build? What were you trying to do?

Chris Yang

Yeah. So the very first tiny kernel was you had an R script, you had a Python script. You're trying to punch through a whole bunch of data. It was bigger or slower or a major machine run hotter or faster than you would tolerate. Our users at the time, even still now, data scientists, not software engineers, certainly not cloud infrastructure engineers. So there was a sense of I know there's a thing called the cloud where potentially there's infinite compute, but I absolutely do not know how to parse the AWS docs to wrap up the libraries and then shift the code over to then get the data back or push it to, and get it back.

And that was all like theoretically possible to them, but like in practice, it was never going to be worth it for them to go learn the AWS stack to figure out how to do that. So it was like the very first like kernel and that was I remember because I remember being pretty proud of this, that sort of from like first check-in to like first version that we got a person to use was like four weeks And obviously it didn't do a lot and it like, it was like pretty broken in a lot of ways.

But then from there, then we had this, okay. So then we had this like use case we had from the interviews, we had a set of people we could go back to and be like, Hey, I built a thing. Do you want to use it? And of course, 90 percent of them were like,  who are you, go away. But then, then the few that did respond and react, then that gave you the tiny little kernel around which to nucleate more usage, leading to more feedback, leading to more adjacent use cases, leading to more depth. And so that was the fundamental feedback loop in the early days. And now that I'm armed with a whole bunch more sort of product management knowledge or theory. Back then, the next best alternative for a lot of these folks was like very old-school, high performance computing frameworks like Star, for instance, but this was all like mainframe stuff. It came up in like the eighties. It was a very high barrier to entry. Like, you still have to write bash scripts to coordinate the distributed compute load. So it certainly was not really premised in the cloud mindset, which is still, back then, like relatively early, at least for these sorts of data use cases. But yeah, that was the main thing. 

Then the next use case that became a pretty easy tee-up was if we're already copying all this code and data up to the cloud, we'll just go ahead and save off a copy of it for you. So that you can reproduce your results. That ended up being like a big theme across all the interviews, which is you like run a bunch of these analyses. They take long enough that you really want to run a bunch of them in parallel, and they also take long enough and you want to run enough variations that like, by the time they're done, you're like, you don't quite remember which thing was doing what. And so again, it was an insight that was only possible after you built the first thing. We were already copying all this stuff. Like, why don't we just save a copy of it off and then show you after the fact? And then that ended up being the kernel of a much bigger theme. That's actually been a part of Domino even to this day, which, is like part of the way we provide value to a team is by helping you silently and like seamlessly help you keep track of what you're doing. So you don't have this like metacognitive workload of Whoa, wait, I need to write down all the things that are like going on and help you stay organized and the principle or the user experience principle we established at the time that we still try to adhere to today is can we make it so the user doesn't have to go out of their way to get this value? So it's we're going to help you farm this compute out. Let's give you reproducibility for free. You don't have to opt in or particularly think about it. So that was a major use case.

Wei Lien Dang

So something like that second use case around make a copy, reproducing the results. Was that something that was on your radar? Or is it something that could only have, was like born out of customer feedback and usage. I'm curious about like your sort of early aha moments, things that you didn't necessarily anticipate, things that surprise you in terms of, and inform like how the product, took shape and evolved, that you didn't necessarily think of like a priori.

Chris Yang

Yeah, we had very little a priori in terms of like specific functionality that we thought we would build. To the extent we had a product hypothesis, it was much vaguer, which was there's going to be a bunch more data scientists in the future soon, future being back when we said this back in 2014. It's going to end up being a team sport in the way that like, software engineering is, has been a team sport for many years. But back then it was like, the pattern that we saw was like, out of this whole company, there's two data scientists and they don't know about each other. So they're like off in their own world, but we knew from our experience at Bridgewater we didn't call them data scientists, but they were basically data scientists.

It's Oh, if you have 25 or 50 or a hundred or 300 of them, you're going to start to run into challenges of how they work together. And so we knew that was going to be a thing. And there's going to have to be a whole bunch of like capabilities to help support that. And that's something you put in a slide and like something you trot out when you're introducing yourself to people and like why they should take you seriously. And then that also ran in parallel with a much smaller loop called you still have to hook people on day one to try to use a thing that you're building for them.

And that was like a much more micro thing called, Hey, what are your challenges? Oh, you have a hard time pushing big compute into the cloud. Okay. What if I built that for you? Would that be helpful to you? And then, these have started to converge over time. But yeah I don't know if we had so many specific, oh, we're going to add this, then we're going to add sort of a cloud environment for notebooks, and we're going to add hosting. Like it was not, it was much, much more amorphous than that. It was much more driven by customer feedback of like just what they needed we tried to be really responsive to.

Wei Lien Dang

That's interesting, Chris, because I think a lot of technologists, strong technologists like yourselves, you're opinionated, and to the extent that you have certain conceptions or like notions of Hey I'm going to build this, like everyone needs this but I think the fact that, you're very oriented around the customer really helped you find the early way, especially.

Chris Yang

Yeah. And I suspect there's lots of ways to be successful in this game. And so I don't mean to suggest that somebody that has a really clear vision for exactly where all the buttons are going to go over the next five years. I think there's lots of companies where out of the gate for a bunch of reasons that like founding builders, know what they want. But my sense is even those people built up that customer intuition, but probably just on someone else's dime. Like when they were working at a really big tongue company, or maybe they were consultants or something, we did it on our own dime called, just the three of us in our own apartments trying to figure this out.

YeahI think there's, I suspect you don't, one way or the other if you are to survive, like you need to be right about what the customer needs, and I think one way to do that is to just ask them a lot. Another way to do that is to guess or roll the dice. And another way to do that is to have built up that intuition or expertise over many years.

I will also say it's not lost on me that perhaps part of my beginner's mind here was also informed by the 18 months we had spent previously building this largely unsuccessful recruiting product where, maybe we had a stronger sense, but certainly looking back at it now, the lesson I take away maybe from that wandering period is at the end of the day, if the customer doesn't need it, it doesn't matter What you build, like it could be the best, it could be the world's best hammer, but if they don't have any nails to hit, like it doesn't matter.

And so that, that probably encoded in me a pretty deep skepticism of do I really understand what a user needs? Whereas maybe before that I was much more like build build.

Wei Lien Dang

Domino's been very successful with the enterprise segment, is that where you started or was that like a shift over time? I'm just curious like how you thought about the early ICP, you know, as you were finding a way to product market fit and whether that changed, over time or not.

Chris Yang

Again, I think there's the glossy movie version of this story, that's we saw the future of collaborative data science and machine learning because of our past experience. It's because of first principles logic around where the growth and value this would create in the market. We need to go target big enterprises and because of yada. And that is a level of sophistication, of thinking about markets that I only have now after having done this for 10 or 11 years and it's like absolutely not anywhere close. That was like, to the extent I maybe have a master's or college level of understanding of product management, I was probably, we were probably bumping around in like kindergarten land back then.

And this is actually a very interesting topic to me because again, I mentioned I'm advising other companies. They're across a whole different range of familiarity with product management as a discipline. But it's my number one thing I impart to technical founders, particularly if they haven't been a product manager themselves or haven't been at a company with a strong product management discipline, which is basically Hey, there's this whole thing that like, maybe you don't realize is a whole discipline called thinking about market opportunities and products that can be monetized to solve those market opportunities. And it's called product management. And I emphasize that because I didn't know it was a thing. Not knowing like not knowing physics existed, and you're just like, whoa, like things drop. That's weird. And then then one day realizing like, oh, wait a minute, like actually like lots of very smart people have developed this like very deep understanding of this very complicated topic.

And so at that time, our strategy was hill climb. It wasn't any, it wasn't much more sophisticated than that. It was like, cast the big net. Who comes in the door, slash, when we go out and try to penetrate out into the market, who reacts well to what we're saying?

And then, as people react to us, and then need us, or need what we're promising, then steer towards what they're asking for. And so in a lot of ways the very early days was a simple loop called if someone wants something and we think like maybe they'll pay us money slash maybe they've already paid us money, then let's just build the thing for them.

And that's hard because the universe of things you could build is potentially infinite. And I think for the first, there's probably some early adolescence where you're out of that initial phase where you're like there's zero people using it. So it's real hard to decide what to build. The algorithm doesn't work there. We had enough people and then there was an early phase where we had enough people where there was meaningful feedback, but not so many people asking us for stuff that was like overwhelming. But that felt good. That felt very productive. Every day it was just like, cool, we got the list. Let's rock. There was this early adolescence phase where you started to really feel the stretch where you're like, Oh, not only is there too much to do it's getting worse. Like every day, the list of stuff we're not doing is increasing exponentially. And the hill climb here then prioritized revenue.

And so what ended up happening is enterprises ended up paying us more than other people. And so we started to steer there. As you start to steer there, you're implicitly making a bunch of decisions about we're going to not serve a whole bunch of these other people who are asking us for stuff by the sort of very naive decision criteria of how much money you're going to pay us.

A lot of that middle phase of adolescence was us half debating whether we had to make a decision or choose and then the other half just being stressed out about it but not totally sure like how to move forward because both paths feel impossible, right?

It's are you really going to turn down this million-dollar check from one of the biggest insurance companies in the world? And then on the other hand, it's I don't know. We got 20 of these small guys using our cloud offering. Are we really going to turn that off? Like, you really see how that could actually turn into sort of a very successful thing in its own right.

And looking back, that probably should have been an explicit, thoughtful, analyzed decision about our product market fit. It's like, where do we want to focus? Where do we think we can win? Where do we have a right to win? But if you don't have the expertise or the vocabulary or the theory of product management, it's not actually at all even obvious that you need to choose.

And I'll bring this up here because I know a lot of the Unusual portfolio has a similar property to Domino, which is we are like a horizontal technology. Like in principle, anybody building machine learning could use a Domino. And so I think we spent a lot of the early years with some angst, maybe, where people would be like, you guys need to focus. And then we'd say why do we need to what are we going to focus on? Everybody needs the same thing. Like it doesn't matter what industry they're in. And, I think if, and I see this a lot in some of the companies that I'm advising, and so maybe I'll just say it here and then maybe I don't have to deliver a very long lecture about this later on.

It's like figuring out who you're trying to serve is really important, no matter what, even if you're a horizontal tool that, in principle, anybody could use. Because, if you try to serve everybody, you'll just end up serving nobody. For, hopefully, obvious reasons, but, it's different groups of potential users have subtly, if not dramatically different needs. what you're building doesn't exist in a vacuum. Whether you have named VC-backed competitors or not, there's always an alternative for a user. And so if your product isn't constantly demonstrating its marginal betterness, than whatever the next best alternative is then you're going to lose, maybe not tomorrow, but you're going to lose over time and being clear about what alternative are we trying to be better than is really important because you can't be better than all the alternatives at least in the realm of horizontal infrastructure tooling, I don't know if you invented cold fusion, then I don't know, like it's probably a totally different game.

But at least in the realm of this world where there are problems for any particular thing you're thinking of, whether your company is doing it or thinking about doing it, there's at least two or three likely credible alternatives. And so for someone to choose you you have to give them a reason to want to choose you.

And that's, again, independent of you getting to pick your targeting criteria. I think it's quite sensible, if you aren't an expert in doing this, to focus on industry and size of company. For a whole bunch of reasons. Again, this is in the B2B space, but it's an art and you will learn over time like maybe those aren't appropriate ways to segment, but the important thing to emphasize is you got to pick some things.

 You have to have some theory for the group of people that you're going to serve better than any alternative they can hsbnot clear about that, then the market will decide for you. Basically.

Wei Lien Dang

I think there's a lot of valuable nuggets in there, Chris. One is, I think in terms of finding product market fit, a lot of founders focus on the what, when, we would actually say the who is equally, or in some respects more important. And that's a great thing. The other is this notion, as you progress from like the toddler to like the adolescent stage that you were referencing, it's you have, some people who are using, who are engaged, are they, some people like to say like the right people.

It's more are they representative of a large enough market that you can build a successful business by serving? And I think that making some deliberate choices with a framework, dollars is one dimension or one input, but having a framework for how you prioritize and how you make these changes, because I think it definitely has implications for for the company, not just in terms of the product, your org chart will look different if you go to, if you serve the enterprise versus, or SMB. 

So I think across the board it's who, who do you, what kind of company or business do you want to be when you grow up is, is the theme, but I think a lot of, interesting, it's interesting to hear you talk through how that was playing out in real time, as you look back .Maybe just, you brought up this notion of like market perspective and, you guys were early thought leaders. You were innovators in the space that, eventually at some point became known as MLOps.

And I think with all the focus and hype on Gen AI. I'm curious to hear, like, how you think that category has evolved over time. Do you view the current wave of AI more of as an extension of that? Do you view it more as there's actually a fundamental break that makes it different from the era that Domino grew up in? What are your views on how MLOps has played out and how it relates to the current AI wave is what I would ask.

Chris Yang

 Yeah, it's an interesting question. At least over the lifespan of Domino, we've been a part of I would say really two discrete phases and now we're entering this third phase of sort of the rise of generative techniques. I think the first phase we were in was something like obviously companies have been doing a lot with data for a long time.

 But what I think the first wave we really rode or the confluence of two waves was the rise of Cloud computing, both in terms of ease of access to like massive amounts of compute, that wasn't really possible, but closely tied with that was also the collection of a massive amount of data that before wasn't possible to do.

So you'd add those two together. So it's like a whole bunch of stuff that is only possible to do, like a whole bunch of models getting trained, only possible to do once you have the confluence of those two things, like enough data to train on and enough compute where it's practical to do.

And that, that, the implication of that was a whole bunch of companies who before maybe had one random PhD floating around realizing that Oh, actually we could have a whole gang of these people and that actually we could create a tremendous amount of value. So around that, and maybe call that the awakening and obviously different pockets of different industries and different segments were further along or less far along.

That was an awakening. Then there is maybe what I'd call the ML Ops phase which is maybe, I remember the meeting where we were like, all right, this seems like a good category. Let's see if we can like slot into it and shape it, but also ride some of the momentum around this phrase.

And that was really a little bit reactionary because we talked to all these CIOs or like heads of data science and, there was a phase there where they were like we thought we would be done after we hired all these PhDs. But actually, if you actually look at what they've delivered, like one out of 10 models actually makes it into production that they've built.

And then there's, you're looking at this going, okay, it seems like a problem. What about, what are the other nine people doing then? And so there was this sort of reaction to oh, it wasn't enough just to hire the people and get them to work with each other. Like just building the model was insufficient.

Like you actually have to operationalize it. You have to deploy it. And then importantly, once it's deployed, actually wire it into all the things that it needs to get wired into or to actually drive like actual impact and results within the enterprise. And I don't think I was alive for this, or I was probably a child, but almost certainly an analogous thing must have happened with developing applications internally.

That companies hired a bunch of software nerds, and they were building stuff, and then at some point they all realized like all the applications you're building are cool, but not actually driving any value because nobody's actually using them. And then at some point someone must have been like, Okay, we gotta figure out how to get people to use these things that we're building.

And I think to some extent that is never going to stop in the same way that like we haven't solved DevOps, right? Like we're still constantly figuring out new, increasingly complex ways to deploy software internally or externally. And then maybe also the rise of like citizen development in the form of retool and these sort of drag and drop builders, like we're constantly figuring out new ways to build that.

I think in the same way, like MLOps is never going to be done. We're going to build increasingly complex machine learning models, different form factors for different applications, like different uses, and there's going to have to be an analogous increase in the power of the tooling and that's going to go on forever.

And then the gen AI thing, I think exists in a slightly different sort of orthogonal dimension, which is Obviously really cool, really amazing stuff being done. All sorts of use cases that we didn't think you could apply ML to, that clearly now you can apply ML to. But, to me, I think runs somewhat orthogonal to the MLOps stuff, which is, it's gonna be one of a bunch of different types of machine learning.

It's gonna be one of a bunch of different types of use cases where you apply machine learning to, I think it's particularly sexy and hot 'cause it does something that, if you were a lay person or maybe not paying that much attention to this stuff ahead of time, that you'd be like, a computer will never do that.

And then to have it like talk to you and be like. more than convincing as like a human is like really surprising it's like very not intuitive but i you know i don't think it's particularly special in that sense of like In addition to helping humans write movie scripts and generate art, we're also still going to have to predict time serieses, and we're still going to have to figure out logistic regressions to find correlations of who best to market to, and we're still going to have to build decision trees.

And it's really great at some use cases. It literally doesn't make sense for some other ones. And that the substrate of the infrastructure and the ML ops layer, I think it's just exists at a slightly different level. One interesting thing I've been noodling on is to the extent there's a lot of value in this generative stuff, like where, what is going to be the preferred architecture of deployment?

So I think it was JP Morgan, but one of these big banks is like OpenAI is going to power all of our chatbots. I'm like, that's interesting. Is that going to be the de facto standard? That these LLMs are so complicated to build and tune and test that actually most people in most places are going to opt to outsource that expertise.

Whether it's OpenAI now, slash, as the big CSPs figure out their acts around this. Is it gonna, are they gonna get into the game? The last Google conference, announced a bunch of data integration capabilities to reach into your enterprises. Complex non structured data sources, obviously with an eye towards building that integration layer so that you can enable all this training on top of it.

Are we, is that where this is going to go or are people going to figure out actually it's better for us to keep our data, to tune it for the things that we need? That the, some combination of the tooling around tuning and the expertise that's going to inevitably get built up by the people doing this stuff every day, is that going to become so common that actually we'll end up insourcing it and it'll be like any other internal application you build. I don't quite know how that's going to shake out. As these skills become more and more commodified, I can imagine that balance of power shifting much more back into organizations that the benefits of control and customization, and also just a rising tide of general capabilities out in the labor market, plus tooling that must get built. Plus sort of diminishing returns of quality, right? Like maybe it shifts that way. Of course it'll always be a space for well-funded organizations to push the edges of the math and like scientific research. Of course that'll always exist. But how it gets commercialized and like mass brought to the mass market,

I don't quite know, but it's very interesting. I legitimately can't quite see how it's going to shake out, but obviously it has big implications for people who want to build tooling at that layer. Can you get one, are you going to have enough market to build? Or are the CSPs or maybe the people who control the data already because moving the data around is really expensive, so there'd be some natural advantage accrued to the people who already host the data in terms of putting machine learning on top.

And how do you get the timing right? Is that, are people going to want to do that in the timeframe of your company having a runway to survive? Yeah. Really interesting. I, yeah. We'll see how it shakes out.

Wei Lien Dang

Well, Chris, it was fantastic to hear about your founder journey and these snippets that I think, most founders, trying to figure out product market fit can relate to. So thank you again for your time and for joining us on our show today.

All posts

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

All posts
January 31, 2024
Portfolio
Unusual

Domino Data's product-market fit journey

Wei Lien Dang
No items found.
Domino Data's product-market fit journeyDomino Data's product-market fit journey
Editor's note: 

SFG 39: Domino Data's Chris Yang on helping data scientists run AI

Domino Data Lab was founded in 2013 and was one of the early innovators in the space known as MLOps. It delivers a unified platform for enterprises to build, deploy, and manage machine learning workloads and provides access to data, tools, compute, models across any environment. Domino enables ML and data teams to collaborate, ensure governance, and reduce costs at scale.

In this episode, Wei Lien Dang chats with Chris Yang, co-founder and CTO of Domino Data.

Be sure to check out more Startup Field Guide Podcast episodes on Spotify, Apple, and Youtube. Hosted by Unusual Ventures General Partner Sandhya Hegde (former EVP at Amplitude), the SFG podcast uncovers how the top unicorn founders of today really found product-market fit.

Summary and key takeaways

  • The Domino founding team's initial idea was for a recruiting platform, but they pivoted when user interviews revealed an opportunity in using the cloud for data science. This is why founder outreach matters - targeting the right market problem is table stakes for success, you can't skip this stage no matter how excited you are about your ideas.
  • The initial MVP was a product that helped users push compute workloads to the cloud and receive results back. Domino also saw early that data science would “become a team sport like software engineering.” They prioritized supporting collaboration.
  • The company followed a "hill climb" strategy, casting a wide net to see who was interested in their product and then steering towards those customers. Chris and his co-founders pivoted to enterprise customers when they found that the market pull for their solution was stronger there.
  • Chris sees MLOps as ”never being done" - as models become more sophisticated. Tooling has to advance in step with models.

Episode transcript

Wei Lien Dang

Today we're going to share the story of Domino Data Lab. Domino Data Lab was founded in 2013 and was one of the early innovators in the space known as MLOps. It delivers a unified platform for enterprises to build, deploy, and manage machine learning workloads and provides access to data tools, compute, and models across any environment.

Domino enables ML and data teams to collaborate, ensure governance, and reduce costs at scale. Today, I'm super excited to welcome Chris Yang, CTO and co-founder of Domino Data. Before founding Domino, Chris worked at Bridgewater, one of the world's largest hedge funds. And he helped deliver their next-generation investment platform.

He holds both a bachelor's in computer science and a master's from MIT and published a bunch of papers on both computer vision and distributed databases while he was there. Welcome to the Field Guide, Chris.

So I'm super excited to dive into your founder journey with Domino Data Lab and, maybe to start, if you take yourself back to 2013, gosh, 10 years and so much has changed, but if you can look back and, I love to start by asking what was the founding insight that prompted you and your co-founders to start Domino? What was your origin story behind the company? 

Chris Yang

Yeah, it's a great question. And I know that the audience for this talk is going to be other founders. So I did want to linger on this for a moment. I think there's a version of the story that we tell about Domino which is the movie version. My co-founders were also both at Bridgewater as well. And I think there's the movie version, which is like the three of us in doing this sort of like data analysis type work at Bridgewater came to some insight about how the broader world could do collaborative data science. And then that was the origin of the company. And that is absolutely not how this happened.

What actually happened was my two co-founders had already decided they wanted to, at the time, they had an idea for a recruiting platform. And I ended up hearing about it and then barged my way into this relationship that the two of them had and said do you need someone to code the thing? We tried a bunch of different tacks around this recruiting platform, but there was actually this like year, if not 18 months where we were working on that and a couple of others like very unrelated ideas. And it was actually that those other threads weren't going well. And I remember this vividly.

We were sitting in my living room in my apartment in San Francisco. And we were like, the three of us were like, this isn't working. Like this recruiting platform thing is not working. And we went back to basics. We were like, what do we actually know? And what's a better way for us to go figure out, like, what can be built here? And that part, that 18-month wandering in the desert is something like, we cut out of the story and I just wanted to tell maybe the other founders, maybe people who are thinking about founders, thinking about founding a company that like, Oh if I don't have this really glossy movie version of a story I don't have a chance.

And I just want to assure you that absolutely, so much rewriting of history happens when we tell these stories. And so anyways, what we ended up doing was we made a short list of stuff we felt like we actually had deep insight into, not necessarily that we had some disruptive idea. We made a list of okay, what do we actually know a lot about? Because at the time, we were coming into recruiting as people who weren't like recruiters. Like we didn't have a sort of like people or recruiting background. From there we realized, actually what we know a lot about is high-end complex collaborative data science. From there, then we bootstrapped from our network to basically do 30 to 60 to like a hundred basically cold interviews of people who were roughly doing data stuff at the time.

And it really just was like, hey, what are your biggest challenges? And after doing a whole bunch of those back-to-back, then the sort of patterns started to emerge. And even that telling of this story is a little bit glossified, but it's much, much closer to the truth. And so then what came out of that was essentially a set of challenges around scaling, scaling the magnitude of the analysis you were doing combined with… so at that time what we called data scientists then, I guess we still call them that, but that term itself has bifurcated into a whole bunch of subspecialties, sort of technical gap, like a skills gap around how to use the cloud.

And so we looked at that and said Oh, we can build a thing that helps somebody push this compute workload up to the cloud where it can scale infinitely and then send the results back. That was the very, very first use case and pain point. And then also the first MVP that was like the very first little tiny nugget of what then expanded over time to become the full platform.

Wei Lien Dang

First, I love that you bring up this topic of like revisionist history. Most successful startups, it's a windy path at best and, but one thing that really stands out from that you highlighted in the early days was went back to this problem that you and your co-founders had a lot of authenticity for, like you understood in a way that maybe a lot of other folks hadn't yet experienced, or they didn't necessarily have a strong insight like you guys did in terms of how to tackle a particular problem space, but then you went and did all this cold outreach to test your hypotheses.

I would say at Unusual, we're big proponents of, of doing that. And I'm curious, if there were any, if you think back, if there were any early learnings that came out of that process and how you thought about balancing or how founders can balance Hey, I have an authenticity and I understand this space probably deeper than most, but then, there's all this feedback from the market that you go try and extract. I'm curious, how that factored into the early thinking for you all

Chris Yang

Yeah. It's a really great question and now I'm fortunate enough to be in a position where I can work with other younger founders or people who are just starting out on this journey. And even thinking back to where I was back in 2014, the bias always appears to be, we're far too quick to build something and I think hopefully I still have my engineer card, although I haven't written a lot of code recently but I think as an engineer, as a person who was trained and brought up that the way I add value to the team I'm on or the company I'm working on is I write code, I build stuff. And a bunch of the other founders that I advise also came out of that tradition. And I suspect a lot of the folks that are in the Unusual orbit are probably also technical in that way. When you only have one hammer called write code, everything starts to look like a write code nail. And my reflection is that I was far too quick to solve a problem rather than understand more deeply what was actually challenging for the user or customer and to go deeper there.

And I see that again with a bunch of the companies I'm working with that they want to, they push in the direction of, I have a really clear vision for what the product is gonna be, And I'm like, Hey, that's great, but you need to temper that or test that with what problems do people actually have? Is this problem even worth solving? And at least in my experience, the thing you thought you were building has almost nothing to do with what you end up needing to build to like actually solve the problem. And fortunately, me and the first folks, we were very fast at building.

And it also happened that the set of problems we ended up tackling had in retrospect, like fairly straightforward solutions to them, at least from a technology perspective. And so it could have been much worse, but we were also like fast on the draw, which I think helped maybe it not spiral out of control.

But, you know, certainly the way I coach people now is like 10 to 1 like you should be having 10 conversations for every feature you had or something like that, whatever, whatever the unit of coding, because if you don't really have clarity about the problem you're trying to solve, and more importantly who you're trying to solve it for. You could build all day and at least in my experience, it's like most stuff you build ends up not really getting used, right?

And so it's like the most expensive way to do market research basically is to build stuff and then see if it sticks. And so cut out the middle, save everybody some trouble. Go talk to more people. It's always go talk to more people. I've never talked to anybody, and this was never a problem that we talked to too many people.

Wei Lien Dang

Cool! And I think if you go back to that sort of like first year and a half of wandering in the wilderness, I think one of the things that founders need is grit, and it's hard to gauge, how, how much grit you have until you're in the thick of it. But clearly, you and your co-founders stuck through it, you adapted, you wanted to continue to find something that worked. And I'm just curious is there something, in how you guys work together or you maybe you knew having pre-existing relationships that, that helped with that? Like, what gave you the resilience tokeep going and push through to find something that started to work?

Chris Yang

One thing is both my personality and also the culture we came out of at Bridgewater, we were always very open with each other and in particular, could talk openly about what was or wasn't working. And so I mentioned this crisis meeting where we were in my living room, and we had this big whiteboard up, and that wasn't a fun conversation, obviously.

But it also didn't have these like second-order complexities of, people are mad or upset at each other or living in denial. We were having a relatively straightforward, again, not fun, but a relatively straightforward conversation called hey, I don't think it's working. Here's why I don't think it's working. Here's what I think we can do. There's debate and discussion around that stuff, but we didn't have any added drama of it's not working because you're not doing X or I actually think it's working and we just need to push through. It's a very open and transparent and like logical conversation.

And I think my advice to folks is everyone's polite on the first date. And so if there's anybody out here who's like thinking about picking a co-founder or how to find a co-founder, it's tough because a lot of what you need to figure out is not can we get along now? It's when stuff really hits the fan, are you gonna have a reasonable partner that you can have a productive sort of brainstorm with?

So I think in that regard we were like very lucky. The second thing is it's funny, I wouldn't say any of the three of us are particularly sunshine and roses and optimistic in that sense. And three of us are actually very different, sure, from a personality perspective in a lot of ways. At least for me, what kept me going was not being alone. So just having co-founders who were like in the boat rowing with me, like really important. I don't understand how solo founders exist, to be honest. Like they must be made of sterner stuff than me. I cannot imagine going alone. And the second thing, and again, I think if I am a broken record about this, it's going to be on this point, which is no matter how much or how little traffic we had to the website I always got real joy, we always got real joy out of actually solving an actual customer's actual problem.

But you only really get that dopamine hit if you're like, seeing them, and observing them, and watching them. Both in terms of actually understanding what their challenge is so hopefully you go home and build a better feature or build a better improvement, but then also when you're there, they'll see them like use it and see their eyes light up and see them like do something that they couldn't do before.

And so it was this ‌very tiny dopamine loop. And I suspect every engineer probably has this part of their brain. I don't think you become an engineer unless you like solving problems. And so that little dopamine hit of I solved the problem and a person's happy about the work that I did was really the manna for me, at least psychologically, for many years, if not still actually forever and ever. And so if you're not out there engaging with, again, especially in the early days, you have like very few users, but if you're not out there engaging with them, you're going to miss out opportunities, on insight of the problems, but also the joy of seeing your solution that you built, like work for them.

We had a really early customer. Tesla was our sort of big, very early customer. And again, thinking back in retrospect how wild this was, but I actually spent three days a week there for easily three or four hours. I would just sit at the desk. And I just sat with the team for a big chunk of the week and I absorbed what they were doing when they ran into problems, I was just there. They'd be like, Chris, why is this thing broken? And I was like yeah, sorry. But you keep your ears open for opportunities, challenges, adjacencies, and that's very extreme. I think on balance, it was probably, obviously it was a very low-leverage activity in terms of my time. If you're not out there really getting ensconced in what your users are actually going through, you should really prioritize that. It's really important

Wei Lien Dang

So Chris, you all had this insight around collaborative data science, data analysis. You had these early customers. Like Tesla. Where did you start in terms of what you built first? Once you felt like you had enough validation, enough confidence to actually, start putting hands on keyboard, like, what was the first sort of like feature or product set out to build? What were you trying to do?

Chris Yang

Yeah. So the very first tiny kernel was you had an R script, you had a Python script. You're trying to punch through a whole bunch of data. It was bigger or slower or a major machine run hotter or faster than you would tolerate. Our users at the time, even still now, data scientists, not software engineers, certainly not cloud infrastructure engineers. So there was a sense of I know there's a thing called the cloud where potentially there's infinite compute, but I absolutely do not know how to parse the AWS docs to wrap up the libraries and then shift the code over to then get the data back or push it to, and get it back.

And that was all like theoretically possible to them, but like in practice, it was never going to be worth it for them to go learn the AWS stack to figure out how to do that. So it was like the very first like kernel and that was I remember because I remember being pretty proud of this, that sort of from like first check-in to like first version that we got a person to use was like four weeks And obviously it didn't do a lot and it like, it was like pretty broken in a lot of ways.

But then from there, then we had this, okay. So then we had this like use case we had from the interviews, we had a set of people we could go back to and be like, Hey, I built a thing. Do you want to use it? And of course, 90 percent of them were like,  who are you, go away. But then, then the few that did respond and react, then that gave you the tiny little kernel around which to nucleate more usage, leading to more feedback, leading to more adjacent use cases, leading to more depth. And so that was the fundamental feedback loop in the early days. And now that I'm armed with a whole bunch more sort of product management knowledge or theory. Back then, the next best alternative for a lot of these folks was like very old-school, high performance computing frameworks like Star, for instance, but this was all like mainframe stuff. It came up in like the eighties. It was a very high barrier to entry. Like, you still have to write bash scripts to coordinate the distributed compute load. So it certainly was not really premised in the cloud mindset, which is still, back then, like relatively early, at least for these sorts of data use cases. But yeah, that was the main thing. 

Then the next use case that became a pretty easy tee-up was if we're already copying all this code and data up to the cloud, we'll just go ahead and save off a copy of it for you. So that you can reproduce your results. That ended up being like a big theme across all the interviews, which is you like run a bunch of these analyses. They take long enough that you really want to run a bunch of them in parallel, and they also take long enough and you want to run enough variations that like, by the time they're done, you're like, you don't quite remember which thing was doing what. And so again, it was an insight that was only possible after you built the first thing. We were already copying all this stuff. Like, why don't we just save a copy of it off and then show you after the fact? And then that ended up being the kernel of a much bigger theme. That's actually been a part of Domino even to this day, which, is like part of the way we provide value to a team is by helping you silently and like seamlessly help you keep track of what you're doing. So you don't have this like metacognitive workload of Whoa, wait, I need to write down all the things that are like going on and help you stay organized and the principle or the user experience principle we established at the time that we still try to adhere to today is can we make it so the user doesn't have to go out of their way to get this value? So it's we're going to help you farm this compute out. Let's give you reproducibility for free. You don't have to opt in or particularly think about it. So that was a major use case.

Wei Lien Dang

So something like that second use case around make a copy, reproducing the results. Was that something that was on your radar? Or is it something that could only have, was like born out of customer feedback and usage. I'm curious about like your sort of early aha moments, things that you didn't necessarily anticipate, things that surprise you in terms of, and inform like how the product, took shape and evolved, that you didn't necessarily think of like a priori.

Chris Yang

Yeah, we had very little a priori in terms of like specific functionality that we thought we would build. To the extent we had a product hypothesis, it was much vaguer, which was there's going to be a bunch more data scientists in the future soon, future being back when we said this back in 2014. It's going to end up being a team sport in the way that like, software engineering is, has been a team sport for many years. But back then it was like, the pattern that we saw was like, out of this whole company, there's two data scientists and they don't know about each other. So they're like off in their own world, but we knew from our experience at Bridgewater we didn't call them data scientists, but they were basically data scientists.

It's Oh, if you have 25 or 50 or a hundred or 300 of them, you're going to start to run into challenges of how they work together. And so we knew that was going to be a thing. And there's going to have to be a whole bunch of like capabilities to help support that. And that's something you put in a slide and like something you trot out when you're introducing yourself to people and like why they should take you seriously. And then that also ran in parallel with a much smaller loop called you still have to hook people on day one to try to use a thing that you're building for them.

And that was like a much more micro thing called, Hey, what are your challenges? Oh, you have a hard time pushing big compute into the cloud. Okay. What if I built that for you? Would that be helpful to you? And then, these have started to converge over time. But yeah I don't know if we had so many specific, oh, we're going to add this, then we're going to add sort of a cloud environment for notebooks, and we're going to add hosting. Like it was not, it was much, much more amorphous than that. It was much more driven by customer feedback of like just what they needed we tried to be really responsive to.

Wei Lien Dang

That's interesting, Chris, because I think a lot of technologists, strong technologists like yourselves, you're opinionated, and to the extent that you have certain conceptions or like notions of Hey I'm going to build this, like everyone needs this but I think the fact that, you're very oriented around the customer really helped you find the early way, especially.

Chris Yang

Yeah. And I suspect there's lots of ways to be successful in this game. And so I don't mean to suggest that somebody that has a really clear vision for exactly where all the buttons are going to go over the next five years. I think there's lots of companies where out of the gate for a bunch of reasons that like founding builders, know what they want. But my sense is even those people built up that customer intuition, but probably just on someone else's dime. Like when they were working at a really big tongue company, or maybe they were consultants or something, we did it on our own dime called, just the three of us in our own apartments trying to figure this out.

YeahI think there's, I suspect you don't, one way or the other if you are to survive, like you need to be right about what the customer needs, and I think one way to do that is to just ask them a lot. Another way to do that is to guess or roll the dice. And another way to do that is to have built up that intuition or expertise over many years.

I will also say it's not lost on me that perhaps part of my beginner's mind here was also informed by the 18 months we had spent previously building this largely unsuccessful recruiting product where, maybe we had a stronger sense, but certainly looking back at it now, the lesson I take away maybe from that wandering period is at the end of the day, if the customer doesn't need it, it doesn't matter What you build, like it could be the best, it could be the world's best hammer, but if they don't have any nails to hit, like it doesn't matter.

And so that, that probably encoded in me a pretty deep skepticism of do I really understand what a user needs? Whereas maybe before that I was much more like build build.

Wei Lien Dang

Domino's been very successful with the enterprise segment, is that where you started or was that like a shift over time? I'm just curious like how you thought about the early ICP, you know, as you were finding a way to product market fit and whether that changed, over time or not.

Chris Yang

Again, I think there's the glossy movie version of this story, that's we saw the future of collaborative data science and machine learning because of our past experience. It's because of first principles logic around where the growth and value this would create in the market. We need to go target big enterprises and because of yada. And that is a level of sophistication, of thinking about markets that I only have now after having done this for 10 or 11 years and it's like absolutely not anywhere close. That was like, to the extent I maybe have a master's or college level of understanding of product management, I was probably, we were probably bumping around in like kindergarten land back then.

And this is actually a very interesting topic to me because again, I mentioned I'm advising other companies. They're across a whole different range of familiarity with product management as a discipline. But it's my number one thing I impart to technical founders, particularly if they haven't been a product manager themselves or haven't been at a company with a strong product management discipline, which is basically Hey, there's this whole thing that like, maybe you don't realize is a whole discipline called thinking about market opportunities and products that can be monetized to solve those market opportunities. And it's called product management. And I emphasize that because I didn't know it was a thing. Not knowing like not knowing physics existed, and you're just like, whoa, like things drop. That's weird. And then then one day realizing like, oh, wait a minute, like actually like lots of very smart people have developed this like very deep understanding of this very complicated topic.

And so at that time, our strategy was hill climb. It wasn't any, it wasn't much more sophisticated than that. It was like, cast the big net. Who comes in the door, slash, when we go out and try to penetrate out into the market, who reacts well to what we're saying?

And then, as people react to us, and then need us, or need what we're promising, then steer towards what they're asking for. And so in a lot of ways the very early days was a simple loop called if someone wants something and we think like maybe they'll pay us money slash maybe they've already paid us money, then let's just build the thing for them.

And that's hard because the universe of things you could build is potentially infinite. And I think for the first, there's probably some early adolescence where you're out of that initial phase where you're like there's zero people using it. So it's real hard to decide what to build. The algorithm doesn't work there. We had enough people and then there was an early phase where we had enough people where there was meaningful feedback, but not so many people asking us for stuff that was like overwhelming. But that felt good. That felt very productive. Every day it was just like, cool, we got the list. Let's rock. There was this early adolescence phase where you started to really feel the stretch where you're like, Oh, not only is there too much to do it's getting worse. Like every day, the list of stuff we're not doing is increasing exponentially. And the hill climb here then prioritized revenue.

And so what ended up happening is enterprises ended up paying us more than other people. And so we started to steer there. As you start to steer there, you're implicitly making a bunch of decisions about we're going to not serve a whole bunch of these other people who are asking us for stuff by the sort of very naive decision criteria of how much money you're going to pay us.

A lot of that middle phase of adolescence was us half debating whether we had to make a decision or choose and then the other half just being stressed out about it but not totally sure like how to move forward because both paths feel impossible, right?

It's are you really going to turn down this million-dollar check from one of the biggest insurance companies in the world? And then on the other hand, it's I don't know. We got 20 of these small guys using our cloud offering. Are we really going to turn that off? Like, you really see how that could actually turn into sort of a very successful thing in its own right.

And looking back, that probably should have been an explicit, thoughtful, analyzed decision about our product market fit. It's like, where do we want to focus? Where do we think we can win? Where do we have a right to win? But if you don't have the expertise or the vocabulary or the theory of product management, it's not actually at all even obvious that you need to choose.

And I'll bring this up here because I know a lot of the Unusual portfolio has a similar property to Domino, which is we are like a horizontal technology. Like in principle, anybody building machine learning could use a Domino. And so I think we spent a lot of the early years with some angst, maybe, where people would be like, you guys need to focus. And then we'd say why do we need to what are we going to focus on? Everybody needs the same thing. Like it doesn't matter what industry they're in. And, I think if, and I see this a lot in some of the companies that I'm advising, and so maybe I'll just say it here and then maybe I don't have to deliver a very long lecture about this later on.

It's like figuring out who you're trying to serve is really important, no matter what, even if you're a horizontal tool that, in principle, anybody could use. Because, if you try to serve everybody, you'll just end up serving nobody. For, hopefully, obvious reasons, but, it's different groups of potential users have subtly, if not dramatically different needs. what you're building doesn't exist in a vacuum. Whether you have named VC-backed competitors or not, there's always an alternative for a user. And so if your product isn't constantly demonstrating its marginal betterness, than whatever the next best alternative is then you're going to lose, maybe not tomorrow, but you're going to lose over time and being clear about what alternative are we trying to be better than is really important because you can't be better than all the alternatives at least in the realm of horizontal infrastructure tooling, I don't know if you invented cold fusion, then I don't know, like it's probably a totally different game.

But at least in the realm of this world where there are problems for any particular thing you're thinking of, whether your company is doing it or thinking about doing it, there's at least two or three likely credible alternatives. And so for someone to choose you you have to give them a reason to want to choose you.

And that's, again, independent of you getting to pick your targeting criteria. I think it's quite sensible, if you aren't an expert in doing this, to focus on industry and size of company. For a whole bunch of reasons. Again, this is in the B2B space, but it's an art and you will learn over time like maybe those aren't appropriate ways to segment, but the important thing to emphasize is you got to pick some things.

 You have to have some theory for the group of people that you're going to serve better than any alternative they can hsbnot clear about that, then the market will decide for you. Basically.

Wei Lien Dang

I think there's a lot of valuable nuggets in there, Chris. One is, I think in terms of finding product market fit, a lot of founders focus on the what, when, we would actually say the who is equally, or in some respects more important. And that's a great thing. The other is this notion, as you progress from like the toddler to like the adolescent stage that you were referencing, it's you have, some people who are using, who are engaged, are they, some people like to say like the right people.

It's more are they representative of a large enough market that you can build a successful business by serving? And I think that making some deliberate choices with a framework, dollars is one dimension or one input, but having a framework for how you prioritize and how you make these changes, because I think it definitely has implications for for the company, not just in terms of the product, your org chart will look different if you go to, if you serve the enterprise versus, or SMB. 

So I think across the board it's who, who do you, what kind of company or business do you want to be when you grow up is, is the theme, but I think a lot of, interesting, it's interesting to hear you talk through how that was playing out in real time, as you look back .Maybe just, you brought up this notion of like market perspective and, you guys were early thought leaders. You were innovators in the space that, eventually at some point became known as MLOps.

And I think with all the focus and hype on Gen AI. I'm curious to hear, like, how you think that category has evolved over time. Do you view the current wave of AI more of as an extension of that? Do you view it more as there's actually a fundamental break that makes it different from the era that Domino grew up in? What are your views on how MLOps has played out and how it relates to the current AI wave is what I would ask.

Chris Yang

 Yeah, it's an interesting question. At least over the lifespan of Domino, we've been a part of I would say really two discrete phases and now we're entering this third phase of sort of the rise of generative techniques. I think the first phase we were in was something like obviously companies have been doing a lot with data for a long time.

 But what I think the first wave we really rode or the confluence of two waves was the rise of Cloud computing, both in terms of ease of access to like massive amounts of compute, that wasn't really possible, but closely tied with that was also the collection of a massive amount of data that before wasn't possible to do.

So you'd add those two together. So it's like a whole bunch of stuff that is only possible to do, like a whole bunch of models getting trained, only possible to do once you have the confluence of those two things, like enough data to train on and enough compute where it's practical to do.

And that, that, the implication of that was a whole bunch of companies who before maybe had one random PhD floating around realizing that Oh, actually we could have a whole gang of these people and that actually we could create a tremendous amount of value. So around that, and maybe call that the awakening and obviously different pockets of different industries and different segments were further along or less far along.

That was an awakening. Then there is maybe what I'd call the ML Ops phase which is maybe, I remember the meeting where we were like, all right, this seems like a good category. Let's see if we can like slot into it and shape it, but also ride some of the momentum around this phrase.

And that was really a little bit reactionary because we talked to all these CIOs or like heads of data science and, there was a phase there where they were like we thought we would be done after we hired all these PhDs. But actually, if you actually look at what they've delivered, like one out of 10 models actually makes it into production that they've built.

And then there's, you're looking at this going, okay, it seems like a problem. What about, what are the other nine people doing then? And so there was this sort of reaction to oh, it wasn't enough just to hire the people and get them to work with each other. Like just building the model was insufficient.

Like you actually have to operationalize it. You have to deploy it. And then importantly, once it's deployed, actually wire it into all the things that it needs to get wired into or to actually drive like actual impact and results within the enterprise. And I don't think I was alive for this, or I was probably a child, but almost certainly an analogous thing must have happened with developing applications internally.

That companies hired a bunch of software nerds, and they were building stuff, and then at some point they all realized like all the applications you're building are cool, but not actually driving any value because nobody's actually using them. And then at some point someone must have been like, Okay, we gotta figure out how to get people to use these things that we're building.

And I think to some extent that is never going to stop in the same way that like we haven't solved DevOps, right? Like we're still constantly figuring out new, increasingly complex ways to deploy software internally or externally. And then maybe also the rise of like citizen development in the form of retool and these sort of drag and drop builders, like we're constantly figuring out new ways to build that.

I think in the same way, like MLOps is never going to be done. We're going to build increasingly complex machine learning models, different form factors for different applications, like different uses, and there's going to have to be an analogous increase in the power of the tooling and that's going to go on forever.

And then the gen AI thing, I think exists in a slightly different sort of orthogonal dimension, which is Obviously really cool, really amazing stuff being done. All sorts of use cases that we didn't think you could apply ML to, that clearly now you can apply ML to. But, to me, I think runs somewhat orthogonal to the MLOps stuff, which is, it's gonna be one of a bunch of different types of machine learning.

It's gonna be one of a bunch of different types of use cases where you apply machine learning to, I think it's particularly sexy and hot 'cause it does something that, if you were a lay person or maybe not paying that much attention to this stuff ahead of time, that you'd be like, a computer will never do that.

And then to have it like talk to you and be like. more than convincing as like a human is like really surprising it's like very not intuitive but i you know i don't think it's particularly special in that sense of like In addition to helping humans write movie scripts and generate art, we're also still going to have to predict time serieses, and we're still going to have to figure out logistic regressions to find correlations of who best to market to, and we're still going to have to build decision trees.

And it's really great at some use cases. It literally doesn't make sense for some other ones. And that the substrate of the infrastructure and the ML ops layer, I think it's just exists at a slightly different level. One interesting thing I've been noodling on is to the extent there's a lot of value in this generative stuff, like where, what is going to be the preferred architecture of deployment?

So I think it was JP Morgan, but one of these big banks is like OpenAI is going to power all of our chatbots. I'm like, that's interesting. Is that going to be the de facto standard? That these LLMs are so complicated to build and tune and test that actually most people in most places are going to opt to outsource that expertise.

Whether it's OpenAI now, slash, as the big CSPs figure out their acts around this. Is it gonna, are they gonna get into the game? The last Google conference, announced a bunch of data integration capabilities to reach into your enterprises. Complex non structured data sources, obviously with an eye towards building that integration layer so that you can enable all this training on top of it.

Are we, is that where this is going to go or are people going to figure out actually it's better for us to keep our data, to tune it for the things that we need? That the, some combination of the tooling around tuning and the expertise that's going to inevitably get built up by the people doing this stuff every day, is that going to become so common that actually we'll end up insourcing it and it'll be like any other internal application you build. I don't quite know how that's going to shake out. As these skills become more and more commodified, I can imagine that balance of power shifting much more back into organizations that the benefits of control and customization, and also just a rising tide of general capabilities out in the labor market, plus tooling that must get built. Plus sort of diminishing returns of quality, right? Like maybe it shifts that way. Of course it'll always be a space for well-funded organizations to push the edges of the math and like scientific research. Of course that'll always exist. But how it gets commercialized and like mass brought to the mass market,

I don't quite know, but it's very interesting. I legitimately can't quite see how it's going to shake out, but obviously it has big implications for people who want to build tooling at that layer. Can you get one, are you going to have enough market to build? Or are the CSPs or maybe the people who control the data already because moving the data around is really expensive, so there'd be some natural advantage accrued to the people who already host the data in terms of putting machine learning on top.

And how do you get the timing right? Is that, are people going to want to do that in the timeframe of your company having a runway to survive? Yeah. Really interesting. I, yeah. We'll see how it shakes out.

Wei Lien Dang

Well, Chris, it was fantastic to hear about your founder journey and these snippets that I think, most founders, trying to figure out product market fit can relate to. So thank you again for your time and for joining us on our show today.

All posts

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.