What is the operational complexity of your ML model?

FAQs for AI Zero → One (Episode #4)

Sandeep Uttamchandani
4 min readJun 2, 2022

--

Transcript:

Welcome everyone to episode four of unraveling AI entrepreneurs. This is a series for engineers and product managers where we cover frequently asked questions on challenges in the domain of AI and going from an idea to real-world products. Today’s question is about the operational complexity of your ML model, and the key point here to understand is not all ML models are equally complex when it comes to running them in production. Let me give you a few examples.

Let’s say you've written an ML model that generates an email every night recommending, you know, articles for, you know, your users to read, for instance. So it’s more of an offline email generated once in 24 hours. That operational [00:01:00] complexity is very different, from considering, an ML model which is being used in real-time gaming where it is actually learning from the actions of the player and adapting to the next steps or the next challenges it kind of shows within the game. That’s a whole different extreme in terms of the operational complexity to have that model [00:01:30] in production.

So to answer the question, I like to break it down as a, a two by two matrix. On one side you have the training and on the other side, you have inference. So when you think about complexity, the two-by-two model here is your training offline versus online and is your inference offline versus online.

So let’s take a couple of examples here in terms of, um, you know, what’s the complexity in, in the different, um, quadrants that are there, right. So going back to the first example where you’re generating an email each night, a list of interesting, relevant articles, um, from you know, one- one of the… one of your sites. That is a model which is [00:02:30] offline trained, as well as the inferences are offline, so it comes into to the lower left quadrant in terms of, you know, where that would fit, and that’s typically I would say low operational complexity.

Um, the other quadrant here is training is offline, but the inference is online. So an example here essentially would be let’s…

--

--

Sandeep Uttamchandani

Sharing 20+ years of real-world exec experience leading Data, Analytics, AI & SW Products. O’Reilly book author. Founder AIForEveryone.org. #Mentor #Advise