I recently had the opportunity to participate in a fireside chat session at Forrester’s New Tech & Innovation 2018 forum with J.P. Gownder, a vice president and principal analyst at the firm. It was a timely and much-needed discussion on some of the biggest questions today in artificial intelligence (AI) and I hope that the audience walked away with a better understanding of this pivotal and complex technology. For those that were unable to attend the event, this blog post will provide an overview of several of the questions and answers that we looked at in the session.
What is industrialized AI?
At Petuum, we often talk about the “industrialization” of AI, but this term is likely unfamiliar to most people. I started the session by exploring what we mean when we talk about AI being industrialized in the same way as textiles, electronics, and other products. Essentially, AI right now is an artisanal, hand-crafted product — there are no standardized building blocks, engineering processes, or templates for its creation. This does not mean that AI programs created today are bad, just as having a custom, handmade suit isn’t a bad thing. However, when you begin to look at scale and maintenance — to follow our analogy, when you need to produce thousands of suits that can be altered by any tailor — a process that’s entirely customized and handcrafted isn’t efficient. That’s why there are factories today devoted to making suits that are identical to each other, using standardized sewing machines, needles, patterns, and fabric.
To use another analogy, creating an AI application today is like building a house. Imagine that you want to build a row of ten houses, and you use a different contractor for each one. Each contractor brings their own, customized materials to the build site, so every house is built with different sized nails, nuts, and bolts. The contractors might all be master craftsmen, and each house may be beautiful, but what happens after a tenant moves in and the natural wear-and-tear occurs? Because each house was built with customized parts, the only person who can make repairs is the same contractor who built it. In AI, this is analogous to how each application today is built with unique tools and processes, meaning that if the application’s creator leaves a company, there is nobody left to maintain and update the system. For businesses, the AI investment could simply go to waste.
This is why we must industrialize AI — in order to ensure uniform quality for an unlimited number of products, and to produce them in a timely fashion, we need standard, common tools and processes that can be used by all practitioners. This is our mission at Petuum. One important distinction to make is that having standardized tools doesn’t mean having standardized end products — AI applications can and should still be customized to suit particular business needs, but these custom systems should be built with ubiquitous tools and processes.
Haven’t tools like Tensorflow already standardized AI?
This is a very good question, and the answer is that Tensorflow has standardized a small part of the AI process. To return to the building analogy, Tensorflow is like an electric generator. It is a standardized part, and an important part, but you cannot build a house with only a generator. You need many additional parts, and these parts should ideally also be standardized.
Tools like Tensorflow are important, but they only cover part of the process of building an AI application. For instance, what sort of data will you be feeding into Tensorflow? In an ideal world, the data is already cleaned and harmonized, ready to be ingested. But in the real world, where business AI applications must function, the raw data coming in is much messier. This data can be anything from a doctor’s handwritten patient intake notes (with abbreviations, misspellings and in incomplete sentences), to physical paper financial contracts and forms, to noisy audio recordings of machinery parts on a factory floor. The training and executing of AI models is only a part of the piece of the puzzle; there are many other steps that are not currently addressed by open source solutions.
As AI becomes more standardized, are its growing pains the same as other disciplines?
All disciplines — civil engineering, aeronautical engineering, etc. — faced challenges as they moved from sciences to industrial processes. In this regard, AI is in good company.
However, AI also has some unique characteristics that make this challenge especially daunting. First and foremost, “AI” is a blanket term that encompasses several distinct families of mathematical, statistical, and algorithmic methods. There’s supervised learning, active learning, deep learning, reinforcement learning, unsupervised learning, and Bayesian learning — and that’s just to name a few. It’s very difficult for any one individual to truly be an expert in all areas of AI. For this reason, standardization will require a concerted effort from an organization with a wide representation of AI talent. And when thinking about standardization, we must consider the computing equipment that many businesses will use to deploy AI, such as IoT or edge devices, mobile devices, laptops, datacenters, or cloud.
This is why, at Petuum, we have experts across many fields of AI and AI software systems, and not just those that are trending right now. As we pursue our mission of industrializing AI, we’re creating a system that is standardized but customizable, simple but extensible, and always ready for the next scientific advancement in AI. It is a difficult challenge that we are tackling, but we know that this is the hard work that must be done to make AI broadly accessible. Our goal is for no company or industry to feel excluded from this revolutionary technology.
If you’re interested in helping us in our work, check out our Careers page — we’re hiring!