As machine learning engines multiply, we see the benefits of having them train each other, versus using limited training data we may or may not have Credit: Getty Images If you deal with machine learning-based systems, you know all about training data. Data must be formatted correctly and be accurate before it’s loaded into an AI model for the purpose of training the model. Say you’re creating a fraud detection engine using a popular machine learning system in a public cloud. First you create the data set used to train the model: in this case, millions of transactional records with the fraudulent transactions labeled. This allows the model to learn what’s likely fraudulent and what isn’t. Of course, there are different types of training data, some labeled, some not. Once trained, the model may indeed continue training by learning what’s likely fraudulent and not through experience learning. Indeed, if you had the time, the model could train itself over time by monitoring transactions that humans or other systems mark fraudulent. What strikes me about this approach to AI training is that you need a sound training data set. In some cases, it can be obtained from open or proprietary training data brokers. In most instances, you format your own data to train the machine learning model. However, what if other trained machine learning models could train models, anywhere and any time? The idea is not new. Since the advent of AI we’ve toyed with the idea of having one AI engine teach another, either by sharing training data or, better yet, sharing knowledge and experience through direct, automatic interaction. Having one AI engine mentor yours provides outside experience and thus makes the AI model more valuable and effective. This is easier said than done. Machine learning engines typically don’t talk to each other, even if they are the same software. They are designed from the ground up to be stand-alone learners and interact with non-AI systems or humans. However, inter-AI engine training is on most vendor radar screens. Lately I’m seeing a few key trends that could change the game: First, is the use of on-demand or SaaS-based AI engines that can interact with other AI engines within a public cloud or on-premises. You can think of these as SaaS clouds that specialize in teaching other AI engines about a specific set of skills, everything from spotting fraudulent transactions, to medical diagnostics, to machine maintenance, and more. Second, AI engines are able to combine with your taught models, creating an AI superbrain of sorts, not only providing global experiences outside of your domain but combining with your own training data to provide local and global experiences. I’m bringing this up now because most enterprises need to be aware of these trends if they want to get more value out of AI, including machine learning and deep learning. Moreover, many enterprises are running into a wall of not having enough training data to make machine learning functional. This could be a good way to solve both problems. Related content analysis Generative AI won’t fix cloud migration You’ve probably heard how generative AI will solve all cloud migration problems. It’s not that simple. Generative AI could actually make it harder and more costly. By David Linthicum Jul 12, 2024 5 mins Generative AI Artificial Intelligence Cloud Computing analysis All the brilliance of AI on minimalist platforms Buy all the processing and storage you can or go with a minimum viable platform? AI developers and designers are dividing into two camps. By David Linthicum Jul 09, 2024 5 mins Generative AI Cloud Architecture Artificial Intelligence analysis The next 10 years for cloud computing Despite AI's explosive growth, the industry still needs to face facts that customers are unhappy about costs and vendor lock-in. By David Linthicum Jul 05, 2024 5 mins Amazon Web Services Google Cloud Platform Microsoft Azure analysis Serverless cloud technology fades away Serverless was a big deal for a hot minute, but now it seems old-fashioned, even though its basic elements, agility and scalability, are still relevant. By David Linthicum Jul 02, 2024 4 mins Serverless Computing Cloud Computing Software Development Resources Videos