The hyperscalers now offer multicloud ops tools. Cloud-native tools sound good in theory, but here are a few other things to keep in mind. Credit: Mattjeacock / Getty Images The rise of cloudops tools (such as AIops) is in full swing. There are three basic choices: on demand as a non-native tool, hosted on-premises, or a hosted cloud-native tool offered by a public cloud provider. Which door should you choose? The on-demand, non-native category includes the majority of AIops tools that run on a hosting service, sometimes on a public cloud. The wide variety of the tools’ options drives this choice more than the preferred deployment model. If more on-premises systems need to be monitored and controlled, that’s better accomplished using on-premises hosting because the data does not need to flow all the way back to a centralized hosting service over the open internet. At times, it may make sense for the ops tool to run in both places, and some tools provide the ability to do that in coordinated ways. If it’s a solid tool, then it should not matter how you deploy it. Cloud-native tools are owned by a specific cloud provider. They were created to monitor and control their own native cloud services, but they can also monitor and control services on other clouds. This support for multicloud deployments is logical when you consider the growing number of multicloud configurations. However, you need to consider the capabilities of the tool now, as well as its ability to address future needs as your deployments become more complex and heterogeneous over time. At this moment, I could make the argument that using a native tool is a good idea. Most enterprises have an 80/20 rule when deploying to multiple cloud brands. This means that 80% of the applications and data reside in a specific cloud brand while the other 20% reside within other brands, for example: 80% Microsoft, 15% AWS, 5% Google. Thus, it may make sense to leverage a cloud-native ops tool that does a better job of supporting its cloud-native services and can also be deployed as a multicloud ops tool that supports other public clouds. The mix makes sense given your ops approach—at least for now. The trouble with multicloud is that it’s always changing. Although the mix used in our example above is the state today, tomorrow’s market may include two more public clouds, say IBM and Oracle, as well as a normalized percentage of applications and data that run across different cloud brands. We could even see a common deployment pattern where a single public cloud holds less than 30% of the workloads and data on average, with the other applications and data distributed across four or more public clouds as part of the multicloud. Here’s the question that comes up: If you use a single cloud-native tool running on a single public cloud provider and it can monitor and control other cloud brands as well, should you select that ops tool? The answer is probably no, and it has nothing to do with the tool being native to a specific public cloud provider. It’s the architectural reality that ops tools need to be centralized and decoupled from the platforms they control. They need to support the monitoring and management of all public clouds as part of your multicloud, as well as most traditional on-premises systems. A hosted cloud-native tool (option 3) could solve your problems in the short run. However, in the long run, your cloudops tool needs to run on a neutral platform to ensure the most effective solutions now and into the future. Therefore, the best cloudops tool choices lie in options 1 (hosted, on demand) or 2 (on premises), or both. Related content analysis Generative AI won’t fix cloud migration You’ve probably heard how generative AI will solve all cloud migration problems. It’s not that simple. Generative AI could actually make it harder and more costly. By David Linthicum Jul 12, 2024 5 mins Generative AI Artificial Intelligence Cloud Computing analysis All the brilliance of AI on minimalist platforms Buy all the processing and storage you can or go with a minimum viable platform? AI developers and designers are dividing into two camps. By David Linthicum Jul 09, 2024 5 mins Generative AI Cloud Architecture Artificial Intelligence analysis The next 10 years for cloud computing Despite AI's explosive growth, the industry still needs to face facts that customers are unhappy about costs and vendor lock-in. By David Linthicum Jul 05, 2024 5 mins Amazon Web Services Google Cloud Platform Microsoft Azure analysis Serverless cloud technology fades away Serverless was a big deal for a hot minute, but now it seems old-fashioned, even though its basic elements, agility and scalability, are still relevant. By David Linthicum Jul 02, 2024 4 mins Serverless Computing Cloud Computing Software Development Resources Videos