david_linthicum
Contributor

The lost art of cloud application engineering

analysis
Jul 28, 20233 mins
Application SecurityArtificial IntelligenceCloud Computing

AI-driven coding is now in wide use, but we may not know all the risks of using it until the damage has been done. Think security problems and code that wastes resources.

shutterstock 77260183 rusty old woodworking tools on the wall of an old workshop
Credit: Mihai Simonia / Shutterstock

AI is changing the programming world, which has been evolving for several years. I could talk about how the emerging practice of using AI-driven coders increases speed and reduces costs, but there are some downsides that many fail to see.

Again, the question is not “Can we?” It’s “Should we?” Let’s go over a few core concerns.

What AI can’t do

AI-driven coders learn from existing code repositories. They often need a more contextual understanding of the code generated. They produce code that works but may need help to comprehend or maintain. This hinders developers’ control over their software and often causes mistakes when fixing or changing applications.

Moreover, the generated code must meet style conventions or best practices and include appropriate error handling. This can make debugging, maintenance, and collaboration difficult.

Remember that AI-driven code generation focuses on learning from existing code patterns to generate net-new code. Generative AI coders have a “monkey see, monkey do” approach to development, whereas the coding approaches are learned from the vast amount of code used as training data.

This approach is helpful for repetitive or standard tasks, which is much of what developers do, but enterprises may require more creativity and innovation for complex or unique problems. Using generative AI code can limit the potential for novel solutions and hinder the development of truly innovative applications.

Not sure if you’ve looked out there, but innovation is lacking. We seem to be building the same things over and over again.

My biggest concern is that code could be more efficient and optimized for the platform the application is deployed on. It takes sound engineering practices to understand how to optimize processors, memory, and storage management.

I think that many people will generate and deploy an application without understanding how it could leverage resources in a more optimized way. We end up with applications that are more expensive to run and have a much larger carbon footprint.

The shame is that, in most cases, just the fact that the application works is good enough for many. The applications operate for years, waste a great deal of money, and fail to return the optimal value to the business. “Oh, well,” people say, “it works, doesn’t it?”

Another scary aspect of AI-driven development is that many security vulnerabilities are left within the application and go unnoticed until the postmortem after a breach. Again, we need human engineering to spot and fix those, albeit some helpful AI-driven scanning tools can be practical.

Bottom line

By removing humans from the development process, which many organizations are looking to do, we sacrifice the understanding needed to create practical applications. The appropriate answer is to find a balance between the value of AI in terms of speed and cost and the fact that many human skills still need to be involved. I fear that we won’t understand that until it’s too late.

david_linthicum
Contributor

David S. Linthicum is an internationally recognized industry expert and thought leader. Dave has authored 13 books on computing, the latest of which is An Insider’s Guide to Cloud Computing. Dave’s industry experience includes tenures as CTO and CEO of several successful software companies, and upper-level management positions in Fortune 100 companies. He keynotes leading technology conferences on cloud computing, SOA, enterprise application integration, and enterprise architecture. Dave writes the Cloud Computing blog for InfoWorld. His views are his own.

More from this author