As a coding companion, Generative AI’s (GenAI) potential in code generation, on-the-fly recommendations, security, and reliability make it a game-changer for software development. However, the time to value for GenAI as a coding companion is at least two years from now as per the HfS research

The Generative Enterprise inspires a new S-curve of value creation
Fig1: The Generative Enterprise inspires a new S-curve of value creation

In DevOps, however, GenAI finds ready adoption and offers a chance to improve productivity and accelerate code delivery. This blog recommends a platform-based approach to operationalize GenAI in DevOps pipelines to streamline workflows, accelerate code delivery, and achieve business objectives faster.

Caveats before onboarding GenAI in DevOps

Since development streams are the basis of an enterprise’s business success, it is important to keep the below caveats in mind before onboarding GenAI in DevOps:

  • Ensure data privacy: Enterprises must ensure that data sent to large language models (LLMs) is secure and private. LLM providers, such as Meta, Amazon Web Services, Azure, or Google Cloud Platform, guarantee that enterprise data is not used to train generally available models and provide a mechanism to run LMMs within the enterprise network to keep data under control.
  • Contextualize model retraining: Enterprises can start with generally available LLMs to save costs when using GenAI. Evaluate and retrain them with custom data if needed for higher accuracy and to eliminate model biases. Incremental retraining or Reinforcement Learning from Human Feedback (RLHF) may be required.
  • Adopt a platform approach: Leveraging GenAI as a platform for DevOps offers a comprehensive mechanism that supplements multiple dimensions, starting with error log analysis and expanding to vulnerability analysis.
How GenAI platformization unleashes value in DevOps

This approach fosters collaboration and design thinking, democratizes downstream activities, and creates end-to-end visibility. Here are a few use cases:

  • Security Best Practices in Code Generation: DevOps engineers manage IAM policies and service pipelines. GenAI can generate IAM policies within the current cloud account setup by referring to specific groups, users, accounts, or policies in the code.
  • Contextualized Configuration Generation: GenAI can create configuration files based on scenarios that incorporate context, such as:
    • Generate a configuration file for a Grafana-Agent, which can collect metrics from Telegraf at intervals of 1 minute and then send these metrics to my production Mimir cluster Helm chart code and templates.

Troubleshooting:

  • Pipelines: GenAI provides intelligent troubleshooting suggestions during pipeline execution by generating pipeline code from text descriptions and existing context, such as variable keys and automation scripts.
  • Infrastructure management: Error encountered during Infrastructure as Code execution is a potential candidate for analysis. GenAI can help DevOps engineers troubleshoot and move forward quickly.
  • Logs: Centralized logging requires parsing logs into fields for accurate search and analysis. Also, GenAI can address errors observed in production logs, improving the system’s overall performance.
  • Security scan: This use case involves performing a security scan of the code for the infrastructure creation to check for vulnerabilities such as wide-open networks across IP and ports.
Four GenAI handoffs in a DevOps cycle
Fig2: Four GenAI handoffs in a DevOps cycle
Case in Point: GenAI Architecture for Observability System

We outline a potential architecture for Enterprises to incorporate GenAI in their DevOps Operations with the following key characteristics:

  • The platform works with existing Observability systems to complement GenAI’s advantages in operations.
  • It follows an API-first approach for building applications and providing further integration capabilities.
  • It is extensible to support multiple use cases as the enterprise goes through different maturity phases of GenAI adoption.
Pi-Insights System
Fig3: Pi-Insights System

Pi-Insights is Persistent’s early-stage accelerator for GenAI in DevOps. It works with the existing observability system, allowing engineers to analyze error messages from centralized log collection system, such as Grafana Loki. Pi-Insights provides explanations, solutions, and sample code to help fix the issue, reducing search time.

  • Ingestor: A pluggable component that pulls data from different logging systems. It can be extended as required to support any new additional logging system.
  • Prompt Engineering: Prompt engineering is crucial for LLM to provide effective results, as an example The prompt “The most likely reason for this error is…” gives a probable reason for an error. LLM Completion APIs generate probable causes. This component has various capabilities for users.

Troubleshooting DevOps Pipelines: This paradigm can help developers troubleshoot CI pipeline errors by retrieving and presenting build system errors via the Error Analysis Portal. The Prompt Engineering module allows for customizable GenAI analysis.

Get the Persistent Edge

As a leading digital engineering partner with 30 years of experience in AI/ML and cloud operations, Persistent has helped enterprises anticipate trends and stay ahead of the competition.

As the only mid-cap IT service provider named a challenger by Gartner in its 2023 Magic Quadrant for Public Cloud IT Services Transformation, we offer expertise to help fast-track GenAI adoption. Our investments in intellectual property, staff training on hyperscaler platforms, and comprehensive suite of GenAI accelerators make us ideal partners for embedding GenAI in your workstreams.

Contact us to elevate your DevOps practice with GenAI.

Author’s Profile

vinit kapoor

Vinit Kapoor

Chief Architect – Cloud Technologies

vinit_kapoor@persistent.com

LinkedIn

Vinit Kapoor has over 20+ years of IT experience and has worked in diverse domains building products and solutions for Mobile Location Services, Fleet management, Industrial IoT, Cloud Infrastructure management and DevOps.