DORA Research Evolution and the Future of DevOps Practices

DORA Research Evolution and the Future of DevOps Practices

Overview

DORA’s 2025 research marks a major inflection point in the evolution of software delivery performance studies. Now rebranded as The State of AI-Assisted Software Development, DORA’s scope extends past traditional DevOps metrics into the age of intelligent automation. The research emphasizes that AI is not only an accelerant for DevOps but a transformative layer influencing how teams design, deliver, and maintain software. Importantly, the DORA team has also evolved its research methodology, shifting from static, annual surveys toward a more agile, iterative delivery model, mirroring the very DevOps principles it studies. We find the following:

  • AI and DevOps Practices: AI accelerates code generation and increases the need for learning climates, fast feedback, and flow. Foundational principles remain critical.
  • Accountability: Developers and organizations are fully accountable for AI-generated code; errors cannot be blamed on AI alone.
  • Community & Research Delivery: The DORA practitioner community is crucial for contextualizing and applying research insights. The report is shifting to smaller, more frequent publications, which mirrors DevOps principles of iterative delivery.
  • Challenges in AI Projects: Organizations face complexity, skill gaps, and budget allocation issues, with ~25% of IT budgets allocated to AI projects often driven by competitive pressure rather than necessity.

According to theCUBE Research AppDev data, organizations increasingly view AI as a crucial enabler for developer productivity, with over 80% of enterprises reporting AI experimentation in some stage of their software lifecycle. Yet, DORA’s 2025 findings caution that true performance gains only arise when AI is grounded in strong cultural and procedural foundations, continuous learning, fast feedback, and system-level flow.

AI and DevOps Practices

The integration of AI into DevOps is both amplifying and stress-testing core principles. DORA’s research underscores that the same practices that fueled DevOps success (small batch work, continuous integration, and collaborative culture) are the foundation for safe and effective AI adoption. AI-assisted code generation is dramatically accelerating development speed, but without disciplined feedback loops and strong governance, teams risk creating technical debt faster than ever before.

theCUBE Research data reveals that 67% of engineering leaders cite AI-driven automation as their top productivity enabler in 2025, but 42% also report challenges maintaining code quality and traceability. This duality signals a new imperative: organizations must pair AI velocity with human oversight and system thinking. The era of “move fast and fix later” has evolved into “move fast and verify continuously.”

Accountability in the Age of AI

A key theme emerging from DORA 2025 is accountability. While AI can generate, test, and even optimize code autonomously, the ultimate responsibility remains with the developers and organizations deploying it. DORA’s analysis reinforces that “AI cannot be blamed;” accountability for security, compliance, and ethical outcomes must remain human-centered.

This notion aligns with theCUBE Research’s broader industry perspective: 58% of surveyed IT leaders express concern that AI-generated outputs introduce new compliance risks, yet only 27% have formal governance frameworks in place. As generative AI tools become embedded across the software supply chain, accountability is shifting from individual contributors to organizational structures, requiring clearer policies, traceability systems, and AI risk literacy across teams.

Community and Research Delivery

In a significant methodological shift, DORA is transforming how it delivers research to the DevOps community. Rather than releasing a single annual report, DORA will now publish smaller, more frequent insights, reflecting the iterative nature of modern software delivery. This mirrors the principle of continuous improvement, research itself becoming agile.

The DORA practitioner community, long seen as one of the most engaged in the software engineering ecosystem, plays a pivotal role here. Practitioners contextualize findings, validate patterns in real-world environments, and provide a feedback loop that strengthens the research model. theCUBE Research AppDev Summit discussions echoed this sentiment, emphasizing that the future of DevOps insight will be participatory, dynamic, and community-informed.

Challenges in AI Projects

Despite enthusiasm, DORA’s 2025 data surfaces systemic challenges in AI project execution. Roughly 25% of IT budgets are now allocated to AI-related initiatives, yet a significant portion of this investment is reactive, driven by competitive pressure rather than clear business necessity. Skill gaps, model governance complexities, and unbalanced spending continue to slow measurable ROI.

theCUBE Research’s findings align closely: while 9 in 10 organizations have integrated some form of AI into development workflows, fewer than half report a positive return on efficiency metrics such as cycle time reduction or deployment frequency. Many teams find themselves caught between innovation urgency and capability maturity, suggesting a widening performance gap between early adopters and those still learning to operationalize AI responsibly.

Implications for Industry

The combined findings from DORA 2025 and theCUBE Research signal a maturing intersection between AI and DevOps, one that is defined by pragmatism rather than hype. The next wave of software excellence will not come from AI alone but from AI applied through a disciplined DevOps culture. Success will hinge on sustaining foundational practices, ensuring human accountability, and leveraging practitioner-driven insights to continuously refine both technology and process.

In essence, DORA’s evolution mirrors the DevOps journey itself: from isolated reports to continuous research delivery, from automation for speed to automation for intelligence, and from siloed teams to a connected ecosystem of learning. For organizations, it is clear that AI will not replace DevOps, but it will demand that DevOps matures.

I recommend that organizations adhere to these findings and institute these steps:

1. Recommit to Foundational DevOps Principles

While AI introduces transformative potential, DORA’s 2025 findings reaffirm that core DevOps practices remain the engine of sustainable performance. Organizations should reemphasize the fundamentals:

  • Continuous Integration and Delivery (CI/CD): Ensure pipelines remain auditable, automated, and resilient to AI-generated changes.
  • Fast Feedback Loops: Integrate observability, testing, and security scanning early in the lifecycle to identify risks introduced by AI-assisted development.
  • Small Batch Work: Keep work increments small to make AI-generated code reviewable and traceable, minimizing complexity in debugging or rollback scenarios.

These practices provide the scaffolding that allows AI to amplify,  rather than destabilize, software delivery.

2. Establish AI Governance Frameworks

As accountability becomes central to DORA’s 2025 narrative, organizations must formalize governance models for AI-assisted development. This includes:

  • Clear Responsibility Assignments: Define who owns the output of AI-generated code and ensure compliance requirements are embedded in policy, not left to interpretation.
  • Data Provenance and Traceability: Implement metadata tagging for AI-generated artifacts to track model versions, data inputs, and decision rationales.
  • Ethical Guardrails: Create internal review boards or risk councils that oversee AI usage across development and deployment.

AI governance should be integrated into existing DevOps workflows, not treated as a separate compliance burden.

3. Build Learning Climates and Developer Literacy

AI’s rapid integration is redefining the skills needed in modern software delivery. To stay competitive, organizations must foster adaptive learning cultures:

  • Upskill and Reskill Developers: Provide structured training on AI-assisted coding tools, prompt engineering, and model evaluation.
  • Encourage Experimentation: Allocate “innovation sprints” for teams to safely test AI tools in sandboxed environments before production rollout.
  • Psychological Safety: Encourage open discussions around AI errors or limitations to build trust and collective learning rather than fear of accountability.

According to theCUBE Research AppDev data, teams with active learning programs are 3.5x more likely to report considerable gains in delivery performance from AI adoption.

4. Shift from Annual Metrics to Continuous Measurement

Just as DORA is evolving its research delivery cadence, organizations should move away from static performance snapshots.

  • Adopt Continuous Measurement Systems: Leverage telemetry and performance analytics to track flow efficiency, deployment frequency, and MTTR (mean time to recovery) in real time.
  • Benchmark Against AI-Augmented Baselines: Adjust KPIs to account for AI acceleration that measures impact, not just activity.
  • Iterate Research and Feedback: Use internal retrospectives, developer surveys, and AI performance dashboards to continuously refine operational insights.

This approach allows teams to quickly adapt to the changing impact of AI across the software lifecycle.

5. Align AI Investments with Business Outcomes

DORA’s 2025 report warns of “chaotic activity” and the risk of pouring resources into AI without measurable outcomes. To avoid this:

  • Define Clear Value Metrics: Tie AI investments to tangible performance outcomes such as deployment speed, defect rate reduction, or customer satisfaction.
  • Balance Innovation and Necessity: Prioritize AI use cases that enhance developer experience or solve existing bottlenecks rather than chasing competitive parity.
  • Allocate Budgets Strategically: Treat AI spend as an operational capability investment, not a one-time project expense.

Organizations that link AI initiatives to strategic goals,  rather than trend-following, will sustain long-term value.

6. Engage in the DORA and Practitioner Community

As DORA evolves into a more iterative, community-driven research model, organizations have a unique opportunity to influence industry benchmarks:

  • Contribute Data and Insights: Participate in DORA’s micro-surveys and share anonymized performance data to shape collective learning.
  • Collaborate Across Ecosystems: Join DevOps and AI practitioner forums to exchange implementation practices and success patterns.
  • Apply Research Rapidly: Treat DORA’s ongoing updates as continuous improvement prompts rather than annual checklists.

This engagement strengthens both organizational maturity and the collective intelligence of the DevOps ecosystem.

7. Change Developer Experience (DevEx) for the AI Era

The infusion of AI into every layer of development is shifting how developers interact with tools, teams, and workflows. Organizations should:

  • Streamline Toolchains: Reduce cognitive load by integrating AI assistants directly into IDEs and CI/CD systems.
  • Promote Transparency: Clearly communicate what AI systems are doing, how they make decisions, and how outputs are validated.
  • Measure Developer Sentiment: Track metrics like tool satisfaction, perceived autonomy, and trust in AI suggestions.

A strong DevEx focus ensures AI remains an enabler of creativity and efficiency, not a source of friction or distrust.

Conclusion

The evolution of DORA research reflects a broader industry truth: AI is not replacing DevOps, but rather, it’s redefining it. Organizations that blend intelligent automation with human accountability, continuous learning, and agile measurement will outperform peers trapped in static or reactive approaches.

By committing to foundational practices, establishing governance, and engaging with the research community, enterprises can turn AI’s complexity into a catalyst for enduring software excellence.

Authors

  • Paul Nashawaty

    Paul Nashawaty, Practice Leader and Lead Principal Analyst, specializes in application modernization across build, release and operations. With a wealth of expertise in digital transformation initiatives spanning front-end and back-end systems, he also possesses comprehensive knowledge of the underlying infrastructure ecosystem crucial for supporting modernization endeavors. With over 25 years of experience, Paul has a proven track record in implementing effective go-to-market strategies, including the identification of new market channels, the growth and cultivation of partner ecosystems, and the successful execution of strategic plans resulting in positive business outcomes for his clients.

    View all posts
  • With over 15 years of hands-on experience in operations roles across legal, financial, and technology sectors, Sam Weston brings deep expertise in the systems that power modern enterprises such as ERP, CRM, HCM, CX, and beyond. Her career has spanned the full spectrum of enterprise applications, from optimizing business processes and managing platforms to leading digital transformation initiatives.

    Sam has transitioned her expertise into the analyst arena, focusing on enterprise applications and the evolving role they play in business productivity and transformation. She provides independent insights that bridge technology capabilities with business outcomes, helping organizations and vendors alike navigate a changing enterprise software landscape.

    View all posts