Skip to main content

Capability Maturity Model (CMM) - Humphrey (1989)

Framework Identification

Framework Name: Capability Maturity Model for Software

Framework Abbreviation: CMM

Target of Framework: Assessment and improvement of software development process maturity through a five-level framework emphasizing standardization, measurement, and continuous advancement enabling organizations to predict and control software development costs, schedules, and quality

Disciplinary Origin: Software Engineering, Process Management, Quality Management, Organizational Capability Development, Technology Management

Theory Publication Information

Author: Watts S. Humphrey

Formal Publication Date: 1989

Official Title: Managing the Software Process

Publisher: Addison-Wesley

Book Format: Authored book, not journal article

ISBN: 978-0-201-18095-4

Citation Information

APA (7th ed.)

Humphrey, W. S. (1989). Managing the software process. Addison-Wesley.

Chicago (Author-Date)

Humphrey, Watts S. 1989. Managing the Software Process. Reading, MA: Addison-Wesley.

Why Was the Model Created?

Watts Humphrey developed the Capability Maturity Model in direct response to a critical crisis in U.S. Department of Defense software development practices. Throughout the 1970s and 1980s, the Department of Defense managed thousands of software development contracts with varying outcomes. Some contractors consistently delivered software on time and within budget while others chronically exceeded schedules, budgets, and quality expectations. Identical hardware development contracts under identical managers showed vast differences in software development outcomes. The Department of Defense and contractors struggled to understand what distinguished successful software organizations from chronically troubled ones.

Humphrey recognized that the software development field lacked fundamental practices and processes that were well-established in other engineering disciplines. Manufacturing and hardware development had standardized processes, documented procedures, and measurement systems enabling prediction and control of outcomes. Software development, by contrast, was often ad hoc, driven by individual developer heroics, and lacking systematic processes or quality measurement. Software success depended more on having exceptional developers than on having good processes; this created unpredictability and limited scaling. Humphrey proposed that the solution required establishing a scientific basis for software process management through process definition, process measurement, and systematic process improvement.

The CMM emerged from Humphrey’s work at the Software Engineering Institute (SEI) at Carnegie Mellon University, chartered by the U.S. Department of Defense to improve software development practices. Humphrey synthesized insights from quality management, manufacturing engineering, and organizational development into a framework specifically tailored to software development process improvement. The framework proposed that software development organizations evolve through predictable maturity levels, each level building on the previous level and enabling improved predictability, control, and effectiveness.

What Does the Model Measure?

Source note:The project’s Zotero library does not contain a PDF of Humphrey (1989) Managing the Software Process. Claims on this page are anchored to the SEI/CMU formalization of the CMM published as Paulk, Curtis, Chrissis, & Weber (1993) Capability Maturity Model, Version 1.1(IEEE Software, July 1993), for which a PDF is available in the project’s Zotero library. Per Paulk et al. (1993, p. 18), the SEI released a brief description of the process-maturity framework in September 1987, which “was later expanded in Watts Humphrey’s book, Managing the Software Process” (1989), and subsequently evolved into the Capability Maturity Model.

CMM measures process maturity on an ordinal five-point scale (Paulk et al., 1993, p. 21), not a continuous psychometric scale. The measurement apparatus consists of:

  • Five Maturity Levels (1-5):An ordinal scale for measuring process maturity and evaluating process capability. Each level “comprises a set of process goals that, when satisfied, stabilize an important component of a software process” (Paulk et al., 1993, p. 21).
  • Key Process Areas (KPAs): Each maturity level (except Level 1) is characterized by specific KPAs that must be institutionalized. Achievement of the KPAs, assessed via a maturity questionnaire and on-site appraisal, is the operational criterion for assigning a level.
  • Maturity Questionnaire: The SEI developed a maturity questionnaire (Paulk et al., 1993, p. 18) as a preliminary assessment instrument. Formal appraisals use Software Process Assessment (SPA) or Software Capability Evaluation (SCE).
  • Process measures at Level 4+: Organizations operating at Maturity Level 4 (Managed) and above collect quantitative process and product measurements (Paulk et al., 1993, pp. 22-23) against which process performance can be controlled statistically.

CMM does not provide a validated instrument for measuring software quality itself, developer skill, or project success. It measures process maturityonly. The relationship between maturity level and project outcomes is a claim CMM proponents advanced; see the Internal Validity section for caveats.

Core Concepts and Definitions

The Capability Maturity Model is built on fundamental concepts about how organizations mature in process capability:

  • Process Maturity: The extent to which an organization has explicitly defined, documented, and standardized software development processes. Maturity progresses from ad hoc, undocumented practices (initial) to standardized, measured, optimized processes (optimizing).
  • Key Process Areas (KPAs): Functional elements of software processes that must be established and institutionalized at each maturity level. Each level requires implementation of specific KPAs and integration with existing processes.
  • Process Institutionalization: The extent to which practices are documented in standards, procedures, and training; embedded in organizational routines; assessed for compliance; and reinforced through organizational culture and management attention.
  • Predictability: The capability to reliably predict software project outcomes (schedule, budget, quality) based on historical data and process knowledge. Predictability increases with maturity level.
  • Measurement: Systematic collection of project metrics and process metrics enabling understanding of process performance, identification of improvement opportunities, and evidence-based process adjustments.
  • Process Improvement: Systematic activities to evaluate processes, identify improvement opportunities, experiment with improvements, and institutionalize successful improvements.
  • Quality Assurance: Activities to ensure that software development follows defined processes, product quality meets standards, and problems are identified and corrected.

Preceding Models or Theories

Paulk et al. (1993, p. 21) explicitly credit the staged structure of the CMM to “principles of product quality espoused by Walter Shewhart, W. Edwards Deming, Joseph Juran, and Philip Crosby.” The CMM is a direct application of this quality-management lineage to the domain of software engineering.

  • Statistical Process Control (Shewhart, 1931):The foundational distinction between common-cause and special-cause variation, and the use of control limits to know when to intervene, are inherited directly by CMM Level 4 (“Managed”), where project processes are controlled statistically (Paulk et al., 1993, p. 22).
  • TQM / Deming philosophy (Deming, 1982/1986): Continuous improvement, the 14 Points, and the PDCA cycle inform CMM’s process-improvement orientation. The evolutionary-small-steps framing of CMM (Paulk et al., 1993, p. 21) closely echoes Deming.
  • Juran on quality management:Juran’s quality trilogy (planning, control, improvement) parallels the CMM’s movement from disciplined planning (Level 2) through process definition (Level 3), quantitative control (Level 4), and optimization (Level 5).
  • Crosby on quality maturity:Crosby’s own maturity grid (Crosby, 1979, Quality Is Free) is a closer organizational precursor: a five-stage scale for quality-management maturity that parallels the CMM’s five-level staging.
  • Humphrey’s process-maturity framework (SEI, 1987):The immediate precursor. Paulk et al. (1993, p. 18) describe how the SEI released a brief description of the process-maturity framework in September 1987, expanded in Humphrey’s 1989 book Managing the Software Process, and subsequently refined into CMM v1.0 (1991) and v1.1 (1993).
  • Software engineering context (Boehm, Brooks):The CMM was developed within the mature software-engineering literature of the 1980s, including Boehm (1988) on the spiral model and Brooks (1975/1995) on the inherent difficulties of software development. These form the domain context rather than direct inputs to the CMM’s staged structure.

Describe The Model

The Capability Maturity Model proposes that software development organizations progress through five maturity levels, each representing greater process definition, standardization, measurement, and optimization capability. Organizations at higher maturity levels can more reliably predict and control software development outcomes, achieve better quality with fewer surprises, and execute complex projects with lower risk. Progression through maturity levels requires systematic implementation of Key Process Areas specific to each level, followed by organizational institutionalization of new practices.

Five Maturity Levels (Paulk et al., 1993, pp. 21-23)

Level names and characterizations below follow Paulk et al. (1993), the SEI/CMU formalization that builds on Humphrey (1989). Note: later models including CMMI renamed Levels 2 and 4 (“Managed” and “Quantitatively Managed”), but the CMM v1.1 names are as given here.

  • Level 1 - Initial:The software process is ad hoc, occasionally chaotic. Few processes are defined and success depends on individual heroics. “At level 1, capability is a characteristic of individuals, not organizations” (Paulk et al., 1993, p. 21). Organizations at Level 1 may still deliver functioning products, but cannot reliably repeat success.
  • Level 2 - Repeatable:Basic project management processes are established to track cost, schedule, and functionality. The CMM v1.1 Key Process Areas at Level 2 are (per Paulk et al., 1993): Requirements Management, Software Project Planning, Software Project Tracking and Oversight, Software Subcontract Management, Software Quality Assurance, and Software Configuration Management. Process capability is summarized as “disciplined”; earlier successes can be repeated.
  • Level 3 - Defined:Both software-engineering and management processes are documented, standardized, and integrated into an organization’s standard software process. The CMM v1.1 Key Process Areas at Level 3 are: Organization Process Focus, Organization Process Definition, Training Program, Integrated Software Management, Software Product Engineering, Intergroup Coordination, and Peer Reviews. Process capability is summarized as “standard and consistent” (Paulk et al., 1993, p. 22).
  • Level 4 - Managed:Detailed measurements of the software process and product quality are collected and analyzed. The CMM v1.1 Key Process Areas at Level 4 are: Quantitative Process Management and Software Quality Management. Projects control products and processes by narrowing variation to fall within “acceptable quantitative boundaries” (Paulk et al., 1993, p. 22). Process capability is summarized as “quantifiable and predictable.”
  • Level 5 - Optimizing: The entire organization is focused on continuous process improvement. The CMM v1.1 Key Process Areas at Level 5 are: Defect Prevention, Technology Change Management, and Process Change Management. Teams analyze defects to determine causes and propose changes to prevent recurrence (Paulk et al., 1993, pp. 22-23).

Key Mechanisms

  • Key Process Areas (KPAs): Functional elements that must be established at each maturity level. KPAs define what practices must be implemented and institutionalized for organizations to achieve that maturity level.
  • Common Features: Characteristics present in organizations implementing KPAs effectively: commitment to perform, ability to perform, activities performed, measurement and analysis, verification of implementation.
  • Process Definition and Documentation: Software processes must be explicitly defined, documented in standards and procedures, and made accessible to software developers.
  • Process Measurement: Organizations collect metrics on process performance, project performance, and product quality. Metrics enable understanding of process effectiveness and identification of improvement opportunities.
  • Organizational Learning and Improvement: Organizations capture lessons learned, analyze process data, and systematically improve processes based on evidence.

Main Strengths

  • Directly addresses software development challenges: Developed specifically for software context, addressing documented software development problems and challenges.
  • Practical assessment methodology: Provides concrete assessment criteria and detailed practices enabling organizations to assess current maturity level and identify improvement path.
  • Predictable progression path: Specifies maturity progression path with clear criteria for moving between levels. Organizations understand what is required at each level.
  • Claimed business value:SEI and consulting studies through the 1990s-2000s (e.g., Herbsleb & Goldenson, 1996; Galin & Avrahami, 2006) reported that organizations advancing maturity levels saw improvements in schedule predictability, cost control, and defect density. Attribution of performance improvements specifically to CMM (as opposed to selection bias, co-occurring process investments, or broader management change) is contested in the empirical literature.
  • Addresses scaling challenges: Enables organizations to scale beyond individual hero developers to reliable organizational capability.
  • Measurement emphasis: Emphasizes measurement and data-driven decision-making enabling evidence-based improvement and objective progress assessment.

Main Weaknesses

  • Slow progression through levels: Moving from one maturity level to the next typically requires 2-4 years of sustained effort. Organizations seeking rapid improvement find CMM progression slow.
  • Risk of bureaucratization: Process documentation and standardization can create bureaucracy if not managed carefully. Excessive documentation and process adherence can inhibit innovation.
  • Complex assessment and implementation: CMM assessment is expensive and time-consuming. Implementation requires significant organizational change and resource commitment.
  • Limited applicability to innovation and small projects: CMM emphasis on standardized processes may be less applicable to innovative work, research projects, or small software developments requiring flexibility.
  • Context and industry variation: CMM was developed for DoD software development. Applicability to other industries, contexts, or software types (e.g., web development, startup environments) may vary.
  • Agile methodology tensions: CMM emphasis on planning and documentation can tension with Agile methodologies emphasizing iterative development and adaptive planning. Integration of CMM and Agile practices remains challenging.
  • Organizational culture requirements: CMM implementation requires significant organizational culture change. Organizations resistant to process discipline struggle with CMM adoption.

Key Contributions

  • Brought process discipline to software development: Argued that software development could benefit from process discipline, standardization, and measurement similar to that of manufacturing and engineering disciplines.
  • Defined software maturity progression: Provided framework for understanding organizational software development maturity and maturity progression path.
  • Operationalized software process improvement: Provided concrete practices and assessment criteria enabling organizations to systematically improve software development processes.
  • Advanced process measurement in software: Emphasized measurement and metrics as central to process management, contributing to subsequent software engineering practice.
  • Addressed software scaling challenges: Provided framework for organizations to scale beyond small team dynamics to organizational capability.
  • Created standardization opportunity: Provided common framework enabling software organizations globally to assess and compare maturity levels.
  • Foundation for subsequent frameworks: Provided foundation for CMMI (Capability Maturity Model Integration), ISO/IEC 15504, and other software process improvement frameworks.
  • Influenced DoD and government practice: Influenced procurement practices and contractor selection, driving widespread CMM adoption across software industry.

Internal Validity

CMM is a practitioner-oriented process framework rather than a tested empirical theory. “Internal validity” here is assessed as logical coherence, the ordinal structure of the five-level scale, and fidelity to the quality-management lineage the SEI explicitly cites:

  • Logical progression: The maturity level progression logically builds: organizations must have basic project management before standardizing processes; must standardize processes before measuring them quantitatively; must measure before optimizing.
  • Grounding in software engineering experience: The framework emerged from empirical investigation of successful and unsuccessful software development organizations. Identified practices reflect observed patterns in high-performing organizations.
  • Consistency with process management theory: The framework incorporates established process management concepts: process definition, measurement, control, and improvement.
  • Addresses documented software problems: The framework addresses well-documented software development problems: schedule predictability, quality consistency, cost control, and scaling challenges.
  • Mutual reinforcement of practices: Practices at each level build on and reinforce each other. Process definition enables measurement; measurement enables control; control enables optimization.
  • Clear assessment criteria: Framework provides clear, observable criteria for assessing maturity level enabling objective assessment rather than subjective judgment.

External Validity

External validity considerations concern generalizability of CMM across diverse software development contexts and organizational types:

  • DoD context origin: CMM was developed in Department of Defense context with emphasis on large-scale, mission-critical software. Applicability to commercial software development, startups, or web development may differ.
  • Software development context variation: CMM applicability may vary by software type. Applicability to innovative research software, exploratory prototypes, or Agile development may be less straightforward than to mission-critical systems.
  • Organizational size effects: CMM was developed in large organizations with dedicated process management resources. Applicability to small software organizations or startups may be limited due to resource constraints.
  • Agile methodology tensions: CMM emphasis on planning and documentation can conflict with Agile methodologies. Organizations using Agile approaches may struggle to implement CMM practices.
  • Industry and cultural variation: CMM was developed in American software engineering culture. Applicability to different national cultures or organizational contexts may require adaptation.
  • Organizational resistance: CMM implementation requires significant organizational change. Organizations with strong resistance to process discipline may struggle with adoption.
  • Innovation and flexibility concerns: CMM emphasis on standardization and documentation may inhibit innovation in certain contexts requiring high flexibility and rapid adaptation.
  • Measurement capability variation: CMM implementation requires organizational capability to collect, analyze, and act on process metrics. Organizations lacking analytical capability may struggle.

Relevance to Technology Adoption

CMM explains organizational success with technology adoption through process capability and maturity progression. Organizations at higher CMM maturity levels have standardized, measured processes enabling more effective technology adoption. Technology adoption success depends on organizational capability to define requirements, plan implementation systematically, track implementation progress, manage changes rigorously, and measure adoption outcomes. Organizations at CMM Level 1 (ad hoc) struggle with technology adoption because they lack basic planning, tracking, and management processes. Organizations at CMM Level 3+ (defined/standardized) have established processes enabling more systematic, predictable technology adoption. CMM predicts that technology adoption success correlates with organizational maturity level.

Barriers to Technology Adoption Identified

  • Lack of defined processes: Organizations without defined software development processes struggle to define technology adoption requirements systematically or plan adoption implementation.
  • Weak project management: Absence of basic project management practices (planning, tracking, change management) creates chaos in technology adoption projects.
  • Insufficient measurement: Organizations lacking measurement systems cannot track adoption progress or identify problems early.
  • Uncontrolled change: Organizations without change management processes cannot manage the scope, impact, and pace of technology adoption changes.
  • Poor requirements definition: Organizations without requirements management practices struggle to define what success means for technology adoption.
  • Inadequate quality assurance: Organizations without quality assurance practices may not identify technology adoption problems until late in implementation.
  • Lack of lessons capture: Organizations may repeat technology adoption mistakes because they do not systematically capture and share lessons learned.

Leadership Actions the Framework Prescribes

  • Establish basic project management practices: Implement fundamental project planning, tracking, and control processes for technology adoption projects.
  • Define adoption process standards: Establish organizational standards for how technology adoption will be managed, including planning, testing, rollout, and support.
  • Implement requirements management: Define and manage technology adoption requirements systematically. Ensure requirements are clear, traceable, and validated.
  • Establish change management: Implement formal change control processes for technology adoption changes. Control scope changes and prevent unmanaged growth.
  • Implement measurement and tracking: Collect metrics on adoption progress, schedule conformance, budget conformance, and quality. Use metrics to guide management decisions.
  • Institute quality assurance: Establish QA practices to verify that technology adoption follows defined processes and meets quality standards.
  • Systematize lessons capture: Document lessons learned from technology adoption experiences. Share lessons across the organization to improve future adoptions.
  • Pursue continuous improvement: Treat each technology adoption as opportunity to improve adoption processes. Incorporate improvements into organizational standards for future adoptions.

Following Models or Theories

The Capability Maturity Model sits at the start of a significant stream of theoretical and practical developments around process capability, with subsequent extensions building on and refining the original framework:

  • CMMI (Capability Maturity Model Integration, 2000): Extended and integrated CMM with related models (IPPD, Systems Engineering) into unified framework. CMMI incorporated lessons learned from CMM implementations and addressed CMM limitations.
  • ISO/IEC 15504 (SPICE): International standard for software process assessment based on CMM concepts but providing more flexible assessment approach and broader process areas.
  • Agile and Iterative Models (Beck, 2000; Schwaber & Sutherland, 2002): Developed as alternatives to CMM-style formal process discipline, emphasizing adaptive planning and iterative development. Later work explored integration of Agile and CMM approaches.
  • DevOps and Continuous Integration (Humble & Farley, 2010): Extended software process concepts to emphasize continuous integration, deployment, and feedback. Incorporated measurement and automation concepts from CMM.
  • Lean Software Development (Poppendieck & Poppendieck, 2003): Applied Lean principles to software development, emphasizing value delivery and waste elimination while incorporating process discipline concepts from CMM.
  • Organizational Process Asset (OPA) Management: Extended CMM concepts to managing organizational knowledge and process assets across the enterprise.
  • Software Process Simulation: Used computational models to simulate software process performance and explore process improvement scenarios.
  • Process Mining and Analytics: Applied data analytics to process execution data to understand, analyze, and improve software development processes.

References

  1. Humphrey, W. S. (1989). Managing the Software Process. Addison-Wesley. ISBN 978-0-201-18095-4. (PDF not available in project Zotero.)
  2. Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. (1993). Capability Maturity Model, Version 1.1. IEEE Software, 10(4), 18-27. SEI/CMU. (This is the authoritative SEI paper formalizing the Humphrey 1989 framework; primary source available in project Zotero library and used as ground truth for this page’s claims about level names, Key Process Areas, and lineage.)
  3. Shewhart, W. A. (1931). Economic Control of Quality of Manufactured Product. D. Van Nostrand Company.
  4. Crosby, P. B. (1979). Quality is Free: The Art of Making Quality Certain. McGraw-Hill. (Source of an earlier five-stage quality-management maturity grid.)
  5. Beck, K. (2000). Extreme programming explained: Embrace change. Addison-Wesley.
  6. Poppendieck, M., & Poppendieck, T. (2003). Lean software development: An agile toolkit. Addison-Wesley.
  7. Schwaber, K., & Sutherland, J. (2002). The Scrum guide: The definitive guide to Scrum. Scrum.org.
  8. Humble, J., & Farley, D. (2010). Continuous delivery: Reliable software releases through build, test, and deployment automation. Addison-Wesley.

Further Reading

  1. Carnegie Mellon University Software Engineering Institute. (1993). Capability maturity model for software (Version 1.1). CMU/SEI-93-TR-24.
  2. Chrissis, M. B., Konrad, M., & Shrum, S. (2003). CMMI: Guidelines for process integration and product improvement. Addison-Wesley.
  3. International Organization for Standardization. (2007). ISO/IEC 15504-1: Information technology - Process assessment. ISO.
  4. Pressman, R. S., & Maxim, B. R. (2014). Software engineering: A practitioner’s approach (8th ed.). McGraw-Hill.
  5. Sommerville, I. (2015). Software engineering (10th ed.). Addison-Wesley.
  6. Boehm, B. W. (1988). A spiral model of software development and enhancement.Computer, 21(5), 61-72.
  7. Brooks, F. P. (1995). The mythical man-month: Essays on software engineering(anniversary ed.). Addison-Wesley.

Series Navigation