Sunteți pe pagina 1din 8

A Software Engineering Process Group (SEPG) is an organization's focal point for software process improvement activities.

These individuals perform assessments of organizational capability, develop plans to implement needed improvements, coordinate the implementation of those plans, and measure the effectiveness of these efforts. Successful SEPGs require specialized skills and knowledge of many areas outside traditional software engineering. [1] Following are ongoing activities of the process group[1] :
y y y

y y y y y y

Obtains and maintains the support of all levels of management. Facilitates software process assessments. Works with line managers whose projects are affected by changes in software engineering practice, providing a broad perspective of the improvement effort and helping them set expectations. Maintains collaborative working relationships with software engineers, especially to obtain, plan for, and install new practices and technologies. Arranges for any training or continuing education related to process improvements. Tracks, monitors, and reports on the status of particular improvement efforts. Facilitates the creation and maintenance of process definitions, in collaboration with managers and engineering staff. Maintains a process database. Provides process consultation to development projects and management.

Capability Maturity Model (CMM) broadly refers to a process improvement approach that is based on a process model. CMM also refers specifically to the first such model, developed by the Software Engineering Institute (SEI) in the mid-1980s, as well as the family of process models that followed. A process model is a structured collection of practices that describe the characteristics of effective processes; the practices included are those proven by experience to be effective. CMM can be used to assess an organization against a scale of five process maturity levels. Each level ranks the organization according to its standardization of processes in the subject area being assessed. The subject areas can be as diverse as software engineering, systems engineering, project management, risk management, system acquisition, information technology (IT) services and personnel management. CMM was developed by the SEI at Carnegie Mellon University in Pittsburgh. It has been used extensively for avionics software and government projects, in North America, Europe, Asia, Australia, South America, and Africa.Currently, some government departments require software development contract organization to achieve and operate at a level 3 standard.

History
The Capability Maturity Model was initially funded by military research. The United States Air Force funded a study at the Carnegie-Mellon Software Engineering Institute to create a model (abstract) for the military to use as an objective evaluation of software subcontractors. The result was the Capability Maturity Model, published as Managing the Software Process in 1989. The CMM is no longer supported by the SEI and has been superseded by the more comprehensive Capability Maturity Model Integration (CMMI).

Maturity Model
The Capability Maturity Model (CMM) is a way to develop and refine an organization's processes. The first CMM was for the purpose of developing and refining software development processes. A maturity model is a structured collection of elements that describe characteristics of effective processes. A maturity model provides:
y y y y y

a place to start the benefit of a communitys prior experiences a common language and a shared vision a framework for prioritizing actions a way to define what improvement means for your organization

A maturity model can be used as a benchmark for assessing different organizations for equivalent comparison. It describes the maturity of the company based upon the project the company is dealing with and the clients.

Context
In the 1970s, technological improvements made computers more widespread, flexible, and inexpensive. Organizations began to adopt more and more computerized information systems and the field of software development grew significantly. This led to an increased demand for developersand managerswhich was satisfied with less experienced professionals. Unfortunately, the influx of growth caused growing pains; project failure became more commonplace not only because the field of computer science was still in its infancy, but also because projects became more ambitious in scale and complexity. In response, individuals such as Edward Yourdon, Larry Constantine, Gerald Weinberg, Tom DeMarco, and David Parnas published articles and books with research results in an attempt to professionalize the software development process. Watts Humphrey's Capability Maturity Model (CMM) was described in the book Managing the Software Process (1989). The CMM as conceived by Watts Humphrey was based on the earlier work of Phil Crosby. Active development of the model by the SEI began in 1986. The CMM was originally intended as a tool to evaluate the ability of government contractors to perform a contracted software project. Though it comes from the area of software development, it can be, has been, and continues to be widely applied as a general model of the maturity of processes in IS/IT (and other) organizations. The model identifies five levels of process maturity for an organisation. Within each of these maturity levels are KPAs (Key Process Areas) which characterise that level, and for each KPA there are five definitions identified:
y y y y

1. Goals 2. Commitment 3. Ability 4. Measurement

5. Verification

The KPAs are not necessarily unique to CMM, representing - as they do - the stages that organizations must go through on the way to becoming mature. The assessment is supposed to be led by an authorised lead assessor. One way in which companies are supposed to use the model is first to assess their maturity level and then form a specific plan to get to the next level. Skipping levels is not allowed.

Timeline
y y y y y y y y

1987 SEI-87-TR-24 (SW-CMM questionnaire), released. 1989 Managing the Software Process, published. 1991 SW-CMM v1.0, released. 1993 SW-CMM v1.1, released. 1997 SW-CMM revisions halted in support for CMMI. 2000 CMMI v1.02, released. 2002 CMMI v1.1, released. 2006 CMMI v1.2, released.

Current state
Although these models have proved useful to many organizations, the use of multiple models has been problematic. Further, applying multiple models that are not integrated within and across an organization is costly in terms of training, appraisals, and improvement activities. The CMM Integration project was formed to sort out the problem of using multiple CMMs. The CMMI Product Team's mission was to combine three source models: 1. 2. 3. 4. The Capability Maturity Model for Software (SW-CMM) v2.0 draft C The Systems Engineering Capability Model (SECM) The Integrated Product Development Capability Maturity Model (IPD-CMM) v0.98 Supplier sourcing

CMMI is the designated successor of the three source models. The SEI has released a policy to sunset the Software CMM and previous versions of the CMMI. The same can be said for the SECM and the IPD-CMM; these models were superseded by CMMI.

Future direction
With the release of the CMMI Version 1.2 Product Suite, the existing CMMI has been renamed the CMMI for Development (CMMI-DEV), V1.2. Two other versions are being developed, one for Services, and the other for Acquisitions. In some cases, CMM can be combined with other methodologies. It is commonly used in conjunction with the ISO 9001 standard, as well as with the computer programming methodologies of Extreme Programming (XP), and Six Sigma.

Levels of the CMM


There are five levels of the CMM:
y

Level 1 - Initial o Processes are usually ad hoc and the organization usually does not provide a stable environment. Success in these organizations depends on the competence and heroics of the people in the organization and not on the use of proven processes. In spite of this ad hoc, chaotic environment, maturity level 1 organizations often produce products and services that work; however, they frequently exceed the budget and schedule of their projects. o Organizations are characterized by a tendency to over commit, abandon processes in the time of crisis, and not be able to repeat their past successes again. o Software project success depends on having quality people. Level 2 - Repeatable o Software development successes are repeatable. The processes may not repeat for all the projects in the organization. The organization may use some basic project management to track cost and schedule. o Process discipline helps ensure that existing practices are retained during times of stress. When these practices are in place, projects are performed and managed according to their documented plans. o Project status and the delivery of services are visible to management at defined points (for example, at major milestones and at the completion of major tasks). o Basic project management processes are established to track cost, schedule, and functionality. The minimum process discipline is in place to repeat earlier successes on projects with similar applications and scope. There is still a significant risk of exceeding cost and time estimate. Level 3 - Defined o The organizations set of standard processes, which is the basis for level 3, is established and improved over time. These standard processes are used to establish consistency across the organization. Projects establish their defined processes by the organizations set of standard processes according to tailoring guidelines. o The organizations management establishes process objectives based on the organizations set of standard processes and ensures that these objectives are appropriately addressed. o A critical distinction between level 2 and level 3 is the scope of standards, process descriptions, and procedures. At level 2, the standards, process descriptions, and procedures may be quite different in each specific instance of the process (for example, on a particular project). At level 3, the standards, process descriptions, and procedures for a project are tailored from the organizations set of standard processes to suit a particular project or organizational unit. Level 4 - Managed o Using precise measurements, management can effectively control the software development effort. In particular, management can identify ways to adjust and adapt the process to particular projects without measurable losses of quality or deviations from specifications. At this level organization set a quantitative

quality goal for both software process and software maintenance. Subprocesses are selected that significantly contribute to overall process performance. These selected subprocesses are controlled using statistical and other quantitative techniques. o A critical distinction between maturity level 3 and maturity level 4 is the predictability of process performance. At maturity level 4, the performance of processes is controlled using statistical and other quantitative techniques, and is quantitatively predictable. At maturity level 3, processes are only qualitatively predictable. Level 5 - Optimizing o Focusing on continually improving process performance through both incremental and innovative technological improvements. Quantitative processimprovement objectives for the organization are established, continually revised to reflect changing business objectives, and used as criteria in managing process improvement. The effects of deployed process improvements are measured and evaluated against the quantitative processimprovement objectives. Both the defined processes and the organizations set of standard processes are targets of measurable improvement activities. o Process improvements to address common causes of process variation and measurably improve the organizations processes are identified, evaluated, and deployed. o Optimizing processes that are nimble, adaptable and innovative depends on the participation of an empowered workforce aligned with the business values and objectives of the organization. The organizations ability to rapidly respond to changes and opportunities is enhanced by finding ways to accelerate and share learning. o A critical distinction between maturity level 4 and maturity level 5 is the type of process variation addressed. At maturity level 4, processes are concerned with addressing special causes of process variation and providing statistical predictability of the results. Though processes may produce predictable results, the results may be insufficient to achieve the established objectives. At maturity level 5, processes are concerned with addressing common causes of process variation and changing the process (that is, shifting the mean of the process performance) to improve process performance (while maintaining statistical probability) to achieve the established quantitative processimprovement objectives.
o

The most beneficial elements of CMM Level 2 and 3:


y

y y

Creation of Software Specifications, stating what is going to be developed, combined with formal sign off, an executive sponsor and approval mechanism. This is NOT a living document, but additions are placed in a deferred or out of scope section for later incorporation into the next cycle of software development. A Technical Specification, stating how precisely the thing specified in the Software Specifications is to be developed will be used. This is a living document. Peer Review of Code (Code Review) with metrics that allow developers to walk through an implementation, and to suggest improvements or changes. Note - This is problematic because the code has already been developed and a bad design can not be fixed by "tweaking", the Code Review gives complete code a formal approval

y y

mechanism. Version Control - a very large number of organizations have no formal revision control mechanism or release mechanism in place. The idea that there is a "right way" to build software, that it is a scientific process involving engineering design and that groups of developers are not there to simply work on the problem du jour.

Total quality management (TQM) refers to quality management, compliance management, risk assessment, document control and any other component of total quality that contributes to the control, quality and validity of a product and/or service.

Establish the scope and features before beginning work Scope creep, according to consultant and author Richard Veryard, is known by various names, depending on the project phase. Early on, it s called requirements creep, when the project team is defining problems to be tackled. Perhaps most often, it s called feature creep, as more solutions, or functions, are added to the specs. The primary way to fight scope creep seems simple: Firmly establish the requirements and features of the project before getting started. But too often, in-house customers believe that an IT project can always make room for one more feature. TechRepublic member Jesse Lo RE, VP of DigitalNine Corp., said that IT leaders need to establish a cross-departmental steering committee of project stakeholders as a first step to define the requirements. Lo RE believes that because project overruns are usually a result of factors outside of the project team, the steering committee can provide the organizational buy-in that s necessary to keep the project on track. Jon Nelson, a TechRepublic member and CTO of Avnet Inc., said that once project requirements and features are mapped out, the next step is putting the requirements in writing and having a representative of each user community sign off. If it s not written, then it s not real is Nelson's first rule of project management. He pointed out that an e-mail validation is deniable, but a personal script on a paper contract is not. Maybe that s why in Nelson s 15-year IT career, none of his 16 large-scale projects have run over in time or budget. Then, Nelson said, it s time for each business unit to prioritize its individual requirements. 2. Prepare your technical team to do its best

The early process of project planning encourages stakeholders to take ownership as they sign off on requirements. The next phase involves getting technical staff to sign off on capabilities and responsibilities. At this stage, Nelson said it s critical to have a thorough understanding of the systems and tools that will be used to design and develop the project. The technical staff needs to validate the capabilities and weaknesses of the development environment. This step is so crucial, Nelson explained, because it is at this step where the designers and developers testify to and commit to their own capabilities to deliver. You should also take into account not only the staff s capabilities, but personalities, desires, and styles during this phase. With those factors in hand, you can then assign teams, determine schedules, and even provide working environments that provide the best potential for success. For example, if some people work best between noon and midnight, try to provide the flexibility to work those hours for optimal productivity. After all, Nelson said, the speed, quality, and value of your project is not a factor of the technology nearly as much as it s a factor of the technologists. 3. Thoroughly investigate a vendor's capabilities While Nelson s advice can help tech leaders properly prepare the project team , few IT efforts are accomplished without help from an outside vendor. If the vendors are providing hardware or software components, it s a good idea to proceed cautiously. Troy Atwood, an associate VP at CNET Networks, cites overpromising vendors as one chief cause of cost overruns, along with scope creep. Atwood recalled a project deploying relatively new VPN WAN technology for TechRepublic as an example of a vendor overselling a product. Atwood and his colleagues interviewed three different companies, eliminated one early in the process, and then grilled the remaining two very hard on their technologies. I have a standing requirement for salesmen: I will not see them unless they bring a technical specialist with them, Atwood explained. A major networking vendor won the contract, but Atwood and a colleague spent many nights on the phone with the company s tech support to iron out various issues. In retrospect, Atwood said the tech specialists they had met and talked with initially were actually trained on future product enhancements not current technology in place. As the technology itself was so new, the team was not able to find help among colleagues and peers, either. But Atwood later learned that there were people in the same boat. When TechRepublic was acquired by CNET, he discovered that CNET had experienced the same problems with the vendor. The experience taught Atwood a big lesson that most tech leaders learn: Even if you cover most of the bases, you are still running huge risks being an early adopter.

Fortunately for Atwood, two factors kept potential project overrun low. First, his team was highly trained and eventually ironed out the issues, saving the cost of hiring outside consultants. And second, the IT department had bought hardware that exceeded current needs. When the networking vendor finally released an improved IOS upgrade that fixed many of the issues, the hardware was already in place to handle the more demanding system. This forward-looking approach is rooted in one of Atwood s favorite mantras on project management: Failure to plan is a plan to fail. An ideal way to head off vendors who exaggerate products' capabilities is to find other techies who have implemented them. But that s not always easy, particularly for early adopters. Communication helps you try to bridge the gap, but you never really bridge it, Zink said. There s little in IT that we have been doing for five years. 4. Stay diligent about keeping the project on the right path Even if initial planning includes business unit involvement, project teams still have to guard against scope creep after the project is underway. Zink believes that scope creep will always be a problem, due to the very nature of IT. IT is simply different from just about any other project, as the project is going to change in scope," said Zink, adding, or, at least, someone is going to try to change the scope. Some IT leaders may allow minor changes if specified in writing, and the cost and time impact is approved. But others, like Nelson, have written contracts that can be used to close the project to any changes once underway. Nelson also emphasized the human touch in keeping a project on the right path. He tries to maintain a strong dedication to the spirit, pride, and concerns of the development and deployment teams. Nelson lets his teams know that he s there throughout the process. Even in a 24/7 development environment, Nelson wrote, if a member of my team was working, then I was working. As a project moves along, it s important to keep user communities, as well as executive management, informed on progress. Recognizing milestones can help keep the team focused and give them something to celebrate. Milestones also can help you publicize your team s successes throughout the organization. Zink, who holds a degree in accounting, recognizes another value of tracking a project: discovering what s not working. Try to discern early what areas or paths are leading to dead ends, Zink advised. If you can t find a path around those dead ends, it may be time to abandon the project before you sink three times the budgeted money into it, as his colleague in San Jose did. And, Zink added, don t be surprised if something slips somewhere, despite best efforts. It ll simply be in sync with his favorite law of project management: It always takes longer than you expect.

S-ar putea să vă placă și