Sunteți pe pagina 1din 3

General & Modeling Large means anything nontrivial that benefits from paper planning and tools, and

nd will be used by someone other than the developer Build models to help during design, analyze existing systems, help us communicate

Software Architecture avoid unnecessary coupling and cohesion - spaghetti code layered Model View Controller, Model View Presenter Conways law: architecture of a system tends to mirror the communication and structure of a team. if you structure the team in a different way, it can impact archit.

Sequence Diagrams shows well defined order, over time. used for when looking at different design options, elaborating on use cases alt: alternative multiple fragments. guard condition must be true to execute loop: may execute multiple times, as indicated by the guard neg: invalid interaction opt: fragment is optional. works like an if condition par: fragments run in parallel ref: reference to a interaction that's defined in another diagram region: a critical region; only one thread at a time.

Use Case Diagrams captures system requirements. how users interacts with a system. short phase to sum functionality. "actors" play the roles of people. relationships between use cases.

SDLC (Systems Development Life Cycle) Waterfall: Requirements -> Design -> Implementation -> Verification -> Maintenance Pros: Enforces discipline. Defined start and end. Progress can be identified. Emphasis on requirements and design helps create better code/save time. Improves quality by doing reqs and design first. Easy to catch flaws. Easy to transfer knowledge through team. Cons: Customers might change their mind, particularly with requirements and design. May be past that point already. Waterfall assumes that design can be easily translated into a real product, but developers can run into roadblocks. If a re-design is required it can screw the distinctions between the phases of the model. Waterfall also suggests a clear division of labor, etc "developers, programmers and testers" but in reality no such division really exists. Gantt Charts Pros: Useful for small projects. Presents phases & activities of project work breakdown schedule well. Cons: Wrongly equated with project design in the sense they try to define project work breakdown at the same time as defining schedule activities. Messy and not useful for large projects with many activities; hard to display. Communicates little info per area displayed. Only represent only part of cost, time and scope of project. Do not represent size of project relative to size of work elements. Magnitude of behind-schedule condition easily miscommunicated. Fixed height of bars, so they can misrepresent time-phased workload. Two activities may appear to be same size, but one can have 1 guy working on it, the other a whole team.

Risk Management Risk Exposure (RE) = probability x consequences (loss) Risk Reduction Leverage (RRL) = (RE before) - (RE after) / cost of mitigating action Risk Assessment: qualitative (risk exposure matrix) don't have independent v/v report to the development manager -> don't report to same team

Requirements Analysis figuring out -> what problem do you want to solve? answer this wrong, you're building the wrong thing requirements change over time. requirements can be incomplete. RA is ongoing and iterative how to get requirements? ID stakeholders and their goals. think about the problem trying to solve... may not always be what you're told. specifications (given domain properties) implies requirements analysis

Robustness Analysis V/V Testing defects can be caused by lots of things. can lead to failures/go unnoticed. removing early is CHEAPER defect detection: formal design inspection/testing = 95%. agile informal review/regression = 90%. good tests -> power: bug exists, test will find it. validity: no false positives. non-redudancy: provides new info. repeatability: easy to re-run Unit tests: tests a unit of code. tested separately. applies to single use case or part of one Integration tests: many or all units together, tests that code meets design specs Functional tests: coverage of all inputs (inc edge/corner cases), tests functional requirements Performance tests: tests (one of the) quality requirements Acceptance tests: tests customer goals Installation tests: tests user environment. can be optional Structural testing (white box): based on structure of code. coverage = all paths through code tested Functional testing (black box): can't see inside, derived from use cases Test driven development: write unit tests. write code to pass tests. refactor code to meet full specs. test cases written for Verification -> are we building the thing right? test cases run for Validation -> are we building the right thing? 3-5 people = between 70% - 90% of all usability problems found. less people = not enough bugs found, more people = spending too much to get the last 10% found bridge between use case diagrams and sequence diagrams/code used to: analyze logic of use case, ensure use case represent usage requirements, visualize what you will build (the code), communicate technical stuff to stakeholders

Static Analysis analysis of a program's source code. IDE's may do this automatically. lint is another example finds stuff like array out of bounds, runtime errors, etc. lots of false positives and negatives

Quality value to someone, fitness to purpose, exceeding customer expectations quality assurance focuses on V/V on quality of product quality frameworks such as six sigma six sigma is not good for software because: depends on human behavior, not predictable. can't measure degree of conformance. mapping between software faults and failures is many to many. not all failures result from the software itself. cannot accurately measure # of faults in software six sigma typical defect rate: would demand 0.0034 faults/KLOC ??

Capability Maturity Model (CMM) 1. Initial: the starting point for use of a new or undocumented repeat process 2. Repeatable: the process is at least documented sufficiently such that repeating the same steps may be attempted 3. Defined: the process is defined/confirmed as a standard business process, and decomposed to levels 0, 1 and 2 (the last being Work Instructions). 4. Managed: the process is quantitatively managed in accordance with agreed-upon metrics. 5. Optimizing: process management includes deliberate process optimization/improvement. Release Planning What are we building? By when will it be ready? How many people do we have? Big bang release: CD to store. Now things are done continuously. Still need to plan though. Still need to market things. Sales team needs to know when you will deliver so they can sell it. Essence of planning is uncertainty. Don't resist changes. If you don't plan, move to CMM level 2. Why plan? external pressures. if it was just for us (fun), not needed. With good data, good managers make good decisions. Capacity vs. requirements. do we have the capacity to fulfill all requirements? prioritized list of features? For resources pick calendar days and determine work factors Put a random bookmark at end of sprint. Quarterly release is an example for sales/marketing. Requirements or Features (F): prioritized potential requirements from wish list. planning poker estimation Available resources (N): pick a value for T (workdays until release/sprint/horizon) Capacity Constraint: F = N x T plan must respect capacity constraint keep plan up to date with most current estimates always dealing with overflow: move dates, cut features, combination, adding developers is rarely helpful to plan

S-ar putea să vă placă și