Sunteți pe pagina 1din 21

The system lifecycle in systems engineering is an examination of a system or proposed system that addresses all phases of its existence

to include system conception, design and development, production and/or construction, distribution, operation, maintenance and support, retirement, phase-out and disposal. Conceptual design The conceptual design stage is where an identified need is examined, requirements for potential solutions are defined, potential solutions are evaluated and a system specification is developed. The system specification represents the technical requirements that will provide overall guidance for system design. Because this document governs all future development, the stage cannot be completed until a conceptual design review has determined that the system specification properly addresses the motivating need. Key steps within the conceptual design stage include: Need identification Feasibility analysis System requirements analysis System specification Conceptual design review Preliminary system design

During this stage of the system lifecycle, subsystems that perform the desired system functions are designed and specified in compliance with the system specification. Interfaces between subsystems are defined, as well as overall test and evaluation requirements.[2] At the completion of this stage, a development specification is produced that is sufficient to perform detailed design and development. Key steps within the preliminary design stage include: Functional analysis Requirements allocation Detailed trade-off studies Synthesis of system options Preliminary design of engineering models Development specification

Preliminary design review Detail design and development This stage includes the development of detailed designs that brings initial design work into a completed with form of specifications. This work includes the specification of interfaces between the system and its intended environment and a comprehensive evaluation of the systems logistical, maintenance and support requirements. The detail design and development is responsible for producing the product, process and material specifications and may result in substantial changes to the development specification. Key steps within the detail design and development stage include: Detailed design Detailed synthesis Development of engineering and prototype models Revision of development specification Product, process and material specification Critical design review Production and construction

During the production and/or construction stage the product is built or assembled in accordance with the requirements specified in the product, process and material specifications and is deployed and tested within the operational target environment. System assessments are conducted in order to correct deficiencies and adapt the system for continued improvement. Key steps within the product construction stage include: Production and/or construction of system components Acceptance testing System distribution and operation Operational testing and evaluation System assessment Utilization and support

Once fully deployed, the system is used for its intended operational role and maintained within its operational environment. Key steps within the utilization and support stage include: System operation in the user environment Change management System modifications for improvement System assessment Phase-out and disposal Once deployed, the effectiveness and efficiency of the system must be continuously evaluated to determine when the product has met its maximum effective lifecycle. Considerations include: Continued existence of operational need, matching between operational requirements and system performance, feasibility of system phase-out versus maintenance, and availability of alternative systems. SYSTEM IMPLEMENTATION Introduction The systems implementation process in terms of construction and delivery phases of the life cycle. Systems implementation is the construction of the new system and the delivery of that system into production (that is, the day-to-day business or organization operation). The Construction Phase of Systems Implementation The construction phase does two things: builds and tests a functional system that fulfills business or organizational design requirements, and implements the interface between the new system and the existing production system. The project team must construct the database, application programs, user and system interfaces, and networks. Some of these elements may already exist in your project or be subject to enhancement. Activity: Build and Test the Networks (if necessary) Purpose: To build and test new networks and modify existing networks for use by the new system. Roles: This activity will normally be completed by the same system specialists who designed the network(s). System owner and users: not usually involved. System analyst the system analysts role is more in terms of a facilitator and ensures that business requirements are not compromised by the network solution.

Network designer (project/site specific role): - the network designer is a specialist in the design of local and wide are networks and in addressing connectivity issues. System builders: - the network administrator the person who has the expertise for building and testing network technology for the new system. S/he will also be familiar with network architecture standards that must be adhered to for any possible new networking technology. Prerequisites (Inputs): This activity is triggered by the approval from the system owners to continue the project into systems design. The key input is the network design requirements defined during systems design. Deliverables (Outputs): The principle deliverable of this activity is an installed network that is placed into operation. Network details will be recorded in the project repository for future reference. [Having introduced the roles, inputs and outputs, now focus on the implementation steps.] Application Techniques: Skills for developing networks are an important skill for systems analysts. Steps: Review the network design requirements outlined in the technical design statement developed during systems design. Make any appropriate modifications to existing networks and/or develop new networks. Review network specifications for future reference. This task must preceded immediately other programming activities because databases are the resources shared by the computer programs to be written. Purpose: The purpose of this activity is to build and test new databases and modify existing databases for use by the net system. Roles: This activity will typically be completed by the same system specialist who designed the database. System owner and system users: not usually involved. System analyst (optional) depends on the organization System designer usually also becomes the builder for this activity. System builder yup, this person does the work. Database administrator when the database is part of a corporate database, theres usually a database administrator who will be involved.

Prerequisites (Inputs): The primary input to this activity is the database design requirements specified in the technical design statement during systems design. Sample data from production databases is often loaded into tables for testing the database. Deliverables: (Outputs): - the end product of this activity is an unpopulated (empty) database structure for the new database. Activity: Build and Test Databases [This is the function most people associate with systems analysis they dont see all the other work involved.] there are several application techniques used in building and testing databases. 1) Sampling: sampling methods are used to obtain representative data for testing database tables. 2) Data Modeling requires a good understanding of data modeling we focus on this in part 2 (after the midterm). 3) Database design. To complete this phase there are 6 steps: Review the technical design statement for database design requirements (know what youre up to) Locate production databases that may contain representative data for testing database tables. Otherwise, generate test data for database tables [get data that will really test the robustness of your design. Dont just pick easy cases.] Build/modify the database according to the design specifications. Load tables with sample data. Test database tables and relationships to adding, modifying, deleting, and retrieving records. All possible relationship paths and data integrity checks should be tested. Review database schema and record for future reference. Activity: Install and Test New Software Packages (if necessary) Purpose: - to install and test any new software packages and make them available to the organizations software library. Roles: This is the first activity in the life cycle that is specific to the applications programmer. System owner and system users not usually involved Systems analyst (optional) may participate in the testing of the software and clarifying requirements. Systems designers may be involved in integration requirements and program documentation

System builder applications programmer. The programmer (or team thereof) is responsible for the installation and testing of new software packages. Network Administrators the network administrator may be involved in installing and testing on the network server (actually, it is a sure bet that the network administrator will be involved). Prerequisites (Inputs): - The main activity is the new software packages and documentation received from system vendors. The applications programmer will complete the installation and testing of the package according to the integration requirements and program documentation that was developed during system design. Deliverables (Outputs): The principle deliverable of this activity is the installed and tested software package(s) that are made available in the software library. Any modified software specifications and new integration requirements that were necessary are documented and made available in the project repository to provide a history and serve as future reference. Applicable Techniques: well, there really isnt much to this. Depends on the programming experience and knowledge of the tester. Essentially just good housekeeping install, test, maintain good documentation for others to follow. Activity: Write and Test New Programs Purpose: The purpose of this activity is to write and test all programs to be developed in-house. Roles: This activity is specific to the applications programmer. System owner and system users not involved System analyst optional System designer optional, may be involved in clarifying the programming plan, integration requirements, and program documentation (developed during systems design) that is used in writing and testing the programs System builder the person responsible for this activity. Applications programmer or programming team they write and test the in-house software. Note that there is often an objective, or specially trained, person to test the application (hence the name, the application tester). Prerequisites (Inputs): The primary input to this activity is the technical design statement, plan for programming, and test data that was developed during the systems design phase. Since new programs or program components may have already been written and in use by other existing systems, the experienced applications programmer will know to first check for possible reusable software components available in the software library. Some information systems shops have a quality assurance group staffed by specialists who review the final program documentation for conformity to standards. This group will appropriate feedback regarding quality recommendations and outputs (or OPT).

Deliverables (Outputs): The output is of course the new programs and reusable software components that are placed in a software library. You should also have created program documentation that may need to be approved by quality assurance people and as a record of the project. Applicable Techniques: If the modules are coded top-down, they should be tested and debugged topdown as theyre written. There are three levels of testing: stub, unit (or program) and systems testing. Stub testing is the test performed on individual modules, whether they are in main programs or are subroutines. Unit or program testing is a test whereby all the modules that have been coded and stub tested are tested as an integrated unit. Systems testing is the test that ensure that the application programs written in isolation work properly when integrated into a whole system. Delivery Phase This is the final part of the implementation phase of the SDLC deliver the new system into operation. To achieve this, you must complete the following: Conduct a system test to make sure that the new system works Prepare a conversion plan to smooth the transition to the new system Install databases used by the new system Provide training and documentation for individuals using the new system Convert from the old system to the new system and evaluate the project and final system. Activity - Conduct System Test Purpose: The purpose is to test all software packages, custom-built ones, and other existing programs to make sure they work together and work correctly. Roles: The systems analyst usually manages this. System owners and users not involved System analyst facilitates by working with project team members in solving problems System designer test integration requirements and resolve design problems System builders all sorts may be involved applications programmers, database programmers, network specialists, etc. Prerequisites: You need the software packages, in-house (custom)-built programs and any existing programs in the new system. Deliverables (Outputs):

Any modifications as discovered during implementation. Continue until the test is successful. You, or others, will have tested the system with some form of data (the system test data). [I like to use a readily identifiable record that I can track through all phases of the system; thats why you see Lars Ingersol in databases used in this and other courses.] Activity Prepare Conversion Plan This activity is not usually performed by the systems analyst it is usually planned by upper managers, a steering committee of some kind or some other person. Although in your work as the analyst/designer, you will have to include the conversion plan in your planning (time and resource projections, Gantt charts, etc.), the specifics are defined by others. So we will skip the details of this activity. However, as part of your planning, youll need to consider the following: Getting training materials ready Establish a schedule for installing databases Identify a training program (or in-house trainers) and schedule for the system users Develop a detailed installation strategy to follow Develop a systems acceptance test plan. There are several common conversion strategies: abrupt cutover on a specific date (usually coinciding with some business data, like the start of the new financial year, or a school year) the old system goes off-line and the new one is placed into operation. parallel conversion both old and new systems are used for a period of time; this is done to ensure that all major problems in the new system have been solved before abandoning the old system. location conversion when the same system will be used in multiple locations, usually one location is selected to start with (to see where the problems are in conversion) and then the conversion is performed on all the other sites. staged conversion each successive version of the new system is converted as it is developed. Anticipate some problems of each strategy. For example, an abrupt cutover will be successful only if the computer program is absolutely perfect which will require lots of testing before hand and will likely require training before users actually go live. The parallel conversion is a lot of work for everyone the workers must use both systems, essentially doing their job twice. Theres lots of opportunity for problems. Acceptance Test Plan

This is the final test performed by end-users using real data over an extended period of time. There are several forms of verification that people use to validate their work. Be aware: managers usually want the bottom-line and dont care about specifics. This is where lots of problems get introduced into an organization. The technical managers like numbers and so will use tests that generate lots of numbers which make them look good to the higher-ups. Having said that, verification is a good idea. Verification testing runs the system in a simulated environment using simulated data. This is sometimes called alpha testing. The simulated test is looking primarily for errors and omissions regarding end-user and design specifications that were specified in the earlier phases but not fulfilled during construction. Validation testing runs the system in a live environment using real data. This is sometimes called beta testing. During this validation, you will test the following: Systems performance Peak workload processing performance Human engineering tests Methods and procedures test Backup and recovery tests Audit testing certifies that the system is free of errors and ready to go! Activity: Install Databases Purpose: To populate the new system databases with existing data from the old system. Roles: Usually only the system builders application programmers and data entry personnel Prerequisites: (Inputs): Existing data from the old system, coupled with database schemas and database structures for the new database. Deliverable (Outputs) the restructured data populated with data from the old system Applicable Techniques: You may need to "massage" the data such as writing programs to convert the old data into the new data formats. Activity: Train System Users Purpose: provide training and documentation to system users to prepare them for a smooth transition to the new system. Roles:

System owners must support this activity: be willing to approve release time for training System users the system is designed for them, so train em. System analyst from the system documentation the system analyst may write the end-user documentation (manuals) System designers and builders not usually involved. Inputs: Youll need system documentation (remember that repository?) Outputs: Youll write user training and documentation. This includes the technical manual, too. Remember: Write the manual as if you had to use it, too! The users are likely the business experts youre not; youre the likely technical experts the users are not. The training manual should address every possible situation. Heres a sample that is only a demo clearly it does not cover all the issues that should be in a manual: Finally, the users must be trained. This may be done in-house (by the analyst or others) or by hiring an outside training company. DSS A Decision Support System (DSS) is a computer-based information system that supports business or organizational decision-making activities. DSSs serve the management, operations, and planning levels of an organization (usually mid and higher management) and help to make decisions, which may be rapidly changing and not easily specified in advance (Unstructured and Semi-Structured decision problems). Decision support systems can be either fully computerized, human or a combination of both. While academics have perceived DSS as a tool to support decision making process, DSS users see DSS as a tool to facilitate organizational processes.[1] Some authors have extended the definition of DSS to include any system that might support decision making.[2] Sprague (1980) defines DSS by its characteristics: DSS tends to be aimed at the less well structured, underspecified problem that upper level managers typically face; DSS attempts to combine the use of models or analytic techniques with traditional data access and retrieval functions; DSS specifically focuses on features which make them easy to use by noncomputer people in an interactive mode; and DSS emphasizes flexibility and adaptability to accommodate changes in the environment and the decision making approach of the user.

DSSs include knowledge-based systems. A properly designed DSS is an interactive software-based system intended to help decision makers compile useful information from a combination of raw data, documents, and personal knowledge, or business models to identify and solve problems and make decisions. Typical information that a decision support application might gather and present includes: inventories of information assets (including legacy and relational data sources, cubes, data warehouses, and data marts), comparative sales figures between one period and the next, projected revenue figures based on product sales assumptions. History

The concept of decision support has evolved from two main areas of research: The theoretical studies of organizational decision making done at the Carnegie Institute of Technology during the late 1950s and early 1960s, and the technical work on Technology in the 1960s. DSS became an area of research of its own in the middle of the 1970s, before gaining in intensity during the 1980s. In the middle and late 1980s, executive information systems (EIS), group decision support systems (GDSS), and organizational decision support systems (ODSS) evolved from the single user and model-oriented DSS. According to Sol (1987) the definition and scope of DSS has been migrating over the years. In the 1970s DSS was described as "a computer-based system to aid decision making". In the late 1970s the DSS movement started focusing on "interactive computer-based systems which help decision-makers utilize data bases and models to solve ill-structured problems". In the 1980s DSS should provide systems "using suitable and available technology to improve effectiveness of managerial and professional activities", and towards the end of 1980s DSS faced a new challenge towards the design of intelligent workstations. In 1987, Texas Instruments completed development of the Gate Assignment Display System (GADS) for United Airlines. This decision support system is credited with significantly reducing travel delays by aiding the management of ground operations at various airports, beginning with O'Hare International Airport in Chicago and Stapleton Airport in Denver Colorado. Beginning in about 1990, data warehousing and on-line analytical processing (OLAP) began broadening the realm of DSS. As the turn of the millennium approached, new Web-based analytical applications were introduced. The advent of better and better reporting technologies has seen DSS start to emerge as a critical component of management design. Examples of this can be seen in the intense amount of discussion of DSS in the education environment.

DSS also have a weak connection to the user interface paradigm of hypertext. Both the University of Vermont PROMIS system (for medical decision making) and the Carnegie Mellon ZOG/KMS system (for military and business decision making) were decision support systems which also were major breakthroughs in user interface research. Furthermore, although hypertext researchers have generally been concerned with information overload, certain researchers, notably Douglas Engelbart, have been focused on decision makers in particular. Components Design of a Drought Mitigation Decision Support System. Three fundamental components of a DSS architecture are: the database (or knowledge base), the model (i.e., the decision context and user criteria), and the user interface. The users themselves are also important components of the architecture. Development Frameworks DSS systems are not entirely different from other systems and require a structured approach. Such a framework includes people, technology, and the development approach. The Early Framework of Decision Support System consists of four phases: Intelligence Searching for conditions that call for decision. Design Inventing, developing and analyzing possible alternative actions of solution. Choice Selecting a course of action among those. Implementation Adopting the selected course of action in decision situation. DSS technology levels (of hardware and software) may include: The actual application that will be used by the user. This is the part of the application that allows the decision maker to make decisions in a particular problem area. The user can act upon that particular problem. Generator contains Hardware/software environment that allows people to easily develop specific DSS applications. This level makes use of case tools or systems such as Crystal, Analytica and iThink. Tools include lower level hardware/software. DSS generators including special languages, function libraries and linking modules

An iterative developmental approach allows for the DSS to be changed and redesigned at various intervals. Once the system is designed, it will need to be tested and revised where necessary for the desired outcome. Classification There are several ways to classify DSS applications. Not every DSS fits neatly into one of the categories, but may be a mix of two or more architectures. Holsapple and Whinstonclassify DSS into the following six frameworks: Text-oriented DSS, Databaseoriented DSS, Spreadsheet-oriented DSS, Solver-oriented DSS, Rule-oriented DSS, and Compound DSS. A compound DSS is the most popular classification for a DSS. It is a hybrid system that includes two or more of the five basic structures described by Holsapple and Whinston. The support given by DSS can be separated into three distinct, interrelated categories:Personal Support, Group Support, and Organizational Support. DSS components may be classified as: Inputs: Factors, numbers, and characteristics to analyze User Knowledge and Expertise: Inputs requiring manual analysis by the user Outputs: Transformed data from which DSS "decisions" are generated Decisions: Results generated by the DSS based on user criteria DSSs which perform selected cognitive decision-making functions and are based on artificial intelligence or intelligent agents technologies are called Intelligent Decision Support Systems (IDSS) The nascent field of Decision engineering treats the decision itself as an engineered object, and applies engineering principles such as Design and Quality assurance to an explicit representation of the elements that make up a decision. Applications 1. As mentioned above, there are theoretical possibilities of building such systems in any knowledge domain. 2. One is the clinical decision support system for medical diagnosis. Other examples include a bank loan officer verifying the credit of a loan applicant or an engineering firm that has bids on several projects and wants to know if they can be competitive with their costs. 3. DSS is extensively used in business and management. Executive dashboard and other business performance software allow faster decision making, identification of negative trends, and better allocation of business resources. Due to DSS all the information from any organization is represented in the form of charts, graphs i.e. in a summarized way, which helps the management to take strategic decision.

4. A growing area of DSS application, concepts, principles, and techniques is in agricultural production, marketing for sustainable development. For example, the DSSAT4 package,[18][19] developed through financial support of USAID during the 80s and 90s, has allowed rapid assessment of several agricultural production systems around the world to facilitate decisionmaking at the farm and policy levels. There are, however, many constraints to the successful adoption on DSS in agriculture.[20] 5. DSS are also prevalent in forest management where the long planning time frame demands specific requirements. All aspects of Forest management, from log transportation, harvest scheduling to sustainability and ecosystem protection have been addressed by modern DSSs. 6. A specific example concerns the Canadian National Railway system, which tests its equipment on a regular basis using a decision support system. A problem faced by any railroad is worn-out or defective rails, which can result in hundreds of derailments per year. Under a DSS, CN managed to decrease the incidence of derailments at the same time other companies were experiencing an increase. Benefits 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Improves personal efficiency Speed up the process of decision making Increases organizational control Encourages exploration and discovery on the part of the decision maker Speeds up problem solving in an organization Facilitates interpersonal communication Promotes learning or training Generates new evidence in support of a decision Creates a competitive advantage over competition Reveals new approaches to thinking about the problem space Helps automate managerial processes Create Innovative ideas to speed up the performance

DSS Characteristics and capabilities 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. Solve semi-structured and unstructured problems Support managers at all levels Support individuals and groups Interdependence and sequence of decisions Support Intelligence, Design, Choice Adaptable and flexible Interactive and ease of use Interactive and efficiency Human control of the process Ease of development by end user Modeling and analysis

12. 13. 14. 15. 16.

Data access Standalone and web-based integration Support varieties of decision processes Support varieties of decision trees Quick response

Architecture of dss

Dsstools 1. Specific DSS The system that actually accompliches the work might be called the specific DSS. It is an information systems application, but with characteristics that make it significantly different from a typical data processing application. It is the hardware/software that allows a specific decision maker or group of decision makers to deal with a specific set of related problems. 2. DSS generator

The socond technology level might be called a DSS generator. This is a package of related hardware and software which provide a set of capabilities to quickly and easily build a specific DSS. An Example of a DSS generator is the Executive Information System (EIS) marketed by Boeing Computer Services. EIS is an integrated set of capabilities which includes report preparation, inquiry capability, a modeling language, graphic display commands, and a set of financial and statistical analysis subroutines. These capabilities have all been available individually for some time. The unique contribution of EIS is that these capabilities are available through a common language which acts on a common set of data. The result is that EIS can be used as a DSS generator, especially for a specific DSS to help in financial decision making situations. 3. DSS Tools The third and most fundamental level of technology applied to the development of a DSS might be called DSS Tools. These are hardware or software elements which facilitate the development of a specific DSS or a DSS generator. This category of technology has seen the greatest amount of recent development, including new special purpose languages, improvements in operating systems to support conversational approaches, color graphics hardware and supporting software, etc. DATA SOURCES data source is simply the source of the data. It can be a file, a particular database on a DBMS, or even a live data feed. The data might be located on the same computer as the program, or on another computer somewhere on a network. For example, a data source might be an Oracle DBMS running on an OS/2 operating system, accessed by Novell Netware; an IBM DB2 DBMS accessed through a gateway; a collection of Xbase files in a server directory; or a local Microsoft Access database file. The purpose of a data source is to gather all of the technical information needed to access the data the driver name, network address, network software, and so on into a single place and hide it from the user. The user should be able to look at a list that includes Payroll, Inventory, and Personnel, choose Payroll from the list, and have the application connect to the payroll data, all without knowing where the payroll data resides or how the application got to it. The term data source should not be confused with similar terms. In this manual, DBMS or database refers to a database program or engine. A further distinction is made between desktop databases, designed to run on personal computers and often lacking in full SQL and transaction support, and server databases, designed to run in a client/server situation and characterized by a stand-alone database engine and rich SQL and transaction support. Database also refers to a particular collection of data, such as a collection of Xbase files in a directory ARCHITECTURE OF DBMS Following a series of 'Database Modeling "this is the article related to" Architecture of a DBMS, "which shows in a graphic way the levels of architecture and sample data communications with other databases.

Architecture of a DBMS Data Modeling A data model is used to describe the structure "logic" and "physics" of a database. Relationships, data types and constraints are known as the structure or level, dividing into 2 types: High Level - we call conceptual data model or Entity-Relationship model, its main concept is a projection of the data that gets closest to the vision that the user has data. Low - known as physical data model, is what provides a detailed view yet of how the data are aramazenados computer. Schemes When using the term "description" of the database, as we understand the call of "schema of a database" that is specified for a project database. Instances Instances are formed when a data is saved in the database for a certain time they formed these database instances, being changed every time a change in the database is performed. DBMS ensures that all instances satisfying the schema of the database, respecting its structure and its constraints. In a DBMS architecture has as main objective, separate user applications of physical data that are divided from the diagrams below: Level domestic or internal schema - uses a data model that shows the physical storage structure of the database, the details of the saved data and access paths. Level conceptual or conceptual scheme - performs a full description of the structure of the database but did not offer details of the data stored in the database. Level or external layout view - describes the views of the database to a group of users that shows which users have access to this database. Architecture of a DBMS Layers Figure 1: Architecture of a DBMS Layers

Data Independence We can say that is allowed to perform schema changes or level of a database without changing a higher level. Recalling that only those levels or regimens are shown in Figure 1. Below is representing two types of data independence: Logical data independence: change only the conceptual level, without any change in the external or in user applications. Physical Data Independence: change the internally without having to change the conceptual level, externally or user applications Languages for Data Manipulation The use of language DDL (Data Definition Language - Data Definition Language) is defined by the conceptual and internal level. When there is a separation of internal and conceptual that is not absorbed a clear view of the user, the DBMS has to compile the DDL action, with the action, permission to execute the statements identified by their descriptions of schemes / levels where you will store them in the DBMS catalog. Where this detachment has an understanding of the user uses the language SDL (Storage Definition Language - Storage Definition Language) for specifying schema / internally. By the end, when the DBMS uses the architecture with three levels, the language used is VDL (Vision Definition Language) that serves to deinio of views (views) of the database When the developer has knowledge in language SQL (Strucuture Query Language) simply just use the basic commands that form the language DDL, DML and DCL, which are described below:

DDL (commands that create objects) - CREATE TABLE, CREATE VIEW, CREATE INDEX, CREATE PROCEDURE among thers. DCL (commands that help in the security of the database) - GRANT, REVOKE. DML (commands responsible for altering the data) - SELECT, DELETE, UPDATE, INSERT. Classification of DBMS Users: single user, are used in workstations, minicomputers and large machines. Location: have 2 states located and distributed. When all data is located are on a single disk, if the data is distributed across multiple machines. Environment: it has two types, the homogeneous environment that is formed by a single DBMS and heterogeneous environment that is composed of different DBMSs. One example is to have a system running two types of database. Structure of a System Manager Database Figure 2: Structure of a System Manager Database

DATA MODELS In software engineering, the term data model is used in two related senses. In the sense covered by this article, it is a description of the objects represented by a computer system together with their properties and relationships; these are typically "real world" objects such as products, suppliers, customers, and orders. In the second sense, covered by the article database model, it means a collection of concepts and rules used in defining data models: for example the relational model uses relations and tuples, while the network model uses records, sets, and fields.

Overview of data modeling context: Data model is based on Data, Data relationship, Data semantic and Data constraint. A data model provides the details of information to be stored, and is of primary use when the final product is the generation of computer software code for an application or the preparation of a functional specification to aid a computer software make-or-buy decision. The figure is an example of the interaction between process and data models. Data models are often used as an aid to communication between the business people defining the requirements for a computer system and the technical people defining the design in response to those requirements. They are used to show the data needed and created by business processes. According to Hoberman (2009), "A data model is a wayfinding tool for both business and IT professionals, which uses a set of symbols and text to precisely explain a subset of real information to improve communication within the organization and thereby lead to a more flexible and stable application environment."[2] A data model explicitly determines the structure of data. Data models are specified in a data modeling notation, which is often graphical in form.[3] A data model can be sometimes referred to as a data structure, especially in the context of programming languages. Data models are often complemented by function models, especially in the context of enterprise models. DESIGN OFDATABASE
Database design is the process of producing a detailed data model of a database. This logical data model contains all the needed logical and physical design choices and physical storage parameters needed to generate a design in a Data Definition Language, which can then be used to create a database. A fully attributed data model contains detailed attributes for each entity. The term database design can be used to describe many different parts of the design of an overall database system. Principally, and most correctly, it can be thought of as the logical design of the base data structures used to store the data. In the relational model these are the tables and views. In an object database the entities and relationships map directly to object classes and named relationships. However, the term database design could also be used to apply to the overall process of designing, not just the base data structures, but also the forms and queries used as part of the overall database [1] application within the database management system (DBMS). The process of doing database design generally consists of a number of steps which will be carried out by the database designer. Usually, the designer must: Determine the relationships between the different data elements. Superimpose a logical structure upon the data on the basis of these relationships. [3]

Design process [edit]

1. Determine the purpose of the database - This hr the remaining steps.

2. Find and organize the information required - Gather all of the types of information to record in the database, such as product name and order number. 3. Divide the information into tables - Divide information items into major entities or subjects, such as Products or Orders. Each subject then becomes a table. 4. Turn information items into columns - Decide what information needs to be stored in each table. Each item becomes a field, and is displayed as a column in the table. For example, an Employees table might include fields such as Last Name and Hire Date. 5. Specify primary keys - Choose each tables primary key. The primary key is a column, or a set of columns, that is used to uniquely identify each row. An example might be Product ID or Order ID. 6. Set up the table relationships - Look at each table and decide how the data in one table is related to the data in other tables. Add fields to tables or create new tables to clarify the relationships, as necessary. 7. Refine the design - Analyze the design for errors. Create tables and add a few records of sample data. Check if results come from the tables as expected. Make adjustments to the design, as needed. 8. Apply the normalization rules - Apply the data normalization rules to see if tables are structured correctly. Make adjustments to the tables

Implementation During the implementation phase, the physical realization of the database and application designs are to be done. This is the programming phase of the systems development. DGMS A Decision Guidance Management System (DGMS) is a productivity platform for fast development of applications that require a closed-loop data acquisition, learning, pre- diction, and decision optimization. This paper introduces the DGMS concept, and the first DGMS data model with its query language, DGSQL. The DGMS data model is an ex- tension of the relational model with probability distributions over a set of attributes as random variables. DG-SQL sup- ports a seamless integration of (1) querying the data collec- tion and construction of learning sets, (2) learning from the learning sets, using parameterized transformers and option- ally defining an estimation utility, such as sum of squares of errors, to be minimized; (3) probabilistic prediction and simulation, using expressions that involve random variables, such as expectation, variance and probability of a logical formula; and (4) stochastic or deterministic optimization, where search space is defined as a set of feasible non-deter- ministic query evaluations.

S-ar putea să vă placă și