Sunteți pe pagina 1din 33

A Client-Server Data Application

For
Production Drawing Department
High Steel Structures Inc.

Spring 2010

Than Lwin Aung


Department of Physics and Engineering
Elizabethtown College
Elizabethtown, Pennsylvania
Email: aungt@etown.edu.

This project report is presented to the Faculty of Physics and Engineering Department, Elizabethtown
College, in partial fulfillment of the Requirements for the Degree Bachelor of Science in Computer
Engineering.

Abstract: As the information processing is an integral part of business organizations,


developing an efficient information system has become a primary concern for software
developers and software engineers. Mainly due to the complexity associated with the software
development, many organizations, such as CCTA (Central Computer and Telecommunication
Agency), ISO (International Standard Organization) and SEI (Software Engineering Institute)
have implemented and standardized different software development models, such as Water Fall
Model, V Model, RAD (Rapid Application Development), Agile Programming, XP (Extreme
Programming), and ISO 12207 etc. Although the primary goal of these models is to deliver the
software product, which meets the user requirements within the time and budget constraints,
they provide different development environments. While the water-fall model provides a more
structured approach of developing software systems, RAD is more suitable for systems which
are dynamic in nature. My goal for the senior project is to develop a client-server data
application system, which will facilitate the part of information systems used in the Production
Drawing Department of High Steel Structures Inc. This paper is to demonstrate the software
development process of my senior project, and the evaluation of the project outcomes.

1
1. Introduction
High Steel Structures Inc. is a company which not only deigns bridge structures but also
fabricates, ships, and erects them to the sites. Although for the company as a whole, there are
various business activities and their related information systems, my senior project will be
confined to the business activities related to the Production Drawing Department. As its name
suggests, the production department is concerned with the production of engineering and design
drawings for bridge structures, and the information systems used in the department are primarily
concerned with keeping track of information related to design drawings, and time and costs
associated with engineering drawings and other information related to shop drawing, estimate
drawing etc. Although there are more than 10 sub systems, the primary systems currently in use
are DCS (Document Control System), Estimate Drawing System, Shop Drawings System. Since
each information system was developed to satisfy the specific needs of the particular business
activities, there are very few interactions among those separate information systems. In addition,
most of those systems were developed more than 10 years ago, and some of the processes have
become obsolete and cannot meet the requirements for the new business activities. Those
problems and the requirements for new business changes entail the current systems to be
modified and updated.

As of now, the company cannot provide a solution to integrate those separate information
systems and their related databases, so they would like to update the existing systems in a piece-
meal approach. Actually, my last summer internship was primarily responsible for the
incremental update of those separate information systems to meet the requirements of new
business processes. However, the piece-meal approach is not an ultimate solution to the
information systems, as it cannot provide the structured approach of integrating all the
information systems.

Therefore, as a senior project, I developed a prototype of integrated information system,


which can support the functionalities of different sub information systems. The new integrated
information system will not only eliminate data duplication but also provide new information
processing features to meet the requirement of the new business activities. Moreover, it can
provide data integrity and security as it will centralize all important data on the central database,
instead of storing on the different databases.

During the summer, I analyzed the current information systems and defined the problems
and requirements for new updates. Throughout the analysis phase of the project, I used both the
structured and object-oriented approach to model the current business activities and their related
information systems. The details of the techniques and software development processes will be
further discussed in the following sections. During the fall semester, I designed and implemented
the prototype of the client-server information system to meet the requirements specified in the
summer. In the following spring semester, the working prototype was tested and evaluated to
examine whether it actually can be implemented for the real information system.

2
2. Background
2.1 Software Development
Software development processes are the activities which take place during the
development of a software product. Primarily, there are 4 major software development activities
or phases:

1. Requirements Analysis and Specification


2. Design
3. Implementation
4. Testing & Evaluation

Various Software development models specify how we go through software development


processes depending on the nature of the software being development. The primary goal of these
models is to allow us to deliver the software on time and budget with all the functions specified.
According to SEI (Software Engineering Institute) at CMU [1]:
“Organizations and governments worldwide will spend about $1 trillion this year on IT projects.
Recent data suggested only about 35 percent of those projects are likely to be completed on time
and on budget, with all their originally specified features and functions. Many projects, perhaps
20 percent, will be abandoned, often after multimillion-dollar investments—and the biggest
projects will fail most often.”

This strongly suggests that software development requires that each and every phase of the
software development be carefully carried out.

Depending on the model of software development [2], the transition from one phase to
another can be either iterative or sequential. In the conventional water-fall model, one proceeds
from one stage to another in a sequential manner. However, in the real world, organizations are
subject to change, and it is essential for software developers to cope with new changes during the
software development. Therefore, the modified version of water-fall model, which allows
moving back and forth between stages, has been introduced. Although there are other models,
such as the spiral model, which can provide more adaptation to changes during the software
development, for organizations which are more static in nature, water-fall model still seems to be
more preferable, as it can provide a structured approach of software development in a cost-
effective and time-efficient way.

For my software development, I used the water model since the software developed is
static in nature. The first 3 phases of the software development will be more iterative; however,
the later stages will be carried out in a sequential manner.

3
Requirements
Analysis and Specification

Design

Implementation

Testing

Evaluation

Figure 1: Software Development Phases (The Water Fall Model)

2.2 Software Development Methodologies

A software development methodology refers to the framework to structure, plan, and


control the process of developing an information system. A wide variety of such frameworks
have evolved over the years, each with its own recognized strengths and weaknesses. Until
recently, there have been 2 major paradigms of software development methodologies:

1. Structured Methodologies

2. Object Oriented Methodologies

However, new methodologies, such as Agile Programming and Extreme Programming, have
appeared to cope with the more dynamical nature of newly emerging information systems. In this
background research, I primarily focus on Structured Methodologies and Object Oriented
Methodologies, since I use these two principles, whenever appropriate, for my software
development.

SSADM is one of the famous Structured Methodologies, and it was introduced for CCTA
(Central Computer and Telecommunication Agency), UK in 1980 [3]. It is a structured approach
to the analysis and design of the information systems. It effectively follows water fall model and

4
provides 3 primary modeling techniques: logical data modeling (Entity Relationship Diagram),
data flow modeling (Data Flow Diagram), and entity behavior modeling (Entity Life History). In
other words, SSADM provides three dimensional perspectives of an information system: one is
to define the structures of data, another is to define the interconnection of the components of the
system, and the last one is to define the dynamic behavior of the system. In this way, SSADM
produces the logical models (designs) of the information system.

Generally, the techniques of SSAMD are most appropriate to the procedural (structured)
programming languages and the relational databases. One of the reasons why I did research on
SSADM is that the information systems used in HSSI were developed with MS AccessTM and
VBATM (Visual Basic for Applications), which is more procedural than object-oriented, I believe
that SSAM will be the perfect tool for analyzing and modeling the systems.

Nowadays, however, almost all programming languages, such as C++, Java, Visual
Basic.Net etc, are more or less object oriented programming languages. In fact, an objected
oriented approach differs somewhat from the so-called structured approach. In object-oriented
environment, each object is defined by both data and functions, and each object communicates
with other objects by passing messages through the pre-defined interfaces. Due to reusability and
data encapsulation for information hiding, OOP has become poplar. Therefore, in order to deal
with modern programming languages, it is necessary to understand the Object Oriented
Methodologies.

UML is just another object oriented modeling methodology, and it was first introduced
by Grady Booch, Ivar Jacobson, and James Rumbaugh as a standard notion for the object
oriented analysis and design [4]. UML also provides a set of graphical modeling techniques for
the object-oriented programming environment. In contrast to SSADM, UML is a more powerful
tool for the contemporary OOP (Object Oriented Programming) languages, such as C++, Java,
and C#. As the implementation of software requires the use of object oriented languages, UML
will be a better tool for modeling and designing the new information system. Particularly, I use
class diagrams to model the application program.

2.3 Software Architectures

Although both SSADM and UML provide the logical models (designs) of the system,
little do they concern with physical design and implementation. Needless to say, however
logically good the design maybe, it is utterly useless if it cannot be defined by physical
implementation.

In fact, various software architectures provide the framework for a software developer to
physically implement the logical model of the information systems. Generally, every information
system involves three main functions: data storage, data processing (application logic) and data
input and output (user interface). Depending on the software architecture, they can be performed
on a server, on a client or divided between server and the client [5] [6] [8].

5
2.3.1 Server Based (Mainframe) Processing

In a server-based environment, all of the functions of the information system are carried
out on the central server. The main server, typically a mainframe computer, is connected by
terminals for data input and output. Although this approach is good for powerful data processing,
it might not be a good option for systems which does not need much data processing.

2.3.2 Stand-Alone Computing and Peer to Peer Processing

In this environment, the data processing is carried out on a stand-alone computer. The
data communication can be achieved through local area network or peer-to-peer network.
However, since the data processing a stand-alone computer can handle is limited, it is not
suitable for heavy data processing.

2.3.3 Client-Server Processing

As we can see, this environment is the hybrid of stand-alone processing and server-based
processing. Although there is a server which handles most data processing, the clients also
perform part of data processing. So, both the server and client share the responsibility of
processing the data. Since this approach optimizes the processing work-load and provides
flexible design, it has become more and more popular. Recently, instead of just two-tier client-
server, n-tier client/server architecture has appeared, in which there are a number of servers
which are responsible for different data process and data storage.

The following table shows the distribution of data, application logic and user interface in
different architecture environments [6].

Architecture Data Application Logic User Interface


Mainframe Server x x x
Client
Stand-alone Server
Client x x x
Two-tier Client/Server Server x x
Client x x
Three-tier Client/Server Data Server x
Application Server x
Client x

6
3. Requirement Analysis and Specifications

3.1 Requirement Specifications

Every information system development project starts with requirement analysis, so does
my project. In fact, requirement analysis allows me not only to understand the operations of
information systems but also to define the design constraints and specifications. As stated above,
during my internship, I analyzed the current information systems, and specified the problem
definitions [2] [3].

Requirement specifications, also known as requirement modeling, involve analyzing the


current systems in terms of input, output, process, performance and security [6]. This usually
follows the top-down approach, starting with the whole departmental structure, and narrowing
down to the detail business operations. Although there are minor requirements in terms input,
output and process, the issues with performance and security create major problems.

The current systems use Microsoft Access Databases, and they are distributed in the local
area network. The database files are shared across the network. Basically, it is a peer-to-peer
processing system.

DCS SDS EVS

Workstation Workstation Workstation Workstation

Figure 2: Current Physical Systems

Performance Problems Current Issues Specifications


Network Traffic High (due to unnecessary file Reduce Network Traffic
sharing)
Concurrency Cannot update the files at the Provide Concurrency
same time
Application Crashes Occurs Frequently when Avoid Application Crashes
multiple users access the
database at the same time

7
Security Problems Current Issues Specifications
Data Integrity Problems with some Provide referential constraints
referential constraints
User Security Controlled by Network Provide Database level user
Directory not by the database security
system

From the requirement specifications, it is clear that most of the problems will be solved
by re-implementing the system on Client-Server Architecture. In fact, there are many options
available for the implementation of client-server system. First, the system can be implemented
with MySQLTM Database Server and Java Application. Second, it can also be implemented with
MS SQL Server and Visual Studio Application.

However, fortunately, through the college, MS SQL Server and MS Visual Studio are
more accessible to me, so I decided to implement the system by choosing Microsoft Platform
instead of Sun Platform. Later, I found that it was a better choice as Visual Studio provides better
system implementation and debugging.

3.2 Project Schedule and Budget


After defining the requirement specifications and the system architecture to be used, I estimated
the project schedule and budget.

Phase Start Date End Date


Requirements Analysis and Specification 05/18/2009 08/21/2009
Design 07/22/2009 09/21/2009
Implementation 08/22/2009 11/28/2009
Testing 09/1/2009 4/26/2010
Evaluation and Reports 4/27/2010 5/6/2010

Components Use Cost Actual Cost


Microsoft Visio 2007 $89.98 0.00
Microsoft Visual Studio 2005 Enterprise Edition $567.54 0.00
Microsoft SQL Server 2005 Express Edition $0.00 0.00
Total $657.02 0.00

Since the actual cost for the implementation of the project is practically zero, it is feasible to
carry out the project within the time and budget. However, to meet the requirement
specifications, it is important for me to model the systems by using software development
methodologies.

8
4. Software Design

Generally, software design involves both logical design and physical design. The
following sections will explain the step-by-step design process I carried out during my software
design phase. Logical design is primarily concerned with modeling the system without
considering how it will be physically implemented. As stated above, SSADM provides logical
designs, from which physical design can be modeled and built. In fact, the physical design is
responsible for transforming the logical design into physical implementation within the design
constraints. The physical design process will be discussed later as it is based on the logical
design process [2] [3].

4.1 Logical Design


According to SSADM, the logical design process involves 3 models to specify the
information system. All models follow the top-down approach and model the systems into
functional modules, interconnected to each other. During my design phase, I iteratively re-
modeled logical models to finally get the following logical design.

4.1.1 Data Flow Diagram

Data Flow Diagram models the data storages and processes of the information system
and their interaction. It defines the system structures of the information system, and how the
system transforms input data into useful information. In brief, DFD tells us what the system
does, but not how it does it [6].

I use Gane and Sarson symbol set. Another popular symbol set is Yourdon symbol set.
There are 5 symbols in DFD: Process, Data Store, Data Flow, and External Entity. A process,
which is represented by rounded rectangular box, contains the business logic or business rules,
and it receives the input data and produce output data according to the business rule. A data flow
defines the path for data to move from one part of information system to another, and it is
represented by a directed arrow. A data store indicates data storage in the information system,
and it is represented by a open rectangle box. Finally, an external is represented by a rectangle
box, and it is used to provide data to the system or to receive output from the system.

Since DFD follows the top-down approach, it starts with the context diagram, a top level
view of an information system. The context diagram defines the system boundary and total input
data flows and output data flows of the system. Also, the context diagram defines the interaction
of the system with the external entities. Once the context diagram is defined, it is further broken
down into diagram zero. Diagram Zero decomposes the context diagram into the processes, the
data stores and the data flows between them.

In my system, there are 5 major processes: Job Initialization, Input Cell, In-House
Detailing, Sublet Detailing and Release Drawings. For each process is further broken down into
smaller units, called functional primitives, which are used to develop the code later.

9
Estimator
Master
Scheduler

New Shipment Estimates


New Job Address Schedule
Engineering Shipment Reports
Administrator Shipment Updates

0 New Job
Job Info
Contract Info Drawing
Select IH/SL Production
Assign EA/PM Manager
Job Status
Shipment Info
Review Contract Drawings HSSI EGR INFO SYS
New Contract Drawings

New Shipment Address


Job/Shipment Reports

New Shop Drawings

Review Shop Drawings Job Information

Specification for
Printing/Scanning
Drawings & Project
Documents for Manager
Approval & Release

Review Shop Drawings

In-House
Detailer Shipment Info
Print Operator

Figure 3: Context Diagram

As we can see, the context diagram defines the system boundary and scope, and all the
interactions between external entities and the system. In fact, the context diagram provides a top-
level view of the system.

10
Master
Job Info Engineering Estimator
Scheduler
Administrator
New Shipment Address

Drawing
Job Select IH/Sublet
New Contract Drawings Production
Manager
Schedule
New Job Address Shipment Info
2
Estimates
New Shipment
R
Input Cell
U
R 1
New Job
Job
U Assign EA/PM
Initialization
Contract Info
Employee
Job Information

U
U R
Contract Job Address
Drawings U
R
Job Project
Manager
Shipment R
Shop U
Drawings
U U
Shipment
Address Drawing
Production
Print U Manager
Operator
R U Shop
U
4 Drawings
Job/Shipment Reports

Sublet
Specs for Printing & Detailing U 5
Scanning
R
Shipment Reports Release Drawings
Shipment Updates
New Shop Drawings U
Review Shop Drawings
3 R

Engineering In House
Administrator New Shop Drawings Review SD
Detailing
Review Shop Drawings Shipment Info
Shipment Reports Update Shipment Shipment Reports
Shipment Updates Review Shop Drawings

Job Status

Specs for Printing & In-House


Print Scanning
Detailer Engineering
Operator
Administrator

Figure 4: Diagram Zero

The diagram zero breaks the context diagram into processes, data stores, and the data
flows.

11
Estimator

Drawing
New Job
Production
Manager Master
Scheduler

1.1
Estimate Number
Specify Job
U
Information

1.2
Get Est Nbr
and Job Req Schedule
Date
U Job

1.5
Contract Number
PM/EA U
Assign Project
Manager
1.3

Get Contract
# of Shipments Number U

1.4
Divide Job U
into
Shipments

Job Info

Project
Manager

Figure 5: Job Initialization Process

Job Initialization process concerns with the initialization of job and starts the whole data
processing of the system.

12
Drawing
Production
2.2 Job Address Manager

Create Job
Folder New Contract Drawings
2.1 U

Create Job
2.3
Address for
R R Input Contract
New Job
Drawings
U
New Job Address

Job Info R Job

Contract
Drawings

2.6 Shipment
U
Input Address
Shipment
New Shipment Address R
Engineering Address
Administrators 2.4
R Shipment
Specify
New Shipment Shipment
Information
U
R

Employee
SL/IH
2.5

Decide SL / U
IH

Figure 6: Input Cell Process

The input cell process defines how the engineering administrators input the job and
shipment information to the system.

13
In-House
Detailers
Engineering 3.1
Administrators New Shop Drawings
Create Shop
Drawings

3.2
Shop Drawings Review
Review Shop
Shop Drawings Review Drawings U

R Shop
U Drawings
Print Specs

Shipment Updates
3.3

Update
Shipment Info
Shipment Info
U

3.4 Shipment

Print Specs

Print
Print Specs Operators

Figure 7: In-House Detailing Process

In-House detailing process starts once the Engineering Administrators issue the drawing
specs to the in-house detailing (usually drafters.) If the drawings are too much to handle, it is
contracted to the sublet-detailers.

14
Engineering 4.1
Administrators New Shop Drawings
Create Shop
Drawings

4.2

Review Shop
Shop Drawings Review Drawings U

R Shop
U Drawings
Print Specs

Shipment Update
4.3

Update
Shipment Info
U

4.4 Shipment

Print Specs

Print
Print Specs Operators

Figure 8: Sublet Detailing Process

Sublet-detail process is concerned with the processing the drawings information when the
drawings are sublet to the sublet-detailers.

15
Engineering 5.1
Administrators

Review Shop
Drawings

Shop Drawings Review

UR

Shop 5.3
Drawings
Change Job
Status
Update Job Status
Update Shipment

Shipment Reports
5.2
Close
Shipment and
Job U
U
Job/Shipment Reports

Job
U

Shipment
Production
Drawing
Manager

Figure 9: Release Drawings Process

It is the final process which will handle when drawings are complete and approved. Once
the job is complete, its status is changed to from „active‟ to „fabrication‟.

As stated above, Data flow modeling provides the logical system structures, but it does
not describe the data structures and the dynamic behavior of the system. Therefore, it is required
carry out logical data modeling (Entity Relationship Diagram) and entity behavior modeling
(Entity Life History) to get a better understanding of the nature of the system.

16
4.1.2 Entity Relationship Diagram

Entity Relationship Diagram defines logical data structure of system [6] [7].
JobType
PK JobTypeNbr
Addresses
PK AddressID

u:R
d:R

Job
ContractDrawings
PK JobNbr JobAddress
u:C u:C
PK,FK2 JobNbr d:C
u:R FK2 JobTypeNbr d:C PK,FK2 JobNbr u:C
PK ContractDrawingsNbr
d:R FK1 JobProjectManager PK,FK1 JobAddressNumber d:C

FK1 DrawingsNbr
u:R
d:R

Drawings
Employee
PK DrawingsNbr
PK EmpID u:R
u:C
d:R
d:C
FK1 EmpType
ContractDrawingsRevision
PK,FK1 JobNbr u:C
d:C u:C
PK,FK1 DrawingsNbr d:C
PK RevisionNbr
StateCode
Shipment
EmployeeType
PK,FK2 JobNbr
PK JobShipmentNbr PK EmployeeTypeID

FK1 InCharge

u:C
d:C

u:C
ShopDrawings d:C
PK,FK2 JobNbr
PK,FK2 JobShipmentNbr ShipmentAddress
u:R
d:R PK ShopDrawNbr PK,FK2 JobNbr u:C
PK,FK2 JobShipmentNbr d:C
FK1 DrawingsNbr PK,FK1 ShipmentAddressNbr

ShopDrawingsRevision

PK,FK1 JobNbr
PK,FK1 JobShipmentNbr
u:C PK,FK1 ShopDrawNbr
d:C PK ShopDrawRevisionNbr

Figure 10: Entity Relationship Diagram

Unlike DFD, ERD approaches the system from different perspective. First, entities used
in the system are defined, such as Employee, Job, Shipment etc. Each entity is then uniquely
identified by an identifier called primary key. Also, each entity contains other data fields known
as attributes. Moreover, an entity can contain the primary key of another entity as a foreign key if
there is a relationship between two entities. In fact, a relationship between two entities is defined

17
through primary and foreign keys. Three forms of relationship can exist between two entities:
one-to-one, one-to-many, and many-to-many. For example, Job entity has a one-to-many
relationship with Shipment entity, as each job contains one or more shipments. Also, ERD can be
used to define the data integrity called referential integrity. I will discuss more about referential
integrity in physical design.

4.1.3 Entity Life History Diagram

The dynamic behavior of the system is defined by ELH (Entity Life History) diagram. In
fact, ELH is similar to state transition diagram. Like state transition diagram, the primary goal of
ELH is to describe the dynamic states of the system. First, the primary entity of the information
is defined, and its life history in the system is modeled by using Jack Structure Diagram [6]. The
following diagram describes the different stages of „Job‟ entity throughout its life.

Subselect and
Initialize Job
type additional
text here

Subselect and
Create
type Job
additional Subselect and
Create Contract Create Job
Subselect and Subselect and Subselect and
Address
text here type additional Create Shipment Archive Job
Drawings
text here typeFolder
additional type additional type additional
text here text here text here

Subselect
Review and
Contract
type additional
Drawings Subselect and
text here Create Shipment
Subselect and Update Shipment
type additional
Address
type additional Address
text here
text here

Subselect and
typeSublet
additional Subselect
In Houseand
text here type additional
text here

Subselect and
type additional Create Shop Create Shop
Update Subselect and Subselect
Update and
Shipment Subselect and
textShipment
here Drawings Drawings
type additional type additional type additional
text here text here text here

Subselect and Subselect and


Review Shop
Review
type Shop
additional type additional
Drawings Drawings
text here text here

Figure 11: Entity Life History of Job

18
4.2 Physical Design and Implementation

The physical design process is primarily concerned with implementation of the logical
design of the system on the software/hardware platform selected. In fact, physical design is
highly interrelated with the physical implementation, and so the physical implementation is
discussed along with the physical design. According to the requirement specifications, the
system is to run in the client/server environment with MS SQL Server. The graphical illustration
of two-tier client/server implementation is described in the following figure [6] [7] [8].

MS SQL Server HSSI


Database

Workstation Workstation Workstation Workstation

Figure 12: Two Tier Client/Server Implementation

As a part of physical design consideration, it is important to specify server processes,


client processes, the interaction between server and client, and finally physical data structures of
the system (database).

The logical design, especially DFD, provides the framework of the processes of the
information system. However, the distribution of the processes depends on the software
architecture. In my case, the two-tier client/server architecture, some processes will be
configured as server-side processes and some will be run as client-side processes. Obviously, to
reduce the network traffic, it is important to run computation intensive processes on the client
side and data intensive processes on the server side.

In addition to the processes, the data structures of the system need to be physically
defined. Since all the data stores of the system will reside on MS SQL Server, the database
schema (data structures) has to be implemented on MS SQL Server. The implementation of
database system on SQL server requires the use of ERD as it provides the logical data structures
of the system. Therefore, I divided the physical design process into 2 phases: database design
and application design.

19
4.2.1 Database Design and Implementation

Fortunately, the transformation of ERD to the actual database systems is pretty


straightforward. Each entity in ERD represents a data table in database systems. With the MS
SQL Server Management Studio Express, I can easily define the data tables and their
relationships. The following screen-shot shows the data definition of Job Table and the definition
of relationship between Job Table and Shipment Table.

As stated above, there is a one-to-many relationship between Job and Shipment. Each job
has one or more shipments, and job table is the parent table and shipment table is the child table.
In fact, the relationship requires the definition of the referential integrity between two data tables.
There are 2 referential integrity rules: Delete Rule and Update Rule. Delete Rule will be
triggered whenever the record of parent table is deleted. Since the child table is dependent on the
parent table, the deletion of parent table will affect the records of child table. By setting Delete
Rule to cascade, the records of child will be deleted when the record of parent table is deleted.
Similarly, Update Rule is also defined to enforce the reverential integrity. And finally, other
tables are defined, along with the relationships between them.

Figure 13: Job Table Definition and Job-Shipment Relationship Definition

20
Figure 14: Database Diagram defined on SQL Server

4.2.2 Application Design and Implementation

Application design is primarily concerned with the development of Visual Basic Data
Application. DFD, in fact, provides the framework for the processes to be developed. The
application design is to transform the DFD into Visual Basic Application. Generally, application
design process follows the top-down approach and it involves a set of design steps:

1) Class Design – to define the infrastructure and components of the application program
2) Screen Design – to define user interface and data input/output
3) Code Design – to define the application logic and data access functions
4) Data Security – to define data and user security

21
Class Design is to model the processes in terms of classes and their relationship to each
other. In this way, it provides a blue print for me to understand how processes can be
modularized into components and how they relate to each other. A class is an ADT (Abstract
Data Type) which encapsulates both data and functions. The following figure will show a typical
defination of a class. Each class includes 4 parts: public, protected, private and friend. Both data
and fuctions can be any one of them. Public members (either data or functions) can be
accessesed by other classes. However, private member can be accessed only by its member
functions.

Figure 15: Class Definitions

Generally, each form represents the functional primitives (the lowest level processes) of
DFD. As we can see, „frmJobFolder‟ Class represents the Creating Job Folder Process, and
„frmJobAddress‟ is responsible for creating Job Addresses. Similarly, other processes are defined
in terms of class definitions. Each class includes a set of member functions to perform the
necessary business processes and a set of member data to store the final or intermediate results of
the processes. For example, the GetJobNbr () function is to read the Job Number from the Job
Data Table. A complete class diagram is described in Appendix – A.

22
Screen Design primarily focuses on developing user interfaces and data input/output.
Since data can enter into the system only through user interface, it is also important to make sure
only valid data is entered into the system. A good screen design has to provide user-friendly
interface and rigorous data validation checks. I use MDI (Multiple Document Interface) for the
screen design. In MDI, the main form acts as a container for displaying other documents and
forms inside it.

Figure 16: Multiple Document Interface

The following figure shows the screen design for the Shipment „Form‟. It includes a set
of Combo Boxes, Text Boxes, Check Boxes and Control Buttons for the user to enter the data
and to process the data.

Figure 17: „Shipment Form‟ Screen Design

23
Code design involves developing the client program with VB.Net language, and the
server program with stored procedures on SQL Server. Since most of the processes are
computational intensive processes, the majority of processes are implemented on the client
application. However, as stated above, some of the processes will be configured as server-side
processes on SQL Server.

Generally, the operation of client/server program can be described as follow. In order to


perform data processing, the client application will request the data access (with SQL
Statements) to the database stored on SQL Server. Once the Server receives the data request, it
responses the client by providing necessary data. When the client receives the necessary data, it
processes the necessary data processing and stores the results back on the Server [6] [7][8].

In fact, VB.net provides a set of objects, called SQL-Client objects, to provide data
access functionality for the client processes. SQL Client objects will enable the client software to
access to the MS SQL Server through SQL-connection object. Once the SQL-connection is
established, the client can access to the database on SQL Server.

Figure 18: SQL-Client Objects

24
The following code segment shows how to connect to the SQL Server with
connection string.

Private Function dbConnect(ByVal UsrName As String, ByVal Pwd As String,


ByVal LocalAuth As Boolean) As Boolean

Dim ConnectionString As String

If LocalAuth = True Then


ConnectionString = "Data Source=CYBVIC\SQLEXPRESS;Initial Catalog
=HSSI; Integrated Security=True;"
Else
ConnectionString = "Data Source=CYBVIC\SQLEXPRESS;Initial
Catalog=HSSI;Integrated Security=False;" & ";User ID=" & _
txtUsrName.Text & ";Password=" & txtPwd.Text & ";"
End If

Try
frmMain.dbConnection.ConnectionString = ConnectionString
frmMain.dbConnection.Open()

The following code segment is to import the system library files to for SQL-
Client Objects.

Imports System
Imports System.Data
Imports System.Data.SqlClient

The following code segment is the class definition of frmAddress Class. The
SQL statemet are defined as private constant members of the class. Also
DataAdpater Object, DataSet Object, DataTable Object and DataRow Object are
defined as private data members.

Public Class frmAddress

'SQL Statement for accessing SQL Server


Const SQL_ADDRESS = "SELECT * FROM dbo.Addresses ORDER BY AddressID"
Const SQL_JOBADDRESS = "SELECT * FROM dbo.Addresses WHERE AddressID = "
Const SQL_STATE = "SELECT StateCode FROM dbo.StateCode"

Private dbDataAdapter As SqlDataAdapter


Private dbDataSet As New DataSet
Private dbDataTable As DataTable
Private dbDataRow As DataRow

Private dbNew As Boolean

Public key As String

25
The following code will show how DataAdapter object is created and how it is
processed to get acces to the database.

Private Sub UpdateAddress()

'Assign primary keys for seeking the specific record

Try

dbDataAdapter = New SqlDataAdapter(SQL_ADDRESS, frmMain.dbConnection)

dbDataAdapter.FillSchema(dbDataSet, SchemaType.Source, "dbo.Addresses")


dbDataAdapter.Fill(dbDataSet, "dbo.Addresses")

dbDataTable = dbDataSet.Tables("dbo.Addresses")

If dbNew Then

dbDataRow = dbDataTable.NewRow()

dbDataRow("AddressID") = txtAddressID.Text
dbDataRow("LastName") = txtLastName.Text

dbDataTable.Rows.Add(dbDataRow)
dbNew = False

In addition to the client processes, it is necessary to implement some processes, such as


Creating User Logins, Creating Security Policies etc, on SQL Server. Fortunately, the SQL
Server provides a programming functionality called Stored Procedure to develop the server-side
processes. Also, SQL Server provides built-in stored procedures for common server operations
related to user and data Securities, data definitions and data manipulations.

The following code segment is to request the Server from the Client to run
the sever-side stored procedure called “sys.sp_addlogin”, which is to add a
login profile for a user.

SQLCmd = New SqlCommand

SQLCmd.CommandText = "sys.sp_addlogin"
SQLCmd.CommandType = CommandType.StoredProcedure
SQLCmd.Connection = frmMain.dbConnection

SQLCmd.Parameters.AddWithValue("@loginame", txtLogin.Text)
SQLCmd.Parameters.AddWithValue("@passwd", txtComfirmPwd.Text)
SQLCmd.Parameters.AddWithValue("@defdb", "HSSI")

SQLCmd.ExecuteNonQuery()
SQLCmd.Dispose()

26
As stated above, one of the requirement specifications is to enhance data and user
securities. Therefore, defining user and data securities is as important as developing the
application. Generally, security definition can be divided into login authentication and group
policy. The following table shows the user group definitions for each user type.

1 Manager db_owner
2 Engineering Administrator db_ddladmin
3 Project Manager db_accessadmin
4 Engineering Technician db_datawriter
5 Drafter db_datareader
6 N/A public

The users are usually categorized into Manager, Engineering Administrator, Project
Manager, Engineering Technician, Drafter and N/A. Depending on the user type, the data access
to the database is restricted. For example, Drafter cannot update nor delete the data. In fact, user
group policies enforce the data restriction on users.

In addition to group policy, each user requires user authentication to access to the
database. For that, each authenticated user is provided with a set of password and username to
access to the database.

Figure 19: User Authentication

27
Figure 20: User Profiles

Figure 21: User Profile Settings

28
5. Testing and Evaluation
Once the software development is complete, I carried out a series of tests to evaluate the
performance and reliable of the program. The results screen shots are shown below.

Figure 22: Data Validation Check

Figure 23: Shop Drawings Data Entry

29
Figure 24: Job Summary Report

30
5. Conclusion
Information is the essential part of organizations, and information systems are necessary
for the survival of the organizations. It is not uncommon that inefficiency of information systems
leads to the failure of business. More importantly, information systems have a huge impact on
the ethical and social issues of the operation of the business. Lack of data security and data
integrity can raise ethical issues in the organization. A common example would be the
unauthorized access to the sensitive information. Also, chaotic interaction between each
information system can result in problematic social communications among the workforce of the
organizations. Therefore, many organizations have implemented and standardized the system
development methodologies to ensure to deliver the efficient software information systems to the
organizations. Still, both the information and the software, by nature, are intangible; sometimes,
it is difficult to reach the perfect understanding of them, even with the systematic software
development and engineering principles.

As stated above, my primary goal for the senior project is to understand the software
development principles as much as I can and to apply them to develop a working application. To
this extent, I fulfill the goal of the project as I can successfully build the software application by
following the software development principles.

However, it is always possible to improve the software in many ways. First, the report
generation tool I used for the project is the intrinsic data report tool, which is part of MS Visual
Studio, and it can only provide static reporting. In fact, static reporting requires the data-set to be
statically bounded with the data sources in the design time. It cannot allow dynamic binding at
the run-time. Therefore, new solutions are required to handle dynamic reporting. Second, the
project is based on the simplified model of the HSSI information system. It does not cover all the
functionality of the system, although it serves as a prototype of the client/server implementation
of the system. Therefore, the additional features will make a better software system.

31
References
[1] Software Engineering Institute. “How we approach software development.” CMU, Carnegie Mellon
University, 2010. Web 5 May, 2010. http://www.sei.cmu.edu/solutions/softwaredev/

[2] Allan M. Davis, Edward H. Bersoff, and Edward R. Comer. "A Strategy for Comparing Alternative
Software Development Life Cycle Models." IEEE Transaction on Software Engineering 14.8
(1988): Web. 5 May 2010.

[3] Simon Rogerson, John Weckert, and Chris Simpson. “An ethical review of information systems
development.” Information Technology and People 13.2 (2000): Web 5 May 2010.

[4] Klaus Bergner, Andreas Rausch, and Marc Sihling. “Using UML for Modeling a Distributed Java
Application.” Institut fur Informatik Techniche Universitat Muchen (1997): Web. 5 May 2010.

[5] M. Tamer Ozsu. “Distributed Database Systems.” University of Waterloo, Department of Computer
Science (2002): Web. 5 May 2010.

[6] Shelly, Gary B., Thomas J. Cashman, and Harry J. Rosenblatt. Systems Analysis and Design. 7th
Edition. Boston: Thomson Course Technology, Print.

[7] Connolly, Thomas M., and Carolyn E. Begg. Database Systems. 4th Edition. Essex: Pearson
Education Limited, 2005. Print.

[8] Carey, Michael J., et al. "Data Caching Tradeoffs in Client Server DBMS Architectures." (1991):
Web. 5 May 2010.

32
Appendix – A

33

S-ar putea să vă placă și