Documente Academic
Documente Profesional
Documente Cultură
A SEMINAR REPORT Submitted By Kamlesh Korat (Enrollment No.080260107015) Chirag Korat (Enrollment No.080260107014)
BACHELOR OF ENGINEERING
In
KALOL INSTITUTE OF TECHNOLOGY & RESEARCH CENTRE DEPARTMENT OF COMPUTER ENGINEERING 2011
CERTIFICATE
Date:
This is to certify that the dissertation entitled Discover India-Content Management System has been carried out by Kamlesh Korat & Chirag Korat under my guidance in fulfillment of the degree of Bachelor of Engineering in Computer Engineering (7th Semester) of Gujarat Technological University, Ahmedabad during the academic year 2011-12.
Guides:
ACKNOWLEDGEMENT
Project, Discover India Content Management System is itself an acknowledgement to the contribution of many individual to name all of them would be tedious, however a few have been signed out for special mentioned. To begin with we would like to thank Prof. Mahesh Panchal the HOD of the Computer Engineering Department, KITRC Kalol and the internal guide Mrs. Nikita D. Patel & the whole staff as what we are today is due to them. We wish to extend our thanks to our external guide Mr. Ravi Patel, employee of Oaasis (India) P. Ltd. at Ahmedabad for their valuable guidance throughout the various phases of project. We also thankful to Mr. Ajay Ojha, D.O., Oaasis Pvt. Ltd, Ahmedabad branch, for their support to provide valuable information about CMS. We are also thankful to our library staff for providing literature for the project. Finally and most importantly we wish to thank our parents for their patience and encouragement during the project period. We thank everybody, who has directly and indirectly helped in our endeavor. It would be our pleasure to work in such an environment. At last but not least, our preference as thanks to all our team co-workers who made the project session lightly, enjoyable and most memorable once.
ABSTRACT
Content Management System(CMS) can be defined as the process of creating, managing and publishing online content without the need for any programming or technical skills. Content Management saves time and money of an organization to a great extent if used appropriately. You can achieve effective content management easily by deploying a content management system. A content management system can be custom built or can be readily bought from a vendor if it suits your needs. There are hundreds of content management vendors who offer content management systems of price range varying from a few cents to millions of dollars. So, it is advisable that you have a well-defined content plan in place for evaluating and choosing a content management system, else this whole procedure will prove disastrous.
LIST OF FIGURES
Figure No Figure 2.2.1 Figure 2.5.1 Figure 4.1.1 Figure 4.1.2 Figure 4.1.3 Figure 4.1.4 Figure 4.1.5 Figure 5.1.1 Figure 5.1.2 Figure 5.1.3 Figure 5.1.4 Figure 5.3.1 Figure 5.3.2 Figure 5.3.2 Figure 5.3.2 Figure 5.3.2
Figure Description Incremental model Risk Management .Net Framework .Net Framework Context Language Compilation In .Net ADO .Net Architecture DataBase Architecture Context Diagram 1st Level Dataflow Diagram 2nd Level Dataflow Diagram For Login 2nd Level Dataflow Diagram For Banners ER Diagram(1) ER Diagram(2) Use Case Of CMS User Use Case Admin Use Case
Page No 17 22 31 33 36 40 51 54 55 56 57 65 66 68 69 70
LIST OF TABLES
Table No Table 2.1.1 Table 2.1.2 Table 2.3.1 Table 2.4.1 Table 2.4.2 Table 2.5.1 Table 2.5.2 Table 2.6.1 Table 3.2.1 Table 3.2.2 Table 4.1 Table 5.2.1 Table 5.2.2 Table 5.2.3 Table 5.2.4 Table 5.2.5 Table 5.2.6 Table 5.2.7 Table 5.2.8 Table 5.2.9 Table 5.2.10
Table Description Project Plan Team Deatails Software Tool Details Project Schedule Roles & Responsibilty (1) Roles & Responsibilty (2) Risk Planning Risk Monitoring Affort Estimations Hardware Specification Software Specification Platform Details Registration Banner Links Menu Image Galery Comment Article Author Event Page
Page No 15 16 19 20 21 24 26 27 28 29 30 59 59 60 60 60 61 61 62 62 63
TABLE CONTENTS
Acknowledgement...3 Abstract...4 List Of Figures.5 List Of Table6 Table Contents ...7
1. Introduction.9 1.1 Project Summary 9 1.1.1 Project Description..9 1.2 1.3 1.4 Purpose.......................................................................10 Scope..................................................................................................11 Feasibility Study ..12 1.4.1 Technical Feasibility.....12 1.4.2 Economical Feasible.....13 1.4.3 Behavior Feasibility..........14 2. Project Management.......................15 2.1 2.2 2.3 2.4 2.5 2.6 2.7 Project Planning.15 Project Development Approach.........................16 Schedule Representation19 Roles and Responsibilities.20 Risk Management .21 Project Schedule....27 Effort Estimation...28
3. System Requirement Study...........29 3.1 User Requirements...........................................................................29 3.2 Hardware and Software Specification.29 3.2.1 Hardware Specification...29 3.2.2 Software Specification30
7
4. Platform Details31 4.1 Introduction to .Net..31 5. Data Modeling....53 5.1 Data Flow Diagram..53 5.1.1 Context Diagram..55 5.2 5.3 5.4 Data Dictionary ...........................................................................59 E-R Diagram ................................................................................65 Use-Case Diagram....68
6. System Testing.......................72 6.1 Test Plan ......................................................................................74 6.1.1 Unit Testing......75 6.1.2 Integration Testing....75 6.1.3 Validation Testing.....76 6.1.4 Output Testing .......76 6.2 6.1.5 Validation Checking..77 Security Features ...........................................................................78
7. Bibliography........................................................79
1. Introduction
1.1 Project Summery
Content Management System is a process and/or a software application that allows groups to efficiently plan, create, manage, store and distribute contents. Content can be anything in the form: Published Documents(Web or Print) Images Achieved Communications Presentation or Streaming Media CMS allow its user to personalize Contents and View. There are mainly three types of Users that interact with the System. Administrator System Manager
Users
Project Description
A content management system (CMS) is the collection of procedures used to manage work flow in a collaborative environment. These procedures can be manual or computer-based. The procedures are designed to do the following:
Allow for a large number of people to contribute to and share stored data Control access to data, based on user roles (defining which information users or user groups can view, edit, publish, etc.)
9
Aid in easy storage and retrieval of data Reduce repetitive duplicate input Improve the ease of report writing Improve communication between users
In a CMS, data can be defined as nearly anything: documents, movies, pictures, phone numbers, scientific data, and so forth. Administrator He handles System Management. He can only create or Delete System. He can Add or Modify Contents (Headlines) of the System. When a new System is Created, Folder with System name is created on Server which contains necessary Files. System Manager Each System has its Manager. He handles Contents (Headlines) of the System. He can Create Delete, Edit or Archive Headlines of the System. He can also specify Columns to display Headlines of the System. Users (Employees) If Users have rights to see Contents of the System, they can add comments on the System.
1.2 Purpose
Content management can be defined as the process of creating, managing and publishing online content without the need for any programming or technical skills. Content management saves time and money of an organization to a great
10
extent if used appropriately. You can achieve effective content management easily by deploying a content management system. A content management system can be custom built or can be readily bought from a vendor if it suits your needs. There are hundreds of content management vendors who offer content management systems of price range varying from a few cents to millions of dollars. So, it is advisable that you have well defined content plan in place for evaluating and choosing a content management system, else this whole procedure will prove disastrous.
1.3 Scope
The project will create the necessary environment for organizational units to deploy the selected CMS system for managing their web content. Thus the project will encompass the following areas:
Identifying roles and responsibilities required for effective management and organization of a CMS
Establishing the policies and guidelines within which the CMS will operate Identifying and purchasing a web CMS system suitable for widespread use across the University
Installing and setting up the CMS in accordance with the agreed policies. Preparing for a roll-out of the CMS to Faculties and Departments according to a defined timescale and roadmap.
11
feasibilities. The following are its features: 1.4.1 Technical Feasibility The system must be evaluated from the technical point of view first. The assessment of this feasibility must be based on an outline design of the system requirement in the terms of input, output, programs and procedures. Having identified an outline system, the investigation must go on to suggest the type of equipment, required method developing the system, of running the system once it has been designed. Technical issues raised during the investigation are: Does the existing technology sufficient for the suggested one? Can the system expand if developed?
12
The project should be developed such that the necessary functions and performance are achieved within the constraints. The project is developed within latest technology. Through the technology may become obsolete after some period of time, due to the fact that never version of same software supports older versions, the system may still be used. So there are minimal constraints involved with this project. The system has been developed using Java the project is technically feasible for development. 1.4.2 Economical Feasibility The developing system must be justified by cost and benefit. Criteria to ensure that effort is concentrated on project, which will give best, return at the earliest. One of the factors, which affect the development of a new system, is the cost it would require. The following are some of the important financial questions asked during preliminary investigation: The costs conduct a full system investigation. The cost of the hardware and software. The benefits in the form of reduced costs or fewer costly errors. Since the system is developed as part of project work, there is no manual cost to spend for the proposed system. Also all the resources are already available, it give an indication of the system is economically possible for development.
13
1.4.3 Behavior Feasibility This includes the following questions: Is there sufficient support for the users? Will the proposed system cause harm? The project would be beneficial because it satisfies the objectives when developed and installed. All behavioral aspects are considered carefully and conclude that the project is behaviorally feasible.
14
2. Project Management
2.1 Project Planning Project Plan:Role Software Engineer Mr.KamleshKorat Mr .ChiragKorat No of Team Member 2 Responsibilities Design, Development, Testing and Documentation
H.O.D (Mr.Maheshpanchal)
Version no Beta
No. license 1
Remarks
Available on Web
3 Months
2005
Available on
15
Management Studio
Web
Beta
Available on Web
MS Visio 2007
3 Months
Beta
Available on Web
2.2 Project Development Approach:1) Analyzing Current System: Going to software store for gathering practical knowledge. Observing and analyzing the system, working in the store. 2) Gathering requirements according to the analysis. 3) Designing the objectives, plan, structure of system and system model. 4) Designing of forms in Dot Net. 5) Creating database. 6) Establishing connectivity between system forms and database. 7) Testing to check if application is running successfully. 8) Creating documentation and report. Process Model (Incremental Lifecycle Model) Incremental model is an evolution of waterfall model. The product is designed, implemented, integrated and tested as a series of incremental builds. It is a popular model software evolution used many commercial software
16
companies. Incremental software development model may be applicable to projects where Software Requirements are well defined, but realization may be delayed. The basic software functionality are required early.
The linear sequential model encompasses the following activities: System/Information engineering and modeling Software Requirements Analysis Design Code Generation Testing Analysis Phase : The requirements gathering process is intensified and focused specifically on software. To understand the nature of programs to be built, the software engineer must understand the information domain for the software as well as required function, behavior, performance and interface. Requirements for both the system and the software are documented and reviewed . Design Phase : Software design is actually a multi step process on four distinct attributes of a program: data structure, software architecture, interface representation and procedural (algorithmic) details. The design processes translate requirements into a representation of Project that can be assessed for quality before coding begins. Like requirements, the design is documented and becomes part of the project configuration. Code Generation : The design must be translated into a machine-readable form. The code generation step performs this task. If design is performed in a detailed manner, code generation can be accomplished mechanistically.
18
Testing : Once code has been generated, program testing begins. The testing process focuses on the logical internals of the software, ensuring that all statements has been tested, and on the functional externals: that is conducting tests to uncover errors and ensure that defined input will produce actual results that agree with required results.
19
Administrator
System Manager
Logging into the system Submit Comments Change his password View Contents
Visitor
Task Project definition study Requirement Gathering Analysis Designing of web pages Various UML Diagrams Data base design Scripting Validation Testing Documentation
Kamlesh
Chirag
21
2.5Risk Management
[Figure2.5.1 Risk Management] Risk Identification Risk Identification is a systematic attempt to specify threats to the project plan. There are two types of risks are there: Generic and Product Specific. One method for identifying risks is to create a risk item checklist. The checklist can be used for identification and focused on some subset of known and predictable risks in the following generic subcategories:
Product Size Risks associated with the overall size of software to build or modified.
Business Impact Risks associated with constraints imposed by management or the Marketplace.
22
Customer Characteristics Risk associated with the sophistication of the customer and the developers ability to communicate with the customers in the timely manner. Process Definition Risks associated with the degree to which the software process has been defined and is followed by the development organization. Development Environment Risks associated with the availability and quality of tools to be used to build the product. Technology to be built Risks associated with the complexity of the system to be built and the newness. Of the technology that is packaged by the system. Staff Size and Experience Risks associated with the overall technical and project experience of the software engineer who will do the work.
Performance Risk: The degree of uncertainty that the product will meet its requirements and be fit for its intended use. Cost Risk: The degree of uncertainty that the project budget will be maintained. Support Risk: The degree of uncertainty that the resultant software will be easy to correct, adapt, and enhance.
23
Schedule Risk: The degree of uncertainty that the project schedule will be maintained and the product will be delivered on the time.
Risk Planning Risk Organizational financial Problems Strategy Prepare a briefing document for senior management showing how the project is making a very important contribution to the goals of the business
Requirement problems
Alert customer of potential difficulties and the possibility of delays, investigate buying-in components.
Staff illness
Reorganize team so that there is more overlap of work and people therefore understand each others jobs.
Defective components
Requirement changes
24
Organizational restructuring
Prepare a briefing document for senior management showing hoe the project is making a very important contribution to the goals of the business.
Database Performance
Risk Monitoring
Risk Monitoring involves regularly assessing each of the identified risks to decide whether or not that risk is becoming more or less probable and whether the effects of the risk have changed. Figure gives some examples of factors that may be helpful in assessing these types of risks.
25
Potential Indicators Late delivery of hardware or support software, many reported Technology problems.
People
Poor staff morale, poor relationships amongst team members, job Availability.
Organizational
Tools
Reluctance by team members to use tools, complaints about CASE Tools, demands for higher-powered workstations.
Requirements
Estimation
26
2.5 Project Schedule:A tentative schedule is as shown below table: Item Software Project Management Plan(This Document) Software Requirements Specifications Software Design Documents First Application Representation Updated SPMP Database Software Testing Description Software Integration Final Project Presentation December 5, 2011 December 15, 2011 January 20, 2012 January 20, 2012 February 08, 2012 March 28, 2012 April 16, 2012 May 5, 2012 Due Date December 5, 2011
27
2.6 Effort Estimations:Software project estimation in todays era result in companys profit and loss, thus a small error in estimating has very large impact on project and company itself. But then its also true that estimation can never be exact, particularly in software we have many variables-human, technical, environmental, political etc. Cost and efforts estimation can be done using any of these four techniques. Along with the ways to estimate justifications are given to state why that strategy was not selected or why was it selected. Delay estimation until late in project Base estimation on similar projects that have already been completed. Use relatively simple decomposition techniques to generate project cost and effort estimates. Use one or more empirical models for software cost and effort estimation.
28
3.2 Hardware and Software Specifications 3.2.1 Hardware Specifications:NO 1 2 3 4 HARDWARE Processor RAM Hard Disc Capacity CPU Speed REQUIREMENT Intel Pentium 4 256MB 20GB 700 MHZ
Table 3.2.1 Hardware Specifications:3.2.2Software Specifications:NO 1 SOFTWARE Operating System REQUIREMENT Windows XP, Vista, Winodws -7 2 Project Development Tool Microsoft Visual Studio 2008
29
Documentation Tool
Software
30
4. Platform Details
Front End Back End ASP.NET, HTML Microsoft SQL Server 2005 Web Server Technology Programming Language Designing IIS 7.0 .NET Framework 3.5 C# CSS Table 4.1 Platform Details
c Introduction to Front End Introduction to .NET Visual Studio .NET is a complete set of development tools for building ASP Web applications, XML Web services, desktop applications, and mobile applications. Visual Basic .NET, Visual C++ .NET, and Visual C# .NET all use the same integrated development environment (IDE), which allows them to share tools and facilitates in the creation of mixed-language solutions. In addition, these languages leverage the functionality of the .NET Framework, which provides access to key technologies that simplify the development of ASP Web applications and XML Web services.
31
Clients
Applications
Web Form
Web Service
[Figure 4.1.1: .Net Framework] The .NET Framework is an integral Windows component that supports building and running the next generation of applications and XML Web services. The .NET Framework is designed to fulfill the following objectives:
To provide a consistent object-oriented programming environment whether object code is stored and executed locally, executed locally but Internetdistributed, or executed remotely.
To provide a code-execution environment that minimizes software deployment and versioning conflicts.
To provide a code-execution environment that promotes safe execution of code, including code created by an unknown or semi-trusted third party.
32
To provide a code-execution environment that eliminates the performance problems of scripted or interpreted environments.
To make the developer experience consistent across widely varying types of applications, such as Windows-based applications and Web-based applications.
To build all communication on industry standards to ensure that code based on the .NET Framework can integrate with any other code. The .NET Framework has two main components: the common language
runtime and the .NET Framework class library. The common language runtime is the foundation of the .NET Framework. You can think of the runtime as an agent that manages code at execution time, providing core services such as memory management, thread management, and remoting, while also enforcing strict type safety and other forms of code accuracy that promote security and robustness. In fact, the concept of code management is a fundamental principle of the runtime. Code that targets the runtime is known as managed code, while code that does not target the runtime is known as unmanaged code. The class library, the other main component of the .NET Framework, is a comprehensive, object-oriented collection of reusable types that you can use to develop applications ranging from traditional command-line or graphical user interface (GUI) applications to applications based on the latest innovations provided by ASP.NET, such as Web Forms and XML Web services. The .NET Framework can be hosted by unmanaged components that load the common language runtime into their processes and initiate the execution of managed code, thereby creating a software environment that can exploit both managed and unmanaged features. The .NET Framework not only provides several runtime hosts, but also supports the development of third-party runtime hosts.
33
For example, ASP.NET hosts the runtime to provide a scalable, server-side environment for managed code. ASP.NET works directly with the runtime to enable ASP.NET applications and XML Web services, both of which are discussed later in this topic. Internet Explorer is an example of an unmanaged application that hosts the runtime (in the form of a MIME type extension). Using Internet Explorer to host the runtime enables you to embed managed components or Windows Forms controls in HTML documents. NET Framework in context
34
The following sections describe the main components and features of the .NET Framework in greater detail. Features of the Common Language Runtime (CLR) The common language runtime manages memory, thread execution, code execution, code safety verification, compilation, and other system services. These features are intrinsic to the managed code that runs on the common language runtime. With regards to security, managed components are awarded varying degrees of trust, depending on a number of factors that include their origin (such as the Internet, enterprise network, or local computer). This means that a managed component might or might not be able to perform file-access operations, registryaccess operations, or other sensitive functions, even if it is being used in the same active application. The runtime enforces code access security. For example, users can trust that an executable embedded in a Web page can play an animation on screen or sing a song, but cannot access their personal data, file system, or network. The security features of the runtime thus enable legitimate Internet-deployed software to be exceptionally feature rich. The runtime also enforces code robustness by implementing a strict typeand-code-verification infrastructure called the common type system (CTS). The CTS ensures that all managed code is self-describing. The various Microsoft and third-party language compilers generate managed code that conforms to the CTS. This means that managed code can consume other managed types and instances, while strictly enforcing type fidelity and type safety.
35
In addition, the managed environment of the runtime eliminates many common software issues. For example, the runtime automatically handles object layout and manages references to objects, releasing them when they are no longer being used. This automatic memory management resolves the two most common application errors, memory leaks and invalid memory references. The runtime also accelerates developer productivity. For example, programmers can write applications in their development language of choice, yet take full advantage of the runtime, the class library, and components written in other languages by other developers. Any compiler vendor who chooses to target the runtime can do so. Language compilers that target the .NET Framework make the features of the .NET Framework available to existing code written in that language, greatly easing the migration process for existing applications. While the runtime is designed for the software of the future, it also supports software of today and yesterday. Interoperability between managed and unmanaged code enables developers to continue to use necessary COM components and DLLs. The runtime is designed to enhance performance. Although the common language runtime provides many standard runtime services, managed code is never interpreted. A feature called just-in-time (JIT) compiling enables all managed code to run in the native machine language of the system on which it is executing. Meanwhile, the memory manager removes the possibilities of fragmented memory and increases memory locality-of-reference to further increase performance. Finally, the runtime can be hosted by high-performance, server-side applications, such as Microsoft SQL Server and Internet Information Services (IIS). This infrastructure enables you to use managed code to write your business logic, while still enjoying the superior performance of the industry's best enterprise servers that support runtime hosting.
36
Source Code in C#
C# Compiler (csc.exe)
Appropriate Compiler
Execute
37
.NET Framework Class Library The .NET Framework class library is a collection of reusable types that tightly integrate with the common language runtime. The class library is object oriented, providing types from which your own managed code can derive functionality. This not only makes the .NET Framework types easy to use, but also reduces the time associated with learning new features of the .NET Framework. In addition, third-party components can integrate seamlessly with classes in the .NET Framework. For example, the .NET Framework collection classes implement a set of interfaces that you can use to develop your own collection classes. Your collection classes will blend seamlessly with the classes in the .NET Framework. As you would expect from an object-oriented class library, the .NET Framework types enable you to accomplish a range of common programming tasks, including tasks such as string management, data collection, database connectivity, and file access. In addition to these common tasks, the class library includes types that support a variety of specialized development scenarios. For example, you can use the .NET Framework to develop the following types of applications and services:
Console applications. Windows GUI applications (Windows Forms). ASP.NET applications. XML Web services. Windows services.
38
For example, the Windows Forms classes are a comprehensive set of reusable types that vastly simplify Windows GUI development. If you write an ASP.NET Web Form application, you can use the Web Forms classes. Accessing data with ADO.NET ADO.NET provides consistent access to data sources such as Microsoft SQL Server, as well as data sources exposed through OLE DB and XML. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, manipulate, and update data. ADO.NET cleanly factors data access from data manipulation into discrete components that can be used separately or in tandem. ADO.NET includes .NET Framework data providers for connecting to a database, executing commands, and retrieving results. Those results are either processed directly, or placed in an ADO.NET Dataset object in order to be exposed to the user in an ad-hoc manner, combined with data from multiple sources, or remote between tiers. The ADO.NET Dataset object can also be used independently of a .NET Framework data provider to manage data local to the application or sourced from XML. The ADO.NET classes are found in System.Data.dll, and are integrated with the XML classes found in System.Xml.dll. When compiling code that uses the System.Data namespace, reference both System.Data.dll and System.Xml.dll. ADO.NET provides functionality to developers writing managed code similar to the functionality provided to native COM developers by ADO.
39
ADO.NET Components The ADO.NET components have been designed to factor data access from data manipulation. There are two central components of ADO.NET that accomplish this: the Dataset, and the .NET Framework data provider, which is a set of components including the Connection, Command, DataReader, and DataAdapter objects. The ADO.NET Dataset is the core component of the disconnected architecture of ADO.NET. The Dataset is explicitly designed for data access independent of any data source. As a result it can be used with multiple and differing data sources, used with XML data, or used to manage data local to the application. The Dataset contains a collection of one or more DataTable objects made up of rows and columns of data, as well as primary key, foreign key, constraint, and relation information about the data in the DataTable objects. The other core element of the ADO.NET architecture is the .NET Framework data provider, whose components is explicitly designed for data manipulation and fast, forward-only, read-only access to data. The
Connectionobject provides connectivity to a data source. The Command object enables access to database commands to return data, modify data, run stored procedures, and send or retrieve parameter information. The DataReader provides a high-performance stream of data from the data source. Finally, the DataAdapter provides the bridge between the DataSet object and the data source. The DataAdapter uses Command objects to execute SQL commands at the data source to both load the DataSet with data, and reconcile changes made to the data in the DataSet back to the data source.
40
You can write .NET Framework data providers for any data source. The .NET Framework ships with two .NET Framework data providers: the .NET Framework Data Provider for SQL Server and the .NET Framework Data Provider for OLE DB. The following diagram illustrates the components of ADO.NET architecture.ADO.NET architecture
[Figure 4.1.4: ADO.Net Aechitecture] Why we Use C#.Net: It supports Client/Server Architecture. C#.Net also provides Database Objects like ADO.Net which is very useful for making Client/server application. It gives more facility like disconnected database structure with classes like Data Adapter, Data connection. C#.Net is object oriented language which is providing facility of Inheritance, constructors, destructors, multithreading etc. C#.Net provides many data types which are giving flexibility in programming. It also provides the Crystal Report support to make report this is the
41
advantage of C#.Net. In our application reports are very important part with graphical representation. The most important feature of C#.Net is disconnected database structure. That features is very much useful in our application and it also give speed and accuracy to the client/server model. Features of C#.Net: Inheritance: C Sharp .Net supports inheritance by allowing you to define classes that serve as the basis for derived classes. Derived classes inherit and can extend the properties and methods of the base class. They can also override inherited methods with new implementations. All classes created with Visual Basic .Net are inheritable by default. Because the forms you design are really classes, you can use inheritance to define new forms based on existing ones. Exception Handling: C Sharp .Net supports structured exception handling, using and enhanced version of the TryCatchFinally syntax supported by other languages such as c++. Structured exception handling combines a modern control structure with exceptions, protected blocks of code and filters. Structured exception handling makes it easy to create and maintain programs with robust comprehensive error handlers. Overloading: Overloading is the ability to define properties, methods, or procedures that have the same name but use different data types. Overloaded procedures allow you to provide as many implementations as necessary to handle different kinds of data, while giving the appearance of a single, versatile procedure.
42
Overriding Properties and Methods: The overrides keyword allows derived objects to override characteristics inherited from Parent objects. Overridden members have the same arguments as the members inherited form the base class, but different Implementations. A members new implementation can call the original implementation in the parent class by preceding the member name with My Base. Constructors and Destructors: Constructors are procedures that control initialization of new instances of a class. Conversely, destructors are methods that free system resources when a class leaves scope or is set to nothing. C Sharp .Net supports constructors and destructors using the sub new and sub finalize procedures. Data Types: C Sharp .Net introduces three new data types. The char data type is and unsigned 16-bit quantity used to store Unicode characters. It is equivalent to the .Net Framework System. Char data type. Interfaces: Interfaces describe the properties and methods of classes, but unlike classes, do not provide implementations. The interface statement allows you to declare interfaces, while the implements statement lets you write code that puts the items described in the interface into practice. Shared Members: Shared members are properties, procedures, and fields that are shared by all instances of a class. Shared data members are useful when multiple by objects need to use information that is common to all. Shared class methods can be used without first creating and object form a class. References: References allow you to use objects defined in other assemblies. In C Sharp .Net, references point to assemblies instead of type libraries.
43
Namespaces: Namespaces prevent naming conflicts by organizing classes, interfaces, and methods into hierarchies. Assemblies: Assemblies replace and extend the capabilities of type libraries by, describing all the required files for a particular component or application. An assembly can contain one or more namespaces. Attributes: Attributes enable you to provide additional information about program elements. For example, you can use an attribute to specify which methods in a class should be exposed when the class is used as a XML Web service. Multithreading: C Sharp .Net allows you to write applications that can perform multiple tasks independently. A task that has the potential of holding up other tasks can execute on a separate thread, a process known as multithreading. By causing complicated tasks to run on threads that are separate from your user inter face, multithreading makes your applications more responsive to user input. Bit Shift Operators: C Sharp .Net now supports arithmetic left and right shift operations on integral data types. Arithmetic shifts are not circulars, which means the bits shifted off one end of the result are not reintroduced at the other and. The corresponding assignment operators are provided as well.
44
English Query and the Microsoft Search Service to incorporate user-friendly queries and powerful search capabilities in Web applications. Scalability and Availability The same database engine can be used across platforms ranging from laptop computers running Microsoft Windows 98 through large, multiprocessor servers running Microsoft Windows 2000 Data Center Edition. SQL Server 2005 Express Edition supports features such as federated servers, indexed views, and large memory support that allow it to scale to the performance levels required by the largest Web sites. Enterprise-Level Database Features The SQL Server 2005 Express Edition relational database engine supports the features required to support demanding data processing environments. The database engine protects data integrity while minimizing the overhead of managing thousands of users concurrently modifying the database. SQL Server 2005 Express Edition includes a set of administrative and development tools that improve upon the process of installing, deploying, managing, and using SQL Server across several sites. SQL Server 2000 also supports a standards-based programming model integrated with the Windows DNA, making the use of SQL Server databases.
46
Data warehousing SQL Server 2005 Express Edition includes tools for extracting and analyzing summary data for online analytical processing. SQL Server also includes tools for visually designing databases and analyzing data using English-based questions. Database Architecture Microsoft SQL Server 2005 Express Edition data is stored in databases. The data in a database is organized into the logical components visible to users. A database is also physically implemented as two or more files on disk. By using a database, it is possible to work primarily with the logical components such as tables, views, procedures, and users. The physical implementation of files is largely transparent. Typically, only the database administrator needs to work with the physical implementation. Each instance of SQL Server has four system databases (master, model, tempdb, and msdb) and one or more user databases. Some organizations have only one user database, containing all the data for their organization. Some organizations have different databases for each group in their organization, and sometimes a database used by a single application. It is not necessary to run multiple copies of the SQL Server database engine to allow multiple users to access the databases on a server. An instance of the SQL Server Standard or Enterprise Edition is capable of handling thousands of users working in multiple databases at the same time.
47
When connecting to an instance of SQL Server, the connection is associated with a particular database on the server. This database is called the current database. The user is usually connected to a database defined as the default database by the system administrator, although its uses connection options in the database APIs to specify another database. SQL Server 2005 Express Edition allows detaching databases from an instance of SQL Server, then reattaching them to another instance, or even attaching the database back to the same instance. If there is SQL Server database file, it is possible to attach that database file with a specific database name. Relational Database components: The database component of Microsoft SQL Server 2005 Express Edition is a Structured Query Language (SQL)based, scalable, relational database with integrated Extensible Markup Language (XML) support for Internet applications. Each of the following terms describes a fundamental part of the architecture of the SQL Server2005 Express Edition database component: Database A database is similar to a data file in that it is a storage place for data. Like a data file, a database does not present information directly to a user; the user runs an application that accesses data from the database and presents it to the user in an understandable format. Database systems are more powerful than data files in that data is more highly organized.
48
In a well-designed database, there are no duplicate pieces of data that the user or application must update at the same time. Related pieces of data are grouped together in a single structure or record, and relationships can be defined between these structures and records. Relational Database Although there are different ways to organize data in a database, relational databases are one of the most effective. Relational database systems are an application of mathematical set theory to the problem of effectively organizing data. In a relational database, data is collected into tables (called relations in relational theory). Scalable SQL Server 2005 Express Edition supports having a wide range of users access it at the same time. An instance of SQL Server 2005 Express Edition includes the files that make up a set of databases and a copy of the DBMS software. Applications running on separate computers use a SQL Server 2005 Express Edition communications component to transmit commands over a network to the SQL Server 2005 Express Edition instance. When an application connects to an instance of SQL Server 2005 Express Edition, it can reference any of the databases in that instance that the user is authorized to access. The communication component also allows communication between an instance of SQL Server 2005 Express Edition and an application running on the same computer.
49
Structured Query Language To work with data in a database, the user have to use a set of commands and statements (language) defined by the DBMS software. Several different languages can be used with relational databases; the most common is SQL. The American National Standards Institute (ANSI) and the International Standards Organization (ISO) define software standards. Extensible Markup Language XML is the emerging Internet standard for data. XML is a set of tags that can be used to define the structure of a hypertext document. XML documents can be easily processed by the Hypertext Markup Language, which is the most important language for displaying Web pages. Database Design Considerations Designing a database requires an understanding of both the business functions you want to model and the database concepts and features used to represent those business functions. It is important to accurately design a database to model the business because it can be time consuming to change the design of a database significantly once implemented. A well-designed database also performs better.
50
Database Architecture Microsoft SQL Server 2005 Express Edition data is stored in databases. The data in a database is organized into the logical components visible to users. A database is also physically implemented as two or more files on disk. When using a database, you work primarily with the logical components such as tables, views, procedures, and users. The physical implementation of files is largely transparent. Typically, only the database administrator needs to work with the physical implementation. Each instance of SQL Server has four system databases (master, model, tempdb, and msdb) and one or more user databases. Some organizations have only one user database, containing all the data for their organization. Some organizations have different databases for each group in their organization, and sometimes a database used by a single application. For example, an organization could have one database for sales, one for payroll, one for a document management application, and so on. Sometimes an application uses only one database; other applications may access several databases. It is not necessary to run multiple copies of the SQL Server database engine to allow multiple users to access the databases on a server. An instance of the SQL Server Standard or Enterprise Edition is capable of handling thousands of users working in multiple databases at the same time. Each instance of SQL Server makes all databases in the instance available to all users that connect to the instance, subject to the defined security permissions.
51
When connecting to an instance of SQL Server, your connection is associated with a particular database on the server. This database is called the current database. You are usually connected to a database defined as your default database by the system administrator, although you can use connection options in the database APIs to specify another database. You can switch from one database to another using either the Transact-SQL USE database_name statement, or an API function that changes your current database context. SQL Server 2005 allows you to detach databases from an instance of SQL Server, then reattach them to another instance, or even attach the database back to the same instance. If you have a SQL Server database file, you can tell SQL Server when you connect to attach that database file with a specific database name.
52
5.Data Modeling
5.1 Data Flow Diagram
A data-flow diagram (DFD) is a graphical representation of the "flow" of data through an information system. DFDs can also be used for the visualization of data processing (structured design). On a DFD, data items flow from an external data source or an internal data store to an internal data store or an external data sink, via an internal process. A DFD provides no information about the timing of processes, or about whether processes will operate in sequence or in parallel. It is therefore quite different from a flowchart, which shows the flow of control through an algorithm, allowing a reader to determine what operations will be performed, in what order, and under what circumstances, but not what kinds of data will be input to and output from the system, nor where the data will come from and go to, nor where the data will be stored. It is common practice to draw a context-level data flow diagram first, which shows the interaction between the system and external agents which act as data sources and data sinks. On the context diagram (also known as the Level 0 DFD) the system's interactions with the outside world are modeled purely in terms of data flows across the systemboundary. The context diagram shows the entire system as a single process, and gives no clues as to its internal organization. This context-level DFD is next "exploded", to produce a Level 1 DFD that shows some of the detail of the system being modeled. The Level 1 DFD shows how the system is divided into sub-systems (processes), each of which deals with one or more of the data flows to or from an external agent, and which together provide all of the functionality of the system as a whole. It also identifies internal
53
data stores that must be present in order for the system to do its job, and shows the flow of data between the various parts of the system. Data-flow diagrams were invented by Larry Constantine, the original developer of structured design, based on Martin and Estrin's "data-flow graph" model of computation. Data-flow diagrams (DFDs) are one of the three essential perspectives of the structured systems analysis and design method SSADM. The sponsor of a project and the end users will need to be briefed and consulted throughout all stages of a system's evolution. With a data-flow diagram, users are able to visualize how the system will operate, what the system will accomplish, and how the system will be implemented. The old system's dataflow diagrams can be drawn up and compared with the new system's data-flow diagrams to draw comparisons to implement a more efficient system. Data-flow diagrams can be used to provide the end user with a physical idea of where the data they input ultimately has an effect upon the structure of the whole system from order to dispatch to report. How any system is developed can be determined through a dataflow diagram.
54
55
USERS
ARTICLE
SUBMITTED ARTICLE
SAVE DATA
SAVE CHANGES
GET U & P
SAVE ARTICLE
1.0
REGISTRATION
REQUEST FOR SIGN UP
REGISTRATION SUCCESSFULLY
2.0
LOGIN
LOGIN SUCCESS LOGIN SUCCESS SAVE UPDATED ARTICLE
3.0
SUBMIT ARTICLE
SAVE ARTICLE SAVE ARTICLE GIVE REPLY
USERS
QUERY FOR VIEW DATA
4.0
SUBMIT AND UPDATE ARTICLE ENTER UN,PWD ENTER UN,PWD SUBMIT ARTICLE GET AND UPDATE SELF INFO
USERS
GIVE REPLY
ADMINISTRATOR
USER
VISITOR
DISPLAY BANNERS
5.0
BANNERS MANAGER
GIVE BANNERS DETAILS GET BANNERS DATA
6.0
MENU MANAGER
GIVE MENU
MENU MANAGEMENT
BANNERS
MENU
USERS
REQUEST FOR U&P
USERS
REQUEST FOR U&P
ADMINISTRATOR
2.1
CHECK USERNAME AND PASSWORD
2.2
USERNAME AND PASSWORD INVALID
IF U & P INVALID
USERS
REPONSE USERNAME AND PASSWORD VALID IF U & P VALID
USERS
REQUEST FOR U&P
GET U & P
2.3
LOGIN SUCCESS
57
BANNERS
BANNERS
5.1
VIEW BANNERS DETAILS
5.2
UPDATE BANNERS
SAVE BANNERS DETAILS
ADMINISTRATOR
5.3
CREATE NEW BANNERS
LOOK FOR BANNERS INFO. GET BANNERS INFO.
BANNERS
58
The data dictionary contains two types of descriptions as following: Data Elements: The most fundamental data level is the data element. Data element is the building block for all others in the system. Data Structure: A data structure is a set of items that are related to one another that describes components in the system.
59
DataDictionary
TABLE :: Registration
Type and Size int(5) varchar(50) varchar(50) Varchar(100) Text Varchar(50) Varchar(50)
Constraint Not_null Not null Not null Null Not null Null Not Null
Description User Identification User Name User password Address Email identity Mobile No Of the user Type of the User whether admin or user
TABLE :: Banner Field Name banner_id banner_name banner_img banner_pos Type and Size int(5) Varchar(50) Varchar(50) Varchar(6) Constraint Not_null Not null Not_null Null [Table 5.2.2] Primary Key ::- banner_id Description Banner Identification Banner Name Banner Image Banner Position
60
TABLE: Links Field Name link_id link_title article_id linkorder Type and Size Numeric(11,0) Varchar(20) Numeric(11,0) Numeric(11,0) Constraint Not_null Not_null Not_null Not_null Description Identity of the link Title of The Link Identity of article Order of the link for display accessrights Varchar(25) Not_null For privacy
[Table 5.2.3] Primary Key ::- link_id Foreign key ::- article_id
TABLE :: Menus Field Name menu_id menu_title PageID menuorder Type and Size Numeric(11,0) Varchar(25) Numeric(11,0) Numeric(11,0) Constraint Not_null Not_null Not null Not_null Description Menu identification Menu name Identity of the page Order of menu
[Table 5.2.4] Primary Key ::- Menus_id Foreign Key ::- PageID
TABLE :: Imagegallary Field Name Image_id Type and Size int(5) Constraint Not_null Description Image IDENTIFACATION
61
Image_name Image_url
Varchar(20) Varchar(100)
Not_null Not_null
[Table 5.2.5] Primary Key ::- Image_id TABLE :: Comment Field Name Comment_id comment_content user_id article_id comment_date Type and Size Numeric(11,0) Text Numeric Int Datetime Constraint Not Null Not_null Not Null Not Null Not_null [Table 5.2.6] Primary Key:::Comment_id Foreignkey :::User_id , article_id Table:: Article Field Name article_id article_title article_imageurl Description COMMENT IDENTIFACATION Content in the comment Identity of the user Identity of article Give the date and time of the Comment
Blockornot
Int
Description Aticle id Title given to the article Image thats inside of the article Content of the article Author identity Accessed Order of display like first or second Block aricle or not
62
Primary Key::- article_id Foreign Key::- author_id Table::Author Field Name author_id author_name
author_email Varchar(100) Not Null author_contactno Varchar(50) Not Null [Table 5.2.8] Primary Key::author_id Table::: Event Field Name event_id
Description Event that Can Be added By admin Name Of the event Date for the Event added Detail related to event User Identity
63
Table:::Page Field Name pageID Title Pagecontent Type and Size Numeric(11,0) Varchar(25) Text Constraint Not Null Not Null Not Null Description Page Identity Title of the page Given By Admin Content added in The page by admin access right by admin
AccessRights
Varchar(25)
Not Null
[Table 5.2.10]
64
entity. The collection of all songs in a database is an entity set. The eaten relationship between a child and her lunch is a single relationship. The set of all such child-lunch relationships in a database is a relationship set. In other words, a relationship set corresponds to a relation in mathematics, while a relationship corresponds to a member of the relation. ER Diagram(1)
U_NAME U_ID
U_LOGINNAME
U_PASSWORD
USERS
U_EMAIL
1
CAN SELECT
U_NC_ID
*
USER_ MENUSCATEGORY NC_ID NC_ICON NC_NAME
U_ID
*
CAN VIEW
N_ID
CAN SUBMIT
MENU_CATEGORY
A_ID A_AUTHER
A_ID
*
ARTICLE
A_TITLE
A_BODY
ER Diagram(2)
U_NAME U_ID
U_LOGINNAME
U_PASSWORD
USERS
U_EMAIL
CAN VIEW
CAN VIEW
B_ID
1
B_IMAGE BANNERS B_CLICKURL A_ID
1
A_NAME
ARTICLE
A_PATH
A use case represents a task. A Task is simply some piece of goal-directed work performed by a user or organization (group of users). A task performed by an organization is also sometimes referred to as a Process. Some modelers, this author included, prefer to make a distinction between: Use casea user task
68
uses
uses
uses
ADMIN
69
uses
uses
uses
Add Comment
uses
70
uses
uses
uses
uses
uses
uses
manage user
ADMIN
uses
uses
Add Image
uses
manage image
Preview Site
uses
6. System Testing
Software Testing is the process of executing software in a controlled manner, in order to answer the question - Does the software behave as specified?. Software testing is often used in association with the terms verification and validation. Validation is the checking or testing of items, includes software, for conformance and consistency with an associated specification. Software testing is just one kind of verification, which also uses techniques such as reviews, analysis, inspections, and walkthroughs. Validation is the process of checking that what has been specified is what the user actually wanted.
Validation : Are we doing the right job? Verification : Are we doing the job right?
Software testing should not be confused with debugging. Debugging is the process of analyzing and localizing bugs when software does not behave as expected. Although the identification of some bugs will be obvious from playing with the software, a methodical approach to software testing is a much more thorough means for identifying bugs. Debugging is therefore an activity which supports testing, but cannot replace testing. Other activities which are often associated with software testing are static analysis and dynamic analysis. Static analysis investigates the source code of software, looking for problems and gathering metrics without actually executing the code. Dynamic analysis looks at the behavior of software while it is executing, to provide information such as execution traces, timing profiles, and test coverage information.
72
Testing is a set of activity that can be planned in advanced and conducted systematically. Testing begins at the module level and work towards the integration of entire computers based system. Nothing is complete without testing, as it vital success of the system testing objectives, there are several rules that can serve as testing objectives. They are Testing is a process of executing a program with the intend of findingan error. A good test case is one that has high possibility of finding an undiscovered error. A successful test is one that uncovers an undiscovered error.
If a testing is conducted successfully according to the objectives as stated above, it would uncovered errors in the software also testing demonstrate that the software function appear to be working according to the specification, that performance requirement appear to have been met. There are three ways to test program. For correctness For implementation efficiency For computational complexity Test for correctness are supposed to verify that a program does exactly what it was designed to do. This is much more difficult than it may at first appear, especially for large programs.
73
data structure is examined to ensure that data stored temporarily maintains its integrity during all steps in an algorithms execution. Boundary conditions are tested to ensure that all statements in a module have been executed at least once. Finally, all error handling paths are tested. Tests of data flow across a module interface are required before any other test is initiated. If data do not enter and exit properly, all other tests are moot. Selective testing of execution paths is an essential task during the unit test. Good design dictates that error conditions be anticipated and error handling paths set up to reroute or cleanly terminate processing when an error does occur. Boundary testing is the last task of unit testing step. Software often fails at its boundaries. Unit testing was done in Sell-Soft System by treating each module as separate entity and testing each one of them with a wide spectrum of test inputs. Some flaws in the internal logic of the modules were found and were rectified.
6.1.2 Integration Testing Integration testing is systematic technique for constructing the program structure while at the same time conducting tests to uncover errors associated with interfacing. The objective is to take unit tested components and build a program structure that has been dictated by design. The entire program is tested as whole. Correction is difficult because isolation of causes is complicated by vast expanse of entire program. Once these errors are corrected, new ones appear and the process continues in a seemingly endless loop. After unit testing in Sell-Soft System all the modules were integrated to test for any inconsistencies in the interfaces. Moreover differences in program structures were removed and a unique program structure was evolved.
75
6.1.3 Validation Testing This is the final step in testing. In this the entire system was tested as a whole with all forms, code, modules and class modules. This form of testing is popularly known as Black Box testing or System tests. Black Box testing method focuses on the functional requirements of the software. That is, Black Box testing enables the software engineer to derive sets of input conditions that will fully exercise all functional requirements for a program. Black Box testing attempts to find errors in the following categories; incorrect or missing functions, interface errors, errors in data structures or external data access, performance errors and initialization errors and termination errors.
6.1.4 Output Testing The system considered is tested for user acceptance; here it should satisfy the firms need. The software should keep in touch with perspective system; user at the time of developing and making changes whenever required. This done with respect to the following points Input Screen Designs, Output Screen Designs, Online message to guide the user and the like. The above testing is done taking various kinds of test data. Preparation of test data plays a vital role in the system testing. After preparing the test data, the system under study is tested using that test data. While testing the system by which test data errors are again uncovered and corrected by using above testing steps and corrections are also noted for future use.
76
6.1.5 Validation Checking: At the culmination of integration testing, software is completely assembled as a package; interfacing errors have been uncovered and corrected, and a final series of software test-validation checks may begin. Validation can be defined in many ways, but a simple definition (Albeit Harsh) is that validation succeeds when software functions in a manner that can be reasonably expected by a customer. Software validation is achieved through a series of black-box tests to be conducted and a test procedure defines specific test cases that will be used in attempt to uncover errors in conformity with requirements. Both the plan and procedure are designed to ensure that all functional requirements are satisfied; all performance requirements are achieved; documentation is correct and human Engineered and other requirements are met. Once the application was made free of all logical and interface errors , inputting dummy data to ensure that the software developed satisfied all the requirements of the user did validation checks .However , the data are created with the intent of determining whether the system will process them correctly . In the proposed system, if the clients click the send button after selecting a file from his file list, then the system will show the confirmation message for sending files. Similarly if a client makes an attempt to download a file from the server file list, then also the system will show the confirmation message for downloading. This is how the data validations were made in the proposed system.
77
6.2 Security Features: There is certain security consideration kept in mind during the development of the project:
The user login details would be secure and there would be full measures that the details are not leaked out
The database is password protected and highly secured and thus the data of the company would not be leaked out to intruder who does not have any rights to gain the access to the database.
There is no solution for power failures as the entire system gets shut down without any prior notice.
The database may get crashed at any certain time due to virus or operating system failure at the server side. Therefore, it is required to take the database backup of the centrally collected data.
78
7.0 Bibliography:(1)Reference Websites:I. II. III. IV. V. www.wikipedia.org www.dotnetnuke.com www.w3schools.com www.msdn.microsoft.com www.google.com
79