Documente Academic
Documente Profesional
Documente Cultură
Systems Post to: Mr. Alpesh Thakor Post by: Muhammad Aamir Sajjad Submission Date: 16-dec-2011
Q1: a)
Database Management System:
The data base management system , which is also commonly known as data manager basically performs the main function of allowing users alone or in a network to create data in a database. The DBMS receives the requiests for data creation from the data users in the network and then it processes it. The data managers donot know the exact location or type of data that is created. DBMS delibratly performs this function of keeping the data managers in black other wise they may also be assessing the data that they should not be assessing.the dbms receives the requests from many different users. In an order for it to be an efficient DBMS the should the two fucntions very well. One of them is INTEGRITY , by itergrity it is meant that data is always as organized as it was required from the instructions of the original data manager , and it is also always available to be assessed. The other function that DBMS should perform very well is known as SECURITY , by security it is meant that only allowed users are given access to the data.
Lets consider an example of a database that has people living in a city known as residants. This is shown in the avoce displayed diagram. The relation shop beween the people and city is that people live in a city. The attribute of the person is shown by name and the attribute of the city is shown by using population. There is ony one relationshop and attribute in the above mention example. This is due to
the time and space constraint. While in the real world there may be many different attributes and many differnd relationships in one single ERD.
Q2:
3year degre e
Optional Teach
Group Tutor
Courses
Ha
Alloca
Student
M-Leader
Assignment s
When I go through the given the case study I can identify many of the entities that are dgiven in the scenario. There are also a range of different relationships between them and hence there will also be different attributes that can be attached to those relationships. Some of the major entities that can be identified through the case study include module coordinator , module , cources, teachers and students. The main relationships in the ficen case study is found between a pair of the entities involved in the case scenario. One of them in teacher student and while the other one is course and module. There are many module in a course. The class of a teacher consists of a great no of students. The teacher may teach several no of courses and he also gives out a great no of home work . so it all works as vice versa as far as ERD is concerned.
Q3:
Normalization:
Nomatlization refers to the phenomenon which results in the arrangement of the the data in the database. There are two main purposes behind the normalization process. They are athe reasons why the nomalisation is done. First of them is to sort out the redundant data. By redundant data it is meant the data that is no longer required to be used in the database. This redundant data mainly consists of the data that is used twice in the table or some other data that is being used twice. The second purpose of the normalization of the data is to ensure that there is logic behind the position of data in database. The position of the data in the tables with in database make sense and all the data that is present in the databse is relavant to the purpose and instructions according to which tha database is being maintained.
In the nutshell the fisrt form mainly emphasizes that the date should not be repeated in the table . this concept is commonly known as automicity of the data. The table that are consistant with the first form are widely known as automic tables.
The main rules that need to be complied with when normalizing the databse to the second normal form include the following The rows that connect to more than one table are put together in a separate table. The relationship between the new and existing tables is also established at this stage of data normalization.
So in the nutshell the basic aim of the second form of data normalization is to reduce the amount of redundant data in the database and it does so by putting the data in separate tables and also establishing relationship between new and existing tables.
b) 1:
Video rebuff 7 9 7
class title X Y Z
cartridge no 5 6 7
3rd Normal Form sticky tape name Rasmbo III penal complex rupture Wanted
Primary keys- cassette no, consortium Code Foreign keys- video given name type code (II)
Un normalized uncomplainin unwearie Ward constituenc g number d name numbe y name r 8 K stacy 23 Gynecolog y 9 B david 27 ENT 10 L 15 Cardiology johnson
serene amount 8 9 10
serene amount 8 9 10
3rd Normal Form preparation set of preparation laws forename 0072 GL 0025 EN 0029 Ca
Measure A D F
3rd Normal Form uncomplaining quantity long-suffering forename 8 K David 9 B Johnson 10 L Richard Primary keys-patient numbers, ward number, drug code. Foreign keys- ward number, drug code and Patient name. (III) Un normalized
apprentice no X6 Y5 Z4
3rd Normal Form (NF) trainee no X6 Y5 Z4 3rd Normal form (NF) lecturer 101 305 403 3rd Normal Form (NF) lessons title CIT DIT CA
Primary keys-course code, tutor, student no Foreign key-student no, tutor, course title Q4:
First Normal Form:
The basic rules for the normalization are set out in the first normal form of the normalization and they can be described as the following. As the first step in the normalization process the duplicate columns that are present in the same table are removed. A separate table is created for each set of unique columns.
In the nutshell the fisrt form mainly emphasizes that the date should not be repeated in the table . this concept is commonly known as automicity of the data. The table that are consistant with the first form are widely known as automic tables.
united Date
steam engine remains maker remains paint Rim Turbo speed Gear box Nitrogen gas
10 10 10 15 23 23 24
30,000 01/02/09 30,000 01/02/09 30,000 01/02/09 7000 1800 1800 1000 01/05/10 01/03/11 01/03/11 01/01/11
to be paid Time 12 24 36 18 30 42 48
So in the nutshell the basic aim of the second form of data normalization is to reduce the amount of redundant data in the database and it does so by putting the data in separate tables and also establishing relationship between new and existing tables.
2nd Normal Form development venture interpretation set of laws 401 steam locomotive 401 remainder producer 401 remains color
billed Time 12 24 36
18 30 42 48
Functional Relation
Defined Relation
venture convention 401 402 403 404 venture brand E R TS NG 2nd Normal Form worker No given evaluation name 10 David A 15 Stainly B 23 Richard C 24 Eve D earnings balance 30,000 7000 1800 1000 united meeting 01/02/09 01/05/10 01/03/11 01/01/11
3rd Normal Form venture set venture portrayal of laws 401 steam steam engine 401 carcass producer 401 cadaver color 402 frame 403 Turbo tempo
due Time 12 24 36 18 30
403 404
42 48
Q5: a)
the three ways in which the integrity of data in a system can be compromised include the following
Non Repudition
The integrity of the data can be compromised by this of through the rel;atives of the employees.
2nd pair
Dependant on family member
3rd pair
It is dependent on relative
4th pair
It is dpendent on next family kin.
Bibiography
http://searchsqlserver.techtarget.com/definition/databasemanage mentsystem http://databases.about.com/cs/specificproducts/g/er.htm http://www.umsl.edu/~sauterv/analysis/er/er_intro.html databases.about.com/od/specIfI cproducts/a/normalIzatIon.htm www.Inf.unIbz.It/~ franco nI/teachIng/2000/ct481/erm odellIng/ http://databases.about.com/cs/specificproducts/g/er.htm http://www.dumblIttleman.com/2008/10/4Importantdatabackup strategIesfor.htmll http://csrc.nIst.gov/publIcatIons/fIps/fIps191/fIps191.pdf