Sunteți pe pagina 1din 17

PART 1 (a)

HISTORY OF PROBABILITY

In the mid-seventeenth century, a simple question directed to Blaise Pascal by a

nobleman sparked the birth of probability theory, as we know it today. A gambler's dispute

in 1654 led to the creation of a mathematical theory of probability by two famous French

mathematicians, Blaise Pascal and Pierre de Fermat. Antoine Gombaud, Chevalier de Méré, a

French nobleman with an interest in gaming and gambling questions, called Pascal's attention

to an apparent contradiction concerning a popular dice game. Chevalier de Méré gambled

frequently to increase his wealth. He bet on a roll of a die that at least one 6 would appear

during a total of four rolls. From past experience, he knew that he was more successful than

not with this game of chance. Tired of his approach, he decided to change the game. The

game consisted in throwing a pair of dice 24 times; the problem was to decide whether or not

to bet even money on the occurrence of at least one "double six" during the 24 throws. A

seemingly well-established gambling rule led de Méré to believe that betting on a double six

in 24 throws would be profitable, but his own calculations indicated just the opposite. He bet

that he would get a total of 12, or a double 6, on twenty-four rolls of two dice. Soon he

realized that his old approach to the game resulted in more money. He asked his friend

Blaise Pascal why his new approach was not as profitable. Pascal worked through the

problem and found that the probability of winning using the new approach was only 49.1

percent compared to 51.8 percent using the old approach.

This problem and others posed by de Méré led to an exchange of letters between

Pascal and Fermat in which the fundamental principles of probability theory were formulated
for the first time. Although a few special problems on games of chance had been solved by

some Italian mathematicians in the 15th and 16th centuries, no general theory was developed

before this famous correspondence. Thus, the problem proposed by Chevalier de Méré is said

be the start of famous correspondence between Pascal and Pierre de Fermat. They continued

to exchange their thoughts on mathematical principles and problems through a series of

letters. Historians think that the first letters written were associated with the above problem

and other problems dealing with probability theory. Therefore, Pascal and Fermat are the

mathematicians credited with the founding of probability theory.

The Dutch scientist Christian Huygens, a teacher of Leibniz, learned of this

correspondence and shortly thereafter (in 1657) published the first book on probability;

entitled De Ratiociniis in Ludo Aleae, it was a treatise on problems associated with

gambling. Because of the inherent appeal of games of chance, probability theory soon

became popular, and the subject developed rapidly during the 18th century. The major

contributors during this period were Jakob Bernoulli (1654-1705) and Abraham de Moivre

(1667-1754).

In 1812 Pierre de Laplace (1749-1827) introduced a host of new ideas and

mathematical techniques in his book, Théorie Analytique des Probabilités. Before Laplace,

probability theory was solely concerned with developing a mathematical analysis of games of

chance. Laplace applied probabilistic ideas to many scientific and practical problems. The

theory of errors, actuarial mathematics, and statistical mechanics are examples of some of the

important applications of probability theory developed in the l9th century.


Like so many other branches of mathematics, the development of probability theory

has been stimulated by the variety of its applications. Conversely, each advance in the theory

has enlarged the scope of its influence. Mathematical statistics is one important branch of

applied probability; other applications occur in such widely different fields as genetics,

psychology, economics, and engineering. Many workers have contributed to the theory since

Laplace's time; among the most important are Chebyshev, Markov, von Mises, and

Kolmogorov.

One of the difficulties in developing a mathematical theory of probability has been to

arrive at a definition of probability that is precise enough for use in mathematics, yet

comprehensive enough to be applicable to a wide range of phenomena. The search for a

widely acceptable definition took nearly three centuries and was marked by much

controversy. The matter was finally resolved in the 20th century by treating probability

theory on an axiomatic basis. In 1933 a monograph by a Russian mathematician A.

Kolmogorov outlined an axiomatic approach that forms the basis for the modern theory.

(Kolmogorov's monograph is available in English translation as Foundations of Probability

Theory, Chelsea, New York, 1950.) Since then the ideas have been refined somewhat and

probability theory is now part of a more general discipline known as measure theory."

Concepts of probability have been around for thousands of years, but probability

theory did not arise as a branch of mathematics until the mid-seventeenth century. During the

fifteenth century several probability works emerged. Calculations of probabilities became

more noticeable during this time period even though mathematicians in Italy and France
remained unfamiliar with these calculation methods. In 1494, Fra Luca Paccioli wrote the

first printed work on probability, Summa de arithmetica, geometria, proportioni e

proportionalita. In 1550, Geronimo Cardano inspired by the Summa wrote a book about

games of chance called Liber de Ludo Aleae which means A Book on Games of Chance.

The topic of probability is seen in many facets of the modern world. The theory of

probability is not just taught in mathematics courses, but can be seen in practical fields, such

as insurance, industrial quality control, study of genetics, quantum mechanics, and the kinetic

theory of gases.

APPLICATION OF THE THEORY OF PROBABILITY

APPLICATION 1 How to win the lottery?

Many people spend large sums of money buying lottery tickets, even though they don’t have

a realistic sense for their chances of winning. How can a better improve your chances of

winning the lottery? Simple, Buy more tickets. I would love to highlight to you a fantastic

new system that would change or enhance your ability to win the lottery. Unfortunately I

can't because it is not possible. Well I say that, it is, You simply buy more tickets!!

No amount of number selection systems or permutations, wheeling, standing on one leg with

your arm in the air, will aid you. But, a good slice of luck and buying lots of tickets will. You

can improve your chances of winning larger amounts when you do win but that is about all

you can influence. Making a decent entry on the coupon is a necessity else you will end up

selecting a large amount of numbers that other people have done too and if that day arrives
you will find yourself sharing your winning with half the population of the world. Scant

reward for such good fortune.

For example, let’s suppose we have the following problem:

A group of 52 people contains 36 females. A sample of size of 8 will be taken, and the

probability of finding 6 women in the group is desired.

In Excel type ="HYPGEOMDIST(6,8,36,52)"

Any 6 from 8 in 52 of 36/52 (69.2%) = 31.06% - Add the quantities to get the total

percentage for a range.

In the case of buying lottery, suppose we want the probability of getting three winning

numbers in a lottery of 49 numbers where we draw 6 numbers.

In Excel type "=HYPGEOMDIST(3,6,6,49)"

Any 3 from 6 in 49 of 1/49 (2.04%) = 1.77% - Or entering one ticket a week you will win

once every 56.6 weeks.

APPLICATION 2 How to set a safe password?

All of us use personal security codes for ATM machines, computer internet access and home

security systems. The safety of such codes depends on the large number of different

possibilities, but hackers now have sophisticated tools that can largely overcome that

obstacle. Researches found that by using variations of the user’s first and last names

along with 1000 other first names, they could identify 10% to 20% of the pass word on
typical computer systems. When choosing a pass word do not use a variation of any

name, a word found in a dictionary, a password shorter than 7 characters, telephone

numbers or social security numbers. Do include nonalphabetic characters such as

digits or punctuation marks.

PART 1 (b)

Empirical Probability of an event is an "estimate" that the event will happen based on how

often the event occurs after collecting data or running an experiment (in a large number of

trials). It is based specifically on direct observations or experiences.

Empirical Probability Formula Example: A survey was conducted to determine

students' favorite breeds of dogs. Each student

chose only one breed.

Dog Collie Spaniel Lab Boxer PitBull Other


# 10 15 35 8 5 12
P(E) = probability that an event, E, will

occur. What is the probability that a student's favorite dog

top = number of ways the specific event breed is Lab?

occurs.
Answer: 35 out of the 85 students chose Lab. The
bottom = number of ways the experiment

could occur.

probability is .

Theoretical Probability of an event is the number of ways that the event can occur, divided
by the total number of outcomes. It is finding the probability of events that come from a

sample space of known equally likely outcomes.

Theoretical Probability Formula Example 1: Find the probability of rolling a six

on a fair die.

Answer: The sample space for rolling is die is 6


P(E) = probability that an event, E, will
equally likely results: {1, 2, 3, 4, 5, 6}.
occur.

n(E) = number of equally likely outcomes of


The probability of rolling a 6 is one out of 6 or .
E.

n(S) = number of equally likely outcomes of

sample space S.

Example 2: Find the probability of tossing a fair die and getting an odd number.

Answer:

event E : tossing an odd number

outcomes in E: {1, 3, 5}

sample space S: {1, 2, 3, 4, 5, 6}

Comparison Between Empirical and Theoretical Probabilities:

Karen and Jason roll two dice 50 times and record their results in Sum of the rolls of two
dice
3, 5, 5, 4, 6, 7, 7, 5, 9,

10, 12, 9, 6, 5, 7, 8, 7, 4,
the accompanying chart.
11, 6, 8, 8, 10, 6, 7, 4, 4,
1.) What is their empirical probability of rolling a 7?
5, 7, 9, 9, 7, 8, 11, 6, 5,
2.) What is the theoretical probability of rolling a 7?
4, 7, 7, 4, 3, 6, 7, 7, 7, 8,
3.) How do the empirical and theoretical probabilities compare?
6, 7, 8, 9

Solution:

1.) Empirical probability (experimental probability or observed

probability) is 13/50 = 26%.

2.) Theoretical probability (based upon what is possible when

working with two dice) = 6/36 = 1/6 = 16.7% (check out the table

at the right of possible sums when rolling two dice).

3.) Karen and Jason rolled more 7's than would be expected

theoretically.

PART 2 (a)

Possible outcomes when a dice is tossed once = {1,2,3,4,5,6}

PART 2 (b)
Possible outcomes when two dice are tossed simultaneously:
Dice
2 1 2 3 4 5 6
Dice 1
1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)

PART 3 (a)

Sum of the dots on both


Possible outcomes Probability P(x)
turned-up faces (x)
2 {(1,1)}

3 {(1,2), (2,1)}

4 {(1,3), (2,2), (3,1)}

5 {(1,4), (2,3), (3,2),(4,1)}

6 {(1,5), (2,4), (3,3),(4,2), (5,1)}

7 {(1,6), (2,5), (3,4),(4,3), (5,2),


(6,1)}
8 {(2,6), (3,5),(4,4), (5,3), (6,2)}

9 {(3,6),(4,5), (5,4), (6,3)}

10 {(4,6), (5,5), (6,4)}

11 {(5,6), (6,5)}

12 {(6,6)}

PART 3 (b)

A = {(1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2), (3,4), (3,5),
(3,6), (4,1), (4,2), (4,3), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,6), (6,1), (6,2), (6,3),
(6,4), (6,5)}
B={ }

C = {(2,3), (2,5), (3,2), (3,3), (3,5), (5,2), (5,3), (5,5), (1,2), (1,4), (1,6), (2,1), (3,4), (3,6),
(4,1), (4,3), (4,5), (5,4), (5,6), (6,1), (6,3), (6,5)}

D = {(3,3), (3,5), (5,3), (5,5)}

PART 4 (a)

Sum of the two


Frequency (f) fx fx2
numbers (x)
2 1 2 4
3 3 9 27
4 4 16 64
5 6 30 150
6 7 42 252
7 10 70 490
8 7 56 448
9 5 45 405
10 4 40 400
11 2 22 242
12 1 12 144
Total 50 344 2626

(i) Mean =

= 6.88

(ii) Variance (σ2) = 2

2
=
= 5.1856

(iii)Standard Deviation (σ) =


= 2.2772

PART 4 (b)

The mean if the number of tosses is increased to 100 times = 6.90

PART 4 (c)

Sum of the two


Frequency (f) fx fx2
numbers (x)
2 2 4 8
3 6 18 54
4 8 32 128
5 11 55 275
6 14 84 504
7 17 119 833
8 14 112 896
9 12 108 972
10 7 70 700
11 7 77 847
12 2 24 288
Total 100 703 5505

(i) Mean =

= 7.03

(ii) Variance (σ2) = 2

2
=
= 5.6291

(iii)Standard Deviation (σ) =


= 2.3726

My estimated mean is different from the actual mean. However, the difference between the
mean values is not significant. Thus, my prediction can be considered as acceptable.

PART 5 (a)

Sum of the dots on both


Probability
turned-up faces x P(x) x2P(x)
P(x)
(x)
2

10

11

12
Total 7 54.8333

Mean =
=7

Variance = - (mean)2
= 54.8333 – (7)2
= 5.8333
PART 5 (b)

The mean, variance and standard deviation obtained in Part 4 and Part 5 are different but not

significantly diverse. This is because the calculation in Part 5 which is base on the

probability theory is just an estimation as compared to the calculation in Part 4 which is base

on the actual data. This explains the differences between the figures in Part 4 and Part 5

which represent the empirical probabilities and theoretical probabilities respectively.

PART 5 (c)

When finding probabilities with the relative frequencies approach, we obtain an estimate

instead of an exact value. As the total number of observations increases, the corresponding

estimates tend to get closer to the actual probabilities. This property is stated as the Law of

Large Numbers. It is noted that when n = 50 times, the mean obtained is 6.88 while when n

= 100 times, the mean obtained is 7.03. This shows that as the number of times two dice are

tossed simultaneously (n) increases, the actual mean obtained in Part 4 will become closer to

the mean obtained in Part 5. This is because a probability estimate base on only a few trials

can be off by substantial amounts, but with a very large number of trials, the estimate tends

to be much more accurate.

FUTHER EXPLORATION

The Law of Large Numbers is a fundamental concept in statistics and probability that

describes how the average of a randomly selected large sample from a population is likely to

be close to the average of the whole population. Base on the findings in this project, it can be
concluded that the Law of Large Number is valid. As the same experiment is performed for

a large number of times, the results obtained will become more approximate to the estimated

values by using probability theory. In another word, the estimated value by using probability

theory is more applicable and valid for data with larger frequencies.

In formal language, if an event of probability p is observed repeatedly during independent

repetitions, the ratio of the observed frequency of that event to the total number of repetitions

converges towards p as the number of repetitions becomes arbitrarily large. Jakob Bernoulli

first described the LLN. He says it was so simple that even the stupidest man instinctively

knows it is true. Despite this, it took him over 20 years to develop a good mathematical

proof. Once he had found it, he published the proof in Ars Conjectandi (The Art of

Conjecturing) in 1713. He named this his "Golden Theorem". It became generally known as

"Bernoulli's Theorem". In 1835, S.D. Poisson further described it under the name "La loi des

grands nombres" (The law of large numbers). Thereafter, it was known under both names,

but the "Law of large numbers" is most frequently used.

Other mathematicians also contributed to make the law better. Some of them were

Chebyshev, Markov, Borel, Cantelli and Kolmogorov. After these studies there are now two

different forms of the law: One is called the "weak" law and the other the "strong" law. These

forms do not describe different laws. They have different ways to describe the convergence

of the observed or measured probability to the actual probability. The strong form of the law

implies the weak one.


The weak law states that as the sample size grows larger, the difference between the sample

mean and the population mean will approach zero. The strong law states that as the sample

size grows larger, the probability that the sample mean and the population mean will be

exactly equal approaches 1.

The phrase "law of large numbers" is also sometimes used in a less technical way to refer to

the principle that the probability of any possible event (even an unlikely one) occurring at

least once in a series increases with the number of events in the series. For example, the odds

that you will win the lottery are very low; however, the odds that someone will win the

lottery are quite good, provided that a large enough number of people purchased lottery

tickets.

One misperception of LLN is that if an event has not occurred in many trials, the probability

of it occurring in a subsequent trial is increased. For example, the probability of a fair die

turning up a 3 is 1 in 6. LLN says that over a large number of throws, the observed frequency

of 3s will be close to 1 in 6. This however does not mean that if the first 5 throws of the die

do not turn up a 3, the sixth throw will turn up a 3 with certainty (probability 1). The

probability for the 6th throw turning up a 3 remains 1 in 6. In an infinite (or very large) set of

observations, the value of any one individual observation cannot be predicted based upon

past observations. Such predictions are known as the Gambler's Fallacy.

REFLECTION
In conducting this project, I have learnt the power of mathematical theories in solving

problems. It enhances my desire to explore to more knowledge and also to respect the

knowledge created by the mathematicians. The theory of probability is widely applied in all

fields in our life, ranging from simple dice-tossing problem to complicated investment

decisions.

Via the research carried out in this project, I have also learnt that the chance in winning the

lottery is relatively low. This makes me aware that it is definitely a loss if we indulged in

gambling activities, The punters are always incurring losses as the hosts (casino, lottery

company) always stand a much higher probability in winning the games. People who do not

have the knowledge about the probability will always assume that he stands a big chance to

win and thus will try his lucks again and again and eventually may cause him to incur big

loss and high debts. The poster of the lottery company for instances, in order to attract the

punters, they portray the dreams of getting big cars, big house, luxury life after winning the

lottery. But we have to bear in mind that the dreams portrayed in the poster stand only a little

probability, which means it rarely can happen, unless we are really in good lucks. So, my

opinion is that :

S-ar putea să vă placă și