Sunteți pe pagina 1din 31

m   Õ

< 
First of all, I would like to say Alhamdulillah, for giving me the strength
and health to do this project work.

Not forgotten my parents for providing everything, such as


money, to buy anything that are related to this project work and their
advise, which is the most needed for this project. Internet, books,
computers and all that. They also supported me and encouraged me to
complete this task so that I will not procrastinate in doing it.

Then I would like to thank my teacher, Madam Zaiton for guiding me


and my friends throughout this project. We had some difficulties in
doing this task, but she taught us patiently until we knew what to do. She
tried and tried to teach us until we understand what we supposed to do
with the project work.

Last but not least, my friends who were doing this project with me
and sharing our ideas. They were helpful that when we combined and
discussed together, we had this task done.


m   Ú


ô 


The aims of carrying out this project work are:

ëto apply and adapt a variety of problem-solving strategies to


solve problems;

ëto improve thinking skills;

ëto promote effective mathematical communication;

ëto develop mathematical knowledge through problem solving


In a way that increases students¶ interest and confidence;

ëto use the language of mathematics to express mathematical


ideas precisely;

ëto provide learning environment that stimulates and enhances


effective learning;

ë to develop positive attitude towards mathematics.


m   %


PART ONE


m   3


¬


Probability is a way of expressing knowledge or belief that an event will occur or has occurred.
In mathematics the concept has been given an exact meaning in probability theory, that is used
extensively in such areas of study as mathematics, statistics, finance, gambling, science, and
philosophy to draw conclusions about the likelihood of potential events and the underlying
mechanics of complex systems.

Probability has a dual aspect: on the one hand the probability or likelihood of hypotheses given
the evidence for them and on the other hand the behavior of stochastic processes such as the
throwing of dice or coins. The study of the former is historically older in, for example, the law of
evidence, while the mathematical treatment of dice began with the work of Pascal and Fermat in
the 1650s.

Probability is distinguished from statistics. While statistics deals with data and inferences from
it, (stochastic) probability deals with the stochastic (random) processes which lie behind data or
outcomes.

HISTORY - ORIGINS

Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities,
presumptions and half-proof to deal with the uncertainties of evidence in court. In Renaissance
times, betting was discussed in terms of odds such as "ten to one" and maritime insurance
premiums were estimated based on intuitive risks, but there was no theory on how to calculate
such odds or premiums.

The mathematical methods of probability arose in the correspondence of Pierre de Fermat and
Blasé Pascal (1654) on such questions as the fair division of the stake in an interrupted game of
chance. Christian Huygens (1657) gave a comprehensive treatment of the subject.

IMPORTANCE OF PROBABILITY IN LIFE

I will assume that you are referring to probability theory. Statistics is based on an understanding
of probability theory.

Many professions require basic understanding of statistics. So, in these cases, it is important.

Probability theory goes beyond mathematics. It involves logic and reasoning abilities. Marketing
and politics have one thing in common, biased statistics. I believe since you are exposed to so
many statistics, a basic understanding of this area allows more critical thinking. The book "How
to lie with statistics" is a classic and still in print.

So, while many people would probably say that probability theory has little importance in their
lives, perhaps in some cases if they knew more, it would have more importance.


m   0


Π
      

½ 
m  and  and their cognates in other modern languages derive from medieval learned
Latin 
and  
 
, deriving from Cicero and generally applied to an opinion to
mean 
 or    .

ô 
Ancient and medieval law of evidence developed a grading of degrees of proof, probabilities,
presumptions and half-proof to deal with the uncertainties of evidence in court. In Renaissance
times, betting was discussed in terms of odds such as "ten to one" and maritime insurance
premiums were estimated based on intuitive risks, but there was no theory on how to calculate
such odds or premiums.

The mathematical methods of probability arose in the correspondence of Pierre de Fermat and
Blasé Pascal (1654) on such questions as the fair division of the stake in an interrupted game of
chance. Christiaan Huygens (1657) gave a comprehensive treatment of the subject.

 
 
Jacob Bernoulli's <
 (posthumous, 1713) and Abraham de Moivre's  

(1718) put probability on a sound mathematical footing, showing how to calculate a
wide range of complex probabilities. Bernoulli proved a version of the fundamental law of large
numbers, which states that in a large number of trials, the average of the outcomes is likely to be
very close to the expected value - for example, in 1000 throws of a fair coin, it is likely that there
are close to 500 heads (and the larger the number of throws, the closer to half-and-half the
proportion is likely to be).








m   


 
 
The power of probabilistic methods in dealing with uncertainty was shown by Gauss's
determination of the orbit of Ceres from a few observations. The theory of errors used the
method of least squares to correct error-prone observations, especially in astronomy, based on
the assumption of a normal distribution of errors to determine the most likely true value.

Towards the end of the nineteenth century, a major success of explanation in terms of
probabilities was the Statistical mechanics of Ludwig Boltzmann and J. Willard Gibbs which
explained properties of gases such as temperature in terms of the random motions of large
numbers of particles.

The field of the history of probability itself was established by Isaac Todhunter's monumental
Œ
   m    m
 
(1865).

Ú 
 
Probability and statistics became closely connected through the work on hypothesis testing of R.
A. Fisher and Jerzy Neyman, which is now widely applied in biological and psychological
experiments and in clinical trials of drugs. A hypothesis, for example that a drug is usually
effective, gives rise to a probability distribution that would be observed if the hypothesis is true.
If observations approximately agree with the hypothesis, it is confirmed, if not, the hypothesis is
rejected.

The theory of stochastic processes broadened into such areas as Markov processes and Brownian
motion, the random movement of tiny particles suspended in a fluid. That provided a model for
the study of random fluctuations in stock markets, leading to the use of sophisticated probability
models in mathematical finance, including such successes as the widely-used Black-Scholes
formula for the valuation of options.

The twentieth century also saw long-running disputes on the interpretations of probability. In the
mid-century frequentism was dominant, holding that probability means long-run relative
frequency in a large number of trials. At the end of the century there was some revival of the
Bayesian view, according to which the fundamental notion of probability is how well a
proposition is supported by the evidence for it.

The mathematical treatment of probabilities, especially when there are infinitely many possible
outcomes, was facilitated by Kolmogorov's axioms (1931).

 


m   D


< 

Two major applications of probability theory in everyday life are in risk assessment and in trade
on commodity markets. Governments typically apply probabilistic methods in environmental
regulation where it is called "pathway analysis", often measuring well-being using methods that
are stochastic in nature, and choosing projects to undertake based on statistical analyses of their
probable effect on the population as a whole.

A good example is the effect of the perceived probability of any widespread Middle East conflict
on oil prices - which have ripple effects in the economy as a whole. An assessment by a
commodity trader that a war is more likely vs. less likely sends prices up or down, and signals
other traders of that opinion. Accordingly, the probabilities are not assessed independently nor
necessarily very rationally. The theory of behavioral finance emerged to describe the effect of
such groupthink on pricing, on policy, and on peace and conflict.

It can reasonably be said that the discovery of rigorous methods to assess and combine
probability assessments has had a profound effect on modern society. Accordingly, it may be of
some importance to most citizens to understand how odds and probability assessments are made,
and how they contribute to reputations and to decisions, especially in a democracy.

Another significant application of probability theory in everyday life is reliability. Many


consumer products, such as automobiles and consumer electronics, utilize reliability theory in the
design of the product in order to reduce the probability of failure. The probability of failure may
be closely associated with the product's warranty.

 


m   ‰


©  
    

Î      

Probability is a likelihood that an event will happen.

We can find the theoretical probability of an event using the following ratio:

Let¶s do a couple of examples.

 ½     

½ 

             

 

Tossing a tail is the favorable outcome here.

When you toss a coin there are only 2 possible outcomes: a Head or a Tail

So the options for tossing a tail are 1 out of 2.

We can also represent probability as a decimal or as a percent.

½ Ú

 Ú             
    

Let¶s first answer a few questions here:

If I am going to randomly pick a marble from the bag then what results can I have:

I¶ll either pick a red marble or a blue one.

My next question is what the chances of picking a red marble are:


m   

There are 15 red marbles and just 5 blue marbles.

It¶s obvious that we have three times as many red marbles as blue marbles.

So, the chance of picking a red marble is more than that of the blue one.

Therefore, the probability of picking a red marble is:

½ 

          

Two dice are being rolled. The possible outcomes are as follows:

Let¶s use the representation (a, b) for the outcomes where     and    
Ú.

(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)

There are 36 possible outcomes in all.

                

From the list above identify the pairs with outcomes that add up to 7.

Let¶s highlight them this way:

(1, 1), (1, 2), (1, 3), (1, 4), (1, 5),  !
(2, 1), (2, 2), (2, 3), (2, 4), Ú! (2, 6),
(3, 1), (3, 2), (3, 3), "! (3, 5), (3, 6),
(4, 1), (4, 2), "!, (4, 4), (4, 5), (4, 6),
(5, 1), Ú!, (5, 3), (5, 4), (5, 5), (5, 6),
 !, (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)

Observe that the pairs along the main diagonal add up to 7. There are 6 such pairs.

So, the probability of getting a sum of 7 when we roll two dice is:


m   Õ


½      

Empirical probability, also known as relative frequency, or experimental probability, is the ratio
of the number favorable outcomes to the total number of trials,[1][2] not in a sample space but in
an actual sequence of experiments. In a more general sense, empirical probability estimates
probabilities from experience and observation.[3] The phrase a posteriori probability has also
been used as an alternative to empirical probability or relative frequency.[4] This unusual usage
of the phrase is not directly related to Bayesian inference and not to be confused with its equally
occasional use to refer to posterior probability, which is something else.

In statistical terms, the empirical probability is an estimate of a probability. If modeling using a


binomial distribution is appropriate, it is the maximum likelihood estimate. It is the Bayesian
estimate for the same case if certain assumptions are made for the prior distribution of the
probability

An advantage of estimating probabilities using empirical probabilities is that this procedure is


relatively free of assumptions. For example, consider estimating the probability among a
population of men that they satisfy two conditions: (i) that they are over 6 feet in height; (ii) that
they prefer strawberry jam to raspberry jam. A direct estimate could be found by counting the
number of men who satisfy both conditions to give the empirical probability the combined
condition. An alternative estimate could be found by multiplying the proportion of men who are
over 6 feet in height with the proportion of men who prefer strawberry jam to raspberry jam, but
this estimate relies on the assumption that the two conditions are statistically independent.

A disadvantage in using empirical probabilities arises in estimating probabilities which are either
very close to zero, or very close to one. In these cases very large sample sizes would be needed
in order to estimate such probabilities to a good standard of relative accuracy. Here statistical
models can help, depending on the context, and in general one can hope that such models would
provide improvements in accuracy compared to empirical probabilities, provided that the
assumptions involved actually do hold. For example, consider estimating the probability that the
lowest of the daily-maximum temperatures at a site in February in any one year is less zero
degrees Celsius. A record of such temperatures in past years could be used to estimate this
probability. A model-based alternative would be to select of family of probability distributions
and fit it to the dataset contain past yearly values: the fitted distribution would provide an
alternative estimate of the required probability. This alternative method can provide an estimate
of the probability even if all values in the record are greater than zero.


m   ÕÕ


½    ©  
    

Empirical Probability of an event is an "estimate" that the event will happen based on how often
the event occurs after collecting data or running an experiment (in a large number of trials). It is
based specifically on direct observations or experiences.

½      ½ : A survey was conducted to determine


students' favorite breeds of dogs. Each student
chose only one breed.
 %
Î
   # $ ô
 
m () = probability that an event, , will occur.
10 15 35 8 5 12
 = number of ways the specific event occurs.
 = number of ways the experiment could What is the probability that a student's favorite
occur. dog breed is Lab?
Answer: 35 out of the 85 students chose Lab.

The probability is .


Theoretical Probability of an event is the number of ways that the event can occur, divided by the
total number of outcomes. It is finding the probability of events that come from a sample space
of known equally likely outcomes.

       ½ : Find the probability of rolling a six
on a fair die.

Answer: The sample space for rolling is die is 6


 equally likely results: {1, 2, 3, 4, 5, 6}.
The probability of rolling a 6 is one out of 6 or
m () = probability that an event, , will occur.
 = number of equally likely outcomes of .
 = number of equally likely outcomes of sample
space .

½ Ú Find the probability of tossing a fair die and getting an odd number.
Answer:
event  : tossing an odd number
outcomes in : {1, 3, 5}
sample space : {1, 2, 3, 4, 5, 6}


m   ÕÚ


-   ½    ©  

   

Sum of the rolls of two
dice
Karen and Jason roll two dice 50 times and record their
results in the accompanying chart. 3, 5, 5, 4, 6, 7, 7, 5, 9, 10,
! What is their empirical probability of rolling a 7? 12, 9, 6, 5, 7, 8, 7, 4, 11,
Ú! What is the theoretical probability of rolling a 7? 6,
8, 8, 10, 6, 7, 4, 4, 5, 7, 9,
! How do the empirical and theoretical probabilities
9, 7, 8, 11, 6, 5, 4, 7, 7, 4,
compare? 3, 6, 7, 7, 7, 8, 6, 7, 8, 9

 
1.) Empirical probability (experimental probability or observed
probability) is 13/50 = 26%.
2.) Theoretical probability (based upon what is possible when
working with two dice) = 6/36 = 1/6 = 16.7% (check out the table
at the right of possible sums when rolling two dice).
3.) Karen and Jason rolled more 7's than would be expected
theoretically.


m   Õ%


PART TWO


m   Õ3


ë My friends and I are playing Monopoly. At the beginner, each of us will toss
a die once. The player who obtains the highest number will start the game.
The possible outcomes are:

{1, 2, 3, 4, 5, 6}

’ë Instead of one dice, two dice can also be tossed simultaneously by each
player. The player will move the token according to the sum of all dots on
both turned-up faces. For example, if the two dice are tossed simultaneously
and "2" appears on one die and "3" appears on the other, the outcome of the
toss is (2, 3). Hence, the player shall move the token 5 spaces. The possible
outcomes when two dice are tossed simultaneously:

{(1, 1), (1, 2), (1, 3), (1, 4), (1, 5), (1, 6),
(2, 1), (2, 2), (2, 3), (2, 4), (2, 5), (2, 6),
(3, 1), (3, 2), (3, 3), (3, 4), (3, 5), (3, 6),
(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6),
(5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (5, 6),
(6, 1), (6, 2), (6, 3), (6, 4), (6, 5), (6, 6)}


m   Õ0


PART THREE


m   Õ


Table 1 show the sum of all dots on both turned-up faces when two dice are tossed
simultaneously.

Sum of dots on both Possible outcomes Probability, m!


turned-up faces !
2 (1,1) "
#$
3 (1,2),(2,1) %
#$
4 (1,3),(2,2),(3,1) #
#$
5 (1,4),(2,3),(3,2),(4,1) &
#$
6 (1,5),(2,4),(3,3),(4,2),(5,1) '
#$
7 (1,6),(2,5),(3,4),(4,3),(5,2),(6,1) $
#$
8 (2,6),(3,5),(4,4),(5,3),(6,2) '
#$
9 (3,6),(4,5),(6,3),(5,4) &
#$
10 (4,6),(5,5),(6,4) #
#$
11 (5,6),(6,5) %
#$
12 (6,6) "
#$

TABLE 1


m   ÕD


A = {the two numbers are not the same}

= {(1,2),(2,1),(1,3),(3,1), (1,4),(2,3),(3,2),(4,1),(1,5),(2,4),(4,2),(5,1),
(1,6),(2,5),(3,4),(4,3),(5,2),(6,1), (2,6),(3,5),(5,3),(6,2), (3,6),(4,5),(6,3),(5,4),
(4,6),(6,4), (5,6),(6,5)}
* 
P (A) =
* 

#(
=
#$

B = {the product of the two numbers is greater than 36}

=U 

C = {Both numbers are prime or the difference between two numbers is odd)

={(2,2),(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}

{(1,2),(1,4),(1,6), (2,3),(2,5),(3,2),(3,4),(3,6),(4,1),(4,3),(4,5),

(5,2),(5,4),(5,6),(6,1),(6,3),(6,5)}

È
 ")
P(C) =
#$ #$
%$
=
#$

D = {The sum of the two numbers are even and both numbers are prime)
= {(1,1),(1,3),(1,5),(2,2),(2,4),(2,6),(3,1),(3,3),(3,5),(4,2),(4,4),(4,6),(5,1),(5,3),

(5,5),(6,2),(6,4),(6,6)}ß{(2,3),(2,5),(3,2),(3,3),(3,5),(5,2),(5,3),(5,5)}

ö
"* *
P (D) =
#$ #$
"
=



m   Չ


PART FOUR


m   Õ


a)ë An activity was conducted by tossing two dice simultaneously 50 times. The
sums of all dots on both turned-up faces were observed. The frequency table
is completed below.

Sum of the two numbers ! Frequency 


2 3
3 8
4 6
5 7
6 9
7 3
8 2
9 4
10 4
11 3
12 1
TABLE 2
; 
- mean =


=


=6.02

; 

- Variance 


= 



=7.4596

; 
- Standard deviation =  


=

=2.731226831


m   Ú


b) New mean value if the number of tosses is increased to 100 times =

Së The prediction in (b) is tested by continuing Activity 3(a) until the total
number of tosses is 100 times. The value of
(i) mean;
(ii) variance; and
(iii) standard deviation
of the new data is estimated.

¦  ¦ Ö 
2 6 12 24
3 9 27 81
4 11 44 176
5 12 60 300
6 13 78 468
7 10 70 490
8 7 56 448
9 12 108 972
10 7 70 700
11 6 66 726
12 7 84 1008
;  100 ;  =675 ;   =5393


m   ÚÕ


; 
- mean =


= = 6.75


; 
- Variance=  


= 


=8.3675

; 
- Standard deviation =  


=2.892663133

The prediction is proven.


m   ÚÚ


PART FIVE


m   Ú%

When two dice are tossed simultaneously, the actual mean and variance of the sum of all dots on
the turned-up faces can be determined by using the formulae below:

a)ë Based on Table 1, the actual mean, the variance and the standard deviation of the sum of
all dots on the turned-up faces are determined by using the formulae given.

!  m! !m!  m!


2 4 " 2 2


#$ 2
3 9 % 2 2


 
#$
4 16 # 2 
 
#$  
5 25 & 

 
 
#$
6 36 '  


#$
7 49 $  
 
 
#$
8 64 ' 2 2

 
#$ 
9 81 & Õ 
#$
10 100 # 


 
#$ 
11 121 % 22 2
2
 
#$ 2 2
12 144 " 2 4

#$ 


m   Ú3

Mean=7

Variance=


=50.27777778

Standard deviation=

=7.090682462

b)
Table below shows the comparison of mean, variance and standard deviation of part 4 and part 5.

PART 4 PART 5
n=50 n=100
Mean 6.02 6.75 7.00
Variance 7.4596 8.3675 50.27777778
Standard Deviation 2.731226831 2.892663133 7.090682462

We can see that, the mean, variance and standard deviation that we obtained through experiment
in part 4 are different but close to the theoretical value in part 5.

For mean, when the number of trial increased from n=50 to n=100, its value get closer (from
6.02 to 6.75) to the theoretical value. This is in accordance to the Law of Large Number. We will
discuss Law of Large Number in next section.

Nevertheless, the empirical variance and empirical standard deviation that we obtained i part 4
get further from the theoretical value in part 5. This violates the Law of Large Number. This is
probably due to

a.ë The sample (n=100) is not large enough to see the change of value of mean, variance and
standard deviation.
b.ë Law of Large Number is not an absolute law. Violation of this law is still possible though
the probability is relative low.

In conclusion, the empirical mean, variance and standard deviation can be different from the
theoretical value. When the number of trial (number of sample) getting bigger, the empirical
value should get closer to the theoretical value. However, violation of this rule is still possible,
especially when the number of trial (or sample) is not large enough.


m   Ú0

c) The range of the mean

   !  ë

Conjecture: As the number of toss, n, increases, the mean will get closer to 7. 7 is the theoretical
mean.

Image below support this conjecture where we can see that, after 500 toss, the theoretical mean
become very close to the theoretical mean, which is 3.5. (Take note that this is experiment of
tossing 1 die, but not 2 dice as what we do in our experiment)


m   Ú


FURTHER
EXPLORATION


m   ÚD


      
In probability theory, the law of large numbers (LLN) is a theorem that describes the result of
performing the same experiment a large number of times. According to the law, the average of
the results obtained from a large number of trials should be close to the expected value, and will
tend to become closer as more trials are performed.

For example, a single roll of a six-sided die produces one of the numbers 1, 2, 3, 4, 5, 6, each
with equal probability. Therefore, the expected value of a single die roll is


ÈÈÈÈ
 

According to the law of large numbers, if a large number of dice are rolled, the average of their
values (sometimes called the sample mean) is likely to be close to 3.5, with the accuracy
increasing as more dice are rolled.

Similarly, when a fair coin is flipped once, the expected value of the number of heads is equal to
one half. Therefore, according to the law of large numbers, the proportion of heads in a large
number of coin flips should be roughly one half. In particular, the proportion of heads after 
flips will almost surely converge to one half as  approaches infinity.

Though the proportion of heads (and tails) approaches half, almost surely the absolute (nominal)
difference in the number of heads and tails will become large as the number of flips becomes
large. That is, the probability that the absolute difference is a small number approaches zero as
number of flips becomes large. Also, almost surely the ratio of the absolute difference to number
of flips will approach zero. Intuitively, expected absolute difference grows, but at a slower rate
than the number of flips, as the number of flips grows.

The LLN is important because it "guarantees" stable long-term results for random events. For
example, while a casino may lose money in a single spin of the roulette wheel, its earnings will
tend towards a predictable percentage over a large number of spins. Any winning streak by a
player will eventually be overcome by the parameters of the game. It is important to remember
that the LLN only applies (as the name indicates) when a    of observations are
considered. There is no principle that a small number of observations will converge to the
expected value or that a streak of one value will immediately be "balanced" by the others. See
the Gambler's fallacy.


m   ډ


REFLECTION


m   Ú





  m 
   m




 
 
   

  



 m
 

  



m   %



CONCLUSION


m   %Õ


As a conclusion, now I know;


ë !" #$! ’ ’%"#! &Õ S'(!" )*S'(!"+
ë ,  $$%" !" #$! ’ ’%"'- %"%# '-&$ ! 'S+
ë, S  ! #$! ’ ’%".&$!S %$! ’ ’%" '-
 !S %$! ’ ’%" '-!-##!'S+
ë , S '-(S -S/ ' S0" #'-$! ’ ’%"+
ë , S %S(% $! ’ ’%"+
 , S %S(% & '+0 ! 'S '- '- !--0  '('
# !&(% .
; 
 '+   + !   ; U 

; 
1 ! 'S+  


+ !   ;  U 

; 
 '- !--0  '+  

 + !  ;  U 

ë
’ ( , # !(&’!2 '-!%  ''$! 3S
, !4+
ë ! %0 %( ’ '-#! &$! 3S, !45


S-ar putea să vă placă și