##### Some Important formula on Random Variable

Hi, I am writing here some important formula which will be very useful when you solve GATE questions. I am not giving you the derivation because derivations are not important for the Gate. If you are interested in derivations please refer  to a book by Mr. Ross Sheldon

1. Discrete random variable:

1. Mean 2. Varience 3. S.D                      = 2. Bionomial Distribution:-

• Mean • Variance • S.D. 3.Poisson distribution:-

• • Variance = • //in case if n is very large and p is very small.
• mean= variance
• • Variance can't be negative.

terms:

• n= no. of trials
• p= probability of success
• q= probability if failure = 1-p
• ##### Poisson Distribution with Example

A random variable Y is said to be Poisson random variable if Y takes value 0,1,2,3... with λ as a parameter.

A Poisson random variable is used to approximate a binomial if the value of n is large and p is very small.

and λ = np Example:-

1. The no. of an accident occurring on a highway each day is poisson random variable with parameter 3.

What is the probability that no accident occurred?

Solution:

They are clearly saying " no accident " which means i=0

So using the above formula :

p(i=0) =  (e-3 (3)) / 0 !

= e-3  or

1/ e3

remember  0! =1

##### Binomial Random Variable Introduction with an example

Informally if I have to define Binomial Random Variable, I will define It as the sum of all the value of Bernoulli Random Variable. or collection of Bernoulli experiment.

Suppose that n independent trials, each of which results in a “success” with probability p and in a “failure” with probability 1 − p, are to be performed. If X
represents the number of successes that occur in the n trials, then X is said to be
a binomial random variable with parameters (n,p). here the i= number of times our desired outcome has occurred and                                        i=0,1,2.....n

Example:-

four coins are tossed what is the probability of getting two head and two tail?

Solution:-

here i=2, n=4, and

probability of getting head or tail  =1/2

So p= 1/2 ,  1-p= 1/2

putting on the formula

P(i=2) =  (4 c 2) (1/2)2 (1/2)2 =3/8

##### Types of Random Variables and Bernoulli Random Variable

We have two types of a random variable:-

1. Discrete value or fixed value.

2. Continuous random variable.

1. Discrete-valued Random Variable:-

Here the value that a random variable gives us is fixed. Example:-

Let's say we toss two coins and Y is a random variable that counts how many times 2 head appears. It will give us a fixed number.

2.Continuous random variable.

Instead of having one fixed value, here they can have uncountable or infinite values.  Ex: The amount of rain, in inches, that falls in a randomly selected storm.

1. Discrete Random Variable:-

They are of three types:

• Bernoulli Random Variable:-
• Binomial Random Variable.
• Poisson Random Variable.

1. Bernoulli Random Variable:-

Y is a random variable whose value can be either 0 or 1 and true or false. It can take only two value.

If we assign the probabilty with this ,

p(0) = P{Y= 0} = 1 −p,
p(1) = P{Y = 1} = p
where p, 0 <= p <= 1, is the probability that the trial is a “success.

##### Random Variable Introduction(1)

Before going into details let's know some basics about random variable:-

1. It is not random.

2. It is not a variable, it is a function.

3. The value of a random variable is determined by the outcome of the experiment.

A random variable is a function that takes the event of sample space and gives us some number. See, we are not Interested in knowing the exact experiment rather we are interested in knowing some number that defines that experiment.

Suppose we are rolling two dice and I want to know how many times we have values on dice such that their sum is 7 ??

Here we are not interested in knowing such samples like {(2,5), (3,4), (5,2), (1,6), (6,1) ,(4,3) , } . We just want to know a number.  let say Y is a function that counts our values

So Y = 6  // it means out of total sample space that is formed by rolling a dice  6 values has 7 as a sum.

now if we assign a probability to them P(Y)= 6/36=1/6, It means with 1/6 probability we can have two values such that their sum is 7.

Example 1: let say we are flipping two coins and we are interested in knowing how many times head appears. Assume Y is a random variable that counts the number of head in this experiment.

Possible values that Y can take is {0,1,2} Why??

because when you toss 2 coins head may appears 0 time or 1 time or 2 time. If you assign a probability to random variables it is called probability distribution.

Now we can write it as

P(Y=0) = {TT} = 1/4 // it means what is the probabilty that 0 head appears .

P(Y=1) ={HT,TH} = 2/4 = 1/2

P(Y=2) = {HH} = 1/4

We will see more example to get more idea about Random Variables.

##### Bayes' theorem Introduction

let's say we have two bags and each bag contains some red balls and some green balls. One ball is randomly chosen and we are asked to find the probability that the chosen ball is red.

It is very easy to answer this question. We need to apply total probability theorem and we will get our answer.

But what if they already mentioned that a red ball is chosen and we are asked to find the probability that the chosen ball is from the first bag.

let's understand this with an example:

1. let us say we have three bags : Question 1: What is the probability of choosing a red ball?

Question 2: What is the probability that the chosen red ball is from bag A?

Solution 1: In this, they are simply asking to find out how we are going to pick a red ball...

let's say the probability of selecting any bag is 1/3

So P(R) = P(BA and R) + P(Band R) ................ (1)

for Question 2:

They have already mentioned that a red ball is chosen already and we have to find out the probability that it is chosen from bag  'A'.

It can be written as P( A/R ).

P(A/R)= P(A ∩R)/ P(R)

= P(A ∩R) / { P(BA and R) + P(Band R)} ....... puting the value of P(R) from equation 1

The formal definition of Baye's Theorem: Derivation Of Baye's Theorem:- Now we will solve some numerical based on Baye's Theorem.

##### Total Probability (1)

It is an application of Conditional Probability and Dependent Event.

We have already discussed conditional probability and Independent events.

Dependent Events :

Events that are depending on each other, here the sequence in which the events are happening is important. Example:

let us say a bag has 4 green balls and 7 red balls and we are pulling two balls on after other.

What is the probability that both balls are Red?

Case 1: Replacement is allowed :

Solution:

for the first ball: P(red1) = 7 /11

for the second ball :

Since we are putting the first ball then again the total number of ball = 11

So P(red2) = 7/11

So P(red1 and red2) = (7/11) *(7/11) = 49/121

Case 2: No replacement:

Here we are not putting the first ball back in the bag.

P(red1 ∩ red 2 )=  P(red1) * P(red2 / red1)

= (7/11) * (6/10)

= 42/110

This is the case of dependent events because when you are trying to pull the second ball it depends on the first event.

Total Probability example 1:-

let's say we have two bags and bag1 contains 4 red and 3 green and bag2 contains  5 red and 6 green balls find out what is the probability of drawing a red ball?

Solution:- Now drawing a red ball depends on two factors:

1. First, we should know from which bag that ball is drawn and what is the probability of choosing that bag?

2. Composition of that chosen bag?

Let's say the probability of choosing each bag is 1/2

So the probability of drawing a red ball  P(red) given as:

P(red )= Bag 1 is selcted and the red ball is drawn OR Bagis selcted and red ball is drawn

{  P( Bag 1 ). (Pred /Bag 1 ) }+  {P(Bag). P(Pred / Bag2)}

= (1/2)(4/7) + (1/2)(5/11)

= ( 4/14 ) + (5/22)

##### Some Properties of Independent Events

If A and B are Independent events then:

1.  A and Bc are also independent.

2.  Ac and B  are also independent.

3.  Ac and Bc are also independents.

I will prove one and you please try to prove the remaining on your own.

let's prove:

3. Ac and Bc are also independents.

if  Ac and Bc are also independents then

P(A ∩ Bc) = P(Ac).P(Bc)

P(A ∩ Bc) ¬ P(A ∪ B ) = 1 - P(A ∪ B )

=  1 - {P(A) + P(B) - P(A ∩ B) }

=  1-   {P(A) + P(B) - P(A). P(B) }

=  {1- P(A) } -  P(B)  {1- P(A) }

= {1 - P(A)}   {1- P(B)}

=  P(Ac).P(Bc)

Hence Proved.

##### Independent Events (Theory)

Two events are said to be an event if happening of one event doesn't affect the other one.

let two event A and B are  independent events then:

P(A ∩ B) = P(A). P(B)

Prove:

from conditional probability theorem, we already know that

P(A/B)= P(A  ∩ B) / P(B) .....................(1)

Since Events are independent then P(E/F)= P(E)

therefore P(A/B)= P(A) because happening of A is not depending on happening of B , they are Independent to each other.

Replacing P(A/B) with P(A) in equation (1)

P(A) . P(B) = P(A ∩ B)

for any number of events if this definition holds we can say that they are Independent events.

Please remember one thing Independent events are not Mutually exclusive events because In mutually exclusive events, A ∩ B = Ø

So P(A ∩ B  ) = 0