virtualgate's picture

Random Variables and Expectations

Random Variable Introduction(1)

Before going into details let's know some basics about random variable:-

1. It is not random.

2. It is not a variable, it is a function.

3. The value of a random variable is determined by the outcome of the experiment.

A random variable is a function that takes the event of sample space and gives us some number. See, we are not Interested in knowing the exact experiment rather we are interested in knowing some number that defines that experiment.

Suppose we are rolling two dice and I want to know how many times we have values on dice such that their sum is 7 ??

Here we are not interested in knowing such samples like {(2,5), (3,4), (5,2), (1,6), (6,1) ,(4,3) , } . We just want to know a number.  let say Y is a function that counts our values 

So Y = 6  // it means out of total sample space that is formed by rolling a dice  6 values has 7 as a sum.

now if we assign a probability to them P(Y)= 6/36=1/6, It means with 1/6 probability we can have two values such that their sum is 7.


Example 1: let say we are flipping two coins and we are interested in knowing how many times head appears. Assume Y is a random variable that counts the number of head in this experiment.

Possible values that Y can take is {0,1,2} Why??

because when you toss 2 coins head may appears 0 time or 1 time or 2 time. If you assign a probability to random variables it is called probability distribution.

Now we can write it as

P(Y=0) = {TT} = 1/4 // it means what is the probabilty that 0 head appears .

P(Y=1) ={HT,TH} = 2/4 = 1/2

P(Y=2) = {HH} = 1/4

We will see more example to get more idea about Random Variables.

Contributor's Info

Random Variable Example(2)

let us say we are flipping a coin and p is the probability of getting head and (1-p) is the probability of getting a tail and N is a random variable that counts the number of times coin is flipped until we get first time head.

What is the probability of getting first time head in the fourth flip?

With this question, we will understand the important concept of random variables. 

The possible values that N can have are {1,2,3,4,5,6......} Why?

If we get Head as we flip first time then N=1

If we get head on the second flip then N= 2

If we get head on the third flip then N= 3 and so on

We can assign probability as:

P(N=1) = {H} = p

P(N=2)={TH} = (p-1)p

P(N=3) = {TTH}= (p-1)(p-1)(p)

P(N=4)={TTTH} =(p-1)(p-1)(p-1)(p)  // because getting head or tail is independent event so we can directly write it like this.




Example on Variance (Sheldon Ross)

Calculate \(Var(X)\) if \(X\) represents the outcome when a fair die is rolled.

Things you need to know

\(Var(X) = E[X^2] - (E[X])^2\)


\(E[X] = \frac{1}{6}(1) + \frac{1}{6}(2) + \frac{1}{6}(3) + \frac{1}{6}(4) + \frac{1}{6}(5) + \frac{1}{6}(6) \\ \ \ \ \ \ \ \ \ \ = \frac{1}{6}(\frac{6*7}{2}) \\ \ \ \ \ \ \ \ \ \ = \frac{7}{2}\)


\(E[X^2] = 1^2(\frac{1}{6}) + 2^2(\frac{1}{6}) + 3^2(\frac{1}{6}) + 4^2(\frac{1}{6}) + 5^2(\frac{1}{6}) + 6^2(\frac{1}{6}) \\
\ \ \ \ \ \ \ \ \ \ \ = \frac{91}{6}\)

Hence, \(Var(X) = \frac{91}{6} - (\frac{7}{2})^2 = \frac{35}{12}\)

Example on Expectation

A school class of 120 students is driven in 3 buses to a symphonic performance. There are 36 students in one of the buses, 40 in another, and 44 in the third bus. When the buses arrive, one of the 120 students is randomly chosen. Let X denote the number of students on the bus of that randomly chosen student, and find E[X].

Since the randomly chosen student is equally likely to be any of the 120 students, it follows that:

 \(P \{ X = 36 \} = \frac{36}{120} \\
P \{ X = 40 \} = \frac{40}{120} \\
P \{ X = 44\} = \frac{44}{120}\)


\(E[X] = 36 (\frac{3}{10}) + 40(\frac{1}{3}) + 44(\frac{11}{30}) = \frac{1208}{30} = 40.2267\)

Example on Random Variable (Sheldon Ross)

Three balls are to be randomly selected without replacement from an urn containing 20 balls numbered 1 through 20. If we bet that at least one of the balls that are drawn has a number as large as or larger than 17, what is the probability that we win the bet?

Let X denote the largest number selected. Then X is a random variable taking on one of the values 3, 4, . . . , 20. Furthermore, if we suppose that each of the \(\binom{20}{3}\) possible selections are equally likely to occur, then 

\(P(X = i) = \frac{ \binom{i-1}{2} }{ \binom{20}{3}} \ \ \ \ , i= 3, .... 20\)


\(P(X = 20) = \frac{ \binom{19}{2} }{ \binom{20}{3}} = \frac{3}{20} = .150 \\ P(X = 19) = \frac{ \binom{18}{2} }{ \binom{20}{3}} = \frac{51}{380} \approx .134 \\ P(X = 18) = \frac{ \binom{17}{2} }{ \binom{20}{3}} = \frac{34}{285} \approx .199 \\ P(X = 17) = \frac{ \binom{16}{2} }{ \binom{20}{3}} = \frac{2}{19} \approx .105\)

Hence, since the event \(\{ X \geq 17 \} \) is the union of the disjoint events \(\{X = i \} , \ \ \ \ i = 17, 18 , 19, 20\) 

It follows that the probability of our winning the bet is given by:

\(P\{ X \geq 17\} \approx .105 + .119 + .134 + .150 = .508\)


Types of Random Variables and Bernoulli Random Variable

We have two types of a random variable:-

1. Discrete value or fixed value.

2. Continuous random variable.

1. Discrete-valued Random Variable:-

Here the value that a random variable gives us is fixed. Example:-

Let's say we toss two coins and Y is a random variable that counts how many times 2 head appears. It will give us a fixed number.

2.Continuous random variable.

Instead of having one fixed value, here they can have uncountable or infinite values.  Ex: The amount of rain, in inches, that falls in a randomly selected storm.


1. Discrete Random Variable:-

They are of three types:

  • Bernoulli Random Variable:-
  • Binomial Random Variable.
  • Poisson Random Variable.

1. Bernoulli Random Variable:-

Y is a random variable whose value can be either 0 or 1 and true or false. It can take only two value.

If we assign the probabilty with this ,

p(0) = P{Y= 0} = 1 −p,
p(1) = P{Y = 1} = p
where p, 0 <= p <= 1, is the probability that the trial is a “success.


Contributor's Info

Binomial Random Variable Introduction with an example

Informally if I have to define Binomial Random Variable, I will define It as the sum of all the value of Bernoulli Random Variable. or collection of Bernoulli experiment.

Suppose that n independent trials, each of which results in a “success” with probability p and in a “failure” with probability 1 − p, are to be performed. If X
represents the number of successes that occur in the n trials, then X is said to be
a binomial random variable with parameters (n,p).

P(i) = \binom{n}{i}(p)^{^{i}}(1-p)^{n-i}

here the i= number of times our desired outcome has occurred and                                        i=0,1,2.....n


four coins are tossed what is the probability of getting two head and two tail?


here i=2, n=4, and 

probability of getting head or tail  =1/2

So p= 1/2 ,  1-p= 1/2 

putting on the formula 

   P(i=2) =  (4 c 2) (1/2)2 (1/2)2 =3/8

Contributor's Info

Bionomial Distribution Example(2)

The probability of a man hitting a target is 1/3.

How many times he must hit so that the probability of hitting a target at least once is more than 90 %?


From the question, we can conclude that p = 1/3 and 1-p =2/3

and they are saying that "probability of hitting a target at least once", It means p(i>=1) 

p(i>=1) =   1- p(i=0)

0.9          >= 1- { (n c 0) (1/3)0 (2/3)n }

1- 0.9     >= (2/3)n

0.1          > = (2/3)n

on solving you will get n=6



Poisson Distribution with Example

A random variable Y is said to be Poisson random variable if Y takes value 0,1,2,3... with λ as a parameter.

A Poisson random variable is used to approximate a binomial if the value of n is large and p is very small. 

and λ = np 


1. The no. of an accident occurring on a highway each day is poisson random variable with parameter 3. 

What is the probability that no accident occurred?


They are clearly saying " no accident " which means i=0

So using the above formula :

p(i=0) =  (e-3 (3)) / 0 !

            = e-3  or

            1/ e3  

remember  0! =1

Contributor's Info

Some Important formula on Random Variable

Hi, I am writing here some important formula which will be very useful when you solve GATE questions. I am not giving you the derivation because derivations are not important for the Gate. If you are interested in derivations please refer  to a book by Mr. Ross Sheldon


1. Discrete random variable:

  1. Mean         \left ( \mu \right )=\sum (x).f\left ( x \right )
  2. Varience(\sigma ^{2})= x^{2}.f\left ( x \right ) -(x.f(x))^{2}
  3. S.D                      =  \sqrt{varience}

2. Bionomial Distribution:-

  • Mean          (\mu) =np
  • Variance  (\sigma ^{^{2}})= npq
  • S.D.                      = \sqrt{npq}


3.Poisson distribution:-

  • P(Y=i)= e^{^{^{-\lambda }}} \lambda ^{^{i}}/i ! 
  • Variance = \lambda
  • \lambda =np   //in case if n is very large and p is very small.
  • mean= variance
  • V(ax\pm by)= a^{2 } V(x) \pm b^{2}V(y)
  • Variance can't be negative.


  • n= no. of trials
  • p= probability of success
  • q= probability if failure = 1-p
  • e=2.71828



Contributor's Info


  • This quiz contains 5 questions on the topic Probability-III
  • Lean well before you attempt the quiz
  • You can attempt the quiz unlimited number of times.

Difficulty Level:  intermediate