Day 3 of the Statistics Tutorial in HackerRank was again devoted to  Statistics. In general, I felt the exercises on this day were a bit weak, in that they did not require most of the material covered in the tutorials and that they could be solved more easily by other means.

The first problem was ostensibly about conditional probabilities. The setup was simple


Suppose a family has 2 children, one of which is a boy. What is the probability that both children are boys?

To be honest, I always find these questions counter-intuitive or a bit of a trap. It’s easy to overthink a question such as this: the probability of one of the children being a boy is independent of what the other child is, so it will be approximately 1/2.

And that is wrong. It’s wrong because in this case we already know partial information, which is “one of the two is a boy”. That restricts our universe.

Let’s assume for simplicity that the probability of a child being born a boy or a girl is truly 1/2. Then, a priori, we have 4 cases, each with 1/4 probability. We’d have a uniform distribution over 4 possible arrangements: BB, BG, GB, GG (B=Boy, G=Girl).

If the question were “what is the probability that both children are boys?” the answer would be: 1 favourable case over 4 possible cases.

But in here, the extra advice eliminates one case: GG, and since the distribution was uniform, all the remaining cases are equally likely. The number of favourable cases is still 1, but the possible cases are now only 3, and by our basic principle of probability, the answer must be 1/3.

This is the solution reasoning on first principles. Let’s see how to do that algebraically and making use of the definition of conditional probability.

First, I define events:

A – both children are boys

B – at least one of the children is a boy

We want the probability of A given B, which can be written as

\Pr[A|B] = \Pr[A \cap B] / \Pr[B]

Now, if event A happens, B must surely happen, so A \cap B must necessarily be the same as A. We can show this by associating each event with its support set:

A = \{BB\},\,B = \{GB, BG, BB\}

A \cap B = A.


\Pr[A|B] = \Pr[A] / \Pr[B] = \frac{1/4}{3/4} = \frac{1}{3}.



You draw 2 cards from a standard 52-card deck without replacing them. What is the probability that both cards are of the same suit?

This problem introduced another concept: counting (permutations, combinations, arrangements). That makes it possible to still reason with the basic principle in mind, although other solutions are possible. First, let’s see the counting method.

Let’s count favourable cases and total cases. We look at all possible sequences of drawing cards, taking in consideration that order does not matter. We only draw two cards, so we have 52 \times 51 / 2 possibilities.

Now we need to count how many of these pairs are considered successes. All the suits have the same number of cards, so if we count the number of pairs of one suit and multiply by 4 we will have the total number of successful cases.

Fix one suit. The number of pairs of the same kind is the number of combinations of 13 taken 2 at a time. This is \binom{13}{2} = 13 \times 12 / 2.

The final solution is, as usual, the number of favourable cases over the total number of cases:

\frac{4 \times 13 \times 12 / 2}{52 \times 51 / 2} = \frac{12}{51}.

But there is an easier way to reason. Intuitively, assume you have drawn the first card. No matter what suit it is, for you to win the game you must draw a second card of the same suit. There are only 12 left in the deck, among a total of 51 cards. Your success probability is \frac{12}{51}.

We can make this reasoning precise with some algebra, if we really must.


A – first card is of suit X (any one given suit)

B – second card is of suit X

We want A and B to happen simultaneously, that is, both cards will be of the same suit:

\Pr[A \cap B] = \Pr[B | A] \times\Pr[A].

Now, let’s reason individually.

Fix some X. Then,

\Pr[A] = 1/4 since, at the start, every suit has an equal number of cards in the deck.

\Pr[B|A] = 12/51 following exactly the same intuitive reasoning given above.

Also, because this is valid for any of the 4 suits, we have to multiply by 4. That is:

\Pr[success] = \sum_{X in \{H,D,C,S\}} \frac{1}{4} \times \frac{12}{51} = 4 \times \frac{1}{4} \times \frac{12}{51} = \frac{12}{51}.


The third task tried to put together both concepts: permutations and conditional probability. But again, it did not quite require them.


A bag contains 3 red marbles and 4 blue marbles. Then, 2 marbles are drawn from the bag, at random, without replacement. If the first marble drawn is red, what is the probability that the second marble is blue?

I like this kind of problems. It’s fun reasoning over the number of different colours you have, which one you pick first and how many remain of each. These come frequently in several more homely varieties, like socks in a drawer.

But let’s see then how to solve this particular one. Can you reason intuitively? Can you see what to do?

Stop and think what is your universe after the assumption has happened. You start with 3 Reds and 4 Blues, and then you take a Red marble. Now, your bag contains:

  • 2 Reds
  • 4 Blues

How likely is it now you are going to take a blue marble?

This is first principles again, and quite a simple one at that. You have 4 Blue marbles in a bag with 6 marbles. Your probability is 2/3.

If you ask me, it’s a terrible way of testing your knowlege of permutations, conditional probabilities or the Bayes Theorem (yes, it’s in the tutorial, it’s extremely important for Machine Learning and is not required at all by the problems), and any other solution would be over-complicating. That is NOT what you should do in maths, where you should always look for the easiest way to reach a solution. I remember my high-school maths teacher saying “Mathematicians are lazy, they are always looking at ways to simplify things”. That is a good practice, you won’t do badly if you follow it as long as you don’t exaggerate.


But suppose you really want to solve this in a way that makes use of permutations and conditional probability all at once. How could you do it?

Again, and always, apply the basic principle. How many favourable cases and possible cases are there?

I first define a couple of events here:

A – The second marble is Blue

B – The first marble is Red

We want

\Pr[A|B] = \Pr[A \cap B] / \Pr[B]

To compute \Pr[B], compute the total and favourable cases:

Total cases: we draw two marbles and the order matters. We have 7 possibilities for the first ball and 6 for the second, for a total of 42.

Favourable cases: we have 3 possibilities for the first ball (only Red balls), but then we can draw any of the remaining six , for a total of 18.

\Pr[B] = \frac{18}{42} = \frac{3}{7}

To compute \Pr[A \cap B], we consider both events at once: we have 3 possibilities for the first marble and 4 for the second, for a total of 12 cases. Again dividing by the total number of cases we have

\Pr[A \cap B] = \frac{12}{42} = \frac{3}{7}

Then, this gives:

\Pr[A|B] = \Pr[A \cap B] / \Pr[B] = \frac{12}{18} = \frac{2}{3}

This covers Day 3 of the Statistics tutorial. The remaining days of were all about coding, so stay tuned for  return to Python on the next installment of this series.


Leave a Reply

%d bloggers like this: