Skip to content

Probability Interview Questions

This document provides a curated list of common probability interview questions frequently asked in technical interviews. It covers basic probability concepts, probability distributions, key theorems, and real-world applications. Use the practice links to explore detailed explanations and examples.


Sno Question Title Practice Links Companies Asking Difficulty Topics
1 Basic Probability Concepts: Definitions of Sample Space, Event, Outcome Wikipedia: Probability Google, Amazon, Microsoft Easy Fundamental Concepts
2 Conditional Probability and Independence Khan Academy: Conditional Probability Google, Facebook, Amazon Medium Conditional Probability, Independence
3 Bayes’ Theorem: Statement and Application Wikipedia: Bayes' Theorem Google, Amazon, Microsoft Medium Bayesian Inference
4 Law of Total Probability Wikipedia: Law of Total Probability Google, Facebook Medium Theoretical Probability
5 Expected Value and Variance Khan Academy: Expected Value Google, Amazon, Facebook Medium Random Variables, Moments
6 Probability Distributions: Discrete vs. Continuous Wikipedia: Probability Distribution Google, Amazon, Microsoft Easy Distributions
7 Binomial Distribution: Definition and Applications Khan Academy: Binomial Distribution Amazon, Facebook Medium Discrete Distributions
8 Poisson Distribution: Characteristics and Uses Wikipedia: Poisson Distribution Google, Amazon Medium Discrete Distributions
9 Exponential Distribution: Properties and Applications Wikipedia: Exponential Distribution Google, Amazon Medium Continuous Distributions
10 Normal Distribution and the Central Limit Theorem Khan Academy: Normal Distribution Google, Microsoft, Facebook Medium Continuous Distributions, CLT
11 Law of Large Numbers Wikipedia: Law of Large Numbers Google, Amazon Medium Statistical Convergence
12 Covariance and Correlation: Definitions and Differences Khan Academy: Covariance and Correlation Google, Facebook Medium Statistics, Dependency
13 Moment Generating Functions (MGFs) Wikipedia: Moment-generating function Amazon, Microsoft Hard Random Variables, Advanced Concepts
14 Markov Chains: Basics and Applications Wikipedia: Markov chain Google, Amazon, Facebook Hard Stochastic Processes
15 Introduction to Stochastic Processes Wikipedia: Stochastic process Google, Microsoft Hard Advanced Probability
16 Difference Between Independent and Mutually Exclusive Events Wikipedia: Independent events Google, Facebook Easy Fundamental Concepts
17 Geometric Distribution: Concept and Use Cases Wikipedia: Geometric distribution Amazon, Microsoft Medium Discrete Distributions
18 Hypergeometric Distribution: When to Use It Wikipedia: Hypergeometric distribution Google, Amazon Medium Discrete Distributions
19 Confidence Intervals: Definition and Calculation Khan Academy: Confidence intervals Microsoft, Facebook Medium Inferential Statistics
20 Hypothesis Testing: p-values, Type I and Type II Errors Khan Academy: Hypothesis testing Google, Amazon, Facebook Medium Inferential Statistics
21 Chi-Squared Test: Basics and Applications Wikipedia: Chi-squared test Amazon, Microsoft Medium Inferential Statistics
22 Permutations and Combinations Khan Academy: Permutations and Combinations Google, Facebook Easy Combinatorics
23 The Birthday Problem and Its Implications Wikipedia: Birthday problem Google, Amazon Medium Probability Puzzles
24 The Monty Hall Problem Wikipedia: Monty Hall problem Google, Facebook Medium Probability Puzzles, Conditional Probability
25 Marginal vs. Conditional Probabilities Khan Academy: Conditional Probability Google, Amazon Medium Theoretical Concepts
26 Real-World Application of Bayes’ Theorem Towards Data Science: Bayes’ Theorem Applications Google, Amazon Medium Bayesian Inference
27 Probability Mass Function (PMF) vs. Probability Density Function (PDF) Wikipedia: Probability density function Amazon, Facebook Medium Distributions
28 Cumulative Distribution Function (CDF): Definition and Uses Wikipedia: Cumulative distribution function Google, Microsoft Medium Distributions
29 Determining Independence of Events Khan Academy: Independent Events Google, Amazon Easy Fundamental Concepts
30 Entropy in Information Theory Wikipedia: Entropy (information theory) Google, Facebook Hard Information Theory, Probability
31 Joint Probability Distributions Khan Academy: Joint Probability Microsoft, Amazon Medium Multivariate Distributions
32 Conditional Expectation Wikipedia: Conditional expectation Google, Facebook Hard Advanced Concepts
33 Sampling Methods: With and Without Replacement Khan Academy: Sampling Amazon, Microsoft Easy Sampling, Combinatorics
34 Risk Modeling Using Probability Investopedia: Risk Analysis Google, Amazon Medium Applications, Finance
35 In-Depth: Central Limit Theorem and Its Importance Khan Academy: Central Limit Theorem Google, Microsoft Medium Theoretical Concepts, Distributions
36 Variance under Linear Transformations Wikipedia: Variance Amazon, Facebook Hard Advanced Statistics
37 Quantiles: Definition and Interpretation Khan Academy: Percentiles Google, Amazon Medium Descriptive Statistics
38 Common Probability Puzzles and Brain Teasers Brilliant.org: Probability Puzzles Google, Facebook Medium Puzzles, Recreational Mathematics
39 Real-World Applications of Probability in Data Science Towards Data Science (Search for probability applications in DS) Google, Amazon, Facebook Medium Applications, Data Science
40 Advanced Topic: Introduction to Stochastic Calculus Wikipedia: Stochastic calculus Microsoft, Amazon Hard Advanced Probability, Finance

Questions asked in Google interview

  • Bayes’ Theorem: Statement and Application
  • Conditional Probability and Independence
  • The Birthday Problem
  • The Monty Hall Problem
  • Normal Distribution and the Central Limit Theorem
  • Law of Large Numbers

Questions asked in Facebook interview

  • Conditional Probability and Independence
  • Bayes’ Theorem
  • Chi-Squared Test
  • The Monty Hall Problem
  • Entropy in Information Theory

Questions asked in Amazon interview

  • Basic Probability Concepts
  • Bayes’ Theorem
  • Expected Value and Variance
  • Binomial and Poisson Distributions
  • Permutations and Combinations
  • Real-World Applications of Bayes’ Theorem

Questions asked in Microsoft interview

  • Bayes’ Theorem
  • Markov Chains
  • Stochastic Processes
  • Central Limit Theorem
  • Variance under Linear Transformations

Custom Questions

Average score on a dice role of at most 3 times

Question

Consider a fair 6-sided dice. Your aim is to get the highest score you can, in at-most 3 roles.

A score is defined as the number that appears on the face of the dice facing up after the role. You can role at most 3 times but every time you role it is up to you to decide whether you want to role again.

The last score will be counted as your final score.

  • Find the average score if you rolled the dice only once?
  • Find the average score that you can get with at most 3 roles?
  • If the dice is fair, why is the average score for at most 3 roles and 1 role not the same?
Hint 1

Find what is the expected score on single role

And for cases when scores of single role < expected score on single role is when you will go for next role

Eg: if expected score of single role comes out to be 4.5, you will only role next turn for 1,2,3,4 and not for 5,6

Answer

If you role a fair dice once you can get:

Score Probability
1
2
3
4
5
6

So your average score with one role is:

sum of(score * scores's probability) = (1+2+3+4+5+6)*(⅙) = (21/6) = 3.5

The average score if you rolled the dice only once is 3.5

For at most 3 roles, let's try back-tracking. Let's say just did your second role and you have to decide whether to do your 3rd role!

We just found out if we role dice once on average we can expect score of 3.5. So we will only role the 3rd time if score on 2nd role is less than 3.5 i.e (1,2 or 3)

Possibilities

2nd role score Probability 3rd role score Probability
1 3.5
2 3.5
3 3.5
4 NA We won't role
5 NA 3rd time if we
6 NA get score >3 on 2nd

So if we had 2 roles, average score would be:

[We role again if current score is less than 3.4]
(3.5)*(1/6) + (3.5)*(1/6) + (3.5)*(1/6) 
+
(4)*(1/6) + (5)*(1/6) + (6)*(1/6) [Decide not to role again]
=
1.75 + 2.5 = 4.25

The average score if you rolled the dice twice is 4.25

So now if we look from the perspective of first role. We will only role again if our score is less than 4.25 i.e 1,2,3 or 4

Possibilities

1st role score Probability 2nd and 3rd role score Probability
1 4.25
2 4.25
3 4.25
4 4.25
5 NA We won't role again if we
6 NA get score >4.25 on 1st

So if we had 3 roles, average score would be:

[We role again if current score is less than 4.25]
(4.25)*(1/6) + (4.25)*(1/6) + (4.25)*(1/6) + (4.25)*(1/6) 
+
(5)*(1/6) + (6)*(1/6) [[Decide not to role again]
=
17/6 + 11/6 = 4.66
The average score if you rolled the dice only once is 4.66

The average score for at most 3 roles and 1 role is not the same because although the dice is fair the event of rolling the dice is no longer independent. The scores would have been the same if we rolled the dice 2nd and 3rd time without considering what we got in the last roll i.e. if the event of rolling the dice was independent.