Back to AI Flashcard MakerStatistics /Unit 5 Data Lesson 1 - Probability Distributions

Unit 5 Data Lesson 1 - Probability Distributions

Statistics5 CardsCreated 4 months ago

This deck covers key concepts related to probability distributions, including definitions and examples of random variables, discrete uniform distribution, expected value, and fair game.

what is a probability distribution

A probability distribution identifies the probability of all possible outcomes in a sample space. In other words, a probability distribution shows us how 1 or 100% is divided up amongst every possible outcome in a given probability experiment.
Tap or swipe ↕ to flip
Swipe ←→Navigate
1/5

Key Terms

Term
Definition
what is a probability distribution
A probability distribution identifies the probability of all possible outcomes in a sample space. In other words, a probability distribution shows us ...
random variables
Random variables - When focusing on all outcomes when dealing with probability distributions, a random variable, X, to représente the particular event...
discrete uniform distribution
Discrete uniform distribution When all outcomes in a distribution are equally likely in any single trial, we call this a uniform probability distribut...
what is expected value
Expected value - an expectation/expected value, E(X), is the predicted average of all possible outcomes in a single trial of a probability experiment....
what is fair game
0 expected value = fair game

Study Tips

  • Press F to enter focus mode for distraction-free studying
  • Review cards regularly to improve retention
  • Try to recall the answer before flipping the card
  • Share this deck with friends to study together
TermDefinition
what is a probability distribution
A probability distribution identifies the probability of all possible outcomes in a sample space. In other words, a probability distribution shows us how 1 or 100% is divided up amongst every possible outcome in a given probability experiment.
random variables
Random variables - When focusing on all outcomes when dealing with probability distributions, a random variable, X, to représente the particular event we are focusing on.
discrete uniform distribution
Discrete uniform distribution When all outcomes in a distribution are equally likely in any single trial, we call this a uniform probability distribution. P(X) = 1 / n
what is expected value
Expected value - an expectation/expected value, E(X), is the predicted average of all possible outcomes in a single trial of a probability experiment. Expected value is the long-run average. Formula E(X) = x P(x)
what is fair game
0 expected value = fair game