Key Concepts in Probability and Decision Making
Key Concepts in Probability and Statistics
Random Variable
A random variable is a numeric description of the outcome of an experiment.
Discrete Random Variable
A discrete random variable is a random variable that may assume only a finite or infinite sequence of values.
Continuous Random Variable
A continuous random variable is a random variable that may assume any value in an interval or collection of intervals.
Probability Function
A probability function, denoted f(x), provides the probability that a discrete random variable x takes on some specific value.
Discrete Probability Distribution
A discrete probability distribution is a table, graph, or equation describing the values of the random variable and the associated probabilities.
Expected Value
The expected value is a weighted average of the values of the random variable, for which the probability function provides the weights. If an experiment can be repeated a large number of times, the expected value can be interpreted as the “long-run average.”
Variance
Variance is a measure of the dispersion or variability in the random variable. It is a weighted average of the squared deviations from the mean, µ.
Standard Deviation
The standard deviation is the positive square root of the variance.
Binomial Probability Distribution
The binomial probability distribution is the probability distribution for a discrete random variable, used to compute the probability of x successes in n trials. (Only 2 possible outcomes, series of n identical trials, probabilities do not change from trial to trial, trials independent)
Poisson Probability Distribution
The Poisson probability distribution is the probability distribution for a discrete random variable, used to compute the probability of x occurrences over a specified interval.
Uniform Probability Distribution
The uniform probability distribution is a continuous probability distribution in which the probability that the random variable will assume a value in any interval of equal length is the same for each interval.
Probability Density Function
The probability density function describes the probability distribution of a continuous random variable.
Normal Probability Distribution
The normal probability distribution is a continuous probability distribution whose probability density function is bell-shaped and determined by the mean, µ, and standard deviation, σ.
Standard Normal Distribution
A standard normal distribution is a normal distribution with a mean of 0 and a standard deviation of 1.
Cumulative Probability
Cumulative probability is the probability that a random variable takes on a value less than or equal to a stated value.
Exponential Probability Distribution
The exponential probability distribution is a continuous probability distribution that is useful in describing the time to complete a task or the time between occurrences of an event.
Decision Making Fundamentals
Decision Alternatives
Decision alternatives are options available to the decision-maker.
Chance Event
A chance event is an uncertain future event affecting the consequence, or payoff, associated with a decision.
Consequence
A consequence is the result obtained when a decision alternative is chosen and a chance event occurs. A measure of the consequence is often called a payoff.
States of Nature
States of nature are the possible outcomes for chance events that affect the payoff associated with a decision alternative.
Influence Diagram
An influence diagram is a graphical device that shows the relationship among decisions, chance events, and consequences for a decision problem.
Node
A node is an intersection or junction point of an influence diagram or a decision tree.
Decision Nodes
Decision nodes indicate points where a decision is made.
Chance Nodes
Chance nodes indicate points where an uncertain event will occur.
Consequence Nodes
Consequence nodes of an influence diagram indicate points where a payoff occurs.