Random Variables: Difference between revisions

From Rice Wiki
No edit summary
No edit summary
 
(2 intermediate revisions by one other user not shown)
Line 1: Line 1:
= Linear Combinations of RV =
= Linear Combinations of RV =


Let <math>X</math> be a random variable, <math>a, c</math> be constants.
Let <math>X,Y</math> be random variables, <math>a, b, c</math> be constants.


Expectation have the following properties
Expectation have the following properties for linear transformations:


<math>
<math>
E(aX + c) = aE(X) + c
E(aX + c) = aE(X) + c
</math>
For a linear combination, we have
<math>
E(aX + bY) = aE(X) + bE(Y)
</math>
</math>


Variance is a bit more complicated. Recall that the calculation of
Variance is a bit more complicated. Recall that the calculation of
variance involves the average difference from the mean squared.
variance involves the average difference from the mean squared. It is no
surprise that any constant coefficient is squared, and any translations
does not impact the spread of data.


<math>
<math>
Var(aX + c) = a^2 Var(X)
Var(aX + c) = a^2 Var(X)
</math>
</math>
For a linear combination, when the two events are independent,
<math>
Var(aX + bY) = a^2 Var(X) + b^2 Var(Y)
</math>
When the events are dependent,
<math>
Var(aX + bY) = a^2 Var(X) + ab Cov(X,Y) + b^2 Var(Y)
</math>
[[Category:Statistics]]

Latest revision as of 03:42, 5 March 2024

Linear Combinations of RV

Let be random variables, be constants.

Expectation have the following properties for linear transformations:

For a linear combination, we have

Variance is a bit more complicated. Recall that the calculation of variance involves the average difference from the mean squared. It is no surprise that any constant coefficient is squared, and any translations does not impact the spread of data.

For a linear combination, when the two events are independent,

When the events are dependent,