Formulario Probabilidad Y Estadistica.pdf

  • Uploaded by: Armando Sandoval
  • 0
  • 0
  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Formulario Probabilidad Y Estadistica.pdf as PDF for free.

More details

  • Words: 1,391
  • Pages: 3
Unidad I

Departamento de Estadística, UAA

1 Estadística descriptiva

2 Teoría de eventos

Dada una muestra {x1 , x2 , . . . , xn }, se definen Frecuencias absolutas

ni

absolutas acumuladas

ni n pi = fi × 100 X nk Ni =

relativas acumuladas

Fi =

relativas

fi =

porcentajes

k≤i

n X

Media x ¯=

r X

xi

i=1

n impar

A ∩ (B ∪ C) = (A ∩ B) ∪ (A ∩ C)

A∩B =B∩A

A ∪ (B ∩ C) = (A ∪ B) ∩ (A ∪ C)

Leyes de Morgan

Asociatividad

(A ∪ B)c = Ac ∩ B c c

c

(A ∩ B) = A ∪ B

c

(A ∪ B) ∪ C = A ∪ (B ∪ C) (A ∩ B) ∩ C = A ∩ (B ∩ C)

Complementario A ∩ Ac = ∅

∅c = Ω

c

A∪A =Ω

(Ac )c = A

Ωc = ∅

A − B = A ∩ Bc

Axiomas de Kolmogorov

n par

2

• P (A) ≥ 0 • P (Ω) = 1

n X

r X

i=1

= n n X (xi − x ¯ )2

k=1

i=1

=

(xi − x ¯ )2

Sn2 =

A∪B =B∪A

3 Probabilidad

n

  x( n+1 ) 2 Mediana x e= n n  x( 2 ) + x( 2 +1) Varianza

Distributividad

xk nk

k=1

=

n

Ni n

Conmutatividad

2 Sn−1 =

n−1

• Si A y B son excluyentes

(xk − x ¯)2 nk

r X

n (xk − x ¯)2 nk

k=1

n−1

P (A ∪ B) = P (A) + P (B) Probabilidad condicional P (A ∩ B) , si P (B) > 0 P (A|B) = P (B) Regla del producto P (A ∩ B) = P (A|B)P (B)

Rango R = x(n) − x(1)

Leyes de suma

Cuantiles Cp , 0 ≤ p ≤ 1 Cp = (1 − 4 r)x( rx( r) + 4 r +1)

• P (Ac ) = 1 − P (A)

donde r = p(n − 1) + 1,

• P (A − B) = P (A) − P (A ∩ B)

4 r=parte decimal,  r=parte entera • Cuartiles Qk = C k , k = 1, 2, 3

• P (A ∪ B) = P (A) + P (B) − P (A ∩ B) Probabilidad total

4

• Deciles Dk = C k , k = 1, 2, . . . , 9

• P (B) = P (B|A)P (A) + P (B|Ac )P (Ac )

• Percentiles Pk = C

• P (B) =

10

k 100

, k = 1, 2, . . . , 99

k X

P (B|Ai )P (Ai ) donde {A1 , A2 , . . . , Ak }

i=1

forman una partición de Ω Rango intercuartilico RIQ = Q3 − Q1 Regla de Bayes Coeficiente de simetría α3 = con M3 =

1 n

n X

P (B|A)P (A) P (B) P (B|A)P (A) • P (A|B) = P (B|A)P (A) + P (B|Ac )P (Ac ) P (B|Ai )P (Ai ) • P (Ai |B) = Pk j=1 P (B|Aj )P (Aj ) • P (A|B) =

(xi − x ¯ )3

i=1

Coeficiente de curtosis α4 = con M4 =

M3 Sn3

n 1X (xi − x ¯ )4 n i=1

M4 Sn4

Independencia A y B son independientes sí y sólo si • P (A|B) = P (A)

Diagrama de caja y brazos CIi =, Q1 − 1.5 × RIQ

• P (B|A) = P (B)

CIs = Q3 + 1.5 × RIQ

• P (A ∩ B) = P (A)P (B) © 2016 JAGDL

Unidad II

Departamento de Estadística, UAA

1 Distribuciones discretas En las propiedades, X ∼ f = fX y Y ∼ fy y c ∈ R Función de distribución f (x) ≥ 0 X f (x) = 1 x

Si X ∼ f , P (X = x) = f (x) Función de distribución acumulada X F (x) = P (X ≤ x) = f (y) y≤x

• 0 ≤ F (x) ≤ 1 • Si x ≤ y entonces F (x) ≤ F (y) Valor esperado X xf (x) E[X] = x

• E[c] = c • E[cX] = cE[X] • E[X + c] = E[X] + c • E[X + Y ] = E[X] + E[Y ] X g(x)f (x) E[g(X)] = x

Varianza V [X]

= =

E[(X − E[X])2 ] X (x − E[X])2 f (x) x

• V [c] = 0 • V [cX] = c2 V [X] • V [X + c] = V [X] • Si X y Y son independientes, V [X +Y ] = V [X]+V [Y ] p SD[X] = V [X] Momentos mr = E[X r ] =

X

xr f (x)

x

• m1 = E[X] • m2 = V [X] + E[X]2 • V [X] = m2 − m21 Función generadora de momentos X tx e f (x) MX (t) = E[etX ] = x

dk M (t) • mk = dtk

t=0

• Si Y = cX, MY (t) = MX (ct) • Si Y = X + c, MY (t) = ect MX (t) n X • Si Y = Xi , i=1

MY (t) = MX1 (t) · · · MXn (t) • Si X y Y son dos variables tales que MX (t) = MY (t), entonces fX (x) = fY (x)

© 2016 JAGDL

Unidad III

Departamento de Estadística, UAA Momentos

1 Distribuciones continuas

mk = E[X k ] = En las propiedades, X ∼ f = fX , Y ∼ fy y c ∈ R

xk f (x)dx

• m1 = E[X]

Función de densidad

• m2 = V [X] + E[X]2

fZ (x) ≥ 0 f (x)dx = 1

• V [X] = m2 − m21 Z

Función generadora deZ momentos

b

f (x)dx

P (a < X < b) =

MX (t) = E[etX ] =

a

Función de distribución Zacumulada x F (x) = P (X ≤ x) =

f (y)dy

etx f (x)dx

dk M (t) dtk t=0

• mk =

−∞

• Si Y = cX, MY (t) = MX (ct)

• 0 ≤ F (x) ≤ 1

• Si Y = X + c, MY (t) = ect MX (t)

• Si x ≤ y entonces F (x) ≤ F (y)

• Si Y =

• F 0 (x) = f (x)

n X

Xi ,

i=1

MY (t) = MX1 (t) · · · MXn (t)

Valor esperado Z E[X] =

Z

• Si MX (t) = MY (t), entonces fX (x) = fY (x)

f (x)dx

• E[c] = c

2 Distribucions conjuntas

• E[cX] = cE[X] Propiedades

• E[X + c] = E[X] + c

• fX,Y (x, y) ≥ 0 Z Z • fX,Y (x, y)dydx = 1

• E[X + Y ] = E[X] + E[Y ] Z E[g(X)] = g(x)f (x)dx Varianza V [X]

Valor esperado 2

E[(X − E[X]) ] Z (x − E[X])2 f (x)dx

= =

• E[g(X, Y )] = Acumulada

• V [c] = 0

Z

b −∞

Marginales

• V [X + c] = V [X] • Si X y Y son independientes, V [X +Y ] = V [X]+V [Y ] V [X]

Covarianza

f (x, y)dydx −∞

• V [cX] = c V [X]

p

Z

g(X, Y )fX,Y (x, y)dydx

F (a, b) = 2

SD[X] =

a

Z Z

• fX (x) =

Z

• fY (y) =

Z

fX,Y (x, y)dy fX,Y (x, y)dx

Condicionales

Cov[X, Y ] = E[(X − E[X])(Y − E[Y ])] • Cov[X, Y ] = E[XY ] − E[X]E[Y ] • V [X + Y ] = V [X] + V [Y ] + 2Cov[X, Y ] Función gamma • Γ(x) = (x − 1)Γ(x − 1) • Γ(n) = (n − 1)! si n ∈ N √ 1 1 1 • Γ( ) = π, Γ( ) = 2.6789, Γ( ) = 3.6256 2 3 4 Estandarización X −µ ∼ N (0, 1), σ p donde µ = E[X] y σ = V [X] • Z=

© 2016 JAGDL

• fX|Y =y (x) =

fX,Y (x, y) fY (y)

• fY |X=x (y) =

fX,Y (x, y) fX (x)

Related Documents

Probabilidad
June 2020 16
Probabilidad
November 2019 26
Probabilidad
November 2019 30
Probabilidad
May 2020 19
Probabilidad
June 2020 19

More Documents from ""