• 검색 결과가 없습니다.

Chapter 1 Introduction

N/A
N/A
Protected

Academic year: 2022

Share "Chapter 1 Introduction"

Copied!
22
0
0

로드 중.... (전체 텍스트 보기)

전체 글

(1)

Chapter 1 Introduction

∙Sources of Uncertainty

- randomness: inherent unexplainable variability of nature

- lack of information/understanding: parameter uncertainty, modeling uncertainty, sampling uncertainty (≠ data uncertainty) - error/inaccuracy: data uncertainty, operational uncertainty

∙Definitions of Statistics and Probability

- Statistics: methods for drawing inferences about the properties of a population based on the properties of a sample from that population

- Probability: methods for calculating the likelihood of an event given known population characteristics

∙Random Variables - Definition

- Discrete or continuous

∙Populations vs Samples - population parameters - sample statistics

∙Graphical Display of Data - histogram

- box plot - quantile plot

∙Characteristics of Hydrologic Data - observational, not experimental

- the order of occurrence (and their serial dependence) is often important - stochastic

(2)
(3)
(4)

Chapter 2 Probability

< Basic Probability >

∙Axioms of Probability (i) P(S) =1

(ii) 0 ≤ P(A) ≤ 1

(iii) P(A∪B) = P(A) + P(B) if A & B are mutually exclusive [note] P(A∪B∪C) =

∙Conditional Probability

P(B |A) = P(A∩B)P(A)

P(A∩B) = P(A)P(B) if A&B are independent

- Total Probability Theorem

P(B) = P(A1)P(B|A1) + ... + P(An)P(B|An) - Bayes Theorem

P( Ak| B) = P(Ak)P(B |Ak)

n

j = 1P(Ai)P(B |Ak)

where P(Ak): prior probability P(Ak|B): posterior probability [note] P(A∩B∩C) =

(5)

< Probability Distribution >

∙Cumulative Mass/Density Functions (CMF/CDF) - discrete r.v.

- continuous r.v. PX(x)= prob(X≤x) = ⌠

x

-∞pX(t)dt

∙Probability Mass/Density Functions (PMF/PDF) - discrete r.v.

- continuous r.v. dPX(x)= pX(x)dx

((note)) the pdf is not a probability and can exceed one.

∙Probability Concept for Continuous R.V.

prob(a≤X≤b) = ⌠

b

a pX(t)dt = PX(b)-PX(a) prob(X = d) = ⌠

d

d pX(t)dt = PX(d)-PX(d)= 0

prob(a≤X≤b) = prob(a < X≤b) = prob(a≤X<b) = prob(a < X< b)

∙Bivariate and Marginal Distributions

pX(x)= ⌠

-∞pXY(x,s)ds

((note))

(i) PXY(x,∞) is a cumulative univariate probability function of X only, i.e. the cumulative marginal distribution of X

(ii) PXY(-∞,y)= PXY(x,-∞)= 0

(6)

∙Conditional Distributions

pX|Y(x|Y=y0) = pXY(x,y0)/pY(y0)

((note))

(i) independence

pX|Y(x|y)= pX(x) pXY(x,y)=pX(x)pY(y)

(ii) when the vectors V and W have a joint multivariate normal distribution, the conditional distribution of a vector V given the value of a vector W is

V|W ~ N [μV + ΣVWΣW-1

(w-μW), ΣV - ΣVWΣW-1ΣVWT

] where ΣVW is the covariance matrix of the vectors V and W,

ΣW is the covariance matrix of W with itself, and ΣV is the covariance matrix of V with itself.

∙Transformation

pU(u)= pX(x)|dx/du|

pUV(u,v)=pXY(x,y)|J( x,yu,v )|

∙Return Period

- Definition: the average interval in years between the occurrence of a flood of specified magnitude and an equal or larger flood (= recurrence interval)

TXx    PXx  probX ≧ x  p

- the probability that at least one event that equals or exceeds the T-year event will occur in any series of N years:

1 - (1 - p)N

- the probability that the first exceedance of the T-year event occurs in year k:

p(1 - p)k-1

(7)

Chapter 3 Properties of Random Variables

< Statistics >

∙Summary Statistics

- measure of central tendency (i) arithmetic mean:

μx = E (X) x =

((note)) E(a+bX) = E(X+Y) =

(ii) others: median, mode, geometric mean, weighted mean

- measure of dispersion (i) variance

Var(X) = σ2 = sX2

=

((note)) Var(a+bX) = Var(X±Y) =

(ii) others: standard deviation, coefficient of variation, range

- measure of symmetry: skewness coefficient

- measure of peakness: kurtosis

- measure for jointly distributed random variables (i) covariance

Cov(X,Y) = σXY =

sXY =

((note)) Cov(aX+b, cY+d) = (ii) correlation coefficient

Corr(X,Y) = ρXY = rXY =

((note)) Corr(aX+b, cY+d) =

(8)
(9)

< Parameter Estimation >

∙Precision/Accuracy

- precision: the ability of an estimator to provide repeated estimates that are close together.

((note)) 1. due to random error

2. measured with the variance of a estimator - accuracy: precision + unbiasedness

((note)) measured with the MSE = variance + bias2

∙Properties of Estimators

- unbiasedness: A point estimator ˆθ is an unbiased estimator of the population parameter θ if E[ θˆ ]= θ . If the estimator is biased, the bias = E[ θˆ ]-θ

.

- consistency: An estimator ˆθn , based on a sample size n, is a consistent estimator of a parameter θ if, for any positive number ε,

limn→∞Pr[| θˆn - θ |≤ε] = 1 .

- efficiency: An estimator that has minimum MSE among all possible unbiased estimators is called an efficient estimator.

- sufficiency: Let a sample X1, X2, ..., Xn be drawn randomly from a population having a probability distribution with unknown parameter θ . Then the statistic T=f(X1, X2, ..., Xn) is said to be sufficient for estimating θ if the distribution of X1, X2, ..., Xn conditional to the statistic T is independent of

θ .

(10)
(11)

∙Moment Generation Function

Mx(t) = E [etX]

= ⌠

- ∞etx fx(x)dx for continuou s r.v.

- If the m.g.f exists, its m-th derivative at the origin (t=0) is the m-th order central moment of X.

dMx(0) dt = μ'1 d2Mx(0)

dt2 = μ'2

dmMx(0) dtm = μ'm

- sample moments

M1= 0

M2= M'2- X2

M3=M'3-3 X M'2+2 X3 etc.

where Mr' is the r-th sample moment about the origin

Mr is the r-th sample moment about the sample mean.

((example))

(12)

< Parameter Estimation >

∙Precision/Accuracy

- precision: the ability of an estimator to provide repeated estimates that are close together.

((note)) 1. due to random error

2. measured with the variance of a estimator - accuracy: precision + unbiasedness

((note)) measured with the MSE = variance + bias2

∙Properties of Estimators

- unbiasedness: A point estimator ˆθ is an unbiased estimator of the population parameter θ if E[ θˆ ]= θ . If the estimator is biased,

the bias = E[ θˆ ]-θ .

- consistency: An estimator ˆθn , based on a sample size n, is a consistent estimator of a parameter θ if, for any positive number ε,

limn→∞Pr[| θˆ

n - θ |≤ε] = 1 .

- efficiency: An estimator that has minimum MSE among all possible unbiased estimators is called an efficient estimator.

- sufficiency: Let a sample X1, X2, ..., Xn be drawn randomly from a population having a probability distribution with unknown parameter θ . Then the statistic T=f(X1, X2, ..., Xn) is said to be sufficient for estimating θ if the distribution of X1, X2, ..., Xn conditional to the statistic T is independent of

θ .

(13)

∙Method of Moments Estimation

- estimates population parameters using the moments of samples

- equates the first m parameters of the distribution to the first m sample moments - computationally simple

((example))

∙Maximum likelihood Estimation

- The best value of a parameter of a probability distribution should be that value which maximizes the likelihood or joint probability of occurrence of the observed sample

- likelihood function

- log likelihood function

((example))

(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)

참조

관련 문서

웹 표준을 지원하는 플랫폼에서 큰 수정없이 실행 가능함 패키징을 통해 다양한 기기를 위한 앱을 작성할 수 있음 네이티브 앱과

_____ culture appears to be attractive (도시의) to the

【판결요지】[1] [다수의견] 동일인의 소유에 속하는 토지 및 그 지상 건물에 관하여 공동저 당권이 설정된 후 그 지상 건물이 철거되고 새로 건물이 신축된 경우에는

Common corporate digital crimes include piracy, financial fraud, espionage, and theft of services.. _______________________________________ such as

It considers the energy use of the different components that are involved in the distribution and viewing of video content: data centres and content delivery networks

If a path is possible (i.e. , has positive probability), we want the hedge to work along that path. The actual value of the probability is irrelevant. There are no

After first field tests, we expect electric passenger drones or eVTOL aircraft (short for electric vertical take-off and landing) to start providing commercial mobility

1 John Owen, Justification by Faith Alone, in The Works of John Owen, ed. John Bolt, trans. Scott Clark, &#34;Do This and Live: Christ's Active Obedience as the