• 검색 결과가 없습니다.

Review Gram-Schmidt orthogonalization and orthogonal complement

N/A
N/A
Protected

Academic year: 2022

Share "Review Gram-Schmidt orthogonalization and orthogonal complement"

Copied!
26
0
0

로드 중.... (전체 텍스트 보기)

전체 글

(1)

Page 1

Review

Gram-Schmidt orthogonalization and orthogonal complement

⌅⌅ Theorem 6.3: V is an inner product space; S = {v1, · · · , vk} ✓ V is orthogonal. Then

y = Pk

i=1 aivi ) aj = hy,vji

kvjk2, j = 1, · · · , k.

⌅⌅ Corollary 6.3.1: If S is orthonormal, y = Pk

i=1 aivi ) ai = hy, vii ) y = Pk

i=1hy, viivi.

⌅⌅ Corollary 6.3.2: Orthogonality implies linear independence.

(2)

Page 2

⌅⌅ Gram-Schmidt orthogonalization:

v1 = w1

v2 = w2 hw2,v1i

kv1k2 v1 v3 = w3 hw3,v1i

kv1k2 v1 hw3,v2i

kv2k2 v2

· · ·

⌅⌅ Theorem 6.4: V is an inner product space; S = {w1, · · · , wn} ✓ V is linearly independent. Then letting v1 = w1 and vi = wi Pi 1

j=1

hwi,vji

kvjk2 vj, i = 2, · · · , n,

makes S0 = {v1, · · · , vn} orthogonal and span(S0) = span(S).

(3)

Page 3

⌅⌅ Theorem 6.5: V is an inner product space; dim(V ) < 1. Then 1. V has an orthonormal basis = {v1, · · · , vn}.

2. 8x 2 V, x = Pn

i=1hx, viivi

That is, [x] = (hx, v1i, · · · , hx, vni)t.

2 is called a Fourier series expansion.

hx, vii are called Fourier coefficients.

kxk2 = hPn

i=1hx, viivi, Pn

j=1hx, vjivji

= Pn

i=1 Pn

j=1hx, viihx, vjihvi, vji = Pn

i=1 |hx, vii|2 : Parseval’s relation

example: C([0, 2⇡), C) with hf, gi = 2⇡1 R 2⇡

0 f (t)g(t)dt

= {fk(t) = eikt : k 2 Z} is an orthonormal basis.

hf, fki = 2⇡1 R 2⇡

0 f (t)e iktdt : k-th Fourier coefficient of f

(4)

Page 4

⌅⌅ Corollary 6.5: V is an inner product space; dim(V ) < 1;

= {v1, · · · , vn} is an orthonormal basis for V ; T is a linear operator on V ; A = [T ] . Then Aij = hT (vj), vii.

⌅⌅ orthogonal complement S? of S : S? = {x 2 V : hx, yi = 0, 8y 2 S}

Do not confuse this with the set complement

S? is a subspace for any set S.

S ✓ (S?)?, but if S is a subspace, then S = (S?)?.

{0}? = V, V ? = {0}

[End of Review]

(5)

Page 5

⌅⌅ Theorem 6.6. and Corollary 6.6: V is an inner product space; W is a subspace; dim(W ) < 1; {v1, · · · , vk} is an orthonormal basis for W ;y 2 V. Then

1. y = u + z, where u 2 W is unique and closest to y, and if dim(V ) < 1,

z 2 W? is unique and closest to y.

2. u = Pk

i=1hy, viivi.

proof:

Let u = Pk

i=1hy, viivi. and z = y u.

) u 2 W

To show z 2 W?, we only need to show hz, vii = 0, j = 1, · · · , k.

爬河

(6)

Page 6

hz, vji = h(y Pk

i=1hy, viivi), vji

= hy, vji hPk

i=1hy, viivi, vji = hy, vji Pk

i=1hy, viihvi, vji

= 0 [hvi, vji = ij] ) z 2 W?

“unique”: Let y = u0 + z0, u0 2 W, z0 2 W?.

) u u0 = z0 z, u u0 2 W, z0 z 2 W? [y = u + z]

) u u0 = z0 z 2 W \ W? = {0}

) u0 = u and z0 = z

“closest”: For any x in W,

ky xk2 = k(y u) + (u x)k2

= h(y u) + (u x), (y u) + (u x)i

= ky uk2+ku xk2+hy u, u xi+hu x, y ui iii. utz

(7)

Page 7

) ky xk2 = ky uk2 + ku xk2 [y u 2 W ?, u x 2 W ]

) ky xk2 ky uk2

A similar proof can show that z is unique and closest to y.

⌅⌅ orthogonal projection of y on W : PW?(y) = Pk

i=1hy, viivi, where {v1, · · · , vk} is an orthonormal basis for W .

A non-orthogonal projection is shown above. V = W1 W2

⌅⌅ example: Consider the inner product space PR 1 2(R) withhf, gi =

1 f (x)g(x)dx.

We found earlier the orthonormal basis by G-S process.

= {v1, v2, v3} = ⇢q

12,

q3 2x,

q5

8(3x2 1) .

U-U t Z

t.y-uc.it#ft

Futzz E w L

<肅w

u =

<Ut Z , Ni >

=

= <a , vi>

(8)

Page 8

Let f(x) = a + bx + cx2. hf, v1i = R 1

1(a + bx + cx2) q1

2dx =

p2

3 (3a + c) hf, v2i = R 1

1(a + bx + cx2) q3

2xdx =

q2 3b hf, v3i = R 1

1(a + bx + cx2) q5

8(3x2 1)dx = 23 q2

5c f (x) = hf, v1iv1 + hf, v2iv2 + hf, v3iv3

=

p2

3 (3a+c)· q1

2+ q2

3b· q3

2x+23 q2

5c· q5

8(3x2 1)) = a+bx+cx2 Let W = span({v1, v3}) =span

✓⇢q1 2,

q5

8(3x2 1)

=span({1, x2}) PW?(f ) = hf, v1iv1 + hf, v3iv3

=

p2

3 (3a + c) ·

q1

2 + 23 q2

5c ·

q5

8(3x2 1) = a + cx2 P ?

W?(f ) = hf, v2iv2 = bx

(9)

Page 9

If instead we let W = span({v1, v2}) =span✓⇢q

12, q3

2x

=span({1, x}) P ?

W(f ) = hf, v1iv1 + hf, v2iv2 = (a + c3) + bx P ?

W?

(f ) = hf, v3iv3 = c(x2 13)

So (a + c3) + bx is the polynomial in W that is “closest” to f, and c(x2 13) is the polynomial in W ? that is “closest” to f.

Note that span({1, x}) and span({x2}) are not orthogonal comple- ments of each other, while span({(1, 0, 0), (0, 1, 0)}) and span({0, 0, 1}) are . Why?

遼閑

.

(10)

Page 10

⌅⌅ Theorem 6.7: V is an inner product space; dim(V ) = n; S = {v1, · · · , vk} is an orthonormal set. Then

1. S can be extended to an orthonormal basis {v1, · · · , vk, vk+1, · · · , vn} for V .

2. W = span(S) ) {vk+1, · · · , vn} is a basis for W?.

3. For any subspace W of V , dim(V ) = dim(W )+ dim(W?).

proof of 2: 8x 2 W?, x = Pn

i=1 aivi [basis for V ]

) hx, vii = ai = 0, i = 1, · · · , k [orthogonal complement]

) x = Pn

i=k+1 aivi View

(11)

Page 11

Adjoint

⌅⌅ The adjoint T of a linear operator T on a finite-dimensional inner product space V is another operator on V such that for any x and y in V, hT (x), yi = hx, T(y)i.

On Fn, this correspond to the conjugate transpose A = At of the matrix A. A is called the adjoint of A.

hx, yi = Pn

i=1 xiyi = ytx = yx

hAx, yi = y(Ax) = (Ay)x = hx, Ayi That is, (LA) = LA.

In R2, if T is rotation by ✓, then T = T 1 is rotation by ✓;

and if T is projection onto x axis, then T = T.

✓cos sin sin cos

: rotation by

( con

jugatetranspc.se

in Matrix Rep )

Existence of H = ?

로 Unique

res

岾金

?

軀批 任閭門

.

.fr?Eaib).TLN=(a.o

.)

西停

.

印作偕 믻 햛 炸漑

T.

一一

(12)

Page 12

⌅⌅ For the derivation of adjoint, consider a scalar-valued linear trans- formation g : V ! F, which is also called a functional.

⌅⌅ Theorem 6.8: V is an inner product space; dim(V ) < 1; g : V ! F is a linear transformation. Then 9y, unique in V, such that 8x 2 V, g(x) = hx, yi.

proof: Let = {v1, · · · , vn} be an orthonormal basis for V, and, for the given g, let y = Pn

i=1 g(vi)vi. Then 8x 2 V, g(x) = g(Pn

i=1hx, viivi) [orthonormal basis]

= Pn

i=1hx, viig(vi) [linear]

= Pn

i=1hx, g(vi)vii = hx, Pn

i=1 g(vi)vii = hx, yi

“uniqueness” : 8x 2 V, g(x) = hx, yi = hx, y0i ) y = y0 [Thm 6.1-5]

Note that this y is not affected by the choice of the basis.

f e

ftp.y.me#iF=gt

(13)

Page 13

⌅⌅ example: g : R3 ! R such that g((a1, a2, a3)) = 2a1 a2 + 4a3. y = Pn

i=1 g(vi)vi = g(e1)e1 + g(e2)e2 + g(e3)e3 = (2, 1, 4) g((a1, a2, a3)) = h(a1, a2, a3), (2, 1, 4)i = 2a1 a2 + 4a3

⌅⌅ Theorem 6.9: V is an inner product space; dim(V ) < 1; T is a linear operator on V. Then 9T, a unique operator on V, such that

1. 8x, y 2 V, hT (x), yi = hx, T(y)i.

2. T is linear.

proof: Fix y and let g(x) = hT (x), yi.

) g is linear. [T is linear; h·, yi is linear.]

) 8x 2 V, g(x) = hx, y0i for some y0, which is unique in V. [Thm 6.8]

Define T such that T(y) = y0.

) hT (x), yi = hx, y0i = hx, T(y)i :“1”

( ? . 00) ( 0-10) (004)= -

EU

사가 어떤것인지는

정의인참1TH성

Property 나중에 )

(14)

Page 14

“linearity”: 8x 2 V,

hx, T(c1y1 + c2y2)i = hT (x), c1y1 + c2y2i [1]

= c1hT (x), y1i + c2hT (x), y2i

= c1hx, T(y1)i + c2hx, T(y2)i [1]

= hx, c1T(y1) + c2T (y2)i

) T(c1y1 + c2y2) = c1T(y1) + c2T (y2) [Thm 6.1-5]

⌅⌅ Theorem 6.10: V is an inner product space; dim(V ) < 1;

= {v1, · · · , vn} is an orthonormal basis for V ; T is a linear operator on V .

Then [T ] = [T ].

proof: Let A = [T ] and B = [T ] . ) Bij = hT(vj), vii [Corol 6.5]

= hvi, T(vj)i = hT (vi), vji = Aji = (A)ij

B 二拌

(15)

Page 1

Review

⌅⌅ Theorem 6.6. and Corollary 6.6: V is an inner product space; W is a subspace; dim(W ) < 1; {v1, · · · , vk} is an orthonormal basis for W ;y 2 V. Then

1. y = u + z, where u 2 W is unique and closest to y, and if dim(V ) < 1, z 2 W? is unique and closest to y.

2. u = Pk

i=1hy, viivi.

⌅⌅ orthogonal projection of y on W : PW?(y) = Pk

i=1hy, viivi, where {v1, · · · , vk} is an orthonormal basis for W .

(16)

Page 2

⌅⌅ Theorem 6.7: V is an inner product space; dim(V ) = n; S = {v1, · · · , vk} is an orthonormal set. Then

1. S can be extended to an orthonormal basis {v1, · · · , vk, vk+1, · · · , vn} for V .

2. W = span(S) ) {vk+1, · · · , vn} is a basis for W?.

3. For any subspace W of V , dim(V ) = dim(W )+ dim(W?).

⌅⌅ The adjoint T of a linear operator T on a finite-dimensional inner product space V is another operator on V such that for any x and y in V, hT (x), yi = hx, T(y)i.

On F n, this correspond to the conjugate transpose A = At of the matrix A. A is called the adjoint of A.

hAx, yi = y(Ax) = (Ay)x = hx, Ayi. That is, (LA) = LA.

forthogondcomplimeuT.tw

王石比) (ATF( Y) = LH Dy)

(17)

Page 3

⌅⌅ Theorem 6.8: V is an inner product space; dim(V ) < 1; g : V ! F is a linear transformation (functional). Then 9y, unique in V, such that 8x 2 V, g(x) = hx, yi, where y = Pn

i=1 g(vi)vi.

⌅⌅ example: g : R3 ! R such that g((a1, a2, a3)) = 2a1 a2 + 4a3. y = Pn

i=1 g(vi)vi = g(e1)e1 + g(e2)e2 + g(e3)e3 = (2, 1, 4)

⌅⌅ Theorem 6.9: V is an inner product space; dim(V ) < 1; T is a linear operator on V. Then 9T, a unique operator on V, such that

1. 8x, y 2 V, hT (x), yi = hx, T(y)i.

2. T is linear.

⌅⌅ Theorem 6.10: V is an inner product space; dim(V ) < 1;

= {v1, · · · , vn} is an orthonormal basis for V ; T is a linear operator on V . Then [T ] = [T ].

⌅⌅ [End of Review]

(18)

Page 4

⌅⌅ Corollary 6.10: A is an n ⇥ n matrix. Then LA = (LA). proof: Let be the standard basis for F n.

[(LA)] = [LA] [Thm 6.10] = A = [LA] .

Then the uniqueness of the representation completes the proof.

This can directly be shown by taking conjugate transpose of the matrix.

hLA(x), yi = hAx, yi = y(Ax)

= (Ay)x = hx, Ayi = hx, LA(y)i ! (LA) = LA.

⌅⌅ example: T : C3 ! C3; is the standard basis.

T (a, b, c) = (2a + ib, b 5ic, a + (1 i)b + 3c) [T ] =

0

@2 i 0

0 1 5i

1 1 i 3

1

A , [T] = [T ] = 0

@ 2 0 1

i 1 1 + i 0 5i 3

1 A T (a, b, c) = (2a + c, ia + b + (1 + i)c, 5ib + 3c)

=A

<, KA

片姓

i.

.

(19)

Page 5

⌅⌅ example: Consider the inner product space PR 1 2(R) with hf, gi =

1 f (x)g(x)dx.

Let f(x) = a + bx + cx2 and g(x) = p + qx + rx2.

When T (f) = f0 = b + 2cx, what would T(g) look like?

=

⇢q1 2,

q3 2x,

q5

8(3x2 1) [orthonormal basis]

[g] = ⇣p

2p +

p2 3 r,

p6

3 q, 2

p10

15 r⌘t [T ] =

0

@0p

3 0 0 0 p

15 0 0 0

1

A , [T] = [T ] = 0

@

0 0 0

p3 0 0 0 p

15 0 1 A

[T(g)] = [T] [g]

< 9시, > =? g쌰is >

.io

←E湟 准坊

器漣

.sc

漲拓

(20)

Page 6

[T(g)] = 0

@

0 0 0

p3 0 0 0 p

15 0 1 A

0 BB

@

p2p +

p2 3 r

p6 3 q

2p 1510r

1 CC A =

0 BB

@ p 0

6(p + r3) p10q

1 CC A

T (p + qx + rx2) =p

6(p + r3) q3

2x +p 10q

q5

8(3x2 1)

= 52q + (3p + r)x + 152 qx2 We can confirm this result by computing hT (f), gi = R 1

1(b + 2cx)(p + qx + rx2)dx = 2bp + 23br + 43cq

= R 1

1(a + bx + cx2)( 52q + (3p + r)x + 152 qx2)dx = hf, T(g)i.=

(21)

Page 8

⌅⌅ Theorem 6.11 and Corollary 6.11: V is an inner product space; T and U are linear operators on V ; A and B are n ⇥ n matrices; c is a scalar. Then the following hold.

(a) (T + U) = T + U (A + B) = A + B (b) (cT ) = cT (cA) = cA

(c) (T U ) = UT (AB) = BA (d) (T) = T (A) = A (e) I = I In = In

proof of (c): 8x, y 2 V,

hx, (T U)(y)i = h(T U)(x), yi = hT (U(x)), yi

= hU(x), T(y)i = hx, U(T(y))i

= hx, (UT)(y)i.

L(AB) = (LAB) = (LALB) [Corol 6.10, Thm 2.15-6]

= (LB)(LA) = LBLA = LBA

(22)

Page 9

proof of (d):

hT (x), yi = hx, T(y)i = hT(y), xi = hy, (T )(x)i

= h(T)(x), yi ) (T) = T

⌅⌅ least square approximation:

measurements: (t1, y1), (t2, y2), · · · , (tm, ym)

approximation: Find a and b such that y = at + b, or

find a, b and c such that y = at2 + bt + c.

criterion: Minimize E = Pm

i=1[yi (ati + b)]2, or E = Pm

i=1[yi (at2i + bti + c)]2.

( No exit Solution , East error)

fcarrefit.mg

*

recession

.

(23)

Page 10

) E = ky Axk2, where y = (y1, · · · , ym)t, and A =

0

@ t1 1 ... ...

tm 1 1

A , x =

✓a b

, or A = 0

@ t21 t1 1 ... ... ...

t2m tm 1 1

A , x = 0

@a b c

1 A

) Find x that minimizes ky Axk2.

Terror

And

a.b.ctamtcti.my

"

~ Eth elementatitboratitbtn.to

= Mi - (ADD2

(24)

Page 11

⌅⌅ orthogonality principle:

The minimizing x0 satisfies 8x, hAx, y Ax0i = 0

W = R(LA)

= {Ax : x 2 Fn}

= {P

aixi : x 2 Fn}

where ai are the column vectors of A.

So W is the column space of A.

The principle is quite general.

⌅⌅ Lemma 6.12.1: A 2 Mm⇥n(F ); x 2 Fn; and y 2 Fm. Then hAx, yim = hx, Ayin.

proof: hAx, yim = y(Ax) = (yA)x = (Ay)x = hx, Ayin

⌅⌅ Lemma 6.12.2: A 2 Mm⇥n(F ). Then rank(AA) = rank(A).

m) # 아

Di

Y

= AX 7ER

otparanln.IE

R , #

No Solution

Since M7개 (불능)

ERm

구최소로하는 Xo

찾는일

= R"

nxm.mx n = nxn

(25)

Page 12

proof: (Ax = 0 ) AAx = 0) ) N(LA) ✓ N(LAA)

) nullity(LA)  nullity(LAA)

) rank(LAA)  rank(LA) [dim thm: nullity + rank = n]

) rank(AA)  rank(A) [also from Thm 3.7]

So we need only to show that N(LAA) ✓ N(LA).

x 2 N(LAA) ) AAx = 0

) hAAx, xin = hAx, Axim = 0 ) Ax = 0 ) x 2 N(LA)

(26)

Page 13

⌅⌅ Theorem 6.12: A 2 Mm⇥n(F ); y 2 Fm. Then 9x0 2 Fn such that 1. AAx0 = Ay

2. 8x 2 Fn, kAx0 yk  kAx yk 3. rank(A) = n ) x0 = (AA) 1Ay.

proof: Let W = {Ax : x 2 Fn} = R(LA) [col sp of A]

) y = u + z, u 2 W, z 2 W? such that u 2 W is unique and closest to y, and

z 2 W ? is unique and closest to y. [Thm 6.6]

) 9x0 such that u = Ax0. : “2”

) z = y Ax0 2 W ? ) 8x 2 Fn, hAx, y Ax0i = 0 ) 8x 2 Fn, hx, A(y Ax0)i = 0 : “1”, orthogonality

“3” follows from Lemma 6.12.2 and “1”.

⌅⌅ Corollary 6.12: A 2 Mm⇥n(F ); rank(A) = n. Then AA is invert- ible.

( East Square Solution)

= 0

참조

관련 문서

 Gram-Schmidt 과정은 선형독립인 k개의 벡터로 부터 k개의 직교 벡터를 생성하는 방법이다.  Ax = b의 최소제곱해는 A의

 Students needing additional information about grading policies and procedures should meet with their faculty advisor, Executive Director/Chairperson or a

Telavancin versus standard therapy for treatment of complicated skin and skin structure infections caused by gram-positive bacteria: FAST 2 study..

Basic aspects of AUTOSAR architecture and methodology Safety mechanisms supported by AUTOSAR.. Technical safety concepts supported by AUTOSAR Relationship to ISO

GDP impact of COVID-19 spread, public health response, and economic policies. Virus spread and public

:Acidulated phosphate fluoride treatment and formation of caries-like lesions in enamel: effect of application time.. Schmidt HF : The fluoride varnish

Verbs with -ing complement construction implicate the sense of ‘enjoyment’ whereas verbs with to-infinitive complement construction has the notion of

Substitute the series with undetermined coefficients and its derivatives.. [Reference] 5.4 Bessel’s Equation.. 11.6 Orthogonal Series.. 11.6 Orthogonal Series..