Kamis, 23 Agustus 2018

Matriks Dan Ruang Vektor : Inner Product Spaces, Sudut, Orthogonalitas, Gram-Schmidt Processes, Dan Dekomposisi Qr

Matriks dan Ruang Vektor : Inner Product Spaces, Sudut, Orthogonalitas, Gram-Schmidt Processes, dan Dekomposisi QR



Hasil kali dalam

Definisi : yaitu fungsi yang mengkaitkan setiap pasangan vektor di ruang vektor V ( misalkan vektor u dan v dengan notasi <u,v> )dengan bilangan riel, dan memenuhi 4 aksioma berikut ini :

  • Simetris : <u,v> = <v,u>
  • Aditivitas : <u+v, w> = <u,w> + <v,w>
  • Homogenitas : <ku,v> = k<u,v> , k : scalar
  • Positivitas : <u,v> ≥ 0 dan( <u,u> = u = 0)

Ruang vektor yang dilengkapi hasil kali dalam disebut : Ruang hasil kali dalam yang disingkat RHD

Contoh soal :

Tunjukkan bahwa operasi perkalian titik standar di R3 merupakan hasil kali dalam !

Jawab :

Misalkan : a(a1, a2, a3), b(b1, b2, b3) dan c(c1, c2, c3) berada dalam R3.

Akan ditunjukkan bahwa perkalian titik standar memenuhi 4 aksioma hasil kali dalam yaitu :

1. Simetri
<a, b> = (a.b)
= (a1b1 + a2b2 + a3b3)
= (b1a1 + b2a2 + b3a3)
= <b,a> (terpenuhi)

2. Aditivitas
<a+b, c> = ((a + b) . c)
= ((a1+b1, a2 + b2, a3 + b3) . (c1, c2, c3))
= ((a1c1 + b1c1) + (a2c2 + b2c2) + (a3c3 + b3c3))
= (a1c1 + a2c2 + a3c3) + (b1c1 + b2c2 + b3c3)
= <a,c> + <b,c> (terpenuhi)

3. Homogenitas
<ka, b> = (ka.b)
= (ka1b1 + ka2b2 + ka3b3)
= k(a1b1 + a2b2 + a3b3)
= k(a.b)
= k< a,b > (terpenuhi)

4. Positivitas
<a, a> = (a.a)
= (a12 + a22 + a32)≥ 0 terpenuhi) dan <u,u> = (a12 + a22 + a32)= u 
= (0,0,0) = 0 (terpenuhi)


Euclidean Inner Product



Euclidean inner product (the standard inner product) on R^n define as 

⟨u,v⟩ = u⋅v = u_1 v_1+u_2 v_2+…+u_n v_n 

Inner products can be used to define notions of norm and distance. If u and v are vectors in Euclidean n- space, the norm and distance can be expressed as


Example: 

Let u=(1,0) and v=(0,1) are vectors in R^2 

Compute the norms and distances between vectors with the Euclidean inner product


Norm and Distance



V is a real inner products space

The norm (or length) of a vector v in V is denoted by 


The distance between two vectors is denoted by


A vector of norm 1 is called a unit vector

Theorem : 

If u and v are vectors in a real inner product space V, and k is a scalar, then: 
  1. ‖v‖≥0 with equality if and only if v=0 
  2. ‖kv‖ = |k|‖v‖ 
  3. d(u,v) = d(v,u) 
  4. d(u,v) ≥ 0 with equality if and only if u=v

Example

Let u = (u_1,u_2) and v = (v_1,v_2) be vectors in R^2 

Verify that the inner product define by 

⟨u,v⟩ = 3u_1 v_1 + 2u_2 v_2 

Satisfies the four inner product axioms ! 

If u = (1,2) and v = (0,2) be vectors in R^2, compute the norms and distance between vectors with the given inner product!


Standard Inner Product on M_nn



Let u = U and v = V are matrices in the vector space M_nn 

⟨u,v⟩ = tr(U^T V) 

Defines an inner product on M_nn called the standard inner product on that space

Example:



Angle and Orthogonality in Inner Product Spaces



Angle Between Two Vectors

“Angle” between vectors in a real inner product space can be found by:


Example: Let M_22 have the standard inner product. Find the cosine of the angle between the vectors



Properties of Length and Distance in General Inner Product Spaces



If u,v, and w are vectors in a real inner product space V, then 
  1. ‖u+v‖ ≤ ‖u‖+‖v‖ ( Triangle Inequality for Vectors ) 
  2. d(u,v) ≤ d(u,w)+d(w,v) ( Triangle Inequality for distances )

Orthogonality

If u and v are nonzero vectors, then the angle between them is θ = Ï€/2 if and only if ⟨u,v⟩ = 0 
Two vectors u and v in an inner product space V called orthogonal if ⟨u,v⟩ = 0

Example :

  1. Two vectors u=(1,1) and v=(1,-2) are orthogonal with respect to the Euclidean inner product on R^2 
  2. If M_22 has standard inner product, then the matrices


Orthogonal Complements

W is a subspace of a real inner product space V. The set of all vectors in V that are orthogonal to every vector in W is called the orthogonal complement of W (denoted by W^⊥)

Theorem 1: If W is a subspace of a real inner product space V, then: 
  1. W^⊥ is a subspace of V 
  2. W∩W^⊥={0} 
Theorem 2: If W is a subspace of a real finite-dimensional inner product space V, then (W^⊥ )^⊥=W

Example: 

Let W be the subspace of R^6 spanned by the vectors 



Find a basis for the orthogonal complement of W 


Gram-Schmidt Process; QR-Decomposition



Orthogonal and Orthonormal Set

Definition:

A set of two or more vectors in a real inner product space is said to be orthogonal if all pairs of distinct vectors in the set are orthogonal 

An orthogonal set in which each vector has norm 1 is said to be orthonormal 


Example: 

Let v_1=(0,1,0), v_2=(1,0,1), v_3=(1,0,-1) and assume that R^3 has the Euclidean inner product.

  • Show that the set S={v_1,v_2,v_3} is orthogonal! 
  • Is S orthonormal?

Constructing an Orthonormal Set

To convert an orthogonal set of nonzero vectors into an orthonormal set is: 

  • multiply each vector v in the orthogonal set by the reciprocal of its length to create a unit vector 

Unit vector: a vector of norm 1. Process of multiplying a vector v by the reciprocal of its length is called normalizing v 


Example :

Normalize v_1 = (0,1,0), v_2 = (1,0,1), v_3 = (1,0,-1) 


Orthonormal Basis



If S={v_1,v_2,…,v_n} is an orthogonal set of nonzero vectors in an inner product space, then S is linearly independent. In an inner product space, a basis consisting of orthonormal vectors is called an orthonormal basis. A basis consisting of orthogonal vectors is called an orthogonal basis 

Example: 



S = {u_1,u_2,u_3 } form an orthonormal basis for R^3

If S ={v_1,v_2,…,v_n} is an orthogonal basis for an inner product space V, and if u is any vector in V, then


If S={v_1,v_2,…,v_n} is an orthonormal basis for an inner product space V, and if u is any vector in V, then


Example: 

Show that the vectors 

w_1 = (0,2,0), w_2 = (3,0,3), w_3 = (-4,0,4) 


Form an orthogonal basis for R^3 with the Euclidean inner product, and use that basis to find an orthonormal basis by normalizing each vector

Express the vector u=(1,2,4) as a linear combination of the orthonormal basis vectors obtained in previous part.


Projection Theorem

W is a finite-dimensional subspace of an inner product space V. Every vector u in V can be expressed in exactly one way as 

u = w_1+w_2

Where w_1 is in W and w_2 is in W^⊥ 
  • w_1 = proj_W u 
  • w_2 = proj_(W^⊥ ) u=u-proj_w u

Calculating Orthogonal Projection

W is a finite-dimensional subspace of an inner product space V

1.If {v_1,v_2,…,v_r} is an orthogonal basis for W, and u is any vector in V, then 


2. If {v_1,v_2,…,v_r} is an orthonormal basis for W, and u is any vector in V, then



Example:

Let R^3 have the Euclidean inner product, and let W be the subspace spanned by the orthonormal vectors v_1 = (0,1,0) and v_2 = (-4/5,0, 3/5). Compute the orthogonal projection of u = (1,1,1) on W and the component of u orthogonal to W !


Gram-Schmidt Process


Theorem: Every nonzero finite-dimensional inner product space has an orthonormal basis. The step-by-step construction of an orthogonal (or orthonormal) basis is called the Gram-Schmidt process. 


Example: 

Assume that the vector space R^3 has the Euclidean inner product. Apply the Gram-Schmidt process to transform the basis vectors 

u_1 = (1,1,1), u_2 = (0,1,1), u_3 = (0,0,1)

Into an orthogonal basis {v_1,v_2,v_3}, and then normalize the orthogonal basis vectors to obtain an orthonormal basis {q_1,q_2,q_3} 


Extending Orthonormal Sets to Orthonormal Basis

If W is a finite-dimensional inner product space, then: 
  1. Every orthogonal set of nonzero vectors in W can be enlarged to an orthogonal basis for W 
  2. Every orthonormal set in W can be enlarged to an orthonormal basis for W

QR-Decomposition



If A is an m×n matrix with linearly independent column vectors, then A can be factored as 

A = QR 

Where Q is an m×n matrix with orthonormal column vectors, and R is an n×n invertible upper triangular matrix



Example: 

Find a QR- decomposition of



Sumber

Slide MRV : Inner Product Spaces




Sumber http://wikiwoh.blogspot.com


EmoticonEmoticon