• 검색 결과가 없습니다.

CSE 586, Spring 2011 Advanced Computer Vision Procrustes Shape Analysis

N/A
N/A
Protected

Academic year: 2022

Share "CSE 586, Spring 2011 Advanced Computer Vision Procrustes Shape Analysis"

Copied!
49
0
0

로드 중.... (전체 텍스트 보기)

전체 글

(1)

CSE586

CSE 586, Spring 2011

Advanced Computer Vision

Procrustes Shape Analysis

(2)

CSE586

Lecture material from Tim Cootes University of Manchester.

For more info, see http://www.isbe.man.ac.uk/~bim/

(includes code for exploring active shape/appearance models).

Credits

lots of slides are due to

(3)

CSE586

Statistical Shape Modelling

•  Represent the shape using a set of points

•  Align a set of training examples, and compute a “mean shape”

•  Model distribution of the variation of shapes in the training set wrt the mean shape

•  This allows us to analyze the major “modes”

of shape variation and to generate realistic

new shapes similar to ones in training set

(4)

CSE586

Procrustes Analysis

R, t, s

R1, t1, s1

R2, t2, s2

Rm, tm, sm

Procrustes Analysis

Align one shape with another

(not symmetric)

General Procrustes Analysis

Align a set of shapes with respect to some unknown “mean” shape (independent of ordering of shapes)

(5)

CSE586

Why “Procrustes”?

http://www.procrustes.nl/gif/illustr.gif

(6)

CSE586

Aligning Two Shapes

•  Procrustes analysis:

–  Find transformation which minimises

–  T is a particular type of transformation that you want “shape” to be invariant to.

–  Typically a similarity transformation

–  Involves solving for best rotation, translation and scale between the two sets of points [we talked about how to do this in earlier lectures]

2 2

1 ( ) |

| xT x

(7)

CSE586

Steps in Similarity Alignment

Given a set of K points: Configuration

Translation normalization: Centered Configuration (center of mass at origin)

Scale normalization: Pre-shape

(divide by Sqrt of SSQ centered coordinates) Rotation normalization: Shape

(rotate to align with a second normalized shape)

(8)

CSE586

Generalized Procrustes Analysis

•  Generalised Procrustes Analysis

–  Find the transformations Ti which minimise

–  Where

–  Under the constraint that

|2

) (

| m −T xi i

) 1 (

i

Ti

n x

m =

1

|

| m =

(9)

CSE586

Aligning Shapes : Algorithm

•  Normalise all so center of mass is at origin, and size=1

•  Let

•  Align each shape with m (via a rotation)

•  Re-calculate

•  Normalise m to default location, size, and orientation

•  Repeat until convergence

x1

m =

) 1 (

i

Ti

n x

m =

(10)

CSE586

Hand shape model

•  72 points placed around boundary of hand

–  18 hand outlines obtained by thresholding images of hand on a white background

•  Primary landmarks chosen at tips of fingers and joint between fingers

–  Other points placed equally between

1 2 3 4 5 6

(11)

CSE586

Labeling Examples

From Stegmann and Gomez

(12)

CSE586

Labeling Examples

From Stegmann and Gomez

(13)

CSE586

Point Alignment

From Stegmann and Gomez

Training points before and after alignment

(14)

CSE586

Hand Shape Model

Samples from training set, after alignment

(15)

CSE586

Statistical Shape Analysis

From Stegmann and Gomez

Problem: each point DOES NOT move independently from all the other points. In fact, the movement of points when the shape undergoes a deformation is often highly correlated.

Independent PCA on each data point cluster

(16)

CSE586

Statistical Shape Analysis

From Stegmann and Gomez

Model how the entire ensemble of points varies.

We will concatenate all points into a single vector in a high dimensional space. The full hand shape model is then built by computing PCA on all model points collectively, which captures correlations of motion between points.

Will describe how in a moment, but first let’s look at the results.

(17)

CSE586

Hand Shape Model

Varying b1

Varying b2

Varying b3

First 3 modes of variation

(18)

CSE586

Face Shape Example

Sample face training image

1st mode of variation

(19)

CSE586

Shape Space

Consider K landmark points (xi,yi) in 2D

Consider each shape as a concatenated vector x, viewed as a point in 2*K dimensional space.

(note: the “shape” really lives in 2K-4 dimensional space, but that is outside the scope of this lecture) We would like a parameterized model that describes the class of shapes we have observed.

(20)

CSE586

Shape Space

•  Each observed shape is now a point (vector) x in 2*K dimensional space.

•  The “mean shape” is the center of mass of all these points

x

space shape

(21)

CSE586

Shape Models

•  For shape synthesis

–  Parameterised model preferable –  We will use a linear model

–  That is,

x = x + b1 p1 + b2 p2 + ... + bm pm

x = x + Pb

(22)

CSE586

Principal Component Analysis

•  Matrix X contains each shape as one column

•  Compute eigenvectors of scatter matrix X

T

X

•  Eigenvectors : main directions

•  Eigenvalue : spread or variation along that direction

p1

p2

λ1

∝ ∝ λ2

(23)

CSE586

Shape Space

•  The eigenvectors p1, p2 form a new (rotated) basis set of coordinate axes

•  Each shape x can now be assigned coordinates (b1,b2) according to the eigenvector axes

x

space shape

x = x + b1 p1 + b2 p2 p1

p2

(24)

CSE586

Dimensionality Reduction

•  Coordinates often highly correlated

•  Some dimensions account for most of the observed spread (variation) of the data

•  We don’t lose much by approximating the shape using only those dimensions

1 1

b p x

x ≈ +

b1

x

x

p1

(25)

CSE586

Dimensionality Reduction

•  Data lies in subspace of reduced dim.

•  However, for some t,

λi

i

t j

bj ≈ 0 if >

t

) is

of

(Variance bj λj

n nb

b p

p x

x = + 1 1 +L

...

+

(26)

CSE586

Building Shape Models

•  Given aligned shapes, { }

•  Apply PCA

•  P – First t eigenvectors of covar. matrix

•  b – Shape model parameters = coordinates of shape along eigenvector axes

x

i

Pb x

x ≈ +

(27)

CSE586

Statistical Shape Models

•  Represent likelihood of observing a given shape with a probability distribution

•  Learn p(b) from training set

•  If x multivariate gaussian, then

–  b gaussian with diagonal covariance

•  Or, can use mixture model for p(b) )

(

1 t

b

= diag λ L λ

S ...

(28)

CSE586

Background Information on PCA

•  principal components analysis (PCA) is a technique that can be used to simplify a dataset

•  It is a linear transformation that chooses a new coordinate system for the data set such that

greatest variance by any projection of the data set comes to lie on the first axis (then called the first principal component),

the second greatest variance on the second axis, and so on.

•  PCA can be used for reducing dimensionality by eliminating principal components that have small variance.

(29)

CSE586

PCA Toy Example

(30)

CSE586

Geometrical Interpretation:

(31)

CSE586

Geometrical Interpretation:

(32)

CSE586

Principal Component Analysis (PCA)

•  Given a set of points, how do we know if

they can be compressed like in the previous example?

–  The answer is to look into the correlation between the points

–  The tool for doing this is called PCA

(33)

CSE586

Principal component in 2d

(34)

CSE586

PCA Theorem

(35)

CSE586

PCA Theorem

(36)

CSE586

PCA Theorem

(37)

CSE586

PCA Theorem

(38)

CSE586

Using PCA to Compress Data

•  Expressing x in terms of e1 … en has not changed the size of the data

•  However, if the points are highly correlated many of the coordinates of x will be zero or closed to zero.

(39)

CSE586

Using PCA to Compress Data

•  Sort the eigenvectors e

i

according to their

eigenvalue:

(40)

CSE586

Example of Approximation

(41)

CSE586

Implementing PCA

•  Need to find “first” k eigenvectors of Q:

(42)

CSE586

Singular Value Decomposition

(SVD)

(43)

CSE586

SVD Properties

(44)

CSE586

More about Shape Space

Consider K landmark points (xi,yi) in 2D

Dimension of the space of observations is 2*K

But what is dimension of the space of the shapes?

Recall, two point configurations are equivalent (have the same shape) if one can be translated, rotated and scaled into the other.

Thus, each unique “shape” is an equivalence class.

(45)

CSE586

Shape Space

The shape space is then the quotient space induced by this equivalence class on the observation space

So shape space has dimension 2*K - 4

This is often called Kendall’s Shape Space.

k

rotation translation scaling

(46)

CSE586

Shape Space

For triangles (K=3), Kendall’s shape space can be visualized as a sphere.

For more points, the space is a more complicated manifold (is it a hypersphere?)

Rather than measure variation in shape on the sphere (or higher dimensional nonlinear manifold), it is

common to project points into the tangent plane taken at the mean shape. This is a linear space, so can use PCA, Gaussians, mixture of Gaussians, etc.

(47)

CSE586

Shape Space

http://www.york.ac.uk/res/fme/resources/morphologika/helpfiles/shapespace.html

(48)

CSE586

Shape Space

Tangent projection can be carried out by subtracting the vector of normalized shape coordinates of a

figure from the mean vector, and taking just the component perpendicular to the mean.

tangent plane

mean shape

nonlinear shape space

shape sample

shape sample projected into tangent plane

(49)

CSE586

Shape Space

If shape variation is relatively small, we can just take the shape difference vector as an

approximation to the vector in the tangent plane.

tangent plane

mean shape

nonlinear shape space

shape sample

참조

관련 문서