Shape Analysis Homeworks

Homeworks in MIT 6.838: Shape Analysis.

Homework 1: Discrete and Smooth Curves

Discrete Curvature of a Plane Curve

Gradient xis

xis=xixi1xixi12+xixi+1xixi+12

Discrete Curvature

κ=2sinθ2

Curve Shortening Flow

xi=xi(xis)h

Discrete Elastic Rods

Homework 2: Surfaces and Curvature

Mean Curvature Flow with Explicit Integrator

p(t+τ)p(t)τ=M1(p(t))L(p(t))p(t) p(t+τ)=p(t)τM1(p(t))L(p(t))p(t)

Mean Curvature Flow with (Semi-)Implicit Integrator

p(t+τ)p(t)τ=M1(p(t))L(p(t))p(t+τ) p(t+τ)=(I+τM1(p(t))L(p(t)))1p(t)

Non-Singular Mean Curvature Flow

p(t+τ)=(I+τM1(p(t))L(p(0)))1p(t)

Homework 3: Geodesics, Distance, and Metric Embedding

Swiss-Roll DataSet

Maximum Variance Unfolding

Homework 4: Laplacian and Vector Fields

Helmholtz Decomposition

  1. V=ζ=Δζ
  2. ×W=Vζ

Geodesic Distance from the Laplacian

Heat Kernel & Normalized Gradient

Divergences

Geodesic Distances

Parallel Transport from the Connection Laplacian

Operator Approach to Tangent Vector Fields

Scalar Field Advection

Manifold Optimization and Optimal Transport

Optimal Transport

(a) W(p,q) measures the minimum cost of transporting p to q.

(b) Let Kα=eCα, then

αKL(TKα)=αijTij(lnTij(Kα)ij1)=α(ijTij(lnTij1)ijTijln(Kα)ij)=α(ijTijlnTij1)+αijTij(Cijα)=ijTijCij+α(ijTijlnTij1)

(c) The Lagrange could be expressed as:

L(T;λ,μ)=ijTijCij+α(ijTijlnTij1)+iλi(jTijpi)+jμj(iTijqj)

where λi and μj are Lagrange multipliers.

Taking derivative of Tij, we have:

LTij=Cij+α(lnTij+1)+λi+μj=0

which implies:

Tij=exp{Cij+λi+μjα1}=eλiα12eCijαeμjα12

Thus,

T=diag(v)Kαdiag(w)

where

vi=eλiα12 wj=eμjα12

(d) When α is small enough, we have

Cij=d2(xi,xj)2×α2lnHα2(xi,xj)

Thus,

Hα2(xi,xj)eCijα=(Kα)ij

(e) If k is odd, the Lagrange could be expressed as

L(T;λ)=KL(TT(k1))+iλi(jTijpi)=ijTij(lnTijlnTij(k1)1)+iλi(jTijpi)=ijTij(lnTijlnTij(k1))+iλi(jTijpi)1

Taking derivative of Tij, we have

LTij=(lnTijlnTij(k1))+1+λi=0

Thus,

Tij=Tij(k1)e1+λi

This implies that

T(k)=diag(v~(k))T(k1)

where

v~i(k)=1e1+λi

Similarly, when k is even, we have

Tij=Tij(k1)e1+μj

where μj is the Lagrange multiplier. Then,

T(k)=T(k1)diag(w~(k)) w~j(k)=1e1+μj

Putting these together, we derive the iteration

T(k)=diag(v(k))T(0)diag(w(k))=diag(v(k))Hα2diag(w(k))

Now we only need to determine the vectors v(k) and w(k). Suppose k is odd, the constraints of pi gives

pi=jTij(k)=v~i(k)jTij(k1)

Thus,

v~i(k)=pijTij(k1)=pijTij(k2)w~j(k1)

Similarly, when k is even we have

qj=iTij(k)=w~j(k)iTij(k1) w~j(k)=qjiTij(k1)=qjiTij(k2)v~i(k1)

Now we derive the Sinkhorn iteration step:

vp(Tw) wq(vT)

where is the elementwise division operator.

(f)