5월, 2020의 게시물 표시

Definiteness of Gram matrix

Definition 1 Definiteness of a matrix Let $A$ be $n \times n$ real symmetric matrix, and column vector $x \in \mathbb{R}^n$. - $A$ is positive (negative) definite if $x^TAx > 0 \ (x^TAx < 0),\ \forall x \neq 0$. - $A$ is positive (negative) semidefinite if $x^TAx \geq 0 \ (x^TAx \leq 0),\ \forall x \in \mathbb{R}^n $, and $x^TAx = 0$ for at least one $ x \neq 0$. - $A$ is indefinite if $x^TAx$ takes on both positive and negative values (with different, non-zero $x$'s) Definition 2 Gram matrix The Gram matrix of a set of vectors $v_1, \dots, v_n$ in an inner product space is the Hermitan matrix of inner products, whose entries are given by (1). $$ \begin{align} G_{ij} = \langle v_i, v_j \rangle \tag{1} \\ \end{align} $$ For finite dimensional real vectors in $\mathbb{R}^n$ with the usual Euclidean dot product, the Gram matrix $G$ is simply $G=V^TV$, where the column of V is $v_i$. Theorem 1  The symmetric matrix $G$ is a Gram matrix if and only

Blogs and Magazines for software engineers

Personally, I think it's important to track what's going on the other side of the world. To prevent wasting of my precious time. Reviewing notable blogs and magazines is part of my effort to do it. This article will update as I find somewhere interesting. Blogs 1. Martin Fowler 2. Bjarne Stroustrup 3. Paul Graham Magazines 1. ACM Queue  : Almost Everything? 2. Computer  : Software Engineering 3. Communications of the ACM  : CS Theory

A function is bijective if and only if has an inverse

Definition Let $$ \begin{align} f &: A \rightarrow B \\ 1_A &: A \rightarrow A, \ f(a) = a\  \forall a \in A \end{align} $$ We say that $f$ is (1) well-defined if whenever $a_1 = a_2$ for some $a_1, a_2 \in A$, then $f(a_1) = f(a_2)$. (2)   injective  if whenever $f(a_1) = f(a_2)$ for some $a_1, a_2 \in A$, then $a_1 = a_2$. (3)  surjective  if $\forall b \in B \  \exists a \in A  \ s.t. \  f(a)=b$. (4) bijective if it is both injective and surjective . Theorem 1 A function $g : B \rightarrow A$ is the inverse of $f$ if $f \circ g = 1_B$ and $g \circ f  = 1_A$ Proof We'll proof the theorem on title by showing (a) and (b) . (a) $f : A \rightarrow B$ is bijective $\Longrightarrow$ $f$ has an inverse Let $f$ be bijective. Since $f$ is surjective, $\forall b \in B \  \exists a \in A  \ s.t. \  f(a)=b$. So we can define a function $f^{-1} : B \rightarrow A$ and let $f^{-1}(b) = a$. Since $f$ is injective, $f^{-1}$ is  well-defined  automatically. Nex

When MathJax not rendering LaTeX

이미지
I'll keep updating erroneous situations I have met when writing articles on web containing MathJax in here. 1. When HTML tag subdivides multi-line MathJax area (2020-05-07) Let you want to write multi-lined LaTeX equation with MathJax display mode. The equation you write in web-based editor seems to be no problem. But it could keep fail to render. Then just check the HTML of your equation. It may contain div, span tags between equation. Removing tags will function it correctly.

A Centroid Coincidence Theorem

While reviewing shape matching problem between  3D point sets of rigid object using least squares approximation. I found A Centroid Coincidence Theorem , which interest me. So I organize the proof of the theorem and related notions in here.  I'll follow mathematical notation of [1] as possible as I can. But, since, the mathematical notations in [1] is to implicit, I write it more explicitly (because of my poor imagination and mathematical skills). Problem Let two 3-D point sets as $p_i = [x_i, y_i, z_i]^t, p^{'}_i = [x^{'}_i, y^{'}_i, z^{'}_i]^t, i= 1,2,\cdots,N$. Which is related by (1) where $R$ is $3\times 3$ rotation matrix, $T$ is $3\times 1$ translation vector, and $N_i$ a noise vector. $$ \begin{align}     p^{'}_i &= Rp_i + T + N_i \tag{1} \end{align} $$ And let mathematical formulation of shape matching of 3D point sets using least squares approximation as (2). $$ \begin{align}    \underset{R, T} {\operatorname{argmin}} \sum_{i=1}^{