曲曲的秘密学术基地

纯化欲望、坚持严肃性

欢迎!我是曲泽慧(@zququ),目前在深圳(ICBI,BCBDI,SIAT)任职助理研究员。


病毒学、免疫学及结构生物学背景,可以在 RG 上找到我已发表的论文

本站自2019年7月已访问web counter

Norm (partial)

Absolute-value norm

$$ | x| = |x| $$

is a norm on the $\Bbb R^1$ vector spaces formed by the real or complex numbers. Any norm $p$ on a $\Bbb R^1$ is equivalent (norm-preserving isomorphism of vector spaces).

Euclidean norm

On the $\Bbb R^n$ Euclidean space, intuitive notion of length of the vector $\pmb x = (x_1, x_2, …, x_n)$ is

$$ |\pmb x|_2 := \sqrt{x^2_1+x^2_2+…+x^2_n} $$

or for two vectors

$$ |\pmb x|:= \sqrt{\pmb x\cdot \pmb x} $$

which is also called the $L^2$ norm, $\ell^2$ norm, 2-norm or square norm. And the distance defined by euclidean norm is called the Euclidean length, $L^2$ distance or $\ell^2$ distance.

Euclidean norm of complex numbers

The Euclidean norm of a complex number is the absolute value (also called the modulus) of it, if the complex plane is identified with the Euclidean plane $\Bbb R^2$ , the Euclidean norm is,

$$ |x + iy| = \sqrt{x^2 + y^2} $$

Finite-dimensional complex normed spaces

On an n-dimensional complex space $\Bbb C^n$, the most common norm is

$$ |z||:=\sqrt{|z_1|^2 + … + |z_n|^2} = \sqrt{z_1\bar z_1 + … +z_n\bar z_n} $$

And can be expressed as the square root of the inner product of the vector and itself,

$$ |\pmb x|:=\sqrt{\pmb x^H\cdot \pmb x} $$

where, $\pmb x$ is represented as a column vector $[x_1;x_2;…;x_n]$, and $\pmb x^H$ denotes its conjugate transpose. Which also can be denoted with,

$$ |\pmb x|:=\sqrt{\pmb x\cdot \pmb x} $$

Quaternions and octonions

There are exactly four Euclidean Hurwitz algebras over the real numbers. real number $\Bbb R$, the complex numbers $\Bbb C$, the quaternions $\Bbb H$ and the octonions $\Bbb O$, with dimension of 1, 2, 4 and 8, respectively.

Taxicab norm or Manhattan norm

$$ |\pmb x|_1:= \sum_{i=1}^{n}|x_i| $$

Related to the distance a taxi has to drive in a rectangular street grid to get from the origin to the point $x$ . Which can be also called the $\ell^1$ norm, and the distance derived from this norm is called the Manhattan distance or $\ell_1$ distance.

Attention, can’t written into, $\sum_{i=1}^{n}x_i$ instead, because it may yield negative results.

p-norm

Let $p\ge1$ be a real number. The p-norm (also called $\ell_p$-norm) of vector $\pmb x = (x_1, …, x_n)$ is,

$$|\pmb x|p:=\left(\sum{i-i}^{n} x_i ^p\right)^{1/p}$$

when $p = 1$, we get taxicab norm, $p=2$, we get Euclidean norm and if $p$ approaches $\infty$, we will get the infinity norm or maximum norm:

$$ |\pmb x|_{\infty}:=\max_i|x_i|. $$

The p-norm is related to the generalized mean or power mean.

when $0<p<1$ (wiki)

Maximum norm (special case of : infinity norm , uniform norm or supremum norm)

$$ |x|_{\infty} :=max(|x_1|, …, |x+|) $$

<++>

Last One

L120C透射电镜操作流程

以连续上样为例:关filament,关Turbo pump —> 上样,(插一半,自动开Turbo,抽真空后,插入后半) —> 开Filament —> 开阀 —> R1 放屏幕 —> LM 低倍数 —> Search (Stage Add)—> 放大倍数 (SA2600X)—> Wobbler, 调整 Z-axis —> 关闭 Wobbler —> Set Alpha 确认 —> 关闭Set Alpha —> 高倍(...…

cryo-EMMore
Next One

cross entropy introduction

In the problem of logistic regression, MSE (Mean Squared Error) is not good enough to describe the loss of the training model. Cross entropy is an important concept in logistic regression, which comes from the information theory.The amount of info...…

DeepLearningMore