Note on Aug. 31, 2019

A review on the paper "On the convergence of ADAM and beyond".

Chinese translated lecture notes for MATH 6366 @ UH

[Chinese] 私译课堂笔记 MATH 6366: 优化论 (Optimization Theory),主要参考教程为Boyd的《Convex optimization》

Note on May 16, 2019

A review on the paper "ADAM: A Method for Stochastic Optimization".

Special Notes on May 15, 2019

A deduction for the basic form of the training phase and testing phases in sparse coding and dictionary learning.

Notes on May 05, 2019

Comments on two SEG-2018 expanded abstracts. In this article, the theories of those two papers would be doubted.

Special Notes on Mar. 4, 2019

Weekly notes. In this note we are mainly learning "Solving ill-posed inverse problems using iterative deep neural networks".

Special Notes on Feb. 28, 2019

A derivation for the non-negative constrained least length problem. Here we only discuss the simplified form $\lVert \mathbf{x} \rVert_2^2,~\mathrm{s.t.}~\mathbf{Hx} \succeq \mathbf{h}$.

Special Notes on Feb. 27, 2019

Weekly notes. In this note we are mainly learning "NETT: Solving Inverse Problems with Deep Neural Networks".

Notes on Feb. 23, 2019

The note for weekly report on Feb 23, 2018.

Stochastic optimization I: from Monte-Carlo methods to Gibbs sampling

Here are special notes on Feb. 15, 2019. This is the first topic about stochastic optimization. In this topic, we will discuss the introductory theory about Monte-Carlo methods, Metropolis-Hastings algorithm and Gibbs sampling.

Special Notes on Aug. 24, 2018

A short discussion about the derivation of LMA from the function expansion view.

Get Back

Get back to the higher stage contents.