摘要
:
Consider a channel whose the input alphabet set <inline-formula> <tex-math notation="LaTeX">$\mathbb {X}=\{x_{1},x_{2}, {\dots },x_{K}\}$ </tex-math></inline-formula> contains <inline-formula> <tex-math notation="LaTeX">$K$ </tex-...
展开
Consider a channel whose the input alphabet set <inline-formula> <tex-math notation="LaTeX">$\mathbb {X}=\{x_{1},x_{2}, {\dots },x_{K}\}$ </tex-math></inline-formula> contains <inline-formula> <tex-math notation="LaTeX">$K$ </tex-math></inline-formula> discrete symbols modeled as a discrete random variable <inline-formula> <tex-math notation="LaTeX">$X$ </tex-math></inline-formula> having a probability mass function <inline-formula> <tex-math notation="LaTeX">$\mathbf {p}(\mathbf {x}) = [p(x_{1}), p(x_{2}), {\dots }, p(x_{K})]$ </tex-math></inline-formula> and the received signal <inline-formula> <tex-math notation="LaTeX">$Y$ </tex-math></inline-formula> being a continuous random variable. <inline-formula> <tex-math notation="LaTeX">$Y$ </tex-math></inline-formula> is a distorted version of <inline-formula> <tex-math notation="LaTeX">$X$ </tex-math></inline-formula> caused by a channel distortion, characterized by the conditional densities <inline-formula> <tex-math notation="LaTeX">$p(y|x_{i})=\phi _{i}(y)$ </tex-math></inline-formula>, <inline-formula> <tex-math notation="LaTeX">$i=1,2, {\dots },K$ </tex-math></inline-formula>. To recover <inline-formula> <tex-math notation="LaTeX">$X$ </tex-math></inline-formula>, a quantizer <inline-formula> <tex-math notation="LaTeX">$Q$ </tex-math></inline-formula> is used to quantize <inline-formula> <tex-math notation="LaTeX">$Y$ </tex-math></inline-formula> back to a discrete output <inline-formula> <tex-math notation="LaTeX">$\mathbb {Z} =\{z_{1}, z_{2}, {\dots }, z_{N}\}$ </tex-math></inline-formula> corresponding to a random variable <inline-formula> <tex-math notation="LaTeX">$Z$ </tex-math></inline-formula> with a probability mass function <inline-formula> <tex-math notation="LaTeX">$\mathbf {p}(\mathbf {z}) = [p(z_{1}), p(z_{2}), {\dots }, p(z_{N})]$ </tex-math></inline-formula> such that the mutual information <inline-formula> <tex-math notation="LaTeX">$I(X;Z)$ </tex-math></inline-formula> is maximized subject to an arbitrary constraint on <inline-formula> <tex-math notation="LaTeX">$\mathbf {p}(\mathbf {z})$ </tex-math></inline-formula>. Formally, we are interested in designing an optimal quantizer <inline-formula> <tex-math notation="LaTeX">$Q^{*}$ </tex-math></inline-formula> that maximizes <inline-formula> <tex-math notation="LaTeX">$\beta I(X;Z) - C(Z)$ </tex-math></inline-formula> where <inline-formula> <tex-math notation="LaTeX">$\beta $ </tex-math></inline-formula> is a positive number that controls the trade-off between maximizing <inline-formula> <tex-math notation="LaTeX">$I(X;Z)$ </tex-math></inline-formula> and minimizing an arbitrary cost function <inline-formula> <tex-math notation="LaTeX">$C(Z)$ </tex-math></inline-formula>. Let <inline-formula> <tex-math notation="LaTeX">$\mathbf {p}(\mathbf {x}|y)=[p(x_{1}|y),p(x_{2}|y), {\dots },p(x_{K}|y)]$ </tex-math></inline-formula> be the posterior distribution of <inline-formula> <tex-math notation="LaTeX">$X$ </tex-math></inline-formula> for a given value of <inline-formula> <tex-math notation="LaTeX">$y$ </tex-math></inline-formula>, we show that for any arbitrary cost function <inline-formula> <tex-math notation="LaTeX">$C(.)$ </tex-math></inline-formula>, the optimal quantizer <inline-formula> <tex-math notation="LaTeX">$Q^{*}$ </tex-math></inline-formula> separates the vectors <inline-formula> <tex-math notation="LaTeX">$\mathbf {p}(\mathbf {x}|y)$ </tex-math></inline-formula> into convex regions. Using this result, a method is proposed to determine an upper bound on the number of thresholds (decision variables on <inline-formula> <tex-math notation="LaTeX">$y$ </tex-math></inline-formula>) which is used to speed up the algorithm for finding an optimal quantizer. Numerical results are presented to validate the findings.
收起