Show simple item record

Causal Encoding of Markov Sources.

dc.contributor.authorGilbert, R. Kent
dc.date.accessioned2020-09-09T01:07:46Z
dc.date.available2020-09-09T01:07:46Z
dc.date.issued1983
dc.identifier.urihttps://hdl.handle.net/2027.42/159551
dc.description.abstractA source code consists of an encoder that creates a binary representation of the source output X and a decoder that creates a reproduction Y of the original source output X. A source code is causal if the reproduction created by the code of the present source output depends on present and past outputs but not on future ones. The measures of performance of such a code are the average distortion between X and Y relative to some per-letter distortion measure and the rate, which is quantified by the entropy-rate of the reproduction process {Y}. The causal OPTA (Optimum Performance Theoretically Attainable) r(,c)(D) is defined to be the least rate attainable among all causal source codes with average distortion no greater than D. Source coding theorems for block, sliding-block, tree, and trellis codes have shown that these classes contain codes that achieve the rate-distortion function R(D), which is the OPTA for all codes in each class, both causal and noncausal. Although these classes contain some codes that are causal, it is widely believed that it is the noncausal code, and only the noncausal codes, that achieve R(D). We define causality for source codes and describe several specific kinds of causal codes. The goal is to discover how much is lost relative to R(D) by the restriction to causal codes, or equivalently, how much can be gained by noncausal codes. In previous work we demonstrated that for memoryless sources the optimum performance by causal source codes can be achieved by memoryless codes or by time-sharing memoryless codes. In this thesis we develop upper and lower bounds to the causal OPTA for discrete m('th)-order Markov source. The upper bound is based on the performance of a class of simple causal codes. An increasing sequence of information-theoretic lower bounds, r(,c)('n)(D), is also developed. We conjecture that these increase to r(,c)(D). In addition we developed a set of lower bounds to r(,c)('n)(D) that can be evaluated by linear programming. These are denoted r(,LP)('n)(D). A principal result of this work is that the linear programming lower bound r(,LP)('n)(D) equals r(,c)('n)(D) for any n and any discrete m('th)-order Markov source. The information-theoretic lower bound, r(,c)('n)(D), is obtained by defining a quantity, which we call the mixed N('th)-order conditional entropy. That is, the entropy of the most recent N-reproduction letters is conditioned on the entire source output up to the N-reproduction letters. The lower bound, r(,c)('n)(D), is then defined to be least mixed N('th)-order conditional entropy attainable by any causal sliding-block code with average distortion not exceeding D.
dc.format.extent140 p.
dc.languageEnglish
dc.titleCausal Encoding of Markov Sources.
dc.typeThesis
dc.description.thesisdegreenamePhDen_US
dc.description.thesisdegreedisciplineElectrical engineering
dc.description.thesisdegreegrantorUniversity of Michigan
dc.subject.hlbtoplevelEngineering
dc.contributor.affiliationumcampusAnn Arbor
dc.description.bitstreamurlhttp://deepblue.lib.umich.edu/bitstream/2027.42/159551/1/8324184.pdfen_US
dc.owningcollnameDissertations and Theses (Ph.D. and Master's)


Files in this item

Show simple item record

Remediation of Harmful Language

The University of Michigan Library aims to describe library materials in a way that respects the people and communities who create, use, and are represented in our collections. Report harmful or offensive language in catalog records, finding aids, or elsewhere in our collections anonymously through our metadata feedback form. More information at Remediation of Harmful Language.

Accessibility

If you are unable to use this file in its current format, please select the Contact Us link and we can modify it to make it more accessible to you.