Entropy and Information Theory [electronic resource] / by Robert M. Gray.

Por: Gray, Robert M [author.]Colaborador(es): SpringerLink (Online service)Tipo de material: TextoTextoDescripción: XXVII, 409p. online resourceISBN: 9781441979704 99781441979704Tema(s): Engineering | Engineering | CODING AND INFORMATION THEORY | CODING THEORY | PROBABILITY TEORY AND STOCHASTIC PROCESSES | DISTRIBUTION (PROBABILITY THEORY) | SIGNAL, IMAGE AND SPEECH PROCESSING | STATICS FOR ENGINEERING, PHYSICS, COMPUTERS SCIENCE, CHEMISTRY AND EARTH SCIENCES | COMMUNICATIONS ENGINEERING, NETWORKS | TELECOMUNICACIÓNClasificación CDD: 621.382 Recursos en línea: ir a documento
Contenidos:
Preface -- Introduction -- Information Sources -- Pair Processes: Channels, Codes, and Couplings -- Entropy -- The Entropy Ergodic Theorem -- Distortion and Approximation -- Distortion and Entropy -- Relative Entropy -- Information Rates -- Distortion vs. Rate -- Relative Entropy Rates -- Ergodic Theorems for Densities -- Source Coding Theorems -- Coding for Noisy Channels -- Bibliography -- References -- Index.
Resumen: This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.
Etiquetas de esta biblioteca: No hay etiquetas de esta biblioteca para este título. Ingresar para agregar etiquetas.
    Valoración media: 0.0 (0 votos)
Tipo de ítem Ubicación actual Colección Signatura Info Vol Copia número Estado Fecha de vencimiento Código de barras Reserva de ítems
DOCUMENTOS DIGITALES DOCUMENTOS DIGITALES Biblioteca Jorge Álvarez Lleras
Digital 621.382 223 (Navegar estantería) Ej. 1 1 Disponible D000358
Total de reservas: 0

Preface -- Introduction -- Information Sources -- Pair Processes: Channels, Codes, and Couplings -- Entropy -- The Entropy Ergodic Theorem -- Distortion and Approximation -- Distortion and Entropy -- Relative Entropy -- Information Rates -- Distortion vs. Rate -- Relative Entropy Rates -- Ergodic Theorems for Densities -- Source Coding Theorems -- Coding for Noisy Channels -- Bibliography -- References -- Index.

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties. New in this edition: Expanded treatment of stationary or sliding-block codes and their relations to traditional block codes Expanded discussion of results from ergodic theory relevant to information theory Expanded treatment of B-processes -- processes formed by stationary coding memoryless sources New material on trading off information and distortion, including the Marton inequality New material on the properties of optimal and asymptotically optimal source codes New material on the relationships of source coding and rate-constrained simulation or modeling of random processes Significant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

No hay comentarios en este titulo.

para colocar un comentario.