-->



       
       
             
             
             

Books, notes, code

  • Y. Polyanskiy and Y. Wu. " Information Theory: From Coding to Learning ," Cambridge University Press, 2023+ ( book draft ) This material will be published by Cambridge University Press as "Information Theory" by Yury Polyanskiy and Yihong Wu. This prepublication version is free to view and download for personal use only. Not for redistribution, resale or use in derivative works. Note that while this version has matching equation numbers, theorems etc, the printed version will have many typos corrected. We recommend using it for thorough reading, while free version could be used for quick reference lookup. © Polyanskiy & Wu 2023 older draft (2022) (2023) -->
  • Y. Polyanskiy, " Information theory methods in statistics and computer science ," (course page) 2019-2020.
," GitHub repository, Dec 2014.
Curious? Check (for up-to-date version:   )

Publications

  • Y. Polyanskiy and S. Verdú, " ................... ," ............... , location.

Journal Papers and Full Conference Papers

  • G. Liva and Y. Polyanskiy, " Unsourced multiple access: a coding paradigm for massive random access ," Proc. IEEE , 2024 (to appear) also arxiv:2408.09874 , Aug. 2024
  • G. Bresler, C. Guo, and Y. Polyanskiy, " Thresholds for reconstruction of random hypergraphs from graph projections ", Proc. Conf. On Learning Theory (COLT-2024) , Jul. 2024
  • P. Agostini, J.-F. Chamberland, F. Clazzer, J. Dommel, G. Liva, A. Munari, K. Narayanan, Y. Polyanskiy, S. Stanczak and Z. Utkovski, " Evolution of the 5G new radio two-step random access towards 6G unsourced MAC ," arXiv:2405.03348 , May. 2024
  • P. R. Gerber, T. Jiang, Y. Polyanskiy, and R. Sun, " Density estimation using the perceptron ", arxiv:2312.17701 , Dec. 2023
  • B. Geshkovski, C. Letrouit, Y. Polyanskiy, and P. Rigollet, " A mathematical perspective on Transformers ", arxiv:2312.10794 , Dec. 2023 (updated Feb. 2024, Aug. 2024)
  • B. Geshkovski, C. Letrouit, Y. Polyanskiy, and P. Rigollet, " The emergence of clusters in self-attention dynamics ", Proc. 37th Adv. in Neural Inform. Proc. Syst. (NeurIPS-2023) , Dec. 2023; also arxiv:2305.05465 , May 2023
  • P. R. Gerber, T. Jiang, Y. Polyanskiy, and R. Sun, " Kernel-based tests for likelihood-free hypothesis testing ", Proc. 37th Adv. in Neural Inform. Proc. Syst. (NeurIPS-2023) , Dec. 2023; also arxiv:2308.09043 , Aug. 2023 (updated Nov. 2023)
  • T. Jayashankar, G. Lee, A. Lancho, A. Weiss, Y. Polyanskiy, and G. W. Wornel, " Score-based source separation with applications to digital communication signals ", Proc. 37th Adv. in Neural Inform. Proc. Syst. (NeurIPS-2023) , Dec. 2023; also arxiv:2306.14411 , Jun. 2023
  • G. Bresler, C. Guo, and Y. Polyanskiy, " Algorithmic decorrelation and planted clique in dependent random graphs: the case of extra triangles ", Proc. 64th Symp. Found. Comp. Sci (FOCS-2023) , Nov. 2023; also arxiv:2305.09995 , May 2023
  • S. Jana, Y. Polyanskiy, A. Teh, and Y. Wu, " Empirical Bayes via ERM and Rademacher complexities: the Poisson model ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2307.02070 , Jul. 2023
  • P. R. Gerber, Y. Han, and Y. Polyanskiy, " Minimax optimal testing by classification ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2306.11085 , Jun. 2023
  • Z. Jia, Y. Polyanskiy, and Y. Wu, " Entropic characterization of optimal rates for learning Gaussian mixtures ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2306.12308 , Jun. 2023
  • Y. Gu and Y. Polyanskiy, " Weak recovery threshold for the hypergraph stochastic block model ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2303.14689 , Mar. 2023
  • Y. Gu and Y. Polyanskiy, " Uniqueness of BP fixed point for the Potts model and applications to community detection ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2303.14688 , Mar. 2023
  • A. Block and Y. Polyanskiy, " The sample complexity of approximate rejection sampling with applications to smoothed online learning ", Proc. Conf. On Learning Theory (COLT-2023) , Jul. 2023; also arxiv:2302.04658 , Feb. 2023 (upd. Feb. 2024).    Errata (Feb 2024): Version in COLT proceedings has a mistake in Lemma 27 that propagated to a wrong coefficient in Theorem 3, arXiv version is correct (thanks G. Flamich and W. L. Wells!)
  • A. Fengler, A. Lancho, K. Narayanan, and Y. Polyanskiy, " On the advantages of asynchrony in the unsourced MAC (extended) ," arxiv:2305.06985 , May 2023 an abridged version appeared in Proc. 2023 IEEE ISIT , Taiwan, Jun. 2023
  • P. R. Gerber and Y. Polyanskiy, " Likelihood-free hypothesis testing ", IEEE Trans. Inf. Theory , 2024 (to appear) also arxiv:2211.01126 , Nov. 2022 (updated Feb. 2023, Sep. 2023, Jun. 2024)
  • S. Jana, Y. Polyanskiy, and Y. Wu, " Optimal empirical Bayes estimation for the Poisson model via minimum-distance methods ", arxiv:2209.01328 , Sep. 2022 (updated Sep. 2023, May 2024)
  • A. Adler, J. Tang, and Y. Polyanskiy, " Efficient representation of large-alphabet probability distributions ", IEEE J. Sel. Areas Inf. Theory , vol. 3, no. 4, pp. 651--663, Dec. 2022 also arxiv:2205.03752 , May. 2022 (updated Dec. 2022)
  • A. Block, Z. Jia, Y. Polyanskiy, and A. Rakhlin, " Rate of convergence of the smoothed empirical Wasserstein distance ", arxiv:2205.02128 , May. 2022 (updated Aug 2024)
  • G. Bresler, C. Guo, and Y. Polyanskiy, " Linear programs with polynomial coefficients and applications to 1D cellular automata ", arxiv:2204.06357 , Apr. 2022
  • Q. Yu and Y. Polyanskiy " Ising model on locally tree-like graphs: uniqueness of solutions to cavity equations ", IEEE Trans. Inf. Theory , vol. 70, no. 3, pp. 1913--1938, Mar. 2024 also arxiv:2211.15242 , Feb. 2021 (updated Nov. 2022, Jul. 2023)
  • J. Tang and Y. Polyanskiy, " Capacity of noisy permutation channels ", IEEE Trans. Inf. Theory , vol. 69, no. 7, pp. 4145--4162, Jul. 2023 also arxiv:2111.00559 , Nov. 2021
  • Y. Polyanskiy and Y. Wu, " Sharp regret bounds for empirical Bayes and compound decision problems ", 2109.03943 , Sep. 2021
  • A. Block, Z. Jia, Y. Polyanskiy, A. Rakhlin, " Intrinsic dimension estimation using Wasserstein distances ", J. Machine Learning Research (JMLR) , 23(313):1-37, 2022. also arxiv:2106.04018 , Jun. 2021 (updated Dec. 2021; May 2022)
  • E. Abbe, E. Cornacchia, Y. Gu and Y. Polyanskiy, " Stochastic block model entropy and broadcasting on trees with survey ", Proc. Conf. On Learning Theory (COLT-2021) , Aug. 2021; Best Student Paper Award also arxiv:2101.12601 , Jan. 2021
  • M. Feder and Y. Polyanskiy, " Sequential prediction under log-loss and misspecification, ", Proc. Conf. On Learning Theory (COLT-2021) , Aug. 2021; also arxiv:2102.00050 , Jan. 2021
  • A. Makur, E. Mossel and Y. Polyanskiy, " Broadcasting on two-dimensional regular grids ", IEEE Trans. Inf. Theory , vol. 68, no.10, pp. 6297--6334, Oct. 2022; also arxiv:2010.01390 , Oct. 2020
  • O. Ordentlich and Y. Polyanskiy, " Strong data processing constant is achieved by binary inputs ", IEEE Trans. Inf. Theory , vol. 68, no.3, pp. 1480--1481, Mar. 2022; also arxiv:2010.01987 , Sep. 2020 (updated Jun. 2021)
  • Y. Polyanskiy and Y. Wu, " Self-regularizing property of nonparametric maximum likelihood estimator in mixture models " arxiv:2008.08244 , Aug. 2020
  • Y. Polyanskiy and Y. Wu, " Note on approximating the Laplace transform of a Gaussian on a complex disk " arxiv:2008.13372 , Aug. 2020
  • J. Gaudio and Y. Polyanskiy, " Attracting random walks ," Elect. J. Probability, vol. 25, no. 73, 2020; also arxiv:1903.00427 , May 2020.
  • S. Jana, Y. Polyanskiy and Y. Wu, " Extrapolating the profile of a finite population ," Proc. Conf. On Learning Theory (COLT-2020), Jul. 2020 ; also arxiv:2005.10561 , May 2020.
  • Y. Gu and Y. Polyanskiy, " Non-linear log-Sobolev inequalities for the Potts semigroup and applications to reconstruction problems ," Commun. Math. Phys. , vol. 404, pp. 769-831, Dec. 2023; also arxiv:2005.05444 , May 2020 (updated Nov. 2022).
  • Z. Goldfeld and Y. Polyanskiy, "The information bottleneck problem and its applications in machine learning," IEEE Journal on Selected Areas in Information Theory , vol. 1, no. 1, pp. 19--38, Apr. 2020; also arXiv:2004.14941 , Apr. 2020
  • H. Roozbehani and Y. Polyanskiy, " Low density majority codes and the problem of graceful degradation ," arxiv:1911.12263 , June 2019 (updated Nov. 2019).
  • Y. Wu, X. Gao, S. Zhou, W. Yang, Y. Polyanskiy, and G. Caire, " Massive access for future wireless communication systems ," IEEE Trans. Wireless Comm. , vol. 27, no. 4, pp. 148--156, Apr. 2020; also arxiv:1910.12678 , Oct. 2019. (updated Feb. 2020)
  • O. Ordentlich, Y. Polyanskiy and O. Shayevitz," A note on the probability of rectangles for correlated binary strings ," IEEE Trans. Inf. Theory , vol. 66, no. 12, pp.7878 -- 7886, Dec. 2020; also arxiv:1909.01221 , Sep. 2019. (updated Aug. 2020)
  • S. Kowshik, K. Andreev, A. Frolov and Y. Polyanskiy, " Energy efficient coded random access for the wireless uplink ," IEEE Trans. Comm. , vol. 68, no. 8, pp. 4694--4708, Aug. 2020; also arxiv:1907.09448 , July 2019 (updated May 2020).
  • Z. Goldfeld, K. Greenewald, Y. Polyanskiy and J. Weed, " Convergence of smoothed empirical measures with applications to entropy estimation, ," IEEE Trans. Inf. Theory , vol. 66, no. 7, pp. 4368--4391, Jul. 2020; also arxiv:1905.13576 , May 2019.
  • Y. Polyanskiy and Y. Wu, " Dualizing Le Cam's method for functional estimation, with applications to estimating the unseens " arxiv:1902.05616 , Feb. 2019 (updated Dec. 2020).
  • Y. Kochman, O. Ordentlich and Y. Polyanskiy, " A lower bound on the expected distortion of joint source-channel coding, " IEEE Trans. Inf. Theory , vol. 66, no. 8, pp. 4722--4741, Aug. 2020; also arxiv:1902.07979 , Feb. 2019.
  • U. Hadar, J. Liu, Y. Polyanskiy and O. Shayevitz, " Communication complexity of estimating correlations, " Proc. 51st ACM Symp. on Theory of Comp. (STOC), 2019; also arxiv:1901.09100 , Jan. 2019 (upd. Apr 2019)
  • S. Kowshik and Y. Polyanskiy, " Fundamental limits of many-user MAC with finite payloads and fading, " IEEE Trans. Inf. Theory , vol. 67, no. 9, pp.5853--5884, Sep. 2021; also arxiv:1901.06732 , Jan. 2019. (upd. May 2019, Nov. 2020, Jun. 2021)
  • A. Bhatt, B. Nazer, O. Ordentlich and Y. Polyanskiy, " Information-distilling quantizers, " IEEE Trans. Inf. Theory , vol. 67, no. 4, pp. 2472--2487, Apr. 2021; also arxiv:1812.03031 , Dec. 2018.
  • A. Makur, E. Mossel and Y. Polyanskiy, " Broadcasting on random directed acyclic graphs, " IEEE Trans. Inf. Theory , vol. 66, no. 2, pp. 780-812, Feb. 2020; also arxiv:1811.03946 , Nov. 2018.
  • Y. Sun, Y. Polyanskiy and E. Uysal-Biyikoglu, , " Sampling of the Wiener process for remote estimation over a channel with random delay, " IEEE Trans. Inf. Theory , vol. 66, no. 2, pp. 1118-1135, Feb. 2020; also arxiv:1707.02531 , Nov. 2018.
  • Z. Goldfeld, K. Greenewald and Y. Polyanskiy, " Estimating differential entropy under Gaussian convolutions ," arxiv:1810.11589 , Oct. 2018.
  • Z. Goldfeld, E. van den Berg, K. Greenewald, I. Melnyk, N. Nguyen, B. Kingsbury and Y. Polyanskiy, " Estimating information flow in deep neural networks ," Proc. 36th Int. Conf. Machine Learning (ICML'2019) , Long Beach, CA, June 2019. also arxiv:1810.05728 , Oct. 2018.
  • Y. Polyanskiy and Y. Wu, " Application of information-percolation method to reconstruction problems on graphs ," Math. Stat. Learn. , vol. 2, no. 1, pp. 1--24, 2019. also arxiv:1806.04195 , Jun. 2018.
  • Z. Goldfeld, G. Bresler and Y. Polyanskiy, " Information storage in the stochastic Ising model ," IEEE Trans. Inf. Theory , vol. 67, no. 3, pp. 1373--1399, Mar. 2021; also arxiv:1805.03027 , May 2018.
  • A. Makur, E. Mossel and Y. Polyanskiy, " Broadcasting on bounded degree DAGs," arxiv:1803.07527 , Mar. 2018. NOTE: This earlier draft was subsequently split into two papers, see part 1 and part 2 .
  • N. Alon, B. Bukh, and Y. Polyanskiy, " List-decodable zero-rate codes ," IEEE Trans. Inf. Theory , vol. 65, no. 3, pp. 1657--1667, Mar. 2019; also arxiv:1710.10663 , Oct. 2017 (updated May 2018).
  • W. Yang, A. Collins, G. Durisi, Y. Polyanskiy, and H. V. Poor, " Beta-Beta bounds: finite-blocklength analog of the golden formula ," IEEE Trans. Inf. Theory , vol. 64, no. 9, pp. 6236--6256, Sep. 2018; also arxiv:1706.05972 , Jun. 2017 (updated Mar 2018).
  • A. Collins and Y. Polyanskiy, " Coherent multiple-antenna block-fading channels at finite blocklength ," IEEE Trans. Inf. Theory , vol. 65, no. 1, pp. 380--405, Jan. 2019; also arxiv:1704.06962 , Apr. 2017 (updated May 2018).
  • Y. Polyanskiy, A. T. Suresh and Y. Wu, " Sample complexity of population recovery ," Proc. Conf. On Learning Theory (COLT-2017), Jul. 2017 ; also arxiv:1702.05574 , Feb. 2017. Errata (Apr 2020): Earlier versions (incl. the one in proceedings) had a mistake in Prop. 9 that propagated to Theorem 1 (lower bound) and Lemma 12.
  • A. Makur and Y. Polyanskiy, " Comparison of channels: criteria for domination by a symmetric channel ," IEEE Trans. Inf. Theory , vol. 64, no. 8, pp. 5704--5725, Aug. 2018; also arxiv:1609.06877 , Sep. 2016 (updated Nov. 2017).
  • J. Tang, D. Wang, Y. Polyanskiy and G. Wornell, " Defect tolerance: fundamental limits and examples ," IEEE Trans. Inf. Theory , vol. 64, no. 7, pp. 5240--5260, Jul. 2018; also arxiv:1610.02578 , Feb. 2016 (updated Jul 2016, Oct. 2016, Oct. 2017).
  • F. P. Calmon, Y. Polyanskiy and Y. Wu, " Strong data processing inequalities for input constrained additive noise channels ," IEEE Trans. Inf. Theory , vol. 64, no. 3, pp. 1879--1892, Mar. 2018; also arxiv:1512.06429 , Dec. 2015.
  • Y. Polyanskiy and Y. Wu, " Strong data-processing inequalities for channels and Bayesian networks ," In Convexity, Concentration and Discrete Structures , part of The IMA Volumes in Mathematics and its Applications , vol. 161, Springer-Verlag, New York, 2017; also arxiv:1508.06025 , Aug. 2015 (updated Jan. 2016, May 2016, Jul 2016).
  • M. Dalai and Y. Polyanskiy, " Bounds for codes on pentagon and other cycles ," arxiv:1508.03020 , Aug. 2015.
  • W. Yang, G. Durisi and Y. Polyanskiy, " Minimum energy to send k bits over multiple-antenna fading channels ," IEEE Trans. Inf. Theory , vol. 62, no. 12, pp. 6831--6853, Dec. 2016; also arxiv:1507.03843 , Jul. 2015 (updated May 2016 and Dec 2016).
  • Y. Polyanskiy and Y. Wu, " Wasserstein continuity of entropy and outer bounds for interference channels ," IEEE Trans. Inf. Theory , vol. 62, no. 7, pp. 3992--4002, Jul. 2016; also arxiv:1504.04419 .
  • Y. Polyanskiy, " On metric properties of maps between Hamming spaces and related graph homomorphisms ," J. Combin. Theory Ser. A , vol. 145, pp. 227--251, 2017; also arXiv:1503.02779 .
  • V. Kostina, Y. Polyanskiy and S. Verdú, " Joint source-channel coding with feedback , IEEE Trans. Inf. Theory , vol. 63, no. 6, pp. 3502--3515, Jun 2017; also arXiv:1501.07640 , Jan. 2015 (updated Sep. 2016, Feb. 2017).
  • G. Durisi, T. Koch, J. Östman, Y. Polyanskiy and W. Yang " Short-packet communications with multiple antennas: transmit diversity, spatial multiplexing, and channel estimation overhead ," IEEE Trans. Comm. , vol. 64, no. 2, pp. 618--629, Feb. 2016; also arXiv:1412.7512 .
  • H. Roozbehani and Y. Polyanskiy, " Algebraic methods of classifying directed graphical models ," arXiv:1401.5551 , Dec. 2014.
  • Y. Polyanskiy, " Upper bound on list-decoding radius of binary codes ," IEEE Trans. Inf. Theory , vol. 62, no. 3, pp. 1119--1128, Mar. 2016; also arXiv:1409.7765 .
  • W. Yang, G. Caire, G. Durisi and Y. Polyanskiy, " Optimum power control at finite blocklength ," IEEE Trans. Inf. Theory , vol. 61, no. 9, pp. 4598--4615, Sep. 2015; also arXiv:1406.5422 .
  • Y. Polyanskiy and Y. Wu, " Dissipation of information in channels with input constraints ," IEEE Trans. Inf. Theory , vol. 62, no. 1, pp. 35--55, Jan. 2016; also arXiv:1405.3629 .
  • V. Kostina, Y. Polyanskiy and S. Verdú, " Variable-length compression allowing errors ," IEEE Trans. Inf. Theory , vol. 61, no. 8, pp.4316--4330, Aug. 2015; also arXiv:1402.0608 .
  • W. Yang, G. Durisi, T. Koch and Y. Polyanskiy, " Quasi-static multiple-antenna fading channels at finite blocklength ," IEEE Trans. Inf. Theory , vol. 60, no. 7, pp.4232--4265, Jul. 2014.
  • A. Makhdoumi, S.-L. Huang, M. Médard and Y. Polyanskiy, " On Locally Decodable Source Coding ," arXiv:1308.5239, Aug. 2013 .
  • Y. Polyanskiy, " Hypercontractivity of spherical averages in Hamming space ," SIAM J. Discrete Math. , vol. 33, no. 2, pp. 731--754, 2019. also arxiv:1309.3014 , Jun. 2013 (updated Nov. 2015, Jun. 2017, Jun. 2018).
  • Y. Polyanskiy, " Hypothesis testing via a comparator and hypercontractivity ," preprint , Feb. 2013.
  • Y. Polyanskiy, " Saddle point in the minimax converse for channel coding ," IEEE Trans. Inf. Theory , vol. 59, no. 5, pp. 2576-2595, May 2013.
  • Y. Polyanskiy and S. Verdú, " Empirical distribution of good channel codes with non-vanishing error probability ," IEEE Trans. Inf. Theory , vol. 60, no. 1, pp. 5-21, Jan. 2014. Extended version (with Section VII included): arXiv:1309.0141 , Aug. 2013.
  • Y. Polyanskiy, " Asynchronous communication: exact synchronization, universality, and dispersion ," IEEE Trans. Inf. Theory , vol. 59, no. 3, pp. 1256-1270, Mar. 2013.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Feedback in the non-asymptotic regime ," IEEE Trans. Inf. Theory , vol. 57, no. 8, pp. 4903 - 4925, Aug. 2011.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Minimum energy to send k bits through the Gaussian channel with and without feedback ," IEEE Trans. Inf. Theory , vol. 57, no. 8, pp. 4880 - 4902, Aug. 2011.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Dispersion of the Gilbert-Elliot channel ," IEEE Trans. Inf. Theory , vol. 57, no. 4, pp. 1829-1848, Apr. 2011.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Channel coding rate in the finite blocklength regime ," IEEE Trans. Inf. Theory , vol. 56, no. 5, pp. 2307-2359, May 2010. Errata: In (175) the upper limit of summation should be (ℓ-1). (Thanks D. Divsalar!). Analysis of case 1 in Theorem 48 has a gap, fixed by Cao and Tomamichel .
  • V. Gorokhov, G. Popelnukha, G. Polyanskiy, Y. Polyanskiy, V. Tsukanov, "Switchboard for managing submersible electric motor," Russian Federation Patent N 31061 (RU) , Jul. 10, 2003.

Conference Proceedings

  • T. Jayashankar, B. Kurien, A. Lancho, G. C.F. Lee, Y. Polyanskiy, A. Weiss, and G. W. Wornell, " The data-driven radio frequency signal separation challenge ," IEEE ICASSP 2024, , Seoul, Korea, Apr. 2024
  • A. Fengler, A. Lancho, and Y. Polyanskiy, " Coded orthogonal modulation for the multi-antenna multiple-access channel ," IEEE Comm. Theory Workshop (CTW-2023) , Taiwan, Jul. 2023 also arxiv:2307.01095 , Jul. 2023
  • A. Teh and Y. Polyanskiy, " Comparing Poisson and Gaussian channels (extended) ," 2023 IEEE Int. Symp. Inf. Theory (ISIT) , Taipei, Taiwan, Jun. 2023 also arxiv:2306.16735 , Jun. 2023
  • Q. Yu and Y. Polyanskiy, " Uniqueness of distributional BP fixed point in Ising model on trees ," 2023 IEEE Int. Symp. Inf. Theory (ISIT) , Taipei, Taiwan, Jun. 2023
  • A. Fengler, A. Lancho, K. Narayanan, and Y. Polyanskiy, " On the advantages of asynchrony in the unsourced MAC ," 2023 IEEE Int. Symp. Inf. Theory (ISIT) , Taipei, Taiwan, Jun. 2023
  • G. Lee, A. Weiss, A. Lancho, Y. Polyanskiy, and G. Wornell, " On neural architectures for deep learning-based source separation of co-channel OFDM signals ," IEEE ICASSP 2023, , Greece, Jun. 2023
  • A. Fengler, G. Liva and Y. Polyanskiy, " Sparse graph codes for the 2-user unsourced MAC ," 2022 Asilomar Conf. Sig. Syst. Comp. , Pacific Grove, CA, USA, Nov. 2022
  • A. Lancho, A. Weiss, G. Lee, J. Tang, Y. Bu, Y. Polyanskiy, and G. Wornell, " Data-driven blind synchronization and interference rejection for digital communication signals ," IEEE GLOBECOM 2022, , Rio de Janeiro, Brazil, Dec. 2022
  • A. Lancho, A. Fengler, and Y. Polyanskiy " Finite-blocklength results for the A-channel: applications to unsourced random access and group testing ," 58th Allerton Conference, Allerton Retreat Center, Monticello, IL, USA, Sep. 2022 also arxiv:2210.01951 , Oct. 2022
  • G. Lee, A. Weiss, A. Lancho, J. Tang, Y. Bu, Y. Polyanskiy, and G. W. Wornell " Exploiting temporal structures of cyclostationary signals for data-driven single-channel source separation, ," 32nd IEEE Int. Workshop Machine Learning for Sig. Proc. (MLSP) , Xi'an, China, Aug. 2022 ( Best student paper award ).
  • J. Tang and Y. Polyanskiy, " Capacity of noisy permutation channels ," 2022 IEEE Int. Symp. Inf. Theory (ISIT) , Espoo, Finland, Jun. 2022 ( Best student paper award ).
  • A. Adler, J. Tang and Y. Polyanskiy, " Efficient representation of large-alphabet probability distributions via arcsinh-compander ," 2022 IEEE Int. Symp. Inf. Theory (ISIT) , Espoo, Finland, Jun. 2022.
  • G. Liva and Y. Polyanskiy, " On coding techniques for unsourced multiple-access ," 2021 Asilomar Conf. Sig. Syst. Comp. , Pacific Grove, CA, USA, Nov. 2021
  • J.-F. Chamberland, K. Narayanan, and Y. Polyanskiy, " Unsourced multiple access (UMAC): information theory and coding (tutorial) ," 2021 IEEE Int. Symp. Inf. Theory (ISIT) , Melbourne, Australia, Jul. 2021
  • A. Adler, J. Tang and Y. Polyanskiy, " Quantization of random distributions under KL divergence ," 2021 IEEE Int. Symp. Inf. Theory (ISIT) , Melbourne, Australia, Jul. 2021 Errata: In Theorem 1 constants c1 and c2 should depend on alpha; typos in the proof of Prop. 7; Lemma 2(iii) applies to uniform p
  • A. Makur, E. Mossel and Y. Polyanskiy, " Reconstruction on 2D regular grids ," 2021 IEEE Int. Symp. Inf. Theory (ISIT) , Melbourne, Australia, Jul. 2021
  • Q. Yu and Y. Polyanskiy, " Broadcasting on trees near criticality: perturbation theory ," 2021 IEEE Int. Symp. Inf. Theory (ISIT) , Melbourne, Australia, Jul. 2021
  • Y. Gu, H. Roozbehani and Y. Polyanskiy, " Broadcasting on trees near criticality ," 2020 IEEE Int. Symp. Inf. Theory (ISIT) , Los Angeles, CA, USA, Jun. 2020. also arxiv:2005.07801 , May 2020
  • H. Roozbehani and Y. Polyanskiy, " Graceful degradation over the BEC via non-linear codes ," 2020 IEEE Int. Symp. Inf. Theory (ISIT) , Los Angeles, CA, USA, Jun. 2020.
  • Y. Polyanskiy, " Remarks on massive random access (slides) ," DLR-MIT-TUM Workshop , DLR, Munich, Germany, Feb. 2020
  • S. Kowshik, K. Andreev, A. Frolov and Y. Polyanskiy, " Short-packet low-power coded access for massive MAC ," 2019 Asilomar Conf. Sig. Syst. Comp. , Pacific Grove, CA, USA, Nov. 2019
  • A. Makur, E. Mossel and Y. Polyanskiy, " Broadcasting on random networks ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • S. Kowshik and Y. Polyanskiy, " Quasi-static fading MAC with many users and finite payload ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • I. Zadik, Y. Polyanskiy, and C. Thrampoulidis, " Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • S. Kowshik, K. Andreev, A. Frolov and Y. Polyanskiy, " Energy efficient random access for the quasi-static fading MAC ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • Z. Goldfeld, K. Greenewald, J. Weed and Y. Polyanskiy, " Optimality of the plug-in estimator for differential entropy estimation under Gaussian convolutions ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • U. Hadar, J. Liu, Y. Polyanskiy and O. Shayevitz, " Error exponents in distributed hypothesis testing of correlations ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • Y. Kochman, O. Ordentlich and Y. Polyanskiy, " A lower bound on the expected distortion of joint source-channel coding ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • W. Huleihel, Y. Polyanskiy and O. Shayevitz, " Relaying one bit across a tandem of binary-symmetric channels ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • Z. Goldfeld, G. Bresler and Y. Polyanskiy, " Information storage in the stochastic Ising model at low temperature ," 2019 IEEE Int. Symp. Inf. Theory (ISIT) , Paris, France, Jul. 2019.
  • C. Thrampoulidis, I. Zadik and Y. Polyanskiy, " A simple bound on the BER of the MAP decoder for massive MIMO systems ," Proc. 2019 IEEE ICASSP , Brighton, UK, May 2019.
  • Y. Polyanskiy, " Modern aspects of multiple access for IoT (4 lectures) ," short course , SkolTech Inst. Of Tech., Moscow, Russia, Jul. 2018.
  • Y. Polyanskiy, " Information-theoretic perspective on massive multiple-access (tutorial) ," 2018 North-American School of Information Theory , Texas A&M University, College Station, TX, May 2018
  • Z. Goldfeld, G. Bresler and Y. Polyanskiy, " Information storage in the stochastic Ising model at zero temperature ," 2018 IEEE Int. Symp. Inf. Theory (ISIT) , Vail, CO, USA, Jun. 2018.
  • O. Ordentlich and Y. Polyanskiy, " Entropy under additive Bernoulli and spherical noises ," 2018 IEEE Int. Symp. Inf. Theory (ISIT) , Vail, CO, USA, Jun. 2018.
  • H. Hassani, S. Kudekar, O. Ordentlich, Y. Polyanskiy and R. Urbanke, " Almost optimal scaling of Reed-Muller codes on BEC and BSC channels ," 2018 IEEE Int. Symp. Inf. Theory (ISIT) , Vail, CO, USA, Jun. 2018.
  • Y. Kochman, O. Ordentlich and Y. Polyanskiy, " Ozarow-type outer bounds for memoryless sources and channels ," 2018 IEEE Int. Symp. Inf. Theory (ISIT) , Vail, CO, USA, Jun. 2018.
  • H. Roozbehani and Y. Polyanskiy, " Input-output distance properties of good linear codes ," 2018 IEEE Int. Symp. Inf. Theory (ISIT) , Vail, CO, USA, Jun. 2018.
  • H. Roozbehani and Y. Polyanskiy, " Triangulation codes: a family of non-linear codes with graceful degradation ," 2018 Conf. Inform. Sciences and Syst. (CISS) , Princeton, NJ, USA, Mar.. 2018.
  • Y. Polyanskiy, " A perspective on massive random-access ," 2017 IEEE Int. Symp. Inf. Theory (ISIT) , Aachen, Germany, Jun. 2017.
  • O. Ordentlich and Y. Polyanskiy, " Low complexity schemes for the random access Gaussian channel ," 2017 IEEE Int. Symp. Inf. Theory (ISIT) , Aachen, Germany, Jun. 2017. also see: " Low complexity schemes for the random access Gaussian channel (extended version) ."
  • A. Makur and Y. Polyanskiy, " Less noisy domination by symmetric channels ," 2017 IEEE Int. Symp. Inf. Theory (ISIT) , Aachen, Germany, Jun. 2017.
  • B. Nazer, O. Ordentlich, and Y. Polyanskiy, " Information-distilling quantizers ," 2017 IEEE Int. Symp. Inf. Theory (ISIT) , Aachen, Germany, Jun. 2017.
  • Y. Sun, Y. Polyanskiy, and E. Uysal-Biyikoglu, " Remote estimation of the Wiener process over a channel with random delay ," 2017 IEEE Int. Symp. Inf. Theory (ISIT) , Aachen, Germany, Jun. 2017.
  • J. Tang, D. Wang, Y. Polyanskiy, and G. Wornell, " Defect tolerance: fundamental limits and examples ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • A. Collins and Y. Polyanskiy, " Dispersion of the coherent MIMO block-fading channel ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016. Errata: Proposition 3 is incorrect. There are many caids, but for all of them entries are pairwise independent. Consequently, they all achieve the same dispersion. See the journal version for details.
  • W. Yang, A. Collins, G. Durisi, Y. Polyanskiy, and H. V. Poor, " A beta-beta achievability bound with applications ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • Y. Polyanskiy and Y. Wu, " Converse bounds for interference channels via coupling and proof of Costa's conjecture ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • A. Mazumdar, Y. Polyanskiy, A.S. Rawat and H. Roozbehani, " Distance-preserving maps and combinatorial joint source-channel coding for large alphabets ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • D. Cullina, M. Dalai and Y. Polyanskiy, " Rate-distance tradeoff for codes above graph capacity ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • M. Dalai and Y. Polyanskiy " Bounds on the reliability of a typewriter channel ," 2016 IEEE Int. Symp. Inf. Theory (ISIT) , Barcelona, Spain, Jul. 2016.
  • Y. Polyanskiy, " Finite blocklength information theory (tutorial) ," 2016 European School of Information Theory (ESIT) , Chalmers Univ., Gothenburg, Sweden, Apr. 2016
  • M. Dalai and Y. Polyanskiy, " Bounds for codes on pentagon and other cycles ," 53rd Allerton Conference 2015, Allerton Retreat Center, Monticello, IL, USA, Oct. 2015.
  • G. Ajjanagadde and Y. Polyanskiy, " Adder MAC and estimates for Rényi entropy ," 53rd Allerton Conference 2015, Allerton Retreat Center, Monticello, IL, USA, Oct. 2015.
  • Y. Polyanskiy, " Upper bound on list-decoding radius of binary codes ," 2015 IEEE Int. Symp. Inf. Theory (ISIT) , Hong Kong, China, Jun. 2015.
  • A. Young and Y. Polyanskiy, " Converse and duality results for combinatorial source-channel coding in binary Hamming spaces ," 2015 IEEE Int. Symp. Inf. Theory (ISIT) , Hong Kong, China, Jun. 2015.
  • F. Calmon, Y. Polyanskiy and Y. Wu, " Strong data processing inequalities in power-constrained Gaussian channels ," 2015 IEEE Int. Symp. Inf. Theory (ISIT) , Hong Kong, China, Jun. 2015.
  • W. Yang, G. Durisi and Y. Polyanskiy, " Minimum energy to send k bits over Rayleigh-fading channels ," 2015 IEEE Int. Symp. Inf. Theory (ISIT) , Hong Kong, China, Jun. 2015.
  • V. Kostina, Y. Polyanskiy and S, Verdú, " Joint source-channel coding with feedback ," 2015 IEEE Int. Symp. Inf. Theory (ISIT) , Hong Kong, China, Jun. 2015.
  • Y. Polyanskiy and Y. Wu, " Strong data-processing of mutual information: beyond Ahlswede and Gács ," 2015 Information Theory and Applications Workshop (ITA) , UCSD, San Diego, CA, USA, Feb. 2015.
  • Y. Polyanskiy, " Hypercontractivity in Hamming space (Invited) ," 52nd Allerton Conference 2014, Allerton Retreat Center, Monticello, IL, USA, Oct. 2014.
  • A. Collins, Y. Polyanskiy, " Orthogonal designs optimize achievable dispersion for coherent MISO channels ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014. Errata: Proposition 1 erroneously claims that any caid is necessarily jointly Gaussian. In truth, only rows/columns of the caid are guaranteed to be jointly Gaussian. Consequently, proof of Theorem 6 needs to be modified slightly to account for non-jointly Gaussian caids.
  • V. Kostina, Y. Polyanskiy, S. Verdú, " Variable-length compression allowing errors ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • D. Wang, Y. Polyanskiy, G. Wornell, " Scalar quantization with noisy partitions and its application to flash ADC design ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • H. Roozbehani, Y. Polyanskiy, " Algebraic methods of classifying directed graphical models ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • M. Johnston, E. Modiano, Y. Polyanskiy, " Opportunistic scheduling with limited channel state information: a rate distortion approach ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • W. Yang, G. Caire, G. Durisi, Y. Polyanskiy, " Finite-blocklength channel coding rate under a long-term power constraint ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • W. Yang, G. Durisi, T. Koch, Y. Polyanskiy, " Dispersion of quasi-static MIMO fading channels via Stokes' theorem ," 2014 IEEE Int. Symp. Inf. Theory (ISIT) , Honolulu, Hawaii, Jul 2014.
  • Y. Polyanskiy and Y. Wu, " Dissipation of information in channels with input constraints ," Theory Group Seminar , Microsoft Research, Redmond, WA, May 2014.
  • W. Gao and Y. Polyanskiy, " On the bit error rate of repeated error-correcting codes ," 48th Conference on Information Sciences and Systems (CISS) 2014 , Princeton, NJ, USA, Mar 2014.
  • Y. Polyanskiy, " On dispersion of compound DMCs (Invited) ," 51st Allerton Conference 2013, Allerton Retreat Center, Monticello, IL, USA, Oct. 2013.
  • Y. Polyanskiy, " Finite blocklength methods in channel coding (tutorial) ," Part of the tutorial delivered with S. Verdú at 2013 IEEE Int. Symp. Inf. Theory (ISIT) , Istanbul, Turkey, Jul 2013.
  • W. Yang, G. Durisi, T. Koch, Y. Polyanskiy, " Quasi-static SIMO fading channels at finite blocklength ," 2013 IEEE Int. Symp. Inf. Theory (ISIT) , Istanbul, Turkey, Jul 2013.
  • A. Mazumdar, Y. Polyanskiy, B. Saha, " On Chebyshev radius of a set in Hamming space and the closest string problem ," 2013 IEEE Int. Symp. Inf. Theory (ISIT) , Istanbul, Turkey, Jul 2013. Errata: Results in Section II.A (in particular Theorem 1) are correct, but trivial: taking set S to be the entire space shows Jung constant of Hamming space equals 2 for all dimensions n.
  • A. Andoni, H. L. Nguyen, Y. Polyanskiy, Y. Wu, " Tight lower bound for linear sketches of moments ," 40th Internat. Coll. Automata, Languages, and Programming (ICALP-2013), Riga, Latvia, Jul 2013.
  • Y. Polyanskiy, " Lp-norms of codewords from capacity and dispersion-achieving Gaussian codes (Invited) ," 50th Allerton Conference 2012, Allerton Retreat Center, Monticello, IL, USA, Oct. 2012.
  • Y. Kochman, A. Mazumdar, Y. Polyanskiy, " Results on combinatorial joint source-channel coding (Invited) ," 2012 Inf. Theory Workshop (ITW) , Lausanne, Switzerland, Sep 2012. Errata: W. Gao and Y. Polyanskiy found Theorem 7 to be in error (Apr. 2013).
  • W. Yang, G. Durisi, T. Koch, Y. Polyanskiy, " Diversity versus channel knowledge at finite block-length ," 2012 Inf. Theory Workshop (ITW) , Lausanne, Switzerland, Sep 2012.
  • Y. Kochman, A. Mazumdar, Y. Polyanskiy, " The adversarial joint source-channel problem ," 2012 IEEE Int. Symp. Inf. Theory (ISIT) , Cambridge, MA, Jul 2012.
  • Y. Polyanskiy, " Hypothesis testing via a comparator ," 2012 IEEE Int. Symp. Inf. Theory (ISIT) , Cambridge, MA, Jul 2012.
  • Y. Polyanskiy, " On asynchronous capacity and dispersion (Invited) ," 46th Conference on Information Sciences and Systems (CISS) 2012 , Princeton, NJ, USA, Mar 2012.
  • Y. Polyanskiy and S. Verdú, " Relative entropy at the channel output of a capacity-achieving code ," 49th Allerton Conference 2011, Allerton Retreat Center, Monticello, IL, USA, Sep. 2011.
  • Y. Polyanskiy and S. Verdú, " Scalar coherent fading channel: dispersion analysis ," 2011 IEEE Int. Symp. Inf. Theory (ISIT) , St. Petersburg, Russia, Aug. 2011.
  • Y. Polyanskiy and S. Verdú, " Binary hypothesis testing with feedback (slides) ," 2011 Information Theory and Applications Workshop (ITA) , UCSD, San Diego, CA, USA, Feb. 2011.
  • Y. Polyanskiy and S. Verdú, " Arimoto channel coding converse and Rényi divergence ," 48th Allerton Conference 2010, Allerton Retreat Center, Monticello, IL, USA, Sep. 2010. Errata: Typo in Theorem 5.1. The correct version is "concave in P_X and convex in P_{Y|X}". (Thanks Siu-Wai Ho!)
  • Y. Polyanskiy and S. Verdú, " Channel dispersion and moderate deviations limits for memoryless channels ," 48th Allerton Conference 2010, Allerton Retreat Center, Monticello, IL, USA, Sep. 2010.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Memoryless channels: the benefits of feedback in the non-asymptotic regime (poster) ," 2010 School of Inf. Theory , Univ. of Southern California, Los Angeles, CA, USA, Aug. 2010.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Variable-length coding with feedback in the non-asymptotic regime ," 2010 IEEE Int. Symp. Inf. Theory (ISIT) , Austin, TX, USA, Jun. 2010 ( Best student paper award ).
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Minimum energy to send k bits with and without feedback ," 2010 IEEE Int. Symp. Inf. Theory (ISIT) , Austin, TX, USA, Jun. 2010.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Finite blocklength results in channel coding (poster) ," 2009 School of Inf. Theory , Northwestern Univ., IL, USA, Aug. 2009.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Dispersion of Gaussian channels ," 2009 IEEE Int. Symp. Inf. Theory (ISIT) , Seoul, Korea, Jul. 2009.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " Dispersion of the Gilbert-Elliot channel ," 2009 IEEE Int. Symp. Inf. Theory (ISIT) , Seoul, Korea, Jul. 2009.
  • Y. Polyanskiy, H. V. Poor and S. Verdú, " New channel coding achievability bounds ," 2008 IEEE Int. Symp. Inf. Theory (ISIT) , Toronto, Canada, Jun. 2008 ( Best student paper award ).

Old preprints, lecture notes, etc

  • Y. Polyanskiy, Y. Wu, " Lecture notes on Information Theory ," MIT (6.441), UIUC (ECE 563), Yale (STAT 664) , 2012-2017.

Login to your account

Change password, your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

Create a new account

Can't sign in? Forgot your password?

Enter your email address below and we will send you the reset instructions

If the address matches an existing account you will receive an email with instructions to reset your password

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

World Scientific

  •   
  • Institutional Access

Cookies Notification

CONNECT Login Notice

Our site uses Javascript to enchance its usability. You can disable your ad blocker or whitelist our website www.worldscientific.com to view the full content.

Select your blocker:, adblock plus instructions.

  • Click the AdBlock Plus icon in the extension bar
  • Click the blue power button
  • Click refresh

Adblock Instructions

  • Click the AdBlock icon
  • Click "Don't run on pages on this site"

uBlock Origin Instructions

  • Click on the uBlock Origin icon in the extension bar
  • Click on the big, blue power button
  • Refresh the web page

uBlock Instructions

  • Click on the uBlock icon in the extension bar

Adguard Instructions

  • Click on the Adguard icon in the extension bar
  • Click on the toggle next to the "Protection on this website" text

Brave Instructions

  • Click on the orange lion icon to the right of the address bar
  • Click the toggle on the top right, shifting from "Up" to "Down

Adremover Instructions

  • Click on the AdRemover icon in the extension bar
  • Click the "Don’t run on pages on this domain" button
  • Click "Exclude"

Adblock Genesis Instructions

  • Click on the Adblock Genesis icon in the extension bar
  • Click on the button that says "Whitelist Website"

Super Adblocker Instructions

  • Click on the Super Adblocker icon in the extension bar
  • Click on the "Don’t run on pages on this domain" button
  • Click the "Exclude" button on the pop-up

Ultrablock Instructions

  • Click on the UltraBlock icon in the extension bar
  • Click on the "Disable UltraBlock for ‘domain name here’" button

Ad Aware Instructions

  • Click on the AdAware icon in the extension bar
  • Click on the large orange power button

Ghostery Instructions

  • Click on the Ghostery icon in the extension bar
  • Click on the "Trust Site" button

Firefox Tracking Protection Instructions

  • Click on the shield icon on the left side of the address bar
  • Click on the toggle that says "Enhanced Tracking protection is ON for this site"

Duck Duck Go Instructions

  • Click on the DuckDuckGo icon in the extension bar
  • Click on the toggle next to the words "Site Privacy Protection"

Privacy Badger Instructions

  • Click on the Privacy Badger icon in the extension bar
  • Click on the button that says "Disable Privacy Badger for this site"

Disconnect Instructions

  • Click on the Disconnect icon in the extension bar
  • Click the button that says "Whitelist Site"

Opera Instructions

  • Click on the blue shield icon on the right side of the address bar
  • Click the toggle next to "Ads are blocked on this site"

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Selected Topics in Information and Coding Theory cover

Series on Coding Theory and Cryptology: Volume 7

Selected topics in information and coding theory.

  • Edited by: 
  • I Woungang ( Ryerson University, Canada ) , 
  • S Misra ( Indian Institute of Technology, Kharagpur, India ) , and 
  • S C Misra ( State University of New York at Buffalo, USA )
  • Add to favorites
  • Download Citations
  • Track Citations
  • Recommend to Library
  • Description
  • Supplementary

The last few years have witnessed rapid advancements in information and coding theory research and applications. This book provides a comprehensive guide to selected topics, both ongoing and emerging, in information and coding theory. Consisting of contributions from well-known and high-profile researchers in their respective specialties, topics that are covered include source coding; channel capacity; linear complexity; code construction, existence and analysis; bounds on codes and designs; space-time coding; LDPC codes; and codes and cryptography.

All of the chapters are integrated in a manner that renders the book as a supplementary reference volume or textbook for use in both undergraduate and graduate courses on information and coding theory. As such, it will be a valuable text for students at both undergraduate and graduate levels as well as instructors, researchers, engineers, and practitioners in these fields.

Supporting Powerpoint Slides are available upon request for all instructors who adopt this book as a course text. Please send your request to [email protected] .

  • Linear Complexity and Related Complexity Measures (A Winterhof)
  • Lattice and Construction of High Coding Gain lattices from Codes (M-R Sadeghi)
  • Distributed Space-Time Codes with Low ML Decoding Complexity (G S Rajan & B S Rajan)
  • Coding Theory and Algebraic Combinatorics (M Huber)
  • Block Codes from Matrix and Group Rings (P Hurley & T Hurley)
  • LDPC and Convolutional Codes from Matrix and Group Rings (P Hurley & T Hurley)
  • Search for Good Linear Codes in the Class of Quasi-Cyclic and Related Codes (N Aydin & T Asamov)
  • Applications of Universal Source Coding to Statistical Analysis of Time Series (B Ryabko)
  • Introduction to Network Coding for Acyclic and Cyclic Networks (Á I Barbero & Øyvind Ytrehus)
  • Distributed Joint Source-Channel Coding on a Multiple Access Channel (V Sharma & R Rajesh)
  • Low-Density Parity-Check Codes and the Related Performance Analysis Methods (X-D Ma)
  • Variable Length Codes and Finite Automata (M-P Béal et al.)
  • Decoding and Finding the Minimum Distance with Gröbner Bases: History and New Insights (S Bulygin & R Pellikaan)
  • Cooperative Diversity Systems for Wireless Communication (M Uysal & M M Fareed)
  • Public Key Cryptography and Coding Theory (P Véron)
  • Traces the steep growth in research and applications of information and coding theory over the last several years
  • Written in a tutorial style, replete with PowerPoint slides, that makes it suitable for use for classroom instructional purposes, in addition to their use by researchers and industry professionals
  • Includes question-and-answer sets in each chapter to help readers test their understanding of the concepts

Updated pub date on 22/10/2008

Updated pub date on 12/1/2009

Updated exchange rate on 19/02/2009

Updated pub date on 23/11/2009

Updated ed & pub date on 25/11/2009

Updated pp & price on 11/01/2010

Updated descrip on 20/01/2010

FRONT MATTER

  • I. Woungang ,
  • S. Misra , and 
  • S. C. Misra
  • Pages: i–xviii

https://doi.org/10.1142/9789812837172_fmatter

  • CONTRIBUTORS

Part 1: Applications of Coding Theory to Computational Complexity

Linear complexity and related complexity measures.

  • ARNE WINTERHOF
  • Pages: 3–40

https://doi.org/10.1142/9789812837172_0001

The linear complexity of a sequence is not only a measure for the unpredictability and thus suitability for cryptography but also of interest in information theory because of its close relation to the Kolmogorov complexity. However, in contrast to the Kolmogorov complexity the linear complexity is computable and so of practical significance.

It is also linked to coding theory. On the one hand, the linear complexity of a sequence can be estimated in terms of its correlation and there are strong ties between low correlation sequence design and the theory of error-correcting codes. On the other hand, the linear complexity can be calculated with the Berlekamp-Massey algorithm which was initially introduced for decoding BCH-codes.

This chapter surveys several mainly number theoretic methods for the theoretical analysis of the linear complexity and related complexity measures and describes several classes of particularly interesting sequences with high linear complexity.

LATTICE AND CONSTRUCTION OF HIGH CODING GAIN LATTICES FROM CODES

  • MOHAMMD-REZA SADEGHI
  • Pages: 41–76

https://doi.org/10.1142/9789812837172_0002

In this study, we conduct a comprehensive investigation on lattices and several constructions of lattice from codes. These constructions include some known constructions such as, Construction A, B, C, D, and D′. For each construction we derive some corresponding lattice factors such as label groups, label codes, etc. Finally, we compare these constructions together to come up with a possible lattice construction with high coding gain using Turbo codes and low density parity check codes.

DISTRIBUTED SPACE-TIME CODES WITH LOW ML DECODING COMPLEXITY

  • G. SUSINDER RAJAN  and 
  • B. SUNDAR RAJAN
  • Pages: 77–117

https://doi.org/10.1142/9789812837172_0003

Cooperative communication in a wireless relay network consisting of a source node, a destination node, and several relay nodes equipped with half duplex single antenna transceivers is considered. Recently, for such a setting, it has been shown that diversity order equal to the number of relays can be achieved provided appropriate distributed space-time block codes (DSTBCs) are manufactured/used by the relay nodes, thus emulating a virtual multiple input multiple output (MIMO) system. This chapter deals with DSTBCs with low maximum likelihood (ML) decoding complexity. After a brief introduction and background to the study of STBCs and DSTBCs, a survey of STBC constructions for point to point MIMO systems that are amenable for low complexity ML decoding is presented. Then, DSTBC constructions achieving full cooperative diversity along with low ML decoding complexity are discussed separately for two cases: (i) the destination has perfect channel state information (CSI) of the source to relay as well as the relay to destination channel fading gains and no CSI is available at the source and relay nodes and (ii) the destination has full CSI and the relays have partial CSI corresponding to the phase component of the source to relay channel gains. For some of these cases, upper bounds on the maximum rate of single symbol ML decodable DSTBCs are also provided. These are counterparts of the well-known complex orthogonal designs for point to point MIMO systems that includes the Alamouti code as a special case.

Part 2: Methods of Algebraic Combinatorics in Coding Theory/Codes Construction and Existence

Coding theory and algebraic combinatorics.

  • MICHAEL HUBER
  • Pages: 121–158

https://doi.org/10.1142/9789812837172_0004

This chapter introduces and elaborates on the fruitful interplay of coding theory and algebraic combinatorics, with most of the focus on the interaction of codes with combinatorial designs, finite geometries, simple groups, sphere packings, kissing numbers, lattices, and association schemes. In particular, special interest is devoted to the relationship between codes and combinatorial designs. We describe and recapitulate important results in the development of the state-of-the-art. In addition, we give illustrative examples and constructions, and highlight recent advances. Finally, we provide a collection of significant open problems and challenges concerning future research.

BLOCK CODES FROM MATRIX AND GROUP RINGS

  • PAUL HURLEY  and 
  • Pages: 159–194

https://doi.org/10.1142/9789812837172_0005

In this chapter, the algebra of groups ring and matrix rings is used to construct and analyze systems of zero-divisor and unit-derived codes. These codes are more general than codes from ideals (e.g. cyclic codes) in group rings. They expand the space of linear block codes, offering additional flexibility in terms of desired properties as algebraic formulations, while also have readily available generator and check matrices.

A primer is presented in the necessary algebra, showing how group rings and certain rings of matrices can be used interchangeably. Then it is shown how the codes may be derived, showing particular cases such as self-dual codes and codes from dihedral group rings.

LDPC AND CONVOLUTIONAL CODES FROM MATRIX AND GROUP RINGS

  • Pages: 195–237

https://doi.org/10.1142/9789812837172_0006

In this chapter, low density parity check (LDPC) codes and convolutional codes are constructed and analyzed using matrix and group rings. It is shown that LDPC codes may be constructed using units or zero-divisors of small support in group rings. From the algebra it is possible to identify where short cycles occur in the matrix of a group ring element thereby allowing the construction, directly and algebraically, of LDPC codes with no short cycles. It is then also possible to construction units of small support in group rings with no short cycles at all in their matrices, thus allowing a huge choice of LDPC codes with no short cycles which may be produced from a single unit element. A general method is given for constructing codes from units in abstract systems. Applying the general method to the system of group rings with their rich algebraic structure allows the construction and analysis of series of convolutional codes. Convolutional codes are constructed and analyzed within group rings in the the infinite cyclic over rings which are themselves group rings.

SEARCH FOR GOOD LINEAR CODES IN THE CLASS OF QUASI-CYCLIC AND RELATED CODES

  • NUH AYDIN  and 
  • TSVETAN ASAMOV
  • Pages: 239–285

https://doi.org/10.1142/9789812837172_0007

This chapter gives an introduction to algebraic coding theory and a survey of constructions of some of the well-known classes of algebraic block codes such as cyclic codes, BCH codes, Reed–Solomon codes, Hamming codes, quadratic residue codes, and quasi-cyclic (QC) codes. It then describes some recent generalizations of QC codes and open problems related to them. Also discussed in this chapter are elementary bounds on the parameters of a linear code, the main problem of algebraic coding theory, and some algebraic and combinatorial methods of obtaining new codes from existing codes. It also includes a section on codes over ℤ 4 , integers modulo 4, due to increased attention given to these codes recently. Moreover, a recently created database of best-known codes over ℤ 4 is introduced.

Part 3: Source Coding/Channel Capacity/Network Coding

Applications of universal source coding to statistical analysis of time series.

  • BORIS RYABKO
  • Pages: 289–338

https://doi.org/10.1142/9789812837172_0008

We show how universal codes can be used for solving some of the most important statistical problems for time series. By definition, a universal code (or a universal lossless data compressor) can compress any sequence generated by a stationary and ergodic source asymptotically to the Shannon entropy, which, in turn, is the best achievable ratio for lossless data compressors.

We consider finite-alphabet and real-valued time series and the following problems: estimation of the limiting probabilities for finite-alphabet time series and estimation of the density for real-valued time series, the on-line prediction, regression, classification (or problems with side information) for both types of the time series and the following problems of hypothesis testing: goodness-of- fit testing, or identity testing, and testing of serial independence. It is important to note that all problems are considered in the framework of classical mathematical statistics and, on the other hand, everyday methods of data compression (or archivers) can be used as a tool for the estimation and testing.

It turns out, that quite often the suggested methods and tests are more powerful than known ones when they are applied in practice.

INTRODUCTION TO NETWORK CODING FOR ACYCLIC AND CYCLIC NETWORKS

  • ÁNGELA I. BARBERO  and 
  • ØYVIND YTREHUS
  • Pages: 339–421

https://doi.org/10.1142/9789812837172_0009

This chapter contains two parts. The first part gives an introduction to a relatively new communication paradigm: Network coding, where network nodes, instead of just forwarding symbols or packets, also are allowed to combine symbols and packets before forwarding them. The second part focuses on deterministic encoding algorithms for multicast, in particular for networks, which may contains cycles.

DISTRIBUTED JOINT SOURCE–CHANNEL CODING ON A MULTIPLE ACCESS CHANNEL

  • VINOD SHARMA  and 
  • Pages: 423–467

https://doi.org/10.1142/9789812837172_0010

We consider the problem of transmission of several distributed sources over a multiple access channel (MAC) with side information at the sources and the decoder. Source-channel separation does not hold for this channel. Sufficient conditions are provided for transmission of sources with a given distortion. The source and/or the channel could have continuous alphabets (thus Gaussian sources and Gaussian MACs are special cases). Various previous results are obtained as special cases. We also provide several good joint source-channel coding schemes for a discrete/continuous source and discrete/continuous alphabet channel. Channels with feedback and fading are also considered.

Part 4: Other Selected Topics in Information and Coding Theory

Low-density parity-check codes and the related performance analysis methods.

  • Pages: 471–503

https://doi.org/10.1142/9789812837172_0011

Low-density parity-check (LDPC) codes are a class of error-correction codes, which attract much attention recently. The LDPC codes can approach within one-tenth dB of the Shannon limits. The codes have already found many applications in practice. This chapter presents a tutorial exposition of LDPC codes and the related performance analysis methods.

VARIABLE-LENGTH CODES AND FINITE AUTOMATA

  • MARIE-PIERRE BÉAL ,
  • JEAN BERSTEL ,
  • BRIAN H. MARCUS ,
  • DOMINIQUE PERRIN ,
  • CHRISTOPHE REUTENAUER , and 
  • PAUL H. SIEGEL
  • Pages: 505–584

https://doi.org/10.1142/9789812837172_0012

The aim of this chapter is to present, in appropriate perspective, some selected topics in the theory of variable-length codes. One of the domains of applications is lossless data compression. The main aspects covered include optimal prefix codes and finite automata and transducers. These are a basic tool for encoding and decoding variable-length codes. Connections with codes for constrained channels and sources are developed in some detail. Generating series are used systematically for computing the parameters of encodings such as length and probability distributions. The chapter contains numerous examples and exercises with solutions.

DECODING AND FINDING THE MINIMUM DISTANCE WITH GRÖBNER BASES: HISTORY AND NEW INSIGHTS

  • STANISLAV BULYGIN  and 
  • RUUD PELLIKAAN
  • Pages: 585–622

https://doi.org/10.1142/9789812837172_0013

In this chapter, we discuss decoding techniques and finding the minimum distance of linear codes with the use of Gröbner bases. First, we give a historical overview of decoding cyclic codes via solving systems of polynomial equations over finite fields. In particular, we mention papers of Cooper, Reed, Chen, Helleseth, Truong, Augot, Mora, Sala, and others. Some structural theorems that use Gröbner bases in this context are presented. After that we shift to the general situation of arbitrary linear codes. We give an overview of approaches of Fitzgerald and Lax. Then we introduce our method of decoding linear codes that reduces this problem to solving a system of quadratic equations. We discuss open problems and future research possibilities.

COOPERATIVE DIVERSITY SYSTEMS FOR WIRELESS COMMUNICATION

  • MURAT UYSAL  and 
  • MUHAMMAD MEHBOOB FAREED
  • Pages: 623–661

https://doi.org/10.1142/9789812837172_0014

Cooperative diversity represents a new class of wireless communication techniques in which network nodes help each other in relaying information to realize spatial diversity advantages. This new transmission paradigm promises significant performance gains in terms of link reliability, spectral efficiency, system capacity, and transmission range. Cooperative diversity has spurred tremendous excitement within the academia and industry circles since its introduction and has been extensively studied over the last few years. This chapter presents an overview of cooperative diversity conveying the underlying basic principles and further discusses the latest advances and open issues in this rapidly evolving field.

PUBLIC KEY CRYPTOGRAPHY AND CODING THEORY

  • PASCAL VÉRON
  • Pages: 663–706

https://doi.org/10.1142/9789812837172_0015

For a long time, cryptography was only concerned by message confidentiality (how to convert a text in an incomprehensible form for non-authorized readers). Nowadays, with the development of network communications, cryptography is devoted to the study of numerous problems concerning security: authentication, message integrity checking, digital signatures, secure computation, …. The security of basically all real-life protocols is actually based on hard problems coming from number theory. The aim of this chapter is to show that algebraic coding theory offers an alternative way to define secure cryptographic primitives.

Sample Chapter(s) Chapter 1: Linear Complexity and Related Complexity Measures (345 KB)

research paper on information theory and coding

Related Books

research paper on information theory and coding

Algorithms for Analysis, Inference, and Control of Boolean Networks

research paper on information theory and coding

Chinese Remainder Theorem

research paper on information theory and coding

Automata Theory

research paper on information theory and coding

Understanding and Learning Statistics by Computer

research paper on information theory and coding

Specification and Verification of Systolic Arrays

research paper on information theory and coding

Some New Directions in Science on Computers

research paper on information theory and coding

Randomness and Complexity, From Leibniz to Chaitin

research paper on information theory and coding

Planar Graph Drawing

research paper on information theory and coding

An Elementary Approach to Design and Analysis of Algorithms

research paper on information theory and coding

Query Complexity

research paper on information theory and coding

Impossible Minds

research paper on information theory and coding

The Mathematical Theory of Nonblocking Switching Networks

research paper on information theory and coding

Handbook of Graph Grammars and Computing by Graph Transformation

research paper on information theory and coding

Voronoi Diagrams and Delaunay Triangulations

research paper on information theory and coding

A Half-Century of Automata Theory

research paper on information theory and coding

Logic and Language Models for Computer Science

research paper on information theory and coding

A Computable Universe

research paper on information theory and coding

An Introduction to the Analysis of Algorithms

research paper on information theory and coding

Genetic Algorithms and Robotics

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

  • Corpus ID: 202784313

Information Theory and Coding

  • Murlidhar Kulkarni And K.S.Shivaprakasha , K. Kulkarni
  • Published 2014
  • Computer Science

One Citation

A novel methodology for clinical semantic annotations assessment, related papers.

Showing 1 through 3 of 0 Related Papers

Coding and Information Theory

Cite this chapter.

research paper on information theory and coding

Part of the book series: Information Technology: Transmission, Processing and Storage ((PSTE))

442 Accesses

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Unable to display preview.  Download preview PDF.

Bibliography

References marked with an asterix are recommended as supplementary reading. J. B. Anderson, Digital Transmission Engineering . IEEE Press, New York, 1999.

Google Scholar  

A. M. Michelson and A. H. Levesque, Error-Control Techniques for Digital Communication . Wiley, New York, 1985.

S. Lin and D. J. Costello, Jr., Error Control Coding . Prentice-Hall, Englewood Cliffs, NJ, 1983.

R. E. Blahut, Theory and Practice of Error Control Codes . Addison-Wesley, Reading, MA, 1983.

J. B. Anderson and S. Mohan, Source and Channel Coding . Kluwer, Boston, MA, 1991.

References marked with an asterix are recommended as supplementary reading. G. C. Clark, Jr. and J. Bibb Cain, Error-Correction Coding for Digital Communications . Plenum, New York, 1981.

J. B. Anderson and K. Balachandran, “Decision depths of convolutional codes,” IEEE Trans. Inf. Theory , IT-35 , 455–459, March 1989.

J. B. Anderson, T. Aulin and C.-E. Sundberg, Digital Phase Modulation . Plenum, New York, 1986.

A. J. Viterbi, “Error bounds for convolutional codes and an asymptotically optimal decoding algorithm,” IEEE Trans. Inf. Theory , IT-13 , 260–269, April 1967.

L. R. Bahl, J. Cocke, F. Jelinek and J. Raviv, “Optimal decoding of linear codes for minimizing symbol error rate,” IEEE Trans. Inf. Theory , IT-20 , 284–287, March 1974.

MathSciNet   Google Scholar  

R. Johannesson and K. Zigangirov, Fundamentals of Convolutional Coding . IEEE Press, New York, 1999.

References marked with an asterix are recommended as supplementary reading. R. G. Gallager, Information Theory and Reliable Communication . McGraw-Hill, New York, 1968.

T. M. Cover and J. A. Thomas, Elements of Information Theory . Wiley, New York, 1991.

H. J. Landau and H. O. Pollak, “Prolate spheroidal wave functions, Fourier analysis and uncertainty, III: the dimension of the space of essentially time-and bandlimited signals,” Bell Syst. Tech. J. , 41 , 1295–1336, July 1962.

References marked with an asterix are recommended as supplementary reading. C. E. Shannon, “A mathematical theory of communication,” Bell Syst. Tech. J. , 27 , 379–429 (Part I) and 623–656 (Part II), July 1948; reprinted in Claude Elwood Shannon: Collected Papers , Sloane and Wyner, (eds), IEEE Press, NY, 1993.

C. E. Shannon, “Communication in the presence of noise,” Proc. IRE , 37 , 10–21, 1949; reprinted as cited in [15].

U. Kressel, “Informationstheoretische Beurteilung digitaler Uebertragungsverfahren mit Hilfe des Fehlerexponenten,” (“Information theoretic evaluation of digital transmission methods with the help of error exponents”), Fortschritt-Berichte VDI, Reihe 10, Nr. 121, VDI-Verlag, Duesseldorf, 1989 (in German).

B. Friedrichs, Kanalcodierung . Springer-Verlag, Berlin, 1995 (in German).

A. Viterbi and J. Omura, Principles of Digital Communication and Coding . McGraw-Hill, New York, 1979.

F. M. Reza, An Introduction to Information Theory . McGraw-Hill, New York, 1961.

E. A. Lee and D. G. Messerschmitt, Digital Communication . Kluwer, Boston, 1988 and later editions

J. B. Anderson and K. Tepe, “Properties of the tailbiting BCJR decoder,” in Codes, Systems and Graphical Models , Marcus and Rosenthal (eds), IMA Volumes in Mathematics and its Applications, vol. 123, Springer-Verlag, New York, 2001.

Download references

Editor information

Editors and affiliations.

University of California at San Diego, La Jolla, California

Jack Keil Wolf ( Series Editor ) ( Series Editor )

California Institute of Technology, Pasadena, California

Robert J. McEliece

Northeastern University, Boston, Massachusetts

John Proakis

Virginia Polytechnic Institute and State University, Blacksburg, Virginia

William H. Tranter

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Kluwer Academic Publishers

About this chapter

(2002). Coding and Information Theory. In: Wolf, J.K., McEliece, R.J., Proakis, J., Tranter, W.H. (eds) Coded Modulation Systems. Information Technology: Transmission, Processing and Storage. Springer, Boston, MA. https://doi.org/10.1007/0-306-47792-0_3

Download citation

DOI : https://doi.org/10.1007/0-306-47792-0_3

Publisher Name : Springer, Boston, MA

Print ISBN : 978-0-306-47279-4

Online ISBN : 978-0-306-47792-8

eBook Packages : Springer Book Archive

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

research paper on information theory and coding

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

  •  We're Hiring!
  •  Help Center

Information Theory and coding

  • Most Cited Papers
  • Most Downloaded Papers
  • Newest Papers
  • Last »
  • Coding Theory Follow Following
  • Network coding Follow Following
  • Information Theory Follow Following
  • Compressed Sensing Follow Following
  • Wireless Communications Follow Following
  • Adaptive Modulation and Coding (AMC) Follow Following
  • Error control and Coding theory Follow Following
  • Computer Architecture and Organisation Follow Following
  • Visible Light Communication Follow Following
  • Computer Science and IT Follow Following

Enter the email address you signed up with and we'll email you a reset link.

  • Academia.edu Journals
  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

IMAGES

  1. (PDF) Introduction to Information Theory and Coding

    research paper on information theory and coding

  2. Unit1ITC

    research paper on information theory and coding

  3. PPT

    research paper on information theory and coding

  4. Information Theory and Coding 1

    research paper on information theory and coding

  5. SOLUTION: A student s guide to coding and information theory

    research paper on information theory and coding

  6. Information Theory Notes

    research paper on information theory and coding

VIDEO

  1. Lecture 1 of the Course on Information Theory, Pattern Recognition, and Neural Networks

  2. Lecture No 2, Information Theory and Coding Spring 2023

  3. Information Theory & Coding Paper || rtu 5 Sem Paper

  4. Lecture No 1, Information Theory and Coding Spring 2023

  5. Lecture No. 1 (part 2), Information Theory and Coding

  6. Lecture No.1 (part1), Information Theory and Coding

COMMENTS

  1. Yury Polyanskiy

    Books, notes, code. Y. Polyanskiy and Y. Wu. " Information Theory: From Coding to Learning," Cambridge University Press, 2023+ (book draft) This material will be published by Cambridge University Press as "Information Theory" by Yury Polyanskiy and Yihong Wu. This prepublication version is free to view and download for personal use only.

  2. Applications of information theory in communication engineering

    Abdul Razaque. Telecommunication Engineering Department. NEDUET, Karachi, Pakistan. [email protected]. Abstract: Applications of coding theory have. grown fast in the past few ...

  3. PDF Interactive information and coding theory

    Keywords. Coding theory, communication complexity, information complexity, interac-tive computation. 1. Introduction 1.1. A high-level overview of information and coding theory. We begin with a very high-level overview of information and coding theory. This is an enormous eld of study, with subareas dealing with questions ranging from

  4. International Journal of Information and Coding Theory

    Information theory and its important sub-field, coding theory, play central roles in theoretical computer science and discrete mathematics. As coding theory occupies an important position within the field of information theory, the focus of IJICoT is on publishing state-of-the-art research articles relating to it.

  5. Selected Topics in Information and Coding Theory

    ISBN: 978-981-4469-19-7 (ebook) USD 85.00. Description. Chapters. Supplementary. The last few years have witnessed rapid advancements in information and coding theory research and applications. This book provides a comprehensive guide to selected topics, both ongoing and emerging, in information and coding theory.

  6. PDF INFORMATION THEORY AND CODING BY EXAMPLE

    INFORMATION THEORY AND CODING BY EXAMPLEI N F O R M AT I O N T H E. O RY A N D C O D I N G B Y E X A M P L EThis fundamental monograph introduces both the probabilistic and the algebraic. spects of information theory and coding. It has evolved from the authors' years of experience teaching at the undergraduate level, including sever.

  7. PDF Information Theory and Coding

    Information Theory and Coding Computer Science Tripos Part II, Michaelmas Term 11 Lectures by J G Daugman 1. Foundations: Probability, Uncertainty, and Information 2. Entropies Defined, and Why they are Measures of Information 3. Source Coding Theorem; Prefix, Variable-, & Fixed-Length Codes 4. Channel Types, Properties, Noise, and Channel ...

  8. PDF Coding and Information Theory

    mation theory, which is a subjectquite different from coding. Shannontheory thinks of a data source not so much as symbols but as a probability distribution on the symbols; a channel likewise is a conditional distribution of the channel outputs given its inputs. Information is measured by a functional called entropy.

  9. [PDF] Information Theory and Coding

    The theory of information and coding : a mathematical framework for communication. This is a self-contained introduction to the theory of information and coding. It can be used either for self-study or as the basis for a course at either the graduate or ,undergraduate level.

  10. Information Theory and Coding by Example

    Information and Coding Theory. Gareth Jones J. Jones. Computer Science, Mathematics. 2000. TLDR. This book provides a clear introduction to both information theory and coding theory, and uses linear algebra to construct examples of error-correcting codes, such as the Hamming, Hadamard, Golay and Reed-Muller codes.

  11. 45366 PDFs

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on CODING THEORY. Find methods information, sources, references or conduct a literature review on CODING ...

  12. PDF Fundamentals in Information Theory and Coding

    the Technical University of Cluj Napoca, Romania and partially my research ac-tivity too. The presented topics are useful for engineers, M.Sc. and PhD students who need basics in information theory and coding. The work, organized in five Chapters and four Appendices, presents the fun-damentals of Information Theory and Coding.

  13. PDF Coding Theory: Tutorial & Survey

    hand often leads to information theory and/or statis-tics. The two aspects of coding theory enjoy a symbi-otic relationship from the days of their origin. Coding theory, and the dichotomy within, owes its origins to two roughly concurrent seminal works by Hamming [45] and Shannon [80], in the late 1940s. Hamming was studying devices to store ...

  14. PDF ENGINEERING 9871: Information Theory and Coding

    A group project which illustrates important aspects of information and coding theory is required in this course. This could be literature review or simulation experiment on a specific topic related to performance analysis of any coding techniques using simple channel models through latest academic publication reading.

  15. Information Theory and Network Coding

    This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research ?eld of its own in information science. With its root in infor- tion theory, network coding has not only brought ...

  16. (PDF) Information Theory and Coding ( Wiley India: 2015) by Dr

    In this paper we propose a model to study feature representation of combined Convolutional Neural Networks (CNN) and Block Group Sparse Coding (BGSC). Firstly, CNN is used to extract low-level ...

  17. Channel Coding and Information Theory

    In fading channels, (block or convolutional) interleaving is essential to provide diversity. From an information-theoretic point of view, in fading channels we can distinguish between the ergodic capacity and the outage capacity. If the channel state information is known at the transmitter, waterfilling is the optimum power allocation strategy.

  18. Information theory and coding problems in genetics

    The aim of this paper is to describe a new class of problems and some new results in coding theory arising from the analysis of the composition and functionality of the genetic code. The major goal of the proposed work is to initiate research on investigating possible connections between the regulatory network of gene interactions (RNGI) and the proofreading (error-control) mechanism of the ...

  19. What Else Can Be Learned When Coding? A Configurative Literature Review

    The idea that more people should learn coding, programming or computing, as they are interchangeably used (Bocconi et al., 2016), has become a global mainstream discourse.In particular, higher education students in non-Computer Science (CS) fields, from social sciences to humanities, are being encouraged to learn coding so they can play a more effective role in solving today's challenges ...

  20. [PDF] Information Theory and Coding

    Information Theory and Coding. Murlidhar Kulkarni And K.S.Shivaprakasha, K. Kulkarni. Published 2014. Computer Science. TLDR. The video drive circuitry of a computer monitor attempts to generate a periodic Sawtooth waveform to generate horizontal scanlines, but the output circuitry is bandlimited to W ; what would be the effect of this on the ...

  21. Information Theory and Coding: Case Studies of Laboratory Experiments

    INFORMATION THEORY AND CODING LABORATORY developed for multiple OBJECTIVES as follow: The student should be made to: • Be exposed to the information theories and their coding. • Learn to ...

  22. Coding and Information Theory

    'Coding and Information Theory' published in 'Coded Modulation ... Account. Menu. Find a journal Publish with us Track your research Search. Cart. Home. Coded Modulation Systems . Chapter. Coding and Information Theory ... (Part II), July 1948; reprinted in Claude Elwood Shannon: Collected Papers, Sloane and Wyner, (eds), IEEE Press ...

  23. Information Theory and coding Research Papers

    This paper introduces the basics of Galois Field as well as its implementation in storing data. This paper shows and helps visualizes that storing data in Galois Fields allows manageable and effective data manipulation, where it focuses... more. Download. by Ratnesh Pandey. Information Theory and coding.