How we did it:
For any feedback, any questions, any notes or just for chat - feel free to follow us on social networks
Counsels programmers and administrators for big and small organizations on how to work with large-scale application datasets using Apache Hadoop, discussing its capacity for storing and processing large amounts of data while demonstrating best practices for building reliable and scalable distributed systems.
Mark Nelson, Jean-Loup Gailly
The Data Compression Book Second Edition The Data Compression Book is the most authoritative guide to data compression techniques available. This second edition has been updated to include fractal compression techniques and all the latest developments in the compression field. All the code in the previous edition has been updated to run with today's compilers and has been tested on multiple platforms to ensure flawless performance. You'll learn to write C programs for nearly any environment as you explore different compression methods. Nelson and Gailly discuss the theory behind each method and apply the techniques involved to shrink data down to a minimum. Each technique is illustrated with a complete, functional C program that not only demonstrates how data compression works, but it also can be incorporated into your own data compression programs. You'll also get detailed benchmarks demonstrating the speed and compression ability of each technique. The code in this book has been tested on a variety of platforms and compilers including Microsoft Visual C++ 1.5 with MS-DOS 5.0 and 6.22; Borland C++ 4.0 and 4.5 with MS-DOS 5.0 and 6.22; Symantec C++ 6.0 and 7.0 with MS-DOS 5.0 & 6.22; Interactive Unix System 3.2 with the portable C compiler; Solaris 2.4 with the SunSoft compiler; and Linux 1.1 with the Gnu C Compiler. Topics Include: The Shannon-Fano and Huffman coding techniques Adaptive Huffman coding techniques Lossy compression The JPEG compression algorithm Fractal compression techniques Arithmetic coding Dictionary compression methods
A substantially updated edition of Video Coding: An Introduction to Standard Codecs, (IEE, 1999) this book discusses the growth of digital television technology and the revolution in image and video compression (such as JPEG2000, broadcast TV, and video phone), highlighting the need for standardization in processing static and moving images and their exchange between computer systems. ITU and ISO/IEC standards are now widely accepted in the picture/video coding field. The book gives an authoritative explanation of picture and video coding algorithms, working from basic principles through to the advanced video compression systems now being developed. One of its main objectives is to describe the reasons behind the introduction of a standard codec for a specific application and its chosen parameter. This book will enable readers to appreciate the fundamentals needed to design a video codec for any given application and should prove to be a valuable resource for engineers working in this field.