Abstract
The data acquired by the sensor have to be processed by the coprocessor or the host microprocessor, so both the systems have to share the same communication protocol and data format. Moreover, at the end of the image generation pipeline the image must be coded in a standard format in order to be read by every external device. Usually the sensor provides the image in the Bayer format. In the past the Bayer data were stored and transmitted using proprietary format and protocol; such solution has the drawback that every designer had to use the same proprietary interface to manage the sensor data. In the latest years the majority of companies making, buying or specifying imagine devices proposed a new standard called Standard Mobile Imaging Architecture (SMIA). It allows interconnecting sensors and hosts of different vendors. Concerning the output of the coprocessor, several standard formats are available. For still images the most frequently used are the Joint Picture Expert Group (JPEG) with a lossy compression, the Targa Interchange Format (TIF) with a lossless compression. In the top level imaging devices the output of the sensors can also be stored directly by making use of a proprietary file format, such as the Nikon Electronic Image Format (NEF), the Canon RAW File Format (CRW), etc. For videos the most used are MJPEG, MPEG-4, H.263 and H264 standards. This chapter besides presenting the main data formats gives also a short description to the next JPEG XR Image Coding Standard. Moreover some techniques concerning the compression factor control and the error detection and concealment are introduced.