dmedia(3dm) dmedia(3dm)
dmedia, dmIntro - Introduction to the IRIS Digital Media Libraries
The term "digital media" is used to describe digitally sampled audio and
video (including still images) data, MIDI event streams, serial event
streams, and other associated information such as time codes. Sampled
audio/video data may be digitally encoded in a variety of uncompressed
and compressed formats.
The IRIS Digital Media Libraries provide programming support for digital
audio, digital video, and MIDI applications. The libraries provide
programming interfaces to:
o audio I/O, video I/O
o MIDI I/O, timestamped serial I/O
o data format conversion (including compression)
o digital media file import/creation/editing
o high-level playback functions
o SMPTE timecode math
o LTC and VITC timecode decoding
DIGITAL MEDIA I/O SUBSYSTEMS
The Audio Library (libaudio) routines provide a device-independent
programming interface to the digital audio I/O subsystems built into
Silicon Graphics workstations. See the man page ALintro(3dm) for an
overview of the routines in the library.
The Video Library (libvl) provides a programming interface to digital
video I/O subsystems available on Silicon Graphics systems. See the man
page VLintro(3dm) for an introduction to the video library routines.
The MIDI Library (libmd) provides a programming interface for timestamped
MIDI input/output via serial ports. The library also allows one
process to send MIDI events to another process through "internal" MIDI
ports. The man page mdIntro(3dm) provides an introduction to the
library.
Page 1
dmedia(3dm) dmedia(3dm)
The timestamped serial I/O library (tserialio) provides a programming
interface for millisecond-accurate timestamping and scheduling of raw
serial bytes. See the man page tserialio(3) for more information.
The Compression Library (libcl) interface supports the Cosmo Compress and
Octane Compression JPEG codecs, in addition to several software codecs,
including software JPEG. The man page clIntro(3dm) provides an
introduction to the library. It has been obsoleted by the Digital Media
Image Conversion library (dmIC in libdmediaFP).
The Digital Media Image Conversion library (dmIC in libdmedia) supports
the O2 ICE hardware JPEG codec, in addition to several software codecs.
The man page DMimageconverter(3dm) provides an introduction to the
library.
A "media stream" is a continuous stream of audio or video data entering
or leaving the computer system through the audio/video connectors, or a
stream of video data which is displayed through the graphics subsystem.
An interlaced video stream consists of alternating even and odd video
fields. Each field in the stream is sampled at a different time. Odd
fields are vertically offset from even fields by one scan line. A non-
interlaced video stream is composed of video frames. A frame may be
converted to a pair of fields by placing alternate lines from the frame
in the odd and even field. A simple-minded way to convert a pair of
fields to a frame is to interleave the lines from the two fields; since
the fields are sampled at different times, however, if any motion
occurred in the scene during the time from the first field to the second
field, the resulting interleaved frame will appear blurred.
An audio stream is composed of audio sample frames. The number of samples
in a sample frame is equal to the number audio channels in the stream;
for a stereo stream, a sample frame consists of a sample for the left
channel and a sample for the right channel.
At one end of a continuous media stream there is always a real-time
consumer or producer of sample data. Audio/video samples enter the system
through analog-to-digital (A/D) converters, or directly through digital
connectors. Audio/video samples leave the system through digital-toanalog
(D/A) converters, or through digital connectors. Video data may
be sent to the graphics display, in which case the graphics subsystem
acts as a real-time video consumer. A real-time producer supplies sample
data to the system at a constant rate. For example, if the audio input
jacks are configured for 44.1 kHz stereo audio, then the audio A/D's
Page 2
dmedia(3dm) dmedia(3dm)
produce a left/right audio sample pair 44100 times per second. When PAL
video data is sent out through the video output connectors, fields are
transmitted at the rate of 50 per second. A real-time consumer removes
data from the system at a constant rate.
Physical hardware clocks are imperfect; if your application relies on any
particular exact ratio of the rates of two digital media streams, make
sure those streams are driven by the same hardware clock.
Some amount of buffering is required between a real-time data consumer or
producer and the rest of the system. It is not feasible for application
programs to directly service interrupts from the digital media I/O ports.
As a result, there is some amount of pipeline latency between the time a
sample enters the system and the time an application program actually
"sees" the data. Similarly, there is a delay between the time an
application program last "touches" a buffer of output data, and the time
data actually leaves the system through the audio or video connectors.
Synchronization between different media streams is achieved by using an
"unadjusted system timer" (UST) to timestamp incoming or outgoing media
data. The UST is a system-wide timer which provides nanosecond-resolution
timestamps. Timestamps generated by the UST are represented as 64-bit
signed integers (C typedef stamp_t). Different media events which occur
in the system may be related by their associated UST stamps.
For more information about UST, see dmGetUST(3dm).
The Digital Media Library (libdmedia) contains a set of SMPTE timecode
math routines, described in dmTCAddTC(3dm), dmTCFramesBetween(3dm),
dmTCFramesPerDay(3dm), dmTCToSeconds(3dm), and dmTCToString(3dm). These
convenience routines make it easy to manipulate hh:mm:ss:ff timecodes and
convert between hh:mm:ss:ff representation and representations more
natural for computer programs. The routines can handle drop-frame time
code.
The Digital Media Library (libdmedia) contains a set of routines for
decoding vertical interval time code (VITC). VITC is a mechanism for
storing and transferring SMPTE time code in the vertical blanking portion
of a video signal. Applications may use VITC to synchronize audio,
video, or other events with external devices by decoding VITC from a
video source connected to a video input port, or they may parse the VITC
code from a previously captured movie file. The man page dmVITC(3dm)
describes the VITC decoder routines.
Page 3
dmedia(3dm) dmedia(3dm)
The Digital Media Library also contains a set of routines for decoding
longitudinal time code (LTC). LTC is a mechanism for storing and
transferring SMPTE time code as an audio-like waveform. Applications may
use LTC to synchronize audio, video, or other events with external
devices by decoding an LTC signal connected to an audio input port, or
they may parse the LTC code from a previously captured audio or movie
file. The man page dmLTC(3dm) describes the LTC decoder routines.
DATA FORMAT CONVERSION [Toc] [Back] We distinguish between sample data format conversion and file format
conversion. Conversion between two file formats may or may not require
conversion of the encapsulated digital media sample data depending on the
specifications of the different file types. For example, SGI Movie Files,
QuickTime files, and AVI files all support JPEG-compressed image frames
in video tracks. On the other hand, Sun sound files may contain mu-law
encoded audio sample data, while AIFF sound files do not support this
format.
Audio/video data format conversion includes these operations:
Audio, video (or image) compression/decompression
Conversion between interleaved streams of compressed audio and video
data ("systems" or "transport" layer streams) and individual streams
of compressed audio and compressed video.
Audio sample format conversion (eg, conversion between 8-bit and
16-bit linear samples, conversion between linear and mu-law samples,
sampling rate conversion).
Audio channel conversion (eg, mix down from 4 interleaved channels
to stereo).
Image format conversion (eg, conversion between different colorspace
representations, conversion between different pixel formats,
conversion between different aspect ratios, image resizing).
Video filtering in the time domain (eg, filtering to interlace/deinterlace
video fields, video rate conversion).
The Digital Media Library (libdmedia) provides a standard parameter/value
list management facility for working with lists of parameters that
describe audio and video data. The man page dmParams(3dm) gives an
Page 4
dmedia(3dm) dmedia(3dm)
overview of the library routines for working with DMparams. The header
file <dmedia/dm_audio.h> contains the standard set of parameters used to
describe audio sample data (numerical sample format, sampling rate,
number of channels, and so on). The header file <dmedia/dm_image.h>
contains the standard set of parameters used to describe image data
(image dimensions in pixels, color model, interlacing, frame rate, and so
on).
The Digital Media Library includes a color space conversion library.
These conversion routines provide a simple, yet powerful, means of
converting images between color spaces (models), packings, subsamplings
and datatypes and/or performing some operation on the image data, such as
adjusting its contrast. The man page dmColor(3dm) provides an
introduction to the library. The header file <dmedia/dm_color.h> contains
the color space conversion function declarations.
The Digital Media Library includes a set of audio data conversion
routines. The header file <dmedia/dm_audioutil.h> contains audio
conversion function declarations. See the man page
dmAudioRateConvert(3dm) for an introduction to the high-quality audio
sampling rate conversion algorithms included in the library. See the man
page dmG711(3dm) for information about mu-law/A-law conversion routines.
See the man pages dmG722Encode(3dm) and dmG722Decode(3dm) for a
description of the ITU Recommendation G.722 codec available in the
library.
The Compression Library (libcl) provides a programming interface to image
(video) compression/decompression. The library supports a number of
codecs used in conjunction with SGI Movie files, QuickTime files, and AVI
files, as well as an MPEG-1 video codec. The library also includes an
MPEG-1 audio codec. See the man page clIntro(3dm) for an introduction to
the Compression Library routines. The Compression Library has been
obsoleted by the Digital Media Image Conversion library (dmIC in
libdmediaFP).
The Digital Media Library includes the Digital Media Image Conversion
library (dmIC), which supports a more flexible buffering scheme known as
DMbuffers. dmIC supports a large number of software codecs as well as
O2's ICE JPEG codec. You can use DMbuffers to transfer buffers between
O2's video inputs and outputs (using the Video Library) and O2's ICE JPEG
codec (using dmIC) with no data copy.
The higher-level Movie File file I/O routines (libmovie) and Audio File
Library (libaudiofile) provide function calls for reading/writing digital
media files. These libraries offer transparent data conversion
(compression/decompression). Applications can use these libraries to
import/export/convert/edit files containing audio/image data without
Page 5
dmedia(3dm) dmedia(3dm)
directly calling the low-level audio/image format conversion routines in
libdmedia and libcl.
FILE IMPORT, CREATION, CONVERSION, EDITING
The Movie Library file I/O interface (libmovie) supports reading,
creating, and editing digital movies. The Movie Library supports reading
and writing of SGI and QuickTime movies, and reading of MPEG-1 video and
systems bitstreams and AVI format movies. A variety of image compression
schemes are supported for the QuickTime, AVI, and SGI formats. The Movie
Library allows conversion between different file formats by opening a
movie file of one format, creating a new movie file of a different
format, and copying the data between the movies. See the man page
mvIntro(3dm) for an introduction to the Movie Library file I/O routines.
The header file <dmedia/moviefile.h> contains the function declarations
for the Movie Library file I/O interface.
The Audio File Library interface supports reading and creating digital
audio files. The Audio File Library supports reading and writing a large
variety of audio file formats. The Audio File Library allows conversion
between different file formats by opening a sound file of one format,
creating a new sound file of a different format, and copying the data
between the sound files. See the man page afIntro(3dm) for an overview of
the Audio File Library routines. The header file <dmedia/audiofile.h>
contains the function declarations for the Audio File Library.
DIGITAL MEDIA PLAYBACK FUNCTIONS
The Movie Library playback engine (included in libmovie) provides
functions for the playback of digital movie files. These functions allow
an application to bind a window to an IRIS GL or OpenGL window and then
play the movie. For more information about the playback capabilities of
the Movie Library see mvIntro(3dm) and mvPlay(3dm). The header file
<dmedia/movieplay.h> contains the function declarations for the playback
engine.
/usr/include/dmedia/*.h - header files
/usr/share/src/dmedia/* - programming examples
IRIS Media Libraries Programming Guide
Overview man pages:
Page 6
dmedia(3dm) dmedia(3dm)
ALintro(3dm) - Audio Library
AFintro(3dm) - Audio File Library
CDaudio(3dm) - CD audio library
DATaudio(3dm) - DAT audio library
dmIC(3dm) - Digital Media Library image conversion routines
dmLTC(3dm) - LTC decoding routines
dmTC(3dm) - SMPTE timecode math routines
dmVITC(3dm) - VITC decoding routines
mdIntro(3dm) - MIDI Library
clIntro(3dm) - Compression Library
dmColor(3dm) - Digital Media Library color-space conversion
routines
dmParams(3dm) - digital media parameter/value lists
mvIntro(3dm) - Movie Library file I/O and playback
tserialio(3) - timestamped serial I/O Library
VLintro(3dm) - Video Library
IRIX Real-time Support:
select(2), sproc(2), setitimer(2), schedctl(2), prctl(2), poll(2),
mpin(2)
PPPPaaaaggggeeee 7777 [ Back ]
|