MPEG 4 - Multimedia for Our Time

Referat
7/10 (1 vot)
Domeniu: Electronică
Conține 1 fișier: doc
Pagini : 5 în total
Cuvinte : 2198
Mărime: 9.59KB (arhivat)
Publicat de: Axinte Cojocariu
Puncte necesare: 5
Profesor îndrumător / Prezentat Profesorului: Simona Halunga

Extras din referat

The standard MPEG-4, developed over five years by the Moving Picture Experts Group (MPEG) OF THE Geneva – based International Organization for Standardization (ISO), explores every possibility of the digital environment.

The greatest of the advances made by MPEG-4 is that viewers and listeners need no longer be passive.

MPEG-4 allows the user to interact with object within the scene, whether they derive from so-called real sources, such as moving video, or from synthetic sources, such as computer-aided design output or computer-generated cartoons. Authors of content can give users the power to modify scenes by deleting, adding, or repositioning objects; for example, a click on a box could set it spinning.

MPEG-4 supplies tools with which to create uniform (and top-quality) audio and video encoders and decoders on the Internet, pre-empting what may become an unmanageable tangle of proprietary formats.

The standard is designed for low bit-rate communications devices, which are usually wireless devices.

MPEG-4 supports scalable content, that is, allows content to be encoded once and automatically played out at different rates with acceptable quality for the communication environment at hand.

MPEG-4- provides tools for seamlessly integrating broadcast content with equally high-quality interactive MPEG-4 objects.

The utility of objects

The audio and video components of MPEG-4 are known as objects. These can exist independently, or multiple ones can be grouped together to form higher-level audiovisual bonds, to coin a phrase. The grouping is called composition, and the result is an MPEG-4 scene. The strength of this so-called object-oriented approach is that the audio and video can be easily manipulated.

Visual objects in a scene are described mathematically and given a position in a two- or three- dimensional space. Similarly, audio objects are placed in a sound space. When placed in 3-D space, the video or audio object need only be defined once; the viewer can change his vantage point, and the calculations to update the screen and sound are done locally, at the users’ terminal. This is a critical feature if the response is to be fast and the available bit-rate is limited, or when no return channel is available, as in broadcast situations.

MPEG-4’s language for describing and dynamically changing the scene is named the Binary Format for Scenes (BIFS). BIFS commands are available not only to add objects to or delete them from the scene, but also to change visual or acoustic properties of an object without changing the object in itself; thus the colour alone of a 3-D sphere might be varied.

BIFS can be used to animate objects just by sending a BIFS command and to define their behaviour in response to user input at the decoder.

BIFS borrows many concepts from the Virtual Reality Modelling Language (VRML), which is the method used most widely on the Internet to describe 3-D objects and users’ interaction with them. In VRML, the objects and their actions are described in text, as in any other high-level language. But BIFS code is binary, and thus is shorter for the same content-typically 10 to 15 times.

MPEG-4 uses BIFS for real-time streaming, that is, a scene doesn’t need to be downloaded in full before it can be played, but can be build up on the fly. BIFS allows defining 2-D objects such as lines and rectangles – something currently not possible in VRML.

The next release of the MPEG-4 standard is MPEG-J. This is an MPEG-4 specific subset of the object-oriented Java language. MPEG-J defines interfaces to elements in the scene, network resources, terminal resources, and input devices.

Wrapping the data

Just as MPEG-4’s representation of multimedia content is new and versatile, so is its scheme for preparing that content for transportation or storage. Objects are placed in so-called elementary streams (ESs). Some objects, such a sound track or a video, will have a single such stream. Others object may have two or more.

Higher-level data describing the scene – the BIFS data defining, updating and positioning the media objects – is conveyed in its own ES. Object-based conceptions in MPEG-4 can be seen: it is easier to reuse objects in the production of new multimedia content, or the new production is easier to modify without changing an encoded object itself.

To inform the system which elementary streams belong to a certain object, MPEG-4 uses the novel, critical concept of an object descriptor (OD). Object descriptors in their turn contain elementary stream descriptors (ESDs) to tell the system that decoders are needed to decode a stream. Optional textual information about the object can be supplied. Object descriptors are sent in their own, special elementary stream, which allows them to be added or deleted dynamically as the scene changes.

The play-out of the multiple MPEG-4 objects is coordinated at a layer devoted solely to synchronization. Elementary streams are split into packets, and timing information is added to the payload of these packets. These packets are then ready to be passed on to the transport layer.

Streams here, streams there

Timing information for the decoder consists of the speed of the encoder clock and the time stamps of the incoming streams, which are relative to that clock. Two kinds of time stamps exist: one says when a piece of information must be decoded, the other says when the information must be ready for presentation.

In terms of ISO seven-layer communications model, no specific transport mechanism is defined in MPEG-4. The fact that the MPEG-2 transport stream is used by digital TV has the important consequence of allowing co-broadcast modes.

A separate transport channel could be set up for each data stream, but there can be many of these for a single MPEG-4 scene, and as a result the process could be unwieldy and waste bits. A small tool in MPEG-4, FlexMux, was designed to act as an intermediate step to any suitable form of transport. Another interface defined in MPEG-4 lets the application ask for connections with a certain quality of service, in terms of parameters like bandwidth, error rate, or delay.

This interface is the same for broadcast channels, interactive sessions, and local storage media. The next release of the standard will allow differing channels to be used at either end of a transmission/receive network.

Another important addition in Version 2 is a file format known as mp4, which can be used for exchange of content and which is easily converted. MPEG-1 and MPEG-2 did not include such a specification, but the intended use of MPEG-4 in Internet and personal computer environments makes it a necessity.

Preview document

MPEG 4 - Multimedia for Our Time - Pagina 1
MPEG 4 - Multimedia for Our Time - Pagina 2
MPEG 4 - Multimedia for Our Time - Pagina 3
MPEG 4 - Multimedia for Our Time - Pagina 4
MPEG 4 - Multimedia for Our Time - Pagina 5

Conținut arhivă zip

  • MPEG 4 - Multimedia for Our Time.doc

Alții au mai descărcat și

Monitorul

O clasificare sumara a monitoarelor ar putea fi dupa unul din criteriile : a) dupa culorile de afisare -monitoare monocrome (afiseaza doar doua...

Televiziunea Digitală

1 TV analogică 1.1 Noţiuni de bază. Principiul de realizare a Televiziunii Televiziunea – din greceşte „vedere la distanţă”, este ştiinţa căreia...

Stabilizator de Tensiune

3. Functionarea În general, pentru realizarea stabilizatoarelor de tensiune se folosesc proprietatile diodelor. Cel mai simplu tip de...

Te-ar putea interesa și

Standardul MPEG4

1.Sinteza Lucrarea de fata îsi propune prezentare unor notiuni fundamentale legate de standardul IS- MPEG 4. În prima parte sunt prezentate unele...

Ai nevoie de altceva?