How to create, initialize, manipulate and connect to signals of an Emotion object. More...
Modules | |
Creation and initialization functions | |
Audio control functions | |
Video control functions | |
Visualization control functions | |
Miscellaneous information retrieval functions | |
Video ressource management | |
Play control functions | |
API available for accessing webcam | |
Typedefs | |
typedef struct _Emotion_Webcam | Emotion_Webcam |
Webcam description. | |
Functions | |
EAPI Eina_Bool | emotion_object_extension_may_play_fast_get (const char *file) |
Do we have a chance to play that file. | |
EAPI Eina_Bool | emotion_object_extension_may_play_get (const char *file) |
Do we have a chance to play that file. | |
EAPI Evas_Object * | emotion_object_image_get (const Evas_Object *obj) |
Get the actual image object that contains the pixels of the video stream. | |
Variables | |
EAPI int | EMOTION_WEBCAM_UPDATE |
Ecore_Event triggered when a new webcam is plugged in. |
Detailed Description
How to create, initialize, manipulate and connect to signals of an Emotion object.
Emotion provides an Evas smart object that allows to play, control and display a video or audio file. The API is synchronous but not everything happens immediately. There are also some signals to report changed states.
Basically, once the object is created and initialized, a file will be set to it, and then it can be resized, moved, and controlled by other Evas object functions.
However, the decoding of the music and video occurs not in the Ecore main loop, but usually in another thread (this depends on the module being used). The synchronization between this other thread and the main loop not visible to the end user of the library. The user can just register callbacks to the available signals to receive information about the changed states, and can call other functions from the API to request more changes on the current loaded file.
There will be a delay between an API being called and it being really executed, since this request will be done in the main thread, and it needs to be sent to the decoding thread. For this reason, always call functions like emotion_object_size_get() or emotion_object_length_get() after some signal being sent, like "playback_started" or "open_done". This example demonstrates this behavior.
Available signals
The Evas_Object returned by emotion_object_add() has a number of signals that can be listened to using evas' smart callbacks mechanism. All signals have NULL as event info. The following is a list of interesting signals:
- "playback_started" - Emitted when the playback starts
- "playback_finished" - Emitted when the playback finishes
- "frame_decode" - Emitted every time a frame is decoded
- "open_done" - Emitted when the media file is opened
- "position_update" - Emitted when emotion_object_position_set is called
- "decode_stop" - Emitted after the last frame is decoded
Examples
The following examples exemplify the emotion usage. There's also the emotion_test binary that is distributed with this library and cover the entire API, but since it is too long and repetitive to be explained, its code is just displayed as another example.
Function Documentation
EAPI Eina_Bool emotion_object_extension_may_play_fast_get | ( | const char * | file | ) |
Do we have a chance to play that file.
- Parameters:
-
file A stringshared filename that we want to know if Emotion can play.
This just actually look at the extention of the file, it doesn't check the mime-type nor if the file is actually sane. So this is just an hint for your application.
Referenced by emotion_object_extension_may_play_get().
EAPI Eina_Bool emotion_object_extension_may_play_get | ( | const char * | file | ) |
Do we have a chance to play that file.
- Parameters:
-
file A filename that we want to know if Emotion can play.
This just actually look at the extention of the file, it doesn't check the mime-type nor if the file is actually sane. So this is just an hint for your application.
References emotion_object_extension_may_play_fast_get().
EAPI Evas_Object* emotion_object_image_get | ( | const Evas_Object * | obj | ) |
Get the actual image object that contains the pixels of the video stream.
- Parameters:
-
obj The object which the query is being ran on.
This function is usefull when you want to get a direct access to the pixels.
- See also:
- emotion_object_image_get()