Android: Multimedia Interview Questions

1) Explain the android architecture?
2) Explain the code flow for audio playback scenario and video playback scenario?
3) Explain the state diagram of media player object?
4) What is MM framework and explain about open core and stage fright?
5) Diff b/w Open core and Stage fright?
6) Explain Stage fright architecture?
7) What is OpenMax IL?
9) What are the call back functions in OpenMax IL?
10) What is ‘role of OMX_Component”?
11) How will you implement an OMX Component?
12) What is the use of OpenMax IL?
13) When “setparam” and “setconfig” functions will be used?
14) When “AllocateBuffer” and “Usebuffer” functions will be called?
15) What is the role of awesome player in Stage fright?
16) How will you integrate an s/w codec or hardware codec with stage fright?
17) How the player type is decided for playback of particular file format?
18) What is Meta data and how will it be extracted?
19) Will stage fright support all media file formats?
20) How the thumbnail image will be created?
21) What is role media scanner?
22) What is the role of media extractor?
23) What is the role of metadata retriever?
24) What is the functionality of Audio flinger?
25) What is the functionality of surface flinger?
26) What is the role of Audio policy Service and Audio Policy Manger?
27) Explain the State diagram of Phone state?
28) How Application Processor and Communication Processor will be communicated?
29) What are the native services that will start from media server?
30) How player app and media player service will be communicated?
31) What is Binder?
32) What are the IPC methods used in android?
33) How AV sync is managed during video playback?
34) How the buffer management will be done for playback and recording?
35) What is PMEM and ashmem?
36) What is audio track and audio sink in the context of playback?
37) What is mutex, and when it is used?
38) What is parser and renderer, will these be OMX Components?
39) How would you know whether s/w codec or h/w codec is used?
41) What is frame buffer?
42) What is egl swapbuffers?

43) What is parser? 
44) What is recogniser?
45) What is Payload?
46) Explain Power Saving Machanisam in Audio/Video Playback?
47) Explain Interrupts handling in Audio Playback?
48) Why up sampling and down sampling is required while routing different audio streams 
       data? 
49) Which is the flag we set when playback complete in OMX Component?
50) What a mp4 file header have in it? 51) What does Media Scanner do?
52) Where is thumbnail stored in a picture?
53) How AV sync is achieved in RTSP streaming?
54) In RTSP streaming, what RTCP Packets comprise of ?
55) What happens in JNI if many media player instances are created?
56) Who selects the codecs for encoding for Author Engine
57) What is the control path in RTP?
58) Which transport protocol is used in RTSP?
59) Which is preferred for streaming...? RTSP or HTTP?
60) Which is more preferred H263 or H264?
61) What is container and codec?
62) Can seek n pause operations be done in while streaming through rtsp?
63) What is interlaced and progressive streaming?
64) How do you synchronize between the audio and video streamed data?
65) Why RTSP is called real time?
66) Difference between HTTP n RTSP?
67) What is RTSP protocol?

5 comments:

  1. Awesome List of Questions... !!! Thanks for great post.

    ReplyDelete
  2. Its nice the above list of questions!! Thanks

    ReplyDelete
  3. Thanks for providing such a great article, it was excellent and very informative.
    as a first time visitor to your blog I am very impressed.
    thank you :)

    ReplyDelete
  4. I simply want to say I’m very new to blogs and actually loved you’re blog site. Almost certainly I’m going to bookmark your blog post . You absolutely come with great well written articles. Thanks a lot for sharing your blog.
    |Android Training institute in chennai with placement | Best Android Training in velachery

    ReplyDelete