This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
In recent past mobile technology has advanced rapidly with support of mobile OS which gives major hands for popularity of mobile device and there manufacturer. As today's mobile phone called as smartphones with their smart attitude of working with smartly working Operating system. Operating system for mobile platforms are Android, BlackBerry, IOS, Windows phone OS, Brew, Palm OS and many more and still new OS are on the way to launch in recent years like Mozilla mobile OS, Ubuntu mobile OS etc. But as a part of smart phone Multimedia has very and most important part of it which requires being very good in quality. Most likely feature of smartphone is streaming video on go, while it may be live TV or YouTube videos, etc.; people spends there time for watching some shorts important video on their pocket size portable TV where they can watch all these stuffs while going, as service provider gives user best deal with quality and comfortable experience for watching.
As experience of watching live TV or live video streaming are not that good enough as like on desktop version of live streaming where it also provides with HD quality of live video streaming. And if this HD quality of video comes to mobile or smart devices number of viewing of video will get increase as now a days user wants everything quick, smooth and with good quality of experience in there small and portable smartphone. This project is a guideline on quality analysis on mobile multimedia using two mostly used mobile Operating system "Android of Google Inc. and IOS of Apple Inc. and their technology of live video streaming as focus of this research on live video streaming as mobile multimedia is very vast topic to research and also because of limited time. This research contains the detail research on Video, video streaming, streaming protocols followed by architecture of Android, IOS Platform and concept of live video streaming on it, codec which these two mobile platform are using for their live video streaming, after that method of analysing the quality of video and its implementation guideline in unit 6 on Implementing Tested Guideline for Quality video Streaming followed by testing guideline and conclusion.
To do quality analysis on the mobile multimedia on two mostly used mobile Operating system platform with respect to major part of analysis on video streaming and its limitation of todays and future aspects in mobile multimedia.
To do detail study on Android and IOS platform.
To do detail research study on video streaming in android and iOS.
To do analysis on quality of audio and video streaming with respect to file format of clips.
To do analysis of hardware and software for current and future mobile multimedia.
Outcome of this research will be analysis of video streaming on two mobile based operating system i.e. Android and IOS. Where, this report shows comparison of two platforms with respect to quality of video streaming them providing with codec as they using for streaming video and conversion of video for same.
This report can be guideline for developer or further researcher; up to which point streaming can and which Operating system can be useful for future point of view. It helps for developer in accordance with which codec they can use for their video streaming project and which kind of conversion of video file format they required to do as algorithm which they supposed to develop or not as some of them already available.
Chapter 2: Literature Review:
Multimedia signifies a number of styles of media types which can be used in collection. Multimedia is if text comes with Images that is multimedia, if is there any video with text that is multimedia. Using of multimedia is too much now a day in fact there is not even single website without any multimedia content because multimedia web sites get interesting and attracts the user to those sites. And speed for internet is also increasing so it makes it easy for user to go through the site and watch a more media contents like video, images, animation and audio also without any need of downloading in that also it give very good quality of experience. (ntchosting, 2002-2013.)
2.2 Multimedia video
Based on number of specification multimedia video file may come in various format where every format enjoy a great reputation. AVI (Audio Video Interleave) and MPEG (Moving Picture Expert Group) are the mostly used video format on the bases of availability and usage. 3GP (3rd Generation Partnership Project), Flash video (FLV) and Windows Media Video (WMV) are the most frequently used video format for online streaming and mobile devices. Some other well-known formats are MOV (QUICKTIME Movie), RM (REALMEDIA), Matroska (MKV), DivX and Ogg.
AVI was established in 1992 as a resource to let in cooperate video and audio playback at the equal time. Its file compression abilities make it a common select amongst handlers who had restricted memory in drives. Improvements in individually compression techniques and info allocation technology permitted AVI to keep its admiration for years, as the file format stays to be one of the most downloaded multimedia video formats. AVI videos stand the .avi file extension.
MPEG was developed in 1993 and it was used as VCDs (video compacts discs) with contains audio video information. MPEG 1 was the first version for video in MPEG category which needs to downzone because of limitation of bitrate which leads to poor video quality. After upgrade this video format it provides better quality with scalable resolution, high definition and improved file compression MPEG audio player III i.e. MP3 is one of the most known used or played audio format. .mpeg, mp4 and mpg are the common multimedia video format. FLV and WMV are used by YouTube on website for sharing videos. MPEG allows smooth and fast streaming on internet in comparison with other format which loos there video quality. User get high definition video but they have to compromise with the speed of streaming with upgrade of technology.
In 3G phones most common video format is 3GP, it has poor video quality compare to other file types since it use lees space it was ideal choice for mobile phone. It can be used for MMS (Multimedia messaging services) which can shared through storage device transmission where that can be downloaded and uploaded on internet. (Marco Sumayao, 2012)
2.3 Mobile multimedia
Today mobile multimedia means is not only images, audio, video and animation it is more than that like "SIRI" from IPHONE 5 which using speech commands is also part of mobile multimedia for various type of portable devices. Smart Phones is the best looking measure of mobile multimedia; which contains various feature like, video streaming, live TV, cameras with high Image quality and video quality, with almost 2GB of RAM and latest Operating System etc. Another example of mobile multimedia is IPOD and portable music player (Sony Walkman).
Mobile multimedia is majorly used for video calling, audio/video messages, easy to use application, streaming live TV, Shows, News, sports etc. even from outside of home and office it can be accessible with the help of strong bandwidth network provider.
It can also say that Smartphones are minicomputer, where it is possible to do access anything like it possible in computers, for example surfing browser, online video play, downloading numbers of apps, playing of songs through YOUTUBE, for watching TV, movies etc. and many more thing like computers smartphone also can do. It is also possible for smartphone to create their own multimedia. Today with text messages facility of attachment of images, audio/video file is there which makes communication interesting and gives more value too; best example is Skype, WhatsApp and many more.
Apple Inc. comes with new technology called as "Siri" with its IPHONE series and it is still continuing with latest version of IPhone 5. Siri interact with user through voice even without hand user can operate that device. ( Jeremy Laukkonen,2013)
2.4 Types of Multimedia
2.4.1 Video clips
Video clip is a sequence of still image running after another which looks like moving pictures. Before YouTube comes in existence it was necessary to download video before watching that video clip. But as times changes internet got too faster speed and with improvement in internet speed and using experience it is possible to watch video clip before downloading it. Video streaming is also possible over internet, attaching of video is more on web as compare to 5 to 10 years before and it can be share easily through blogs, forum. MP4, MKV, AVI, XVID, etc. are the common video file format.
Audio is sound system which can we used for listing through speakers. An audio file is recorded sound that can be play later to listen. Audio file is fast to transmit because of less size than video clip. It requires less bandwidth and storage capacity than video which is beneficial for server for transmission of those audio files. It is very simple to upload and download the audio files to and from webserver with the help of simple player. MP3, AAC, WAV, etc. are the common file format for audio.
The first ever type of multimedia is used on web is Image; which became the most useful and crucial part of web page design. Images use very less space and bandwidth because this functionality of images online news portals is replaced with paper news and as a result of this it reach to the more people with in very less time with frequently updates of latest news with images of news. Gallery of images in websites is most popular use of images which can include various images of photograph, painting and pictures. Instead of full picture it is possible give thumbnail of that image as a small version of full image. It became an important part of web sites. Now it can be used as background, navigational menus, and buttons also. Most of the images are raster graphics like JPEG, PNG, GIF, TIFF etc. another type of image is vector graphics is not used because of incompatibility of some browser. (ntchosting.com,2002-2013.)
Optical illusion is called ass animation. It is an illusion of still images which are running in motion one after another. It is used for performing determination which consider as form of art. It can be used for presentation purpose, educational, instructional and learning purpose. Classic form of animation is cartoon animation. Cartoon animation first comes in early years of 20th century which comes with 24 various drawing/sec. Traditionally animation of cartoon is made by hand made cartoon drawing, it is costly and time consuming to create. Because of this con it is only produce by professional studios. Animation in films or movies is produces as individual frame basis. Frames can be created using images, images of photographs or it can be drawn or painted. (N. Madison,2002-2013)
H:\Dissertation\New folder\New folder\cover logo.PNG
2.5 Mobile Platform
The operating system which is especially for mobile device is Mobile Platform. As in computer where OS is on which application can be run with the support of input device and output device which organise all process and all resource for properly running of application. So as like in computer, in mobile also there is OS which comes over all other application. OS is responsible for features of any mobile device and their functionality and manufacturer use OS functionality for mobile device to utilise feature of OS for their manufactured device. Following are some popular Mobile operating system
Windows Phone OS
2.5.1 Android H:\Dissertation\New folder\New folder\android logo.PNG
In 2005 Google acquire Android Inc. with their Operating system today it is well known as "ANDROID". The OHA ("Open Handset Alliance") was formed in November 2007 which was announced by Google and its associates. Android is working on Linux kernel which powers for security, model of process and driver structure. As Android I based on asset of Linux but it doesn't mean application of android can directly run on Linux or vice versa, it is all because of differ in UI framework, Library and difference in life model. Android is middleware which on Linux kernel as android is comes with layers; support of location service, 2D and 3D, sensors, audio/video multimedia Bluetooth wireless networking and many more. Previously middleware multimedia based on the Open core framework which was by PacketVideo and recently it was renewed by Google's Stagefright project which is much easier framework for multimedia. OpenGL ES 1.x and 2.0 APIs are the standard for 3D graphics and acceleration of 3D hardware.
As application of Android is developed and run using Java and that application is executed using Dalvik JVM, where it was developed by Google. DVM is help to improve the performance like usage of memory during runtime, load of processor and battery usage. Android 2.2 comes with improved performance as it introduces Just In Time (JIT) compiler for java based application. Android platform comes with inbuilt application like messaging email, contacts, dialler, media player etc. As applications are developed on java language so some restricted APIs are not available for third party application. Android UI also supports multi-touch, animation and gestures, where it is improved by OEM's of Samsung Touch Wiz, HTC Sense Sony Rachael, and LG SCLass.
Eclipse IDE is used for development of Android application with extension of device emulators and some required plugins. Plugins help for design, optimize, debug and deployment of application on mobile. For running of third party application android uses Java framework. For employment of system API and life cycle model, android uses dialect of java language; because of this android is not compatible with Java SE and Java ME application. The components which are developed in C, C++ with java can be embed with android, where it uses NDK (Native Development Kit ) which allows developers to build and application with C/C++.
Android Frameworks are combination of various groups of APIs which includes SQLite for data storage, 2D and 3D graphics, Wi-Fi, web services, media with audio/video/image, camera, telephony, sensors and Bluetooth. Because of Novel framework used by android it allows loose coupling with application and services. Android Market or now called Play store is used to get android application and installed on android enable device. There is another way to install application if user knows sources and allowing the option of unknown source as that option is disabled by default and if user wants then only it can be enable to install application. (VisionMobile, 2011)
2.5.2 IOS H:\Dissertation\New folder\New folder\IOS logo.PNG
In January 2007, Apple Inc. came up with most popular phone with IOS called as IPhone this device was introduce by Apples CEO Steve Jobs at Conference & Expo of MacWorld. During the time of introduction of iOS it was called as iPhoneOS as the same OS was introduced for iPad in January 2010 from then it was known as IOS. With this IOS platform Apple Inc. got there status and mark in mobile market with their iOS product iPhone, iPad and iPod touch which share same iOS. IOS is only comes with Apple Inc. product it can't be used for third party device.
IOS is the stacks or combination of frameworks which includes web services, middleware, application store, application framework, iTunes delivery service and online service of iCloud. IPhone, IPad and IPod device of Apple Inc. through which Apple getting its most of the profits by selling this devices at superior price.
Moreover, iOS is scale-down version of the MAC OS x which is operating system use in MAC computers. In the comparison of MAC OS x, iOS is touch screen user interface Operating System. It made of ARM- based processor and hardware configuration missing hard disk and physical switch partition. iOS is based on Unix based Operating system likewise MAC OS is part of BSD Unix. It also uses "hybrid" kernel combining elements of both microkernels and monolithic kernels.
Whereas microkernel runs in user space and monolithic kernel runs in kernel space; microkernels runs with device driver in user space and monolithic kernels rune for features like process model stacks of networking and security model where these features are same as core OS features. IOS offers high level of components where it also includes 3D graphics, UI, networking, rich multimedia content, web services and many more. These functionality provides services for developers by using set of API's which allows them to cerate new games and applications.
All iOS comes with an extensive set of preloaded applications, which includes following:
SMS app and a phone dialler.
Applications of contacts e-mail and calendar
ITunes integrated with media player used in IPod.
For the purpose of buying and installing applications Apps Store are available.
Google Maps applications and YouTube for using Google cloud services.
Safari Web browser of Mac also comes with IOS devices.
In June 2011 Apple comes with ICloud online services which give supports to store all users documents, photos, calendars, note, music, apps on remote server. This service is allowed for only user of Apple products or who has registered with Apple Inc. products. (VisionMobile, 2011)
2.5.3 BlackBerry OS H:\Dissertation\New folder\New folder\blackberry logo.PNG
Over several generation BlackBerry OS develop gradually. Its origin has traced by BlackBerry 850 device announce by in RIM in 1990. This BlackBerry 850 used to two way pager which uses DataTrac Network. In the time of 2002, RIM launched one of the more meaningful second generation OS which were known as smart phones. In addition to this device comes with QWERTY Key pad with some of the more good functionality like, phone faxing, email, browsing and text messaging;. because of these features BlackBerry devices found their initial market success.
Recently BlackBerry expanded their customer's flexibility by proving them messaging solution for text addicting users. In the year of 2009, he announced that 50% of their subscribers are from consumers and 80% are from non-enterprise sector. Moreover, in latest model of BlackBerry not only build strong traditional messaging but also included private info sharing with support of multimedia, internet service accessibility, contacts, location services, access for Intranet system, contacts and downloadable application. BlackBerry OS 6 included universal search as well as features such as YouTube application, a social application and a browser built on the open source WebKit engine. Even after providing such a well-developed OS still RIM has been suffering to competing with iOS and android OS. In April of 2010, an effort to modernize its handset platform, RIM launched the acquisition of QNX Software Systems from Harman International. QNX Software Systems is a long time maker of the QNX operating system, used in drive in applications, containing in car information systems.
In the year of 2011, RIM had launched one new generation OS name as BBX platform that was centred on QNX OS. RIM was expecting BBX would replace the legacy BlackBerry OS in future smart phone and tablet model of RIM. RIM was expecting that he should continue the path of creation of products on proprietary OS over the time. BlackBerry OS is a registered operating system which not for third parties.
BlackBerry OS provides the following features:
Push emails, which are connected with IBM Lotus Notes, Microsoft Exchange and Novell Groupwise.
Personal information manager (PIM), calendar, address book and to do list, which can be coordinated with enterprise associated systems.
It also supports an Office compatibility suit which provides support to Microsoft file format for documentation, presentations and spread sheets.
BlackBerry Messenger. With matchless distinctive PIN.
Incorporated messaging application collecting push notifications from numerous applications, as well as third party applications
Phone dialler support.
It can also support media player with audio and video music player.
Good quality of video and picture capture.
Internet based services with YouTube and Facebook.
BlackBerry platform provides many ways for developer to develop different number of application using java and web development. With the support of JAVA ME MIDP 2 of java based mobile framework developer can develop application for BlackBerry OS. RIM extended the framework with patented APIs that put together in apps to use:
Communication with Bluetooth devices.
Background threads with server push.
Media player with multimedia capability.
Recording and RTSP streaming.
Video playback and streaming.
Location based services.
Access to PIM data.
Layouts and transitions.
Cryptography and security utilities with PKI and elliptic curve support.
2.5.4 Windows Phone H:\Dissertation\New folder\New folder\windows logo.PNG
The new mobile platform of Microsoft is Windows Phone 7. It seems, it is a modified version of earlier OS name Windows Mobile OS. Microsoft had firstly announced about Windows Phone 7 in February 2010. Several mobile operator of tier-1 had participated in the declaration operators like AT&T, Orange, Telefonica/O2, Deutsche Telekom, T-Mobile, Sprint, Vodafone, SFR, Verizon Wireless, Telstra, and Telecom Italia.
Moreover, in 1998, Microsoft had stared to develop its first mobile production. In April 2000 Microsoft introduced, its first mobile in market as a name of 'Pocket PC 2000'. Mobile based on the Windows Mobile operating system. Till the primer of Android, Windows Mobile was the main open licensable operating system for smartphones. At the end of the 2008, Microsoft had stared to work on new Windows Phone mobile platform, when it became seeming in front of everyone Windows Mobile OS could not compete with the more modern iOS and Android platforms. In 2011, two biggest companies named Microsoft and Nokia had publicised their partnership in business of mobile world. The deal was that Nokia has to make its smartphone application under Microsoft Windows Phone.
In addition, similar to other production, Windows Phone 7 is registered of the Microsoft. Microsoft owned the rights to the operating system and there were no backward compatibility with Windows Mobile. Windows Phone 7 put up at the top of the Microsoft Windows CE 6 kernel. The services of fundamental system provided by kernel such as, process abstraction, memory management and scheduling. As match to CE 6 kernel is touted as significantly more advanced than CE 5 which is used in Windows Mobile product line. Windows Phone middleware supports Wi-Fi networking, Bluetooth and 3G data. According to a pre-configured policy, a built in connection manager handles Wi-Fi and cellular data connections. The new look of the graphics and rendering engine of the platform is based on the Direct 3D 11 API, with hardware acceleration.
At the process level as well as at the file-system level (every application only has access to its own files), Windows Phone 7 supports separate application. In Windows CE 6 kernel application security has compulsory. Windows Phone 7 applications are discovered with pre installed Windows Marketplace. Client only install the application from only Windows Marketplace after formal submission, verification and vetting process by Microsoft. For development of windows phone application developer can use widley popular Visual Studio. The Visual studio 2010 for windows phone SDK comes with support of Silverlight for windows phone, Expression Blend 4 for Windows Phone, Windows phone Emulator and XNA Game Studio 4.0.(VisionMobile, 2011)
H:\Dissertation\New folder\New folder\comparision table.PNG
Table: (VisionMobile, 2011) Note: Above table shows till Q3 2011 history of four mobile platform which was estimated by VisionMobile.
Video is the packet of still images which runs one after another in quick span of time with particularly set frequency which gives experience of motion picture. In early days it was for analog television services, and still it been used by many but as today world of digital service everything about videos get improved with in manner of their quality, watching experiences, capacity, resolution, size, broadcasting medium and most important viewers; viewers who wants best quality with low size, easy way to access with good quality viewing experience.
In video frame rate, aspect ratio, colour space and bits per pixel and video quality are the main characteristics. Frame rate is continuously running still images in per unit of time to make it as video, earlier for old cameras it was range between 6 to 8 frames/sec but for new cameras it is more than 120 frames/sec. PAL (Phase Altering Line) is the mostly used standard before as it was analog television standard which was used in Europe, Asia, Australia. SECAM (Sequential colour with memory) standard mostly used in Russia, France, Africa with 25 frames/sec and NTSC (National Television System Committee) which was used in Japan, America, Canada with 29.97 frame/sec. As because of less frame rate it was not looks as continues some lag was there as technology improved frame rate increased which give good and continues look to videos with almost negligible lag. Digital standard DVB, ATSC, ISDB comes with very much improved digital video. European countries are using DVB (Digital video broadcasting) with frame rate of 60 frame/sec, ATSC i.e. Advanced Television System Committee being used by North America with 60 fame/sec and Japan using standard called ISDB i.e. Integrated Services Digital Broadcasting with 30 frame/sec.
Resolution is the most important of any video as with proper resolution one can view video properly with information that video wants to give. Resolution is the physical size of video with its pixel height and width; pixel is the smallest part of image and with number of pixel one digital image can be seen. More number of pixels is better the quality of video. . For example of resolution where it shows 640 pixels x 480 pixels its shows 640 pixels as width of video and 480 pixels shows the height of video Colour space is collection various colour combination for various application where colour space incorporate with intensity of Red , Green and Blue in each colour pixel in range of 0 to 255 which allows render more than 16 million individual colour with combination this three colour.
Aspect raito is dimension of screen to be displayed which represents with width and height. For traditional TV screen aspect ratio is 4:3 or 1.33:1 and todays HD TV screen aspect ratio is 16:9 or 1.78:1. Pixels in digital video are not square aspect ratio as traditional monitor screen in computer where it was square aspect ratio. (RTE, october 2009)
2.7 Video Streaming
Streaming is the playing of video on client device/machine while that video being downloaded from server. There will be variation in bandwidth with various type of streaming purpose where it depends on the network with frame rate, resolution and bit rate for that streaming, as these increase in these factors it required appropriate bandwidth for that video file, where streaming can be on-demand and may be real time like as live TV. Video conferencing is the application of real time streaming as it needs to be low loss of packets and time delay as it requires quick communication between end users. Every user is can role as a client and a server for streaming in video conferencing.
On-demand type of service is like watching videos on internet where all pre-recorded videos are stored in server and can be played or viewed by user when they asked for it or demand it. As real-time service can only be available when actual events are happening but as in on-demand videos are recorder one they can be played anytime and anywhere. (RTE, october 2009)
2.8 Research Methodology
The Research Difficulties can be solved by Research Methodology. It can be assumed as a science of learning how research is complete methodically. This is steps through which research can study there problem with logic. It is also important for researcher to know about research techniques and methodologies also. Research methodology contains various methods and dimensions to accompany with it. The opportunity of research methodology is extensive than that of research methods. Thus, when research methodology has discourse it means not only discourse of the research methods but also think through the logic for that methods supposed to use in the situation of our research learning and clarify why it is essential to use a specific method and why there is no need to use others so that research outcomes are skilled and estimated by the researcher or others.
2.8.1 Quantitative research
Quantitative research gives more stress on measurement while gathering and examining data. The use of numerical measures no just for this but also tracks natural science model research measuring process to found knowledge. Usually it makes usage of conclusion, that is, research is supported out in relative to hypotheses tired from model. The usual procedure of quantitative research is given in Figure.
H:\Dissertation\New folder\New folder\Steps in the (linear) deductive process.PNG
Figure: Steps in the (linear) deductive process (Commonwealth of Learning, 2004)
Data collection methods in Quantitative research:
secondary study and authorized data
content study accordance with coding system
quasi-experiments (studies that need certain of the characteristics of
Classic research (studies that need resistor groups and investigational groups).
2.8.2 Qualitative research
Qualitative researches are binary where they are concerned in phenomena or they put effort indirectly with scales.Mainly qualitative researches search for to recognize and take to meaning of conditions or measures from the standpoints of the individuals tangled and as understood by them. It is normally inductive reasonably than rational in its tactic, i.e., it creates model from clarification of the confirmation, although equated to a theoretic circumstantial. A process in qualitative research is given in Figure. Qualitative research highlights senses rather than frequencies and stores as soon as gathering and analysing data. Some researchers claim that qualitative research is also worried with subjects of quantity, but with methods that are of altered order to numerical measures.
H:\Dissertation\New folder\New folder\iterative qualitative research process.PNG
Figure: The iterative qualitative research process (Miller and Crabtree 1992)
Methods of qualitative research include:
interviews (face-to-face, or through various technologies)
shapeless (normal discussion, life history narrative of key informers; projective techniques)
semi-structured (using an meeting guide)
individual (an in-depth interview)
group (focus group)
life history narrative focused on selected topics
structured (using an interview schedule)
questionnaires given in meetings
recordings - audio and video with structured or unstructured analysis,
content analysis of talk and interaction
2.8.3 Mixed Methods Research Method
The mixed methods research procedure model includes eight different phases:
define the research question;
define whether a mixed design is suitable;
choice the mixed-method or mixed-model research design;
gather the data;
examine the data;
read the data;
authentic the data; and
Draw decisions and compose the final report.
H:\Dissertation\New folder\New folder\Monomethod and mixed-model designs.PNG
Figure: Mon method and mixed-model designs.(Commonwealth of Learning, 2004)
H:\Dissertation\New folder\New folder\Mixed-methodde signm atri.PNG
Figure: Mixed-method design matrix with mixed-method research designs shown in the four cells. (Commonwealth of Learning, 2004)
Mixed methods research:
Triangulation (i.e., looking for junction and legalization of outcomes as of diverse methods and designs learning the identical phenomenon);
Complementarity (i.e., looking for explanation, improvement, design, and interpretation of the outcomes from one method with outcomes from the other);
Initiation (i.e., determining inconsistencies and flaws that clue to a rethinking of the research inquiry);
Development (i.e., with the answers from one method to benefit notify the other);
Expansion (i.e., looking for to develop the range and variety of research by means of various methods for different review mechanisms).
The mixed methods research procedure model includes Onwuegbuziea nd Teddlie's (2003) seven stage concept of the mixed methods data examination procedure. the seven stages are as follows:
Data reduction: Data reduction contains reduction of the element of the quantitative data (e.g., via expressive figures, examining issue study, collective study) and qualitative data (e.g., via exploratory thematic study).
Data display: Data display includes defining symbolically the qualitative data (e.g., mediums, list, Venn diagram, networks, charts, and quantitative data (e.g., tables, graphs).
Data transformation: data transformation in this stage quantitative data are transformed keen on description data that can be examined qualitatively (i.e., qualified; Tashakkori &Teddlie, 1998) and qualitative data are transformed keen on statistical codes which is presented numerically (i.e., quantized; Tashakkori &Teddlie, 1998).
Data correlation: Data correlation contains connected with the qualified data through the quantitative data or the qualitative data being connected through the quantized data.
Data consolidation: Data consolidation, in which equally quantitative and qualitative data are united to generate fresh or consolidated Data sets.
Data comparison: data comparison contains equating data as of the qualitative and quantitative data bases.
Data integration: Data integration typifies the last phase, in which both quantitative and qualitative data are united into whichever a comprehensible units or two divides etc. (i.e., qualitative and quantitative) of comprehensible units.
H:\Dissertation\New folder\New folder\Mixed researchp rocessm odel..PNG
Figure: Mixed Research Method (Commonwealth of Learning, 2004)
Chapter3: RESEARCH INTO ANDROID
3.1 Android Introduction
Android is mobile operating system which is stack of software which contains OS, middleware and application. Java programming language application developer can develop application for android as it supports concepts of java with its own necessary tools and APIs in Android SDK.
Reuse of components and replacement of components can enable because of application framework
Like as JVM in Java in android there is DVM (Dalvik-virtual-machine) adjusted for mobile devices
Preloaded web browser with Integration of open source Web Kit engine
It also supports 2D graphics and 3D also with OpenGL ES 1.0.
For storing data there is SQLite
It supports commonly used audio/video format and image formats also which includes ("MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, GIF")
It also supports Bluetooth, Wi-Fi, GSM telephony, 3G and EDGE.
Android also comes with support of compass, camera accelerometer and GPS.
3.2 Android Architecture
Following figure shows the architecture of Android with component of Linux Kernel, Libraries, Android Runtime, Application frame work and Application.
Android System Architecture
Figure: Android Architecture (developer.android.com/about/versions/index.html)
Android comes with pre-loaded application like email, maps, SMS, Contacts, calendars, browsers and others which are written in java language. Application is part of android architecture where all installed application comes.
3.2.2 Application Framework
Application framework is the collection of manager which help for developing the number innovative application by accessing its original API which were used for core application. Through these managers developer can handle hardware of device, location based access also can be handle service which runs in background also can be handle, notification alarm and many more can be handle through android API's.
Android system includes following service for application
Android provide number Views for building an application with various options like textbox, list, buttons, grids and even web browser.
As content provider it allows to access of internal data for other application or to share data within application in device.
In resource manager it allows developer to access resources like layout strings and graphics.
Notification Manager allows to all other application notification to be displayed on status bar with customize alert setting.
Activity manager is most important part where it manages the life cycle of any application with proper navigation of application.
Libraries of android system includes the C, C++ libraries also which can be access through android framework and those frameworks can be used by developer in their application. Following are some core libraries of android.
System C library is standard C library which are embedded for use in android via Linux based devices.
Media Libraries- Media library supports many audio and video files format and it also supports still images; it supports files like MPEG4, MP3, AMR, PNG, H.264, ACC, and JPEG.
Surface Manager- It allows to access of display for application in 2D and 3D graphics.
Android comes with its own web browser engine and embeddable web view also.
SGL 2D graphic is the primary graphic engine.
3D libraries- IT contains library for 3D acceleration and rasterizer which is based on OpenGL ES 1.0.
Font Rendering and bitmap comes in FreeType libraries.
SQLite - is the database engine for android device which accessible for all application.
3.2.4 Android Runtime
Android contains core libraries of java language that provides various functionality for android application; like as in Java there is JVM for in android there is DVM Dalvik virtual machine where every process of android application run on its own with instance of DVM. DVM supports multiple running processes efficiently. It has executable file called .dex format which elevate memory footprint. The virtual machine runs compiled classes of java compiler that interpreted into .dex format with dx tool. The DVN dependent on Linux kernel for functionality like as management of memory and threading concept of java.
3.2.5 Linux Kernel
Linux Kernel is the heart of android system which pertains as abstraction layer for hardware and software. It is a core system for services like driver model, security process management, network stack and memory management. Android uses 2.6 versions Linux.
3.3 Live Video Streaming from Android-Enabled device:
H:\Dissertation\New folder\New folder\archi.PNG
Figure : Architecture Overview (N. Vun and Y. H. Ooi. 2010)
Figure shows the architecture of live streaming for android devices, which contains two main parts one is android device and another is server from where video will received both are interconnected. The android device needs authorization to connect with server with the help of convention protocol over a TCP connection.
After authorization approved device turn on camera and starts to transmit the video over UDP server get video in form of RTP packets. If user device capable or application capable of RTSP video player wants to view video it decodes the stream and get connected with RTSP server and after info like connection information and type of information is asked from server client request for play. Server starts sending RTP packets to the client device where it get decode the video in an embeddable format. Encoded video stored to server in flash video format. And when user requests for video it get played on user's device.
18.104.22.168 Android Device application
H.263 is an encoding and decoding video standard which developed in 1996 by ITU-T video coding group it use low bandwidth for video conferencing where in year 1998 and 2000 they made some changes for high quality. Google has adopted this for android OS because of this today it can supported by all android devices. Because of low bandwidth stream it is possible to transmit over low bandwidth network. ( Vun, N.; Ansary, M, 2010)
22.214.171.124 Real-Time Transport Protocol
RTP is a packet which is use for transmission of audio and video in number of encoded format over internet or network. Number of application supports RTP such as VLC, Windows media player, ffmpeg and QuickTime. There is no need of TCP acknowledge as RTP transfer packets over UDP, because of this reason UDP has chosen for sending of packets. For example if user is at low bandwidth are from their also device can send video to the server in time; it will be variable video but it will transfer. And when user is at no coverage area and playback is get stop and when again it comes to coverage are it will directly starts redisplaying video. (N. Vun and Y. H. Ooi. 2010) (http://tools.ietf.org/html/rfc2326)
126.96.36.199 Real Time Streaming Protocol
RTSP is used to handle playback of RTP stream. It allows client to receive RTP packet, client cannot view RTP stream directly because of that reason RTSP server is needed for handling the start and stop of streaming packet using streaming information. It includes some commands like setup, option, teardown, announce, describe, play and pause. RTSP also supported by Safari browser because HTML5 supports video which enable to embed video without any flash player. The OS of client device need to support required decoder for transmitted video.H.263 and H.264 are mostly used stream for video for RTP.
188.8.131.52 Encoding and Decoding
The role of encoder and decoder is to convert live stream video into flash video for web page. There number of encoding and decoding video playback, but everyone has its own pros and cons while comparing with each other. Properties like memory footprint, file size, decoding time, encoding time, processor usage, video quality and many. There is a need of optimal scalability because of video is being transmitted from android device which requires processing. As FFmpeg gives very good quality of picture, CPU and memory convention also low per stream, that's why it is selected as default transcode. (N. Vun and Y. H. Ooi. 2010)
184.108.40.206 Real Time Streaming Protocol Server
RTSP server provides stream info and onward RTP packets to the requested client only. It receives the RTP packets from devices, packets get ignored if there are no viewers. When any request come to RTSP server for play stream, server modifies the packet and onwards them to client device. (http://tools.ietf.org/html/rfc2326)
220.127.116.11 Android Device Side
The video created from android device through the camera that device is only responsible for transferring the video on internet to server. Through practice protocol it retains communiqué with server for confirming correct proof of identity and video streaming.
18.104.22.168 Conversion H.263 to RTP H.263
Through RTP video get transferred from android device and, as signal can vary cellular network are not reliable and because of that it consume more power and battery life. Conversion from camera device required to occur to create RTP packet. Android device sends H.263 which get support from API's to native TCP on the device, phone get packet from the native port then transfer the packet into RTP H.263 packet by packaging frame as sequence with RTP header. The original video break into number of packets which is near by 1500 bytes, with mark at start of every block of header data get forward, then RTP packets sent to server to requested video UDP port. By information contains in RTP header frame cane be decoded on server and at the end on client it will be viewable video. Because of H.263 video don't send any info about video like length of video till it get close, source identity, payload type, marker, and version it needs this conversion to RTP. (Vun, N.; Ansary, M, 2010)
22.214.171.124 Conversion to Flash Video
Through plug-in for browser and shockwave player flash video file can be viewed on web page. Almost every browser and OS has plug-ins to play flash video which is developed by Adobe only problem of compatibility is with Apple product. Flash was selected for alteration of file to display the video live on web page. Extension for flash video is .flv, it encodes with synced audio and video stream and it get encoded same as audio and video in .swf files. (Adobe Systems Incorporated, 2010)
Longtail Video has given open source FLV player called as JWPlayer. The video transcoding output offers a shockwave Flash file that can be used to transferral the FLV file. Every stream is a solo occurrence of JWPlayer.
From the time when the FLV file is gets by the encoder, the whole video recording will be stored to the file, if video played straight away via HTTP server its outcome is video get played and if it get recording that means it not get played as live video that will be recorded video. There are number of software which correlate with live streaming but none of them deliver a comprehensive answer for video streaming to web page from mobile device. There are some other software which delivers profits for other classes of application like streaming of video from web cam. (JWPlayer,2011) ( Vun, N.; Ansary, M, 2010)
Chapter 4: RESEARCH INTO IOS
4.1 IOS Introduction
IOS is the Operating system which comes with all apples portable handheld devices like IPhone, IPad and IPod. The IOS manages all hardware of devices with their native application and a device also comes with preloaded application such as mails, phones and browser.
The application for IOS devices can be developed by using IOS SDK which contains supports of tools and interface for development of application. All IOS application can be run on iOS devices as it builds in IOS framework and written objective C language. All application of phones are synced with users computer through iTunes. (Apple Inc. 2011.)
4.2 The iOS Architecture is Layered
IOS intermediate with all hardware of devices and their native application which come on screen. Applications interact with their native hardware via system interface which also protect from damaging application from changing hardware settings. Since, iOS has good interface with its hardware it is simply to write application for any hardware of devices. Following figure shows the layer architecture of IOS where lower layer gives fundamental services supports to all application; and higher layer support more refined service and technology.
H:\Dissertation\New folder\New folder\IOS ARCHI.PNG
Figure: Layers of IOS (Apple Inc. 2011.)
For developing application developer always try to use higer level framework instead of lower level framework; because high level framework supports abstraction of object oriented for lower level. As a result it reduces code and makes it easy to write application with more complex features like as threads and sockets.
First layer of IOS architecture is Coca Touch layer where is the most important layer as it contains the key frameworks for development of IOS application; such as high level services for system, notification, touch input and multitasking. Before designing application developer must have to investigate this layer whether it will fulfil the application need.
Second layer is Media Layer as it names seem this layer is for all multimedia content base application where it contains audio, video, graphic technology for developing best multimedia apps for device. Through multimedia application can look good and sound good for user experience.
Third Layer of Architecture is Core service Layer it consist of system service which are run for all application that are installed in device whether that application is don't use service directly but still some fundamental service has top priority to run over all of application.
The last and fourth Layer is Core OS layer consist of low level feature which most of the technologies are built on it; whether if one use it or not these technologies are automatically incorporate with application via other frameworks. And sometimes if it requirement of dealing with security or communication with external hardware of devices this layer provides support for framework. (Apple Inc. 2011.)
4.3 Live Streaming.
Live streaming is sending of an audio/video on HTTP through any server which can be played on IOS devices IPhone, IPOD, Apple TV, IPAD, and Mac book computer also. Live streaming maintain both per corded i.e. Video-on-demand and live broadcasting. It supports number of options for streaming at various bitrate and end user software can stream video according to network bandwidth availability. Authentication is also required to stream a video to protect a work hence encryption of media is also supported in live streaming. All the apple devices support HTTP live streaming with their own Safari browser which can stream with media player as HTTP Live streaming Client. (Apple Inc. 2011.)
4.4 HTTP Streaming Architecture
HTTP live streaming allows all apple devices with IOS 3.0 or later or with Safire 4.0 or later installed can stream recorded or live audio/video with authentication and encryption.
Figure A basic configuration (Apple Inc. 2011.)
4.4.1 Server Components
Server Components is in authority for taking input stream which can be encoded digitally with appropriate format for sending to client via distribution. It need media encoder i.e. hardware and it break media into segments and stores them as file; for segmentation of media, apple provide media stream segmenter software.
4.4.2 Media Encoder
Media Encoder takes real time audio/ video media signal and encapsulate and encode that media for transmission. Media which is ready for transmission is set as supporting file for client device, such as for video H.264 and for audio AAC. At this time for delivery of audio/video MPEG-2 transport stream is supported and for audio only MPEG elementary stream is supported. MPEG-2 transport stream is responsible for delivery of encoded media over network to the stream segmenter.
4.4.3 Stream Segmenter
Stream Segmenter is the software which reads the transport stream over local network and splits that stream into equal duration in sequence of small media files. Video file are created from constant stream even though every segment of file is separate which can be reassembled effortlessly. Segmenert also produces an index file which consist reference for media file; after completion of new media file each time index file also get restructured. For tracking the availability of media file with location index file get used. The segmenter create key file for encrypted media segment as a part of process. Index file is store as .M3U8 playlists and media segment are store as .ts files.
4.4.4 File Segmenter
If there is media file with supported codec it can be use directly for file segmenter to compress it as MPEG and splits into segments of identical length. For transmission of video- on-demand through HTTP live streaming file segmenter provides existing audio/video files library. File segmenter takes files as an input instead of stream as it is input for stream segmenter as both file segmenter and stream segmenter perform same task.
4.4.5 Media Segment Files
Media segment files are created by stream segmenter which is depends on encode input and that file contains a succession of .ts files which consist of segment of MPEG-2 transport stream which held H.264 and ACC for video and audio respectively. For transmission of audio file only segmenter just create MPEG audio stream consisting of AAC with ADTS-header or MP3 for audio.
4.4.6 Index Files
Index file are generally created by file or stream segmenter and that file is soter as M3U8 playlists, with extension of .m3u which is used for mp3 playlist.
4.4.7 Distribution Components
The distribution Component is web server; which is responsible for acknowledging the client request and transmission of media to requested client. For large data delivery, edge or other content network being get used. The distribution system is caching system which distributes the index and media file over HTTP to the requested Client. Typically with low configuration is required for web server and no customize server modules are needed for transmission of content. MIME-type which related with .M3U8 and .ts files is suggested as specific configuration.
4.4.8 Client Component
The client software is liable for defining the suitable media to demand, downloading those resources, and reconstructing for continuous streaming for user. The client software get started with fetching index file according to URL detecting the stream. The availability of media file alternate stream decryption key is specifies in index files. The client download sequence of existing file for certain stream. The client starts with presenting reconstructed stream to client device once it got enough extents of downloaded data. The client is liable for fetching any authenticating or giving and decryption keys to a user interface to let confirmation, and decrypting media files as required. (Apple Inc. 2011.)
4.5 Preparing Media for Delivery to iOS-Based Devices
The following four tables show the settings for IOS devices which are suggested encoder configuration. This configuration needs to be there in point of view of hardware or software encoder. It is possible to use tool like compressor for video editing if there is need to encode from original file for video-on-demand. MP3 for audio, MPEG 4 for video and Quick time for movie are the file formate for file segmenter which can be used using identified encoding.
MPEG audio and video stream enclosed in MPEG 2 transport stream are the stream format for stream segmenter which can be wrapped using following:
H.264 Compression for encoding video:
H.264 Baseline 3.0 is supported by IPhone 3G or later and IPOD second generation or later If application runs on earlier version of IPHONE and IPOD it is necessary to use H.264 Baseline 3.0 for compatibility.
Baseline Profile 3.0, 3.1 or Main profiles 3.1 are used for IPHONE 4 and later, IPAD, Apple TV 2 and later.(Baseline Profile can use for small screen only or for both large and small screen, and Main Profile can be used for large screen device)10 fps frame rate suggested for video stream lower than 200 kbps. Video stream between 300kbps 12 to 15 frame/sec and for other stream 29.97 frame/sec frame rate suggested.
For encoding of video either HE-ACC or ACC-LC, stereo and MP3 MPEG 1 audio layer 3 stereo. Bitrate fore audio is 40 kbps and sample rate of 22.05 kHz is suggested for all cases. (Apple Inc. 2011.)
H:\Dissertation\New folder\New folder\Table 1-1.PNG
Table: Encoder settings for iPhone, iPod touch, iPad, and Apple TV, 16:9 aspect ratios (Apple Inc. 2011.)
H:\Dissertation\New folder\New folder\1.2.PNG
Table: Encoder settings for iPhone, iPod touch iPad, and Apple TV, 4:3 aspect ratios (Apple Inc. 2011.)
H:\Dissertation\New folder\New folder\1.3.PNG
Table: Additional encoder settings for iPad and Apple TV only, 16:9 aspect ratios (Apple Inc. 2011.)
H:\Dissertation\New folder\New folder\1.4.PNG
Table: Additional encoder settings for iPad and Apple TV only, 4:3 aspect ratios (Apple Inc. 2011.)
Codec is abbreviation for coder decoder; it takes original data file and converts into compressed file. Compressed file consist only some data than the original file, codec is translator which takes on what data should go through the compressed version what get discarded.
AAC is Advanced Audio coding it is a digital audio file which replaced by another audio file type which dominate in digital audio for long time that is MP3. AAC gives better sound quality compare to MP3. AAC is an Apple's proprietary audio format, where it was developed by group of organization which includes Sony, AT&T, Nokia and Dolby; because of this now AAC can be played on non-Apple devices also, like android mobile phone, Sony PlayStation 3 and PSP, Microsoft Zune, Nintendo Wii and many others. AAC is loss format like MP3 which mean if it compresses audio files for transmission some data will be removed from file which will not affect to the listening experience. Because of compression AAC not alike as CD quality file. Quality of AAC is measured on bitrate like MP3, and bitrate for AAC is same as MP3 which includes 256 kbps, 192 kbps and 128 kbps. (Apple Inc. 2009)
H.264 is new generation video compression format; it is also called as MPEG 4 AVC. It is used for HD systems like HD DVD, HDTV and Blu-ray and it also can be used in portable devices where it requires low resolution devices like Apple IPOD and SONY PSP. IT give better quality compare to MPEG 2 and MPEG 4(i.e. DivX or XviD). For QuickTime apple has officially adopted H.264 format. H.264 is main video format for Blu-ray recorder and camcorder. X264 is free encoder for H.264 as sometimes it also gets referred as MP4. For H.264 format MP$ is best choice. (H264info.com. 2010)
Chapter 5: Challenges Associated with Video and video streaming
5.1 Video Quality Issues Source
Quality issues in video is not rare for mobile user which caused due to service network, especially in older telephony networks which was not good enough for transmission of video. The restricted use of bandwidth in network and as design of protocol are to developed according to video transmission there were many issues in quality of mobile videos and displaying of it on devices; these issues are still there with use of technology network but not as that was earlier. The four major issues associated with video quality are:
Issue while creation of video.
Issue while transcoding.
Issue while transmission of video.
Issue while displaying video.
5.1.1 Issue while creation of video
Today anyone can make video with their own video recording device whether with handy cam or may be with mobile enable with good quality camera. Professionally video recording or creation done with all required equipment with professional camera which is only making video which are widely produced by movie studios or by television network; and for making video and handling camera there are trained cameraman and technicians. But still sometimes taking all care, minutely some problem will be there in making videos which will not affect experience but it will affect quality standard of video. However now a days most of the videos viewed are created or made by user with their portable video recording devices which did not produce quality of video as they were using low pixel value camera or may be lens of camera also affects the quality of video. Environment also can be possible reason like while it's raining in on going football match so it affect the camera lens because of water and something because of vapours also. In case of user recorded video it can be shaky as human one cannot make their hand steady to take shake less video, as many devices comes with anti-shake algorithm but still it is not well enough. (Dialogic Corporation, D, 2009)
5.1.2 Issue while transcoding
In case of user created video while recording video from device video gets compressed accordingly so that it can efficiently transmitted over network. As there are strong algorithms for retaining video quality but still there are some integral losses of quality every time while compression of video and for retaining best quality it requires lot of power for processing. While uploading video from mobile device which not use that much powerful algorithm as it requires more processing power and which affects the mobile device battery life. There is a protocol where algorithm which is use for encoding same algorithm should be used for decoding and decoding is necessary once encoding done before it get played. Today there are many devices with various encoding algorithm which allows decoding with different algorithm. Even almost every video playing device contains various compression algorithms but still between capturing and displaying of video it is necessary to translate from one algorithm to another algorithm this translation of video known as transcoding. As re-encoding and decoding number of times it will loss the quality of video; and in today's network communication it very often to transcode the video number of time and it will definitely affect the quality of video with lots of efforts for trying to retain videos actual quality.
5.1.3 Issue while transmission of video
Loss of information is most common in transmission of video via network, and it is very common in today's IP network where whether packet get lost or may get delay which almost lost if it takes long delay. Jitter also can affect quality of video by interfering in algorithm that supposed to use for buffering the content via bandwidth network. Packet loss and jitter are there from long for voice network but it won't affect that much in voice network as because of human tendency of voice recovery from small gaps in communication but in case of video stream it can be visible any glitches in video as video communication is more sensitive to loss as compare to audio communication and because of loss it will affect experience and understanding.
5.1.4 Issue while displaying video
Video quality can be affected by the device on which user are watching video whether actual quality of video is good also. So device with poor display can affect experience as today it is not that much problem with screen or resolution of device as better and HD displays are there for small device also. But earlier mobile phone device were more compatible with voice as compare to video where video needs more power, processor and lo