13.07.2015 Views

Tarek FRIKHA, Nader BEN AMOR, Mohamed Ramzi ... - Wseas.us

Tarek FRIKHA, Nader BEN AMOR, Mohamed Ramzi ... - Wseas.us

Tarek FRIKHA, Nader BEN AMOR, Mohamed Ramzi ... - Wseas.us

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Recent Advances in Circuits, Communications and Signal Processing<strong>Tarek</strong> <strong>FRIKHA</strong>, <strong>Nader</strong> <strong>BEN</strong> <strong>AMOR</strong>, <strong>Mohamed</strong> <strong>Ramzi</strong> Ben Yemna, Jean-Philippe DIGUET*,<strong>Mohamed</strong> ABIDCES-Laboratory,Lab-STICC*Sfax University, National Engineering School of Sfax, Sfax, TUNISIEUniversité de Bretagne SUD, Lorient, FranceSfax TUNISIAAbstract: Embedded multimedia applications grow up nowadays. These applications, especially video oneare not only developed for the PCs but also for embedded systems such as game console, smartphone,touchpad… The embedded system limited resources and particularly in network communication betweenemitter and receiver platforms need an adaptation to different parameters and specially environment noises.To solve this type of problem, we’ll test a video application adaptation on FPGA platforms. Thisapplication consists on sending images by network from a platform ML507 to a PC. In this paper, wedescribe the implementation and the adaptation of the application data transfer to the network transfer.Keywords: Network adaptation, embedded system, FPGA, video adaptation, systems communication.:The multimedia embedded applications inflate thecomputer sciences domain. Watching a HD video or a3D movie is now possible not only with a 3D TV butalso possible on small portable systems such assmartphone and tablets.The design of such systems faces new challenges dueto the limited available resources and the externalenvironment fluctuations such as noise, bandwidthfluctuations, available energy…)Nowadays, network video diff<strong>us</strong>ion becomes morepopular. The connectivity and bandwidth ameliorationmakes possible to have a fast access to video data withvario<strong>us</strong> quality from mobile format to full HD one.Despite the significant increase of the rate of flow,the shared network has a limited capacity fortransferring high definition and high resolution video.To remedy to this limitation, we need to concept anadaptive system for multimedia treatment. [1]In this paper, we present the conception, theimplementation and the optimization of an adaptiveMJPEG compressed video transmission system on an IPnetwork for connected mobile terminal. The related workpermits to validate four different axes: MJPEG streaming video coder realization. Bandwidth coder adaptation.Moving detection application realization.Application layer adaptation for respecting timeexecution constraints.The paper is organized as follow. Section II describesthe related works. Section III presents the embeddedmultimedia system. We present also the applicationdesign and the adaptive video diff<strong>us</strong>ion application. InSection IV, we talk about the obtained results. Finally, insection V, we conclude the paper with a brief outlook onfuture works.In this section, we detail the related work in theadaptation techniques.In this context, multiple works have touched the issueof adaptation in application layer and the layer of theOS.There is an algorithm for selecting a configurationwhere the criterion is the minimization of the <strong>us</strong>e ofresources (CPU and bandwidth). The <strong>us</strong>e of suchtechnique for specific critical systems requiresimplementing formal verification techniques whichensure that the chosen configuration check specificconstraints for this type of system. The <strong>us</strong>e of formalISBN: 978-1-61804-164-7 58


Recent Advances in Circuits, Communications and Signal Processingmethods is a difficult task of process development,which often faces the problem of the combinatorialexplosion of the number states to take into account.Proposed solution in [3] was to find the best settingas the number of images of reference, the size of thesearch window, etc. of the encoder to achieve theproportional balance between the video quality andpower consumption.The author in [4] presents an approach of adaptationin the application layer based on the modification ofcertain parameters of the application to manage thedesired quality of service. This approach has beenvalidated by a simulation <strong>us</strong>ing high-level synthesis of3D as a case study but it was not validated on anembedded system.An approach in [5] presents an adaptation layerbetween the hardware, OS, and application layers tomultimedia reconfigurable embedded systems. Theproposed approach improves the quality of service whilerespecting system constraints (execution time, durationof life, QoS) and <strong>us</strong>er preferences (minimum level ofquality of service, duration of life). Although the authorhas been able to improve the adaptation of multi-levelapproach but it is very limited in application level.In [6], the author has added the flow as a constraintand he was able to auto-generate settings reconfigurationof the H264 application. However, this adaptationtechnique has some disadvantages beca<strong>us</strong>e it supportsonly two constraints, a limited number of configurationparameters without taking into account the stability ofthe system.In [7], the author proposes an adaptive techniquebased on the coordination between resource allocationand dynamic adaptation in response to changes inenvironmental resources of the multitasking mobile.Variation can be ca<strong>us</strong>ed by the variations in the resourceavailability due to the environment changing as well asto the mobility of the system, or by varying the numberof processed applications. The proposed adj<strong>us</strong>tmentsupports three types of variation, the change in thenumber of tasks, changes in the availability of networkbandwidth and finally the change in the level of energywhich is available in the battery of the system. However,this adaptation technique has been tested <strong>us</strong>ing adatabase of configurations which is specific andgenerated for the application H.264/AVC.The above techniques are based on adj<strong>us</strong>tments eitherof the application layer or the hardware one. Theseadj<strong>us</strong>tments are local to the system board.In this paper, we present an approach based on anetwork adaptation based on implemented Linux OS.The OS permits the management of the hardwareapplication layer. To validate the described approach amixed demonstrator based on an embedded platform asemitter and a PC as receiver is established.In our work, we propose to design an adaptivestrategy including adaptive data transfer via network. Theadopted system permits to perform the MJPEG codecwhich is a complex application.To stream video on the Internet, two solutions arepossible. As the capacity of the Internet is limited and itdoes not broadcast with sufficient quality, you m<strong>us</strong>t:reduce the speed (and therefore the quality) ofvideo sequences while maintaining the realtime get rid of the real-time broadcastingIn our work, we maintain the second solution as itallows <strong>us</strong> to achieve good results without overloadingthe system and it is suitable for Video cast applicationas the display of images in the first solution is done atthe time of reception of the image and requires a goodconnection to ensure good reception.Our system includes three layers: software, hardwareand embedded operating system layer which linksbetween the hardware and applications whileperforming a variety of tasks.3.1.Bandwidth adaptation:We start by explaining the operation of the server andvideo casts mechanisms on the network sends TCPadj<strong>us</strong>ts its transfer speed bandwidth, MJPG-streamercannot send the next frame if the current frame has notbeen completely transferred. A new image is capturedfrom the sending of the current image. The image to besent is stored in RAM. The exchange between the inputmodule and output module is synchronized by <strong>us</strong>ing amutual excl<strong>us</strong>ion. In this way, all modules have quickaccess to the current image stored in RAM without therisk of being overwritten by a new plugin when theoutput is still the copy. The plugin expects the HTTPoutput connection from a client (the web browser suchas Firefox, Safari, Opera, etc.) to serve the MJPEGstream. The plugin sends the headers that tell thebrowser what type of data it will receive (HTTPheaders). Then it receives the current frame from itslocation in the memory system videocasts. TCP socketdoes not report anything until the data is transferred (orto indicate that an error has occurred. Once the imagetransfer is done, the next step is to send the image to theISBN: 978-1-61804-164-7 59


Recent Advances in Circuits, Communications and Signal ProcessingThe Figure 5 ill<strong>us</strong>trates the role of the web serverapplication in broadcast video:4.1.Frames per second variation:Using a rate greater than 15 frames per second is achallenge for our server with a variable number ofclients. Setting the compression rate and imageresolution can easily detect the number of clients: ourapplication is not broadcast but a multicast one. Figure 7shows an fps variation <strong>us</strong>ing our shell script to detect therate of flow changing and saturation and to stopsaturation. We <strong>us</strong>e a wired connection inferior than 1100Kb/s upload. We denote "saturated connection" the statewhere the emission’s peak compared to the limitedcapacity of our cable connection.1200FPS adaptation1000Figure 6. The role of the web server application inbroadcast video800600400200To validate our adaptation technique designed formultimedia embedded systems we’ll adopt M-JPEGvideo compression as a case study for adaptivemechanism.We start by changing the number of frames persecond. Then, we present the compression ratio stepsbefore proceeding to images sizes variations. Finally, wepresent the proposed approach to outsourcing movementdetection algorithm and the obtained results of thesetests.First, our tests have relied on the MJPEG streamerconfiguration server settings for varying the framesnumber per second fps, JPEG compression rate and thevariation of image resolution. Then we outsource thealgorithm of motion detection from the embeddedplatform to the terminal browser. The informationcollection is done <strong>us</strong>ing the pmap linux command. Thiscommand provides different information about thememory <strong>us</strong>e of a specific PID (Process Identification)process as well as information about the process sharedlibraries.0Figure 7 . FPS variation adaptationOnce the server start up at 25 fps (Figure 7), theemission rise with the connected clients increase untilbeing at a state of saturation. Then, we reconfigure ourapplication by reducing the number of frames per secondup to 5 fps which allows serving the large number ofc<strong>us</strong>tomers connected avoiding saturation problems. Theserver restart permits to an important number of clientsto be connected.4.2.Variation of Compression:As materiel, we <strong>us</strong>ed a Logitech webcam integratinga hardware JPEG compression. Th<strong>us</strong>, the created flowby combining instantly captured images in the buffer toyield a broadcast. The variation in compression ratiorequires the <strong>us</strong>e of the function jpeg_ create_decompress then jpeg_create_compress then thecaptured image to decompress and then recompress.ISBN: 978-1-61804-164-7 62


Recent Advances in Circuits, Communications and Signal ProcessingThese two functions are part of the library managementLibjpeg with JPEG images. This function m<strong>us</strong>t berepeated for every image buffer that makes a huge loadon the CPU and RAM’s card without network saturation.The processor cannot perform the same algorithm for allservice images. For any compression decompressionoperation, running time is ranging between 3.4 s and 4.1s on a1.8GHz processor with 512MB Ram. Th<strong>us</strong> it appearsthat lower occupancy rate of the network is not due tothe new size of the images: our JPEG encoder is purelysoftware.Change image resolution.4.3.Change image resolution:Resizing the media streams is achieved by <strong>us</strong>ing the-rparameter or input resolution. Once the bandwidth issaturated, we change the resolution to low resolution"QVGA" 320x240. Figure 8 shows the dynamic changeserver settings.Figure 8 . Resolution variation adaptationThe release of network bandwidth due to the change ofthe resolution providing more bandwidth for new <strong>us</strong>erswas represented by Figure 7.hardware components and their diversity of <strong>us</strong>e.Furthermore, maximizing QoS, presents an inherentconflict in the design of embedded systems to satisfy thec<strong>us</strong>tomers.In this paper, we designed the architecture of a Videocast system. Then, we foc<strong>us</strong>ed on embedded Linuxkernel architecture. We have also integrated theapplication of M-JPEG video broadcast on embeddedLinux and we have added rules to adapt bandwidth andexecution time for motion detection code. Asperspectives, we would treat video streams out of oursystem in a cloud environment and retrieve the results onthe FPGA platform. We can also integrate hardwareaccelerators compression / decompression JPEG systemarchitecture. Finally, we can adapt the sending data toRTP.REFERENCES[1] <strong>Tarek</strong> Frikha, <strong>Nader</strong> Ben Amor, Kais Loukil, AgnesGhorbel, <strong>Mohamed</strong> Abid, Jean-Philippe Diguet “Hardware accelerator for self adaptive augmentedreality systems ” (HPCS 2012)[2] P. Tao Zhang, Santosh Pande, and AntonioValverde.Tamper-resistant whole programpartitioning. Conf. on Languages, Compilers, andTools for Embedded Systems.[3] S. Lin, P. C. Tseng, P. Lin, and L., G. Chen, "Multimodecontent-aware motion estimation algorithm forpower-aware vidéo coding systems," in Proc. IEEEWorkshop on Signal Processing Systems (SIPS2004) pp. 239-244, 2004.[4] N. <strong>BEN</strong> <strong>AMOR</strong>, "Approche de conception deprocesseurs de vision embarqué," PhD thesis ENIS,Tunisia Dec. 12, 2005.[5] K. Loukil, N. Ben Amor, M. Abid, “Self adaptivereconfigurable system based on middleware crosslayer adaptation model,” SSD, Djerba, Tunisia, Apr.2009.[6] M. Ben Saïd “A multi-constraints adaptationtechnique for embedded multimedia systems”master, ENIS june2010.[7] L.F.Ye “Multiprocessor self-adaptive reconfigurablearchitectures” PHD may 2011The growing number of complex multimediaapplications has raised new challenges in the design ofembedded mobile multimedia systems. Or, the field ofembedded systems is one of the areas that are in vogueand its development is related to the evolution ofISBN: 978-1-61804-164-7 63

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!