Sunteți pe pagina 1din 6

Abstract

The project is to develop a Web conferencing client in pure Java using the Java Media Framework (JMF) and to develop a web conference management server to setup, host and monitor conferences between several clients. Web conferencing is basically the transmission of video and audio between separate physical locations. This is achieved through the use of web cam. Current day conferencing systems provide a rich and useful array of functions designed to solve the most complex of communication and collaboration problems. The ultimate goal of this project is to explore the feasibility of creating a conferencing application through web using javas latest multimedia technology, java media frame work. This Conferencing System supports three data channels for its operations: a video channel for its video operations: an audio channel for audio operations and a background text channel for inter program communication session. A user who connects through the program to one or more will be able to share his video and audio with all the other conferences that are presently connected to him. Users should have to connect to the other conferences by entering the IP address or the DNS name, then the system will try to search whether an instance of the program is running on that system or not. If it finds an instance running, our system will be allowed to connect provided that particular system is not already connected to MAXIMUM allowed conferences.

PURPOSE: The main purpose of this application is to develop a browser based web conference using JMF (Java Media Framework). This application enables the client (Sender/Receiver) to perform conference through audio and video feeds, where the client can exchange audio and video files through conference. SCOPE: EXISTING SYSTEM WITH LIMITATIONS: In the existing systems of conferencing developed using Java, there is a problem of capturing images and audio as well. Because for capturing of these values the system needs to query the devices that are associated with the devices. As java does not support any system level programming we had to depend either on JNI (java native interface) or depend on any of the third party utility that was developed and captures the video and stores that into a file. And these videos were being transferred and were displayed in the opposite side. This reduces the interaction of the programmer to develop the conferencing system. As like the audio. This paradigm often makes this problem of waiting the conference systems for the resources, and unnecessary maintains of queues to the conference connection etc. this may often causes the system hangs or so called dead locks. And if the receiving system is not at that much speed of display the video at that much speed that it can receive from the network. There they may be a possibility of data loss and often a dead lock at the receiver to over come all these problems we for a new video conferencing system.

PROPOSED SYSTEM: In our system we are going to develop a conferencing system that depends on the JMFs libraries to capture the images as well the sound so that we are not bothered by the system level programs or any JNI (java native interface) to query the camera and sound codes, so that we are not going to use any fixed file formats to send the data and as we are not depending on the file formats we are not going to use any third party utility to capture audio as well video, thus our system has given a chance to develop itself all the required libraries ( depending on visitor pattern ) . As our program is calculating the frames per second (Fps) ratio our system will automatically share this information with other conferences so that they will maintain this system Fps& network transfer timings and send the video files calculating these timings. FEATURES AND ADVANTAGES OF CONFERENCING USING JMF: 1. HTTP Protocol to broadcast and receive Audio/Video. 2. Broadcasters and Receivers are not required to have public IP addresses. 3. Multiple Users, each capable of broad casting to and receiving feeds from many users. 4. Low cost solution for continuous Video/Audio feed.

User Characteristics: Users of this web conference: 1. Client: Client can be a Sender or a Receiver who interact with this System. Sender is one who uploads his audio files and video files, which are stored first in local system then goes to the server. And receiver is one who receives the respective files sent by the sender. MODULES IN THE SYSTEM: This Application contains the following modules: Sender (JMF Capture Applet) The steps for developing capture applet are: 1. A Data Source is created from the Webcam source using the Media Locator. 2. A Processor Model is created from the Data Source, the format object specifying the video format, and the File Descriptor object specifying the output file format. 3. A Processor is created from the Processor Model and the output Data Source is obtained from the Processor. 4. A Data Sink object is created by first creating a Media Locator for storing the media in a file. 5. Capture of the stream is started and the stream is saved for a specified duration into a file.

This process is repeated until the sender ends the session. File Upload

The File Upload uses the JUpload project It has two parts: the file upload thread at the client and the upload servlet at the server. The following are the steps for developing the File Upload thread (see Listing 2): 1. Create a socket connection with the server. 2. Create an HTTP POST request and an HTTP head and tail. 3. Create necessary IO stream objects. 4. Send an HTTP request to the server. Write the HTTP head, the clip file, and the HTTP tail to the server. The Upload Servlet is used to upload the files. Multipart Request is a utility class that handles multipart/form-data requests for file uploads.

Receiver (JMF Player Applet): The steps for developing a player applet are: 1. Construct two players from the URL of the media at the Web server. One player is for the current clip and the other is for the next clip. 2. Start the first player and fetch the next clip using the second player. 3. On the EndOfMediaEvent for clip i, start playing clip i+1. Destroy the visual component for the player of clip i, de-allocate the player, and create a new player for clip i+2. Prefetch the clip i+2 and add Controller Listener. Repeat these steps for subsequent clips. This makes the playing of clips continuous, as there will be little or minimal delay between subsequent clips. Note that the entire clip is downloaded by the player applet before playing it.

SOFTWARE REQUIREMENTS: Presentation Layer Network Layer Web Server Layer Language Specification Operating Systems HTML, DHTML, XML TCP/IP Tomcat5.5, Servlets,JMF4.0 J2SE 1.5. Windows 2000, Windows NT 4.0, Windows 9x

S-ar putea să vă placă și