Skip to content
ashumeow edited this page Dec 5, 2014 · 1 revision

Web browsers are used on daily basis by several billions of users. Since the web browsers were accessed by several millions every second, the network traffic increases and sometimes, that will create delay and affects network bandwidth. The World Wide Web (WWW) in Web browsers can get efficient only if there is better web performance. The web performance is currently growing, but yet, some parts need to be improved. As years grow by and As users grow by, the data too gradually grows more higher and higher.

The data consists of several pieces (or collection) of information in the form of text, images, audio and video. The orginal data (or raw) gets transmitted through network, where the fetched data gets converted into frames, thereby into bits, then transmitted as packets/stream through Transport Layer, then it successfully gets encoded before session timeout. Finally, the encoded data passes through the Application layer (i.e.) through HyperText Transport Protocol (HTTP), where it undergoes header compression, followed by content encoding/decoding and then, header decoding. Then, the data undergoes decoding, where it decodes the encoded data and then it is transmitted and it reaches the destination before the timeout. So, this mechanism that determines the performance as well as security. The work is focussed on performance.

To improve the web performance, it requires effective scripts that will boost the web through web compression that compress the data and reduce the time between sending and receiving. Also, it will reduce the time in aspects of uploading/downloading any data. The powerful Javascript will comes into the role in the place of scripting. From this work, the focus is high on web compression that will optimize the performance of web browsers. To bring the work from darkness to light, the study is made by learning various existing algorithms especially LZMA (Lempel Ziv Markov chain Algorithm) and also other algorithms related to the web compression. The proposed algorithm namely Lempel Ziv BitMasking Hidden Markov (LZBMHM) algorithm, which extends the LZMA algorithms adding some additional mechanism. LZBMHM mechanism will be used in the later implementation process. The caching process will be used within the compression process.

The current proposed work will be later extended to the proposed implementation. As of present, the documentation presents the project overview, it's objective, block diagram, system analysis that includes problem statement/issues from the literature survey (previous works), where the problem is analyzed and the solutions will be presented, then the existing/proposed algorithms. The experiments, testing and result in this phase will show the problem issue analysis. The proposed architectures and modules will also be presented. The proposed (or final) implementations and results will be shown in the next phase.