Literature Review is the process of finding information for help on searching for resources on the Internet. Reading intensively in the chosen topic area is essential, but the task can prove daunting if they do not approach it in a systematic way. The continuous number of high-profile Internet security breeches reported in the mass media shows that despite an emphasis on security processes that there is still a gap between theory and practice.
Not only is there a need to develop better software engineering processes but also theoretical security improvements need to find their way into real systems. Software design patterns are defined as “descriptions of communicating objects and classes that are customized to solve a general design problem in a particular context”. As software design patterns have proven their value in the development of production software, they are a promising new approach to help in both the theoretical development and practical implementation of better security processes.
First, many/most software developers have only a limited knowledge of security processes and patterns are a proven way to improve their understanding. Second, patterns work against “reinventing-the-wheel” to promote learning best practices from the larger community to save time, effort, and money with easily accessible and validated examples. Third, code can be reused since the same security patterns arise in many different contexts
- Investigating existing resources in our area of research will generally cover three areas:
- “Exploratory investigations, as part of the development and evaluation of possible topics in an area
- Investigation is some depth, sufficient to support a formal research and dissertation proposal
- Complete research that is described in the ‘literature / research’ section of the dissertation.” [from Writing the Doctoral Dissertation, To Author names]
2. Related Research Work Available:
Wireless local (WLAN) and wireless personal (WPAN) area networks are being used progressively to implement VoIP services. The main motivation for using these architectures are user mobility, setup flexibility, increasing transmission rate and low costs, despite this convergence depends on the answers of several technical problems
Supporting reliable real-time service is one of the major concerns for widely deployment of VoIP in these ireless IPbased networks and security is now receiving the attention of researchers. The problem of offering security to WLAN and WPAN is that security does not come for free and, security and efficiency are conflicting requirements. The introduction of a security mechanism such as the IPSec encryption-engine to overcome these issues impacts directly in the speech quality of established calls and in the channel capacity.
Moreover, largely deployed radio technology standards as IEEE 802.11 and Bluetooth used to achieve wireless connectivity have several constraints when delivering real-time traffic, as transmission errors at the channel, introducing delay and loss which with security mechanisms impact can lead to low quality VoIP calls. Although these technologies offer some security mechanisms, they have some flaws which need to be addressed by an additional level of security. In this paper we focus on the IPSec protocol to achieve the data secrecy due to its widely deployment and implementation of many encryption algorithms.
During final decades information technology founded on the computer networks take part in an essential role in different areas of human being action. Troubles of huge importance are assigned on them, such as maintenance, communication and mechanization of information processing. The safety level of processed information is able to differ from private and viable to military and state secret.
Herewith the destruction of the information secrecy, reliability and accessibility may cause the spoil to its proprietor and contain important unattractive consequences. Hence the trouble of information safety is concerned. Many associations and companies expand safety facilities that need important aids. In additional, the impracticality of creating wholly protected system is a recognized fact – it will always hold faults and «gaps» in its understanding.
To guard computer systems such familiarized mechanisms as classification and verification, methodologies of the delimitation and limit of the access to data and cryptographic techniques are applied.
But they hold following drawbacks:
• Disclosure from interior users with spiteful purpose;
• Complexity in access separation caused by data sources globalization, which cleans
away difference between “personal” and “foreign” topics of the system;
• Diminution of efficiency and communication complexity by reason of methods for
access control to the sources, for occasion, in e-commerce;
• Effortlessness of passwords description by crating arrangements of simple users’ relations.
Hence classification and audit systems are utilized beside with these methods. between them are interruption.
Intrusion Detection Systems (IDS).
IDS are generally separated to systems detecting previously identified attacks (mishandling exposure systems) and variance exposure systems registering the life cycle differences of the computer system from its usual (distinctive) action. Besides, IDS are divided to network-based and host-based category by data source. Network-based IDS examine network dataflow, caring its members, almost not moving the output of their work. Network-based systems do not utilize data about progression from divide workstation.
A firewall is a mixture of hardware and software used to put into practice a security policy leading the flow of network traffic between two or more networks. In its simplest form, a firewall acts as a safety barrier to control traffic and manage links between internal and external network hosts. The actual means by which this is able varies and ranges from packet sort and proxy service to stateful examination methods.
A more difficult firewall may hide the topology of the network it is employed to keep, Firewalls have recognized to be useful in trade with a large number of pressure that create from outer a network. They are becoming ever-present and necessary to the action of the network. The constant growth of the Internet, coupled with the increasing difficulty of attacks, however, is placing further stress and difficulty on firewalls design and management. . [ Subrata Acharya, Jia Wang, Albert Greenberg 2006]
Furthermore, the need to deal with large set of varied safety policy and rules impose additional load on firewalls, thus depiction the presentation of the firewall highly serious to enforce the network safety policy. In this context, the defense that a firewall provides only the policies it is configured to execute, but evenly importantly the speed at which it enforces these policy. Under attack or deep load, firewalls can simply become a bottleneck. As the network size, bandwidth, and giving out power of networked hosts carry on increasing, there is a high demand for optimizing firewall operation for improved performance. [ Subrata Acharya, Jia Wang, Albert Greenberg 2006]
Multi-dimensional firewall research group of people to focus on mounting various optimizations to make firewalls more resourceful and steady. In spite of significant progress in the design of firewalls, the techniques for firewall optimization remains static and fail to get used to to the always varying dynamics of the network. This is frequently due to their failure to take into account the traffic individuality by the firewall, such as source and purpose, service requests and the resultant action taken by the firewall in reply to these requests.
Moreover, current firewall designs do not support adaptive difference discovery and counter measure device. As a result, they run the risk to become unbalanced under attack. The object of this paper is to address the above failing and develop a sound and effective toolset to hasten firewall operation and adapt its performance to the dynamically altering network traffic individuality.
Achieve this goal, however is tough, as the number of policy and safety rules a firewall has to enforce for enterprise network. In addition, there is a need for preserve high policy addition. This is further compounded by the limited resources of firewalls relation to the increased ability of the network to process and forward traffic at very high speed. [ Subrata Acharya, Jia Wang, Albert Greenberg 2006]
Cite This Work
To export a reference to this article please select a referencing stye below:
Related ServicesView all
Related ContentAll Tags
Content relating to: "Computing"
Computing is a term that describes the use of computers to process information. Key aspects of Computing are hardware, software, and processing through algorithms.
Software Process Development: Water Fall Model and the RAD
Software Process Development Abstract Choosing the best suited Software development system (SDS) for any particular project is the first and foremost thing to do. If this step goes wrong or selecting ...
MapReduce for Distributed Computing
1.) Introduction A distributed computing system can be defined as a collection of processors interconnected by a communication network such that each processor has its own local memory. The communicat...
DMCA / Removal Request
If you are the original writer of this literature review and no longer wish to have your work published on the UKDiss.com website then please: