World Wide Web Security Computer Science Essay

Published:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Web and the Internet are often synonymously used these days by people. Internet is a set of interconnected networks. And the Web is one of the applications that run on top of the infrastructure that this interconnected network provides.

At a fair level of thought, Web can be defined as one of the leading applications of the Internet, which consists of set of inter-linked documents frequently called as web pages, which can be accessed via the internet using special type of software called Web browser. Such inter-linked documents are often kept public and are served from a computer running a special kind of software that is called Web Server software.

While we are referring to the web, we are in fact referring to World Wide Web (WWW) that was conceptualized by Sir Tim Bernes Lee. WWW is only one of the applications of the internet like Email Applications or Applications to transfer file over the internet. WWW runs on top of the infrastructure that the Internet provides and has its own set of standards for creating and accessing the documents or the web pages.

The World Wide Web ("WWW" or simply the "Web") is a universal information medium which users can read and write throughout computers connected to the Internet. The name is often by mistake used as a synonym for the Internet itself, but the Web is a service that operates over the Internet, as e-mail does. The history of the Internet dates back considerably further than that of the World Wide Web.

The hypertext part of the Web in scrupulous has an complicated intellectual history; notable influences and precursors comprise Vannevar Bush's Memex, IBM's Generalized Markup Language, and Ted Nelson's Project Xanadu.

The concept of a home-based global information system goes at least as far back as "A Logic Named Joe", a 1946 short story by Murray Leinster, in which computer terminals, called "logics," were in each home. Although the computer system in the story is centralized, the story captures some of the feeling of the everywhere information explosion driven by the Web.

The low interest rates in 1998-99 helped add to the start-up capital amounts. However a number of these new entrepreneurs had sensible strategy and administrative aptitude, most of them lacked these characteristics but was able to sell their ideas to investors because of the novelty of the dot-com concept.

Traditionally, the dot-com boom can be seen as related to a numeral other technology-inspired booms of the past together with railroads in the 1840s, radio in the 1920s, transistor electronics in the 1950s, computer time-sharing in the 1960s, and home computers and biotechnology in the early 1980s.

In 2001 the bubble burst, and numerous dot-com startups went out of business after burning through their venture capital and failing to become profitable.

In the aftermath of the dot-com bubble, telecommunications companies had a immense deal of congestion as many Internet business clients went bust. That, plus ongoing investment in local cell infrastructure kept connectivity charges low, and helping to make high-speed Internet connectivity more reasonable. During this time, a handful of companies found success developing business models that helped make the World Wide Web a more compelling experience. These comprise airline booking sites, Google's search engine and its profitable approach to simplified, keyword-based advertising, as well as Ebay's do-it-yourself auction site and Amazon.com's online department store.

This new era also begot communal networking websites, such as MySpace and Facebook, which, though unpopular initially, very quickly gained acceptance in becoming a major part of youth culture.

It is better to put the history or trilogy of WWW. Students could gain the whole abstraction of the WWW and its technologies.

5.1.1 How the web works?

Web or WWW (World Wide Web) as it is precisely known is based on the client-server software model. In a simple client-server model, there is a software which acts as a information consumer and another software which acts as a information producer. The interaction between these two software is guided by the set of rules that each party accept. This set of rule is often called as "protocol". Web is based on the similar software model.Web browsers like the one that you are using right now are the clients and the server on the machine that supplies you this page is called web server.

Too short explanation of this sub-topic. Author should visualized how its work, better with some diagram or workflow. Too short.

Today, the Web and the Internet (Figure 1) allow connectivity from literally everywhere on earth-even ships at sea and in outer space.

Figure 1: Web and Internet Working Flow

Source: http://en.wikipedia.org/wiki/History_of_the_World_Wide_Web

Remove textbox

5.2 WEB TECHNOLOGIES

For this chapter, it is good if there are some samples or picture of the technologies.

Web technologies are a general term referring to the many languages and multimedia packages that are used in conjunction with one another, to produce dynamic web sites. Each divide technology is fairly limited on it's own, and tends to require the dual use of at least one other such technology. Therefore we can conclude that all of the components that build up a site are interdependent on one another.

Figure 2:Web Technology Family Tree Model

Source: http://modernl.com/images/diagrams/web-technology-family-tree-large.jpg

HTML (Hyper Text Mark-up Language) is the superglue that holds together every web site. Like construction a house, you always build a strong foundation first. For every site, HTML is that foundation. HTML is an open source language (i.e. not owned by anyone), which is simple to learn, and requires no fancy (or expensive!) packages to start using it. All you require is something to type with, such as Windows Notepad, and a lot of time and patience.

HTML works on a 'tag' system, where all tag effects the content placed within that tag, e.g.

<TAG>What the tag effects</TAG>.

Even though comparatively limited by itself, it is the suppleness of HTML that allows web sites to develop in complexity. Similar to the foundation of your house, HTML is robust enough to support many kinds of languages integrated within your HTML pages.

5.2.1 DHTML

DHTML (Dynamic HTML) is just as the name suggests, it adds dynamic, moving or changing contented to your plain old HTML pages. Believe of it as a more advanced version of HTML, although DTHML is in fact not a programming language in itself. DHTML is a broad term used to describe a group of applications; the main ones are described below:

JavaScript: JavaScript is a 'scripting' language. A bit like a script in a feature film, it is used to make a decision 'what happens next'. This may be a series of screen events, where one event is initiated by the end of another, or it could be a programmed response to a user interacting with the page in some way, e.g. poignant their mouse over a link. JavaScript is a difficult and powerful language, and may be placed directly inside a HTML page, or in a separate JavaScript file.

Add sample script

HTML tags with JavaScript

<html>

<body>

<script type="text/javascript">

document.write("<h1>Hello World!</h1>");

</script>

</body>

</html>

CSS and CSS-P: CSS (Cascading Style Sheets) is a comparatively new language, intended to expand upon the incomplete style properties of HTML. easy to learn and implement, CSS is an excellent way to control the style of your site, such as text styles like size, colour and font.

CSS may also be positioned inside the HTML page or in separate files. The true advantage of having all of the style properties for your entire site in one single CSS file is that you may edit that single file to effect changes on the whole site, rather than having to go through each HTML file one at a time. For this motive, it is perhaps the most useful web technology and certainly one of my favourites.

CSS-P (CSS-Positioning) is a sub-set of CSS, and is disturbed chiefly with the layout of your HTML pages. It allows the web designer to put any constituent (text, graphic etc.) precisely on the screen where they want it, to the pixel.

DOM: DOM (Document Object Model) allows the designer to admission any element in a HTML page, such as an image, layer or table. Every element may be assigned a unique 'id' name to identify it by, e.g.

<TAG ID="My Tag">Content of My Tag</TAG>

When combined with CSS and JavaScript, the DOM may be used to craft changes to only "My Tag" and no other element, such as increasing text size or the position of "My Tag" on the screen. JavaScript my as well be used to animated such changes to any identified element, such as gradually increasing the size of the text on screen.

Every different browser has its own DOM, and this is often used to determine which browser the visitor is using. A meticulous action may be carried out if the person is using NS6, for example, and ignored if the person is using IE5.

5.2.2 Flash

Unlike all of the above mentioned technologies, Flash is not an open source technology. Flash is owned by Macromedia, and they produce the application essential to produce Flash content for your web site. In addition, a web browser on it's own will not display Flash content, a 'plug-in' must be downloaded (for free) and installed by the visitor to your site before they can view it properly. This plug-ins (mini-applications) should also be updated to ensure that they can display the most up-to-date Flash content on the Net.

While I have discussed the drawbacks of Flash, there are a lot of positives. Flash is now one of the only true multimedia packages for the Net, providing support for animation, video, sound and truly impressive interactive web site experiences. The one real misfortune about Flash is that it may never become a standard technology, which is a real shame. That is not to say that it's future is not bright, with many web users having the plug-in installed, and with many web designers quoting Flash as the application of choice. See sample Figure 3 and 4.

Add sample

Drawing samples

Figure 3: Drawing samples

Source: http://www.adobe.com/devnet/flash/samples/

Media samples

Figure 4: Media samples

Source: http://www.adobe.com/devnet/flash/samples/

5.2.3 The Backend: CGI and Perl

If HTML forms the base of the house, with the different flavours of DHTML and Flash forming the structure of the house over ground (the bits you see), then CGI (Common Gateway Interface) would form the functional workings parts of the house that you can not see such as the plumbing, electrical wiring and heating.

So CGI is apprehensive with the working parts of your web site, which may include hit counters, form processors or web statistics tools. There are numerous different languages that may be termed 'CGI', the most popular of which is Perl. Perl is the language of option for adding function to your site. HTML, DHTML and Flash are very well for controlling the look and presentation of your site, but Perl is needed to run the mechanics of it behind the scenes.

We can distinguish between the presentation (the bit the visitor sees) and the functional (the bit they don't see) parts of a site using the stipulations 'front-end' and 'back-end' correspondingly. So, Perl is often referred to as a back-end technology. This will operates on the server that hosts your site, rather the within the browser window of a visitor to your site.

5.2.4 The Future: SVG and XML.

The personality of web technology is constantly changing to meet the demands of web users and web designers alike. It is hard to envisage with any real accuracy what will become the norm in future developments, and what will fade to obscurity.

As well as being accountable for HTML and CSS, the W3C are also building a potential rival to Flash, SVG (Scalable Vector Graphics). SVG is currently in its infancy, but when finished it will offer web designers an alternative to Flash while possessing many of its versatile qualities. In addition like HTML and CSS, SVG will be an open source standard with nobody owning exclusive rights to produce the packages used to generate SVG content. Currently no browser supports it, and a plug-in from Adobe is essential to run sites using SVG. With the W3C behind it however this circumstances is expected to change and SVG is sure to become more commonplace in the future.

XML (Extensible Mark-up Language) has been around a slight longer than SVG. It is more complicated to define, however, and its use is still in its early stages. If HTML is a 'mark-up' language, XML is a language that allows you to make your own 'mark-up' languages. In basic terms, this allows you to create your own tags to replace the limited amount that HTML has to offer. In theory this would mean an vast amount of user mark-up languages, but this would create obvious practical problems for the compatibility of such variety. At present even the latest browsers offer limited support for XML, and its use is often confined to interacting with databases in combination with Perl.

SELF CHECK 5.2

Define the backend methods which are used in web technologies?

Illustrate XML technologies?

Remove textbox

5.3 BROWSER PRIVACY

Numerous Web sites record which pages your browser explores while you are within their Web site, using small files called cookies. Examples of how cookies are used includes keeping a "shopping cart" of items you are trading or remembering your address or other important information so you don't have to reenter it each time you visit. Depending on your privacy concerns, you have the choice to limit or prohibit cookies on your computer.

Other cookies, used mainly by advertisers, are called third-party cookies, because they are maintained by Web sites other than the one you're visiting. Few people choose to limit only these third-party cookies.

Remove textbox

5.4 COOKIES

Definition of Cookie: The name of the topic should be change from question sentences to normal sentences. (eg. Cookie/Definition of Cookie.. etc)

Cookies were introduced to e-world by the pioneering firm Netscape. The name "cookie" was a phrase by now in use in computer science for recitation a piece of data held by an intermediary.

Internet cookies are little pieces of information in text format that are downloaded to your computer when you visit many Web sites. The cookie may arrive from the Web site itself or from the providers of the advertising banners or other graphics that make up a Web page. Hence visiting a single Web site can actually result in the downloading of multiple cookies, each from a different source. You may never in fact visit a page of one of the major advertising agencies like Doubleclick.com but you will still get cookies from them. Cookies normally contain some kind of ID number, a domain that the cookie is valid for, and an expiration date. They may also hold other tracking information such as login names and pages visited. Since they are in text format, they can be read with a standard text editor such as Notepad although the contents may not necessarily seem to make a lot of sense.

5.4.1 Storage place of Cookies: (same as above)

Each type of Internet browser designates a meticulous place for storing cookies.

Internet Explorer (IE) has a folder Cookies\ where cookies are held in reserve as small individual text files, one for each cookie. In Windows 98/Me, the IE cookie folder is a sub-folder of the Windows folder. Windows XP has special folders, one for each user, \Documents and Settings\[User name]\Cookies\. As part of a intricate caching scheme, pointers to IE cookies are also kept in the folder Temporary Internet Files\.

While AOL uses Internet Explorer beneath its proprietary interface, it employs the related method as IE and cookies are in the same place.

Netscape and Mozilla allied browsers use a single text file, cookies.txt, with each cookie occupying one or more lines within this one file. The location of the file depends on your version and type of browser.

The easiest method to find where cookies are kept is to do a Find or Search either on the folder name "Cookies" or the file name "cookies.txt", depending on your browser.

5.4.2 Usage of Cookies: (same as above)

They are essential to provide the function of "persistence". Browsing the Internet involves what is recognized as a "stateless" process. In other words, a Web site normally has no memory of who comes and goes. (In fact, logs of traffic are kept but these are not involved here.) As soon as the information that your browser requests from a site is downloaded to your computer, the connection is dropped. If you return to the site a minute later (or whenever), the site has no information that you were just there. If a site has numerous pages and you go from one to the other the site does not remember which pages you have been to. That is, it won't except a cookie is on your machine to remind the site and provide continuity.

5.4.3 Cookies and Privacy

Although some cookies offer a useful function, many others may not be desirable. As the Internet has evolved from its beginnings in academic circles and government to a commercial enterprise, cookies have inexorably been turned into a tracking mechanism used by advertisers. In standard, cookies are only accessible to the site that originated them but large advertising agencies with many clients can easily circumvent this restriction by collecting information for all their clients under one domain. A fairly harmless (and perhaps even useful) advertising application of cookies is to rotate banner ads as you go from page to page to make sure that you do not see the same ads over and over.

However, there are more persistent tracking methods that might involve cookies and therein arise privacy issues. The privacy dilemma is beyond the scope of this article but some references are given in the sidebar. It ought to be emphasized that cookies are plain text files and, as such, are not executable programs and cannot do anything to your computer.

5.4.4 Managing Cookies

Many PC users do nothing to deal with cookies and simply accept whatever comes their way. This policy of neglect had more to suggest it back when cookie management was fairly arduous and time-consuming. Nowadays, however, the obstacles to cookie management are low enough that at least some form of basic management should be a standard practice.

There are numerous reasons why a PC user might consider exerting a little effort in this area. Initially, the volume of cookies sent out these days is so large that a computer will rapidly acquire hundreds of cookies. It isn't strange for me to pick up 30 or 40 in a single hour of browsing. Few of these cookies are useful but most are tracking cookies from advertisers. Merely from disk housekeeping considerations, you might want to keep the number down. A more sober consideration for many is the possible privacy issues that arise from the tracking cookies. Controlling cookies isn't that tricky and here are some methods.

In theory you can merely refuse all cookies. All browsers allow for this option. Though, this is not a extremely practical solution. Several sites use cookies for useful or benign purposes. Also lots of sites require cookies to be enabled before they let you view them.

An enhanced alternative is to selectively block and/or remove undesirable cookies while keeping good ones.

There are numerals of approaches.

One way is by do-it-yourself methods involving such things as editing the actual contents of the IE cookie folder. This is dull and there are better ways.

The major browsers have added ways of selectively configuring for cookies. For example, Internet Explorer 6 has Privacy settings with a numeral of cookie options. Along with the options is the ability to list specific sites whose cookies are to be rejected. This gives a PC user the option of refusing cookies from certain advertising agencies such as Double-Click that use aggressive tracking methods. The Firefox browser has even more cookie control in its setting Tools-Options-Privacy

There is a whole assortment of Internet security software, some free, some commercial, that include cookie management. Two free programs are this script and Karen Kenworthy's Cookie Viewer. The key commercial players like Symantec and McAfee now include cookie management in their Internet security suites as do firewall applications like ZoneAlarm Pro. Tracking cookies are exclusively targeted by many spyware removal programs. There are also programs such as Cookie Crusher which is purposely designed to deal with cookies.

Show diagram flows or picture of managing cookies.

To Delete Cookies (Figure 5)

On the Edit menu, click Preferences.

Double click the Privacy & Security heading on the left to expand its contents.

Click the Cookies heading.

Click the Manage Stored Cookies button.

Click the Remove All Cookies button (see diagram -->).

Click Close.

Click OK.

Figure 5: Deleting Cookies

Source: http://www.dvconnect.org/about/online_security.asp

SELF CHECK 5.4

Define the importance of cookies in internet?

How to manage the cookies in a PC?

Remove textbox below

5.5 MOBILE CODE THREATS

Mobile code is a significant programming paradigm for the Internet (e.g., Java Applets) and provides a flexible way to structure distributed systems. Mobile agents are mobile code that acts separately on behalf of a user for continuous collecting and processing of information.

Autonomous mobile agents are fashioned by an originator and may visit any number of hosts before returning to the originator:

Figure 6: Work flow of code threats

Source: http://www.onlineprdia.com

Diagram No and Description

Mobile code poses new security threats:

How to protect the host who runs potentially malicious mobile code?

How to guard the mobile code from a potentially malicious host?

Protecting the Host too short explanation of this topic. Elaborate more

This question has received considerable attention because of the threat of viruses (unfortunately, a prominent form of the mobile agent species!). Current solutions for this crisis are to run mobile code in a sandbox with access control and by applying code signing.

Java has always had numerous faces to its security model. It has a strongly typed compiler to eradicate programming bugs and help implement language semantics, a bytecode verifier that ensure the rules of Java are followed in compiled code, a classloader that's accountable for finding, loading, and defining classes and running the verifier on them, and the security manager -- the main boundary between the system itself and Java code.

We'll be concentrating on the age-old security manager and the new addition to the JDK, the access controller. To refresh our memories, the security manager in Java is composed of a series of checkXXX methods that we can override, defining the logic we desire. In JDK 1.1, this logic either disallows the request (by throwing a java.lang.SecurityException) or allows the request (either via some ornate logic scheme or by simply returning).

The sandbox, though, is just the security policy -- the correspondent of a law. And a policy or law must be enforced to be effective. The government can pass a law in your town tomorrow banning red shoes, but if no one enforces it, it's not much of a law. The sandbox won't efficiently confine code and its behavior without an enforcement mechanism. This is where the security manager comes into the picture. Security managers make sure all limited code stays in the sandbox.

Here's an example of how to create a security manager in JDK 1.1 that allows reading files, but disallows writing files:

public class MySecurityManager extends java.lang.SecurityManager {

public void checkRead(String file) throws SecurityException {

// reading is allowed, so just return

return;

}

public void checkWrite(String file) throws SecurityException {

// writing is not allowed, so throw the exception

throw new SecurityException("Writing is not allowed");

}

} // end MySecurityManager

5.5.1 Protecting Mobile Code Applications Correct this numbering. Too short explanation

Some people thought that mobile code cannot be guarded from a malicious host. However, it is recognized that, at least in principle, this could be possible by applying tools from basic research in cryptography using homomorphic encryption schemes.

Recent work at IBM Zurich shows that it is indeed possible to protect mobile code from a spying host. Though, such an application is limited to the case where the mobile code must not affect its host in any way. This curb is also inherent in the previous proposals.

Mobile agents have to be ready to execute on different hosts where, according to exact environmental conditions, they will adopt dissimilar behaviours. The agent becomes able to adapt itself to the atmosphere where it is currently running, by composing only appropriate modules. This allows enhancement of the privacy of the agent code.

Techniques for using a class loader to protect mobile code against a malicious host. The techniques include using the class loader to extend a class used by the mobile code such that a method is added to the code which authenticates the mobile code. When executed, the method provides a dynamic watermark that authenticates the code. The method may be encrypted until it is added to the code.

5.6 WEB SERVER SECURITY

Remove textbox above

Once upon a time, the World Wide Web was a comparatively static place. The Web server's sole function was to merely deliver a requested Web page, written in HTML, to a client browser. Over time, developers started looking for ways to interrelate with users by providing dynamic content -- that is, content that displayed a form or executed a script based on user input. Therefore Server Side Includes (SSI) and the Common Gateway Interface (CGI) were born.

A Server Side Include page is normally an HTML page with embedded command(s) that are executed by the Web server. An SSI page is parsed by the server (a "normal" Web page is not), and if SSI commands are establish they are executed before the resultant output is delivered to the requesting client. SSI is used in situations that demand a small amount of dynamic content be inserted in a page, such as a copyright notice or the date. SSI can also be worn to call a CGI script; however, there is a performance penalty associated with SSI. The server should parse every page designated as SSI-enabled, which is not an optimal solution on a heavily loaded Web server.

The CGI is a standard for communiqué between a program or script, written in any one of several languages, and a Web server. The CGI requirement is very simple: input from a client is passed to the program or script on STDIN (standard input). The program then takes that information, processes it, and returns the result on STDOUT (standard output) to the Web server. The Web server combines this output with the requested page and returns it to the client as HTML. CGI applications do not force the server to parse every requested page; only pages containing CGI-recognized arguments involve further processing.

The network is secure, at the back of firewall, and the server itself is in a controlled environment.

The operating system has been appropriately secured and all unnecessary services are disabled.

The Apache user and group directives are properly set, and appropriate permissions assigned.

The ServerRoot and log directories are protected.

User overrides are disabled.

Default access has been disabled, and access opened for only those system directories designated "public". For example, on a system configured to host user Web pages from /home/username/public_html, Apache's httpd.conf configuration file should contain the following directives:

Example:

ServerName www.sitename.com

UserDir public_html

<Directory />

Order deny,allow

Deny from all

</Directory>

<Directory /home/*/public_html>

Order deny,allow

Allow from all

</Directory>

In other words, in order to completely absorb the material discussed in this chapter, the reader must have a excellent working knowledge of common Web server security, installing and configuring Apache, Apache modules, Apache's key configuration directives, the role of Apache's .htaccess file, how to read log files, UNIX file permissions, and basic system administration. Readers must also be well-known with the syntax, commands, and functions of whatever programming language they intend to use to create CGI applications.

Each Web server has an IP address and possibly a domain name. For example, if you enter the URL http://www.abc.com/index.html in your browser, this sends a request to the server whose domain name is abc.com. Then the server fetches the page named index.html and sends it to your browser.

Any computer can be twisted into a Web server by installing server software and connecting the machine to the Internet. There are numerous Web server software applications, including public domain software from NCSA and Apache, and commercial packages from Microsoft, Netscape and others.

The paragraphs above don't seem to explain the introduction part of web security. Please redo

Problems in web server: (change the topic name)

The difficulty with all of this is that when an individual sets up their home-based web server, many of them have no idea how to properly secure it, and many may not even have the latest updates for their particular software installed, setting them up with a very high risk of intrusion into their computers (and hence their personal lives) by hackers.

The major issue is that while you run a Web server on your home PC, you're opening a port on your computer that allows entrance from the outside world. Web servers may not be the easiest method to gain access to a computer, but they are a well known method of intrusion, meaning that you raise your risk of having your computer attacked, your website defaced, and maybe even having your computer taken over completely by unscrupulous individuals.

Some users have taken computer security classes or have knowledge of web server security. When you open up a service such as a web server on your machine, you're not just leasing your friends and family connect to it, you're also letting the entire internet access your hard drive and all that data that it contains.

lots of people are under a false sense of security, thinking that "if I set up a web server on my computer at home, no one will be connect to it except those friends that I tell about it". This is of course absolutely untrue unless they've taken the proper security measures for their web server, such as specifying that only users with specific IP (internet protocol) addresses can access it. Very few home users are familiar with how to do this however, and it can be problematic even for those that do since most users connect to the internet using dynamic IPs. This means that every time they connect to the internet, their IP address changes, hence the problem.

The internet is, comparable to the rest of the world, a unsafe place where anyone, as well as criminals, can connect and do as they please. Even usually law abiding people often commit crimes on the internet because of their perceived anonymity.

Figure 7: Service oriented architecture (Web server)

Source: http://www.w3.org/TR/2002/WD-ws-arch-20021114/#basicext

Shows some samples of the problems

5.6.1 Website Defacement

Web defacement (the defacement of a web site by hackers) is a general method of attack on a web server. You may have visited a web page and saw something that you didn't anticipate. This might be anything from a "Hacked by so and so" message to a political message. This is a somewhat common thing on the internet. For example, if you're going to run a web server on a Microsoft Windows PC, odds are it's going to be Internet Information Server (referred to as IIS). Stats from the site listed above state that "75% of all web servers running Microsoft IIS 5.0 are vulnerable to exploitation."

Another method that hackers use is to exploit a bug or security flaw in their particular operating system (OS) or the web server software they're using.

Here we précis a few basic measures that anyone running a Web server should consider essential.

1. Don't run needless servers or interpreters. If you don't require the FTP (File Transfer Protocol) server that's bundled with your Web server, don't give hackers another target: Disable it, or don't install it at all. Likewise, disable scripting languages and sample scripts that you don't absolutely require.

2. Subscribe to your server vendor's security alert listing. Or at least monitor its Web site frequently for patches and apply them immediately. The Computer Emergency Response Team suggested list at www.cert.org/advisories/ can be a useful resource. Don't forget to look at out for alerts and patches for your OS as well as for the Web server itself.

3. Practice good password habits. Evade simple, easy-to-guess passwords, particularly for privileged administrator accounts. On the other hand, don't make your password rules so draconian that user's option to writing them down. Constantly change default passwords and eliminate unnecessary accounts (such as guest). Make sure passwords are essentially enabled for sensitive areas and administration functions.

4. Identify what's happening on your network. Numerous Web servers are free and easy to install, so watch out for well-meaning but ill-informed users who may inadvertently create security holes.

5. Employ your operating system's permission mechanism. Generally the Web server runs with the permission of a particular user. Make sure that user has suitably limited access.

6. Monitor your logs. Your Web server keeps track of every request; review your logs regularly for signs of out-of-the-ordinary behaviour.

7. set aside public and private data. Don't store receptive data on the same machines as public Web servers if you don't have to do it. For an extranet, you might regard as a "sacrificial lamb" configuration, where a Web server sits outside the firewall so that it doesn't jeopardize corporate data behind the firewall.

8. Be cautious with your server configuration. Boundary executable files to specific directories, and make sure their source codes can't be downloaded. Turn off features such as automatic directory indexing and WebDAV publishing support if you don't need them. Run some security tools your OS or Web-server vendor provides, such as Microsoft's IIS Lockdown Tool, to spot potential weak spots.

9. Ensure programs for security holes. CGI scripts on Web servers are mainly prone to security breaches, especially if they don't validate user-supplied data before accessing files or operating-system services.

Remove textbox below

5.7 WEB TRAFFIC SECURITY

Web traffic is the sum of data sent and received by visitors to a web site. It is a huge portion of Internet traffic. This is resolute by the number of visitors and the number of pages they visit. Sites monitor the incoming and outgoing traffic to see which parts or pages of their site are popular and if there are any obvious trends, such as one specific page being viewed mostly by people in a particular country. There are numerous ways to monitor this traffic and the gathered data is used to help structure sites, highlight security problems or indicate a potential lack of bandwidth - not all web traffic is welcome.

Web traffic can be analysed by screening the traffic statistics found in the web server log file, an automatically-generated list of all the pages served. A hit is generated while any file is served. The page itself is measured a file, but images are also files, thus a page with 5 images could generate 6 hits (the 5 images and the page itself). A page view is generated when a visitor desires any page within the web site - a visitor will always generate at least one page view (the main page) but could generate many more.

Tracking applications outer to the web site can record traffic by adding up a small piece of HTML code in every page of the web site.

Web traffic is also rarely calculated by packet sniffing and thus gaining random samples of traffic data from which to extrapolate information about web traffic as a complete Internet usage.

The subsequent types of information are often collated when monitoring web traffic:

The number of visitors.

The average amount of page views per visitor - a high number would point out that the average visitors go deep inside the site, perhaps they like it or find it useful.

Average visit interval - the entire length of a user's visit. As a rule the more time they use the more they're interested in your company and are more prone to contact.

Average page time duration - how long a page is viewed for. The additional pages viewed.

Domain classes - each levels of the IP Addressing information requisite to deliver Webpages and content.

Busy times - mostly popular viewing time of the site would show when would be the best time to do promotional campaigns and when would be the majority ideal to perform maintenance

Most requested pages - the highly admired pages

Most requested entry pages - the entry page is the first page viewed by a visitor and shows which are the pages highly attracting visitors

Most requested exit pages - the most requested exit pages might help find bad pages, broken links or the exit pages might have a admired external link

Top paths - a path is the series of pages viewed by visitors from entry to exit, with the top paths identifying the way most customers go throughout the site

Referrers; The host can track the (apparent) source of the links and decide which sites are generating the most traffic for a particular page.

5.7.1 Controlling web traffic (too short)

The quantity of traffic seen by a web site is a measure of its popularity. By analysing the statistics of visitors it is feasible to see shortcomings of the site and look to improve those areas. It is also possible to raise (or, in some cases decrease) the popularity of a site and the number of people that visit it.

Limiting access:

It is sometimes significant to protect some parts of a site by password, allowing only authoritative people to visit particular sections or pages.

Some site administrators have selected to block their page to specific traffic, such as by geographic location. The re-election campaign site for U.S. President George W. Bush (GeorgeWBush.com) was blocked to all internet users outside of the U.S. on 25 October 2004 after a reported attack on the site.

It is also possible to limit access to a web server both based on the number of connections and by the bandwidth expended by each connection. On Apache HTTP servers, this is accomplished by the limitipconn module and others.

.Show some tools that can be used to control traffic. Elaborate more

Tools used for web traffic control:

NetLimiter 1.30

NetLimiter - NetLimiter is an ultimate internet traffic control and monitoring tool designed for Win98/Win98 SE, WinME, Win2000 and WinXP. You can use NetLimiter to set download/upload transfer rate limits for applications

or even single connection and monitor their internet traffic. Along with this unique feature, Netlimiter offers comprehensive set of internet statistical tools. It includes real-time traffic measurement and long-term per-application internet traffic statistics.

Screenshot:

2. TrafficRefine:

As a computer user, you require to protect your system from malicious software. As a parent, you surely desire the Internet to be as safe as probable for your child. TrafficRefine is a personal network filter which is used to restrict access to meticulous Internet resources from a local computer. The program was developed with parental control in mind and combines necessary functionality with extreme ease of use.

5.7.2 Web Traffic security approaches

Numeral approaches can provide web security is possible. The variety of approaches that have been considered are related in the services they provide and to some extent, in the mechanisms that they use, but they differ with respect to their scope of applicability and their relative location within the TCP/IP protocol stack.

One way to offer web securiy is to use IP security. The advantage of using IPSec is that it is transparent to end users and applications and provides a general purpose solution. Further IPSec includes a filtering ability so that only selected traffic need incur the overhead of IPSec processing.

An additional relatively general-purpose solution is to implement security just above TCP. The primary instance of this approach is the SSL and the ensuing Internet standard known as TLS. At this stage, there are two implementation choices. For full generality SSL might be provided as part of the underlying protocol suite and therefore be transparent to application. Instead, SSL can be embedded in specific packages.

SELF CHECK 5.7

a) Define the Web Traffic security and the aporbriate approches?

Remove textbox below

5.8 SUMMARY

This is the fifth module for the Internet security course. This module explains about the World Wide Web and the working of the World Wide Web.

In this chapter you have learnt what browser privacy is. Additionally you have understood the web technologies like HTML, DHTML, XML, Perl, CGI, Flash etc...

In addition, this module pointed out the about cookies, where it is kept and why is it used for and how to manage cookies.

It also explained the mobile code the important programming paradigm for the Internet, and the threat and security issues related to it.

Finally you have learnt about the web server security and web traffic security and the measure and approaches to overcome such security issues.

Appendix - Acronym

WWW - World Wide Web

SSL - secure socket layer

TLS - Transport Layer Security

DHTML - Dynamic Hyper Text Markup language

IE - Internet Explore

URL - Uniform resource locator

XML - Extensible Markup Language

Writing Services

Essay Writing
Service

Find out how the very best essay writing service can help you accomplish more and achieve higher marks today.

Assignment Writing Service

From complicated assignments to tricky tasks, our experts can tackle virtually any question thrown at them.

Dissertation Writing Service

A dissertation (also known as a thesis or research project) is probably the most important piece of work for any student! From full dissertations to individual chapters, we’re on hand to support you.

Coursework Writing Service

Our expert qualified writers can help you get your coursework right first time, every time.

Dissertation Proposal Service

The first step to completing a dissertation is to create a proposal that talks about what you wish to do. Our experts can design suitable methodologies - perfect to help you get started with a dissertation.

Report Writing
Service

Reports for any audience. Perfectly structured, professionally written, and tailored to suit your exact requirements.

Essay Skeleton Answer Service

If you’re just looking for some help to get started on an essay, our outline service provides you with a perfect essay plan.

Marking & Proofreading Service

Not sure if your work is hitting the mark? Struggling to get feedback from your lecturer? Our premium marking service was created just for you - get the feedback you deserve now.

Exam Revision
Service

Exams can be one of the most stressful experiences you’ll ever have! Revision is key, and we’re here to help. With custom created revision notes and exam answers, you’ll never feel underprepared again.