The Search Engine Optimization Computer Science Essay

Published:

Today's world consists of luxury and comfort. Modern technology has improved many folds to fulfill this requirement of mankind. Nowadays it is not surprising to see someone eating, decorating their house, studying or entertaining online. Many more things can be done through online interaction. From starting a new house to planning a vacation everything's done online these days.

INTRODUCTION

It would be fair to say that as people are being used to do almost everything online it has become the responsibility of business owners or franchise owners to provide such information to people. It is these people's duty to give people what they want. In turn what information these people get plays a big part in their own personal gain as well.

As we all know that during these days of huge demand and supply and tough competition if your business isn't online by now it's as good as dead. A website of your business or service or company is the most accessible piece of information that can be made available to consumers. Giving out brochures, printing advertisements in newspapers and magazines these days would only give you limited exposure. It can be constrained to only a town not even a city. You have to be online if you have to gain city or state or nationwide exposure.

Lady using a tablet
Lady using a tablet

Professional

Essay Writers

Lady Using Tablet

Get your grade
or your money back

using our Essay Writing Service!

Essay Writing Service

Summarizing the way your website is designed makes a huge contribution in your company's success. As said earlier the website is what people are going to see and the website is what people are going to access form any geographic location. The way people access to your website in spite of not knowing the exact domains is through search engines. Search engines is an advent of the modern world wherein we can enter anything in natural language and the engine would go and give us out results in no time the most matching entries it can find in its database for the question you asked. We have several search engines like Google, Bing or Yahoo. There work is to process requests from users, search its database and return back the results as close as it can get to the asked question.

Now the search engine can return results or links to websites from any corner of the world. For eg a person living in New Zealand can browse through a website of a company located in Alaska. This is possible as the company has made itself available to internet. Any company can make itself available to anyone throughout the world by smart use of search engines. Now lets take a example of a construction company in Texas. There are thousands of construction companies in Texas. Now for that particular company to be available on search engines it must make changes in its website content. As is the working of a search engine it gives millions of results on hundreds of pages. Out of them may be thousand are really related to what a person really wanted, others are loosely related. So to stand out from the websites of other companies offering the same service and making your website appear first and on the first page and as a result accessible to users some smart changes have to be carried out on the website so that it can be optimized by the search engine resulting in your website getting more hits and as a result increasing the profits. Sometimes multi - national companies maintain a customer database. Their objective is to find entries with as less data as possible. Sometimes some applications would only need a name or address or a contact number [18]. This is called as Search Engine Optimization (SEO)

SEARCH ENGINE OPTIMIZATION (SEO)

It basically means making your website accessible to everyone over internet and in such a way that your website is easily accessible to users which will lead to getting more hits on their website and as a result increasing their profits. SEO simply means some techniques, ways or methods by which a website is improved so that it can be accessible on all search engines to all users in spite of their geographical location. Search engines apply retrieval principles to return result of queries put up by users. "SEO demands basic elements of a website to be constructed to fit the search engine retrieval principle so that possible web pages can be gained by search engines and higher ranking in the results can be achieved resulting in website promotion".[2]

Lady using a tablet
Lady using a tablet

Comprehensive

Writing Services

Lady Using Tablet

Plagiarism-free
Always on Time

Marked to Standard

Order Now

Now we will first see the general layout of a website

A] Theme:

- Almost every website has a theme these days. It is mostly based on specific company and its motto or services.

- That theme could be anything

- For example you could have a watermark of the company on all the pages

- There could be an audio for all pages in the website.

B] Images/Videos:

- It depicts various areas of work or achievements.

- It shows consumers that people are experiencing benefits and that potential buyers would also feel that same.

- As far as images are concerned a website is a visual oriented service or business.

- People would obviously buy it if it appeals to them, so having nice meaningful images about the company or service does help.

C] Text:

- It is the most important component of a website

- What the website is all about is actually described here.

- This is the part where mostly changes can be made according to SEO perspective as the text of companies offering similar services can more be more or less similar.

Now, let us take a look at what, where and how changes are made to a website. So that it can be more visible on search engines.

Fig 1) [2] SEO factors

KEYWORDS [4] [5]

Website contains many pages and generally on all of those pages there is text. Now keywords mean replacing one, two or three words of repetitive and common text by something else. This replacement text serves two purposes. Firstly it will reduce the amount of text on web page, Secondly it will create more chances of being searched by search engines. Keywords are especially useful as it is always a desired trait that you are able to get results for the exact questions in e- commerce. [19]

Eg: There is a jeweler store and it has on its website words such as nice craftsmanship, excellent price and available in installments. It can be replaced by Crafted cheap jewels available on easy installments. In this manner repetitive and common English words like nice, excellent are replaced by a new combination of words. These combinations of words can lead to getting picked up by search engines. They might have a better chance of getting picked up than the earlier combination.

Also keyword density is kept into mind. Sometimes it is a good practice to keep titles or main words of a web page in bold headings or to change text contained in Meta tags like name, title and descriptions [3]. Also a normal website easily contains at least 5 pages. It is not advisable that all those 5 pages have the same background or title [3]. Title is different, keywords are different. As a result every page stands a chance of getting hit by search engines, and in a website if any one of five pages is hit the website as a whole is hit.

Another important aspect is creating content on the website based on competitive key word search. It means first study is done of websites of competitors. This needs to be done as you don't want websites of companies offering similar services as yours to be almost similar. As when users try to search there is a possibility that the other website would get a hit instead of yours. [8]

LINKS [4][ 5]

Another factor that is considered is links. They can be links on a website. Search engines take into account how many links to other websites are present on your website. Links are second most important tool for search engines [5]. Now there could be a loophole that a website might have 100 links in its pages. So there is a catch there the links should be of reputed websites and not any random links. [5]

Another important and faster approach to optimizing the website is having your website linked by other websites that is having incoming links. Search engine would find the website quickly if it has many incoming links. [6]

Another approach is to participate in chat forums and discussions on reputed chat boards or one of your own. What that will do is website will get real time feedback from users and you will be able to rectify any shortcomings in the website. Participating in chat forums would mean giving links of your website publicly. People if find the content of blog, other user's reactions worthy they will definitely give the website a visit. If people who do visit the website because of links on chat forums find the website and services interesting they will surely post nice reviews on the same as well as different chat forums getting you more promotion. [6]

INTERFACE:

Lady using a tablet
Lady using a tablet

This Essay is

a Student's Work

Lady Using Tablet

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.

Examples of our work

This is especially important as it is of no use if people just visit the website for few seconds and not more than that. It might not be a problem if the website is new but it might become a problem in the long run. This wouldn't affect the ranking initially as being a new website there might be curiosity in visitors about the services and work of the company but if the website is not able to hook viewers it is of no use. As a result an interactive and informative interface must be developed. If the user stays longer that means your content is a huge success. [7]

Now keeping the user on the website varies from website to website. Some companies might prefer keeping images on every page as part of information. Small videos explaining processes or services can be uploaded. You could even post videos of customer interviews or surveys about your company. There could be chat forums or provisions for feedback from users. If it's an software company for example it could have a Careers section wherein hopeful candidates can browse through various opportunities and be able to contact the company. [7]

URL:

While developing a website its URL should also be taken into consideration. It should be as clean and clear as possible. Mostly people make their way to the websites through URL itself but still it must be neat and clean free from any characters or strings in between [7].

Now if the company's URL contains all sorts of characters and not a very good domain name people wouldn't even visit it. It does not matter whether the content is really good or bad, If the name is absurd they won't visit it. It is general human nature to go on outer looks rather than looking for themselves closely on all ends. But all these are areas which need to be considered before going online.

Fig: 2) [13] Optimization overview

Search Engine Optimization snapshot is depicted in Fig 2).

Now that we know what care must be taken while creating a website let us take a look at what exactly needs to be done in order to get maximum exposure from search engines. Let us take a look at how search engines work-

There are many search engines in today's world notably Google, Yahoo and Bing. Now basic step is to first submit or make available website to various search engines and be rest assured if the previous mentioned conditions are all met any website would achieve promotion and user accessibility.

Now every search engine has its own technique for searching links (websites). Each engine has its different set of factors that it takes into account while searching. For eg: Google search engine has more than 200 factors that help in deciding links of websites [3]. Let us have a look at some mechanisms and algorithms of search engine optimization.

CRAWLERS

"Crawler is a generic term for any program (such as a robot or spider) used to automatically discover and scan websites by following links from one webpage to another." [9] The main function of crawler is to search websites and links automatically, randomly and follow them. "Search engines use crawlers to collect web pages from web servers distributed across the internet."[10] Crawlers work in an organized manner. There search is automatic and random.

"With the increasing importance of information access on web, online marketing and social networking the functions and activities of web crawlers has become extremely diverse". [14]

They scan the internet based on uniform resource locators and the links that are present on those URL's. These links are stored on a temporary file. So basically a file contains all the information about links of multiple websites. It also contains log of hyperlinks on those web pages. So it keeps track of all such information and whenever search engine is in need it provides the necessary file [10]. It can be considered as a kind of bot which keeps track of all the information that a search engine needs. Now these crawlers carry out random scans. Now given the vastness of today's web world it is not possible that the crawler could finish scanning the entire web quickly.

Crawler visits pages and subsequently links on those pages. It stores this information in a temporary location. This information can be accessed by search engines. Crawler starts its work again repeating the same things again and again. So if one has to think work of crawler must stop at some time as there will reach a time where all the files for links would be stored. Sadly that is not the case. This is a rapidly changing world where websites are changed, updated and created in seconds and minutes. So there is constant need for crawlers to keep updated information. [11]

Fig 3: [11] Flow of a Crawler

As the above flow diagram shows let us have a look at crawler working:

1. The starting link or web page is also referred to as seed. From here crawler begins it search of various web pages and subsequent links on it. Now frontier is the page where unvisited nodes are stored. It is an unexpanded list of URL's the crawler has to visit. Now in graph theory we can visualize this concept better. It can be seen as a set of unvisited nodes of a tree. Based on available memory the size of frontier is decided. It not unusual to have a frontier that can store "100,000 links." [11] Now when the crawler starts its random scanning about half or so space is taken up in no time. Consider crawler starts scanning about "10,000 pages and each page has about 7 links" [11] then out of 100,000 70,000 space is taken up. [11]

2. Now the crawler can be seen as implementing breadth first search or as a queue. New unvisited links present themselves from head and visited links are taken out from tail. Now there is a mechanism which prevents use of duplicate links so they cannot be visited again and again. [11]

3. Frontier is the place where crawlers get their command for next assignment. What if there was a provision for multithreaded crawlers. They could work simultaneously and efficiency could be increased.

Fig4) [11] Multithreaded crawler

This multithreaded crawler is especially useful in distributed architectures where it can be used to access many web applications simultaneously. [16]

4. Another use is of Hash table. Suppose we want to know which URL has been visited. A linear search is a costly proposition. So we can maintain a hash table which can look up URL's and visited links can be easily found out. Now this is easy but again the entries in hash table need to changed and updated as information on links changes very rapidly. So if memory is not an issue this method can be use as it is fast also.[11]

Now another concept that can be applied here is of preferential crawler. A priority queue can be used. We can let the URL's with most priority crawl first and all the links that come with it. A separate hash table can be maintained for visited pages so that a heuristic computation of ranking can be done. Now each and every page goes through a priority queue and one by one pages are crawled. Once queue gets over crawling process comes to a halt. A maximum and minimum principle can also be applied. Setting limits on tables would help from memory perspective. [11]

5. Now let us take a look at what activities are performed once pages are crawled. There should be some way or mechanism to store links in. It can be compared to the kind of storing mechanism available on our web browsers. Toolbar has functionality to store visited websites links. Also you can find out which web pages where visited in past week, month or even a particular day. Similarly an entry can be made for various URL's that were visited.

Now the ultimate purpose of all these is to calculate the ranking of web pages. Now each web page is given an identifier and a value to associate it with. There can also be provisions where special events could be marked. These pages could then be fetched as per their index value. This whole list is then made available to search engine. [11]

We have seen how pages are stored and what all techniques can be applied to them keeping in mind memory and speed. Fetching is the process of retrieving a web page.[11] Now as we all know the internet or web domain is based on client and server model HTTP (Hyper text transfer protocol) is used for sending query and receiving a response.

An HTTP client sends an HTTP request to get a response from the server. There will be some

issues like big pages. So scanning them whole would require more time. Also time spent on servers should also be checked. [11]

6. Error checking and Exception Handling are two main important domains which are to be assessed while retrieving pages. [11] Both these tasks are performed after pages are retrieved. This checking is of utmost importance as the basic principle of optimization is good web pages get good ranking and bad pages bad ranking. So pages need to be verified for authenticity.

Parsing is the ability to break down URL's of web pages to extract valuable information [11]. It is generally the next step after a page has been retrieved. As URL's on web domain can be meaningful and sometimes they can also be garbage links. Good needs to be separated from the bad.

Once a page is retrieved it is parsed and sent to the crawler for further processing. It involves breaking of URL received. Some mechanisms are also used wherein some words or phrases are removed upon parsing and then the URL is given to crawler. [11]

HTML parser is a freeware which can be used easily. Its job is to separate URL's. Now in this system there can be problems sometimes there can be many high character links of a web page but their parent domain is the same. It is not common sense to use same link twice. So there can be a provision that such a big web page which could possibly contain hundred links considered as one instead of keeping 1000 entries.

Parsers are used to find anchor tags and values associated to href tag are taken. Also any relative URL's are converted to absolute URL's after parsing. [11]

Robots.txt [9]

Every website has a robots.txt file. The significance of this file is it contains information of every visited user and also the link from where it visited. Now there can be control over how much information you want to be seen by Google crawlers. There are specific agents which allow partial and also full use of robots.txt. Now this file is scanned by crawlers randomly and then it is evaluated. It also contains number of hits and other logistic information. All such information is useful while determining the reputation of a website. [9]

Robots Exclusion Protocol is used in robots.txt. It has been adopted by mostly all search engines and also by crawlers [14].

SITE MAP

There is one file called SITE MAP. It is an XML file. It is available to search engines and it contains all the information about web pages in a web site. [17] Let us take a look at some examples of algorithms that are used to calculate importance of web pages.

PAGE RANK ALGORITHM [2]

It is used to identify the importance of web pages. Here web pages are divided into levels of 1 - 10. 1 is considered as unpopular and 10 as most popular. [2] Suppose a website has T1, T2… Ti pages

Its formula is given as:

PR (A) = (1-d) + d (PR (T1)/C(T1) + ... + PR(Tn)/C(Tn)) [12]

"PR (A) is Page Rank of A.

PR (Ti) is the Page Rank of pages Ti which link to Page A.

C (Ti) is the number of outbound links on Ti.

d is the damping factor which can be set between 0 and 1" [12]

As is clear from the formula that page rank does not rank website as a whole but it gives ranks to various web pages that constitute a web site. As PR (Ti) states page ranks of all the pages that are linked to A are taken into consideration. Further we can infer that the page rank of A does not solely depend on page ranks of pages linked to it. It is divided by the number of outbound links. So as the number of outbound links increases the whole sum of page rank of A decreases. [12]

The page rank algorithm can be perceived as a probability distribution function and as we all know Sum of all quantities is 1. So sum of page ranks of all web pages is equal to 1. [1]

HILL TOP ALGORITHM: [2]

As already explained earlier page rank algorithm gives some values or levels. This algorithm differs in this initial step itself.

No initial value / levels are given to web pages.

This algorithm is based on "expert documents" [15].

Expert documents are those who come into life by giving genuine references. It can be explained as when people like a web page they themselves recommend that web page to other users publicly by posting it over the web. Now this posting web page also contains other genuine links posted by some other people.

This as a whole can be categorized as Expert documents.

Now web site is searched and all of its links are scanned and these are compared to the expert documents sources.

A threshold is used to calculate whether a page qualifies as a expert page or not.

The result of this operation results in a weight.

This is then used to apply Page Rank Algorithm.

If Hill Top algorithm yields no result '0' is returned. [2] [15]

CURRENT ALGORITHMS:

In this competitive world providing accurate results is of utmost importance. Leading search engines use a different kind of technique which employs both Page Rank as well as Hill Top algorithm.

Fig 5) [2]

The above formula is used for calculations by leading search engines.

"a, b and c are regulation weight controls.

d, e and f are damping controls

fb is the factor base.

In addition to this it also contains 3 more parts

RS correlation, PR page rank and LS Industry" [2].

SEO ADVANTAGES

Search Engine Optimization involves many advantages.

Promotion

Normally companies undertake search engine optimization for promotion purposes. Some companies while starting a new business do everything possible for promoting themselves which is not at all surprising. People stand a great chance to reach the website. This is especially helpful when the website is new as people know URL's of known websites and optimization helps in helping people reach the websites.

Cost

Optimizing a website does not cost anything for users. The company bears cost for that if they choose to hire some agency to that work for them. People just need an internet connection to search for anything they need [3].

Focus [3]

Now we can safely assume that people those who use services of optimization are focused about what they want. It can be taken as common sense that people wouldn't normally waste their time if they don't want to spend money on something. Online transactions increase as a result of optimization of websites.

Revenue

As explained in previous step people generally use optimization if they are going to spend money. As a result company revenue increases.

URL

This optimization doesn't generally succeed for big companies. As branded companies are easily reachable users don't use optimization for them. Low and medium scale companies don't generally benefit from URL. So these are the companies which mostly rely on search engine optimization for promotion. [3]

DISADVANTAGES

RELIABILITY

In terms of other media SEO is a slow and less reliable source. Other forms of media like Television, Newspapers and Magazines are more effective and direct than SEO. Also SEO requires time to come into effect.

If we take a example of a Advertising agency. It gives out advertisements in newspapers, brochures and discount coupons. There is a possibility that it may attract a customer the very same day or the next day. Now consider SEO, it will first go through all the steps and exhaustive algorithms and whatever effect that has to take place it will take place after many weeks or months. [3]

COMPLEXITY

There is big difference in what major search engines like Google use as factors and what really happens in changing of website for optimization. It is because of complex nature of page rank and other algorithms used for calculating web page importance. [3]

RESEARCH

There is still time before search engine optimization is perfected. Still research is going on.

TIME

As stated in reliability time is an issue in search engine optimization. Sometimes it takes weeks and months before results are obtained.

CONCLUSION

Search Engine Optimization is very widely used in today's world. It is very much today's need. For that your business must be online. Gone are the days where you start a company with only few hundred people knowing about it and making profit comfortably year after year. This is a very competition oriented world that we are living in. It is a slow but sure process of gaining popularity and increasing the profit. It is not recommended if overnight or fast success is desired. There is absolutely no catch for users as people don't pay for looking up things they want in search engines. It is also not binding on them whether to opt for the resource they have been searching about or to go for a new one. Consumer freedom is guaranteed. From a merchant's point of view initial investment is expected but with little patience and faith expected results are achieved. Also search engine optimization helps e-commerce in general helping the ever popular quick lifestyle in today's world.