EGovernment: History, Causes and Trends
Disclaimer: This dissertation has been submitted by a student. This is not an example of the work written by our professional dissertation writers. You can view samples of our professional work here.
Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UK Essays.
BENCHMARKING EGOVERNMENT SERVICES
Governments around the world have embraced the use of Information and Communication Technologies (ICTs). This represents a relatively new branch of study within the IT field. EGovernment Services are provided through many different means of access and to a variance of audiences, citizens, businesses or even other governmental entities. After clarifying the definitions and differences among similar terms (i.e. eGovernance and Digital Government, eDemocracy) this paper examines how eGovernment is measured by analyzing the dominating methodologies that are used. Furthermore, following specifically the eGovernment benchmarking methodology that is used by the European Commission, a greater focus in the evolution of eGovernment in Greece has been made. The finding through this assessment was far from satisfactory. Particularly, comparing the 20 Basic eGovernment Services offered in Greece, from 2007 to 2009, no development in terms of improvement, has taken place. Finally, the measures that governments need to undertake are discussed.
In the past years, assisted by the "invasion" of Information Technology in everyday lives, governments all over the world have begun widely using information technologies for increasing the effectiveness and quality of the services they provide. These initiatives have become known as “electronic government” or eGovernment services. In most cases, when words gain that attractive “e-“ in front of them, the popular belief is that they have become “electronic”, whatever that means, even though in some cases it does not make much sense. This confusion is much more obvious when the original word itself has conceptual and abstract meanings. Words like Government and Governance.
Section I presents the most popular definitions, choosing the one that describe each term the best, and clarifies boundaries between the most common terms. Furthermore, the different ways that eGovernment can be classified, depending on the delivery model or the audience, is outlined.
Although the definitions of eGovernment may vary widely, there an obvious shared theme emerges; eGovernment involves using information technology, and especially the Internet, to improve the delivery of government services to citizens, businesses, and other government agencies. It acts as enabler for citizens to interact and receive services from governments twenty four hours a day, seven days a week. Monitoring eGovernment development and evaluating its effectiveness is a complex and challenging task, as the phenomenon is new and dynamic.
In Section II, the basics of Benchmarking are presented and its structural elements are analyzed. Focusing on specific examples of methodology used, a set of four dominant practices that represent the longest running efforts for measuring eGovernment is chosen to be explored further. Using the reports publish by each one of them, on a periodical basis, their inner workings are analyzed and the various developments, changes and evolutions in the methods employed by each one are mentioned.
Section III focuses at benchmarking of eGovernment Services in Europe. In order to recognize how eGovernment has evolved and matured within the European Union, the relevant European directives, initiatives and frameworks for the development of eGovernment Services in the region since 1999 are examined. Following that, the methodology used for benchmarking eGovernment the European Union is examined in detail. All measuring elements, including some that were used for the first time in the most recently published report are evaluated.
Having established what eGovernment is, what Benchmarking is and how its methodologies function, Section IV uses the data from the latest European eGovernment Benchmarking Report, which was published in November 2009, to assess how the Greek eGovernment landscape evolved since the previous report in 2007. The results are disappointing. When comparing the 20 Basic eGovernment Services offered in Greece, there was no improvement, what so ever, from 2007 to 2009. Following that, Greek performance in the two new indices introduced in the latest report (EProcurement and User Experience) is reported and compared to the respective EU27+ average.
Finally, in Section V, a general overview is provided along with the conclusions about the (lack of) progress in eGovernment in Greece.
Scope and aims
The scope of this project is to analyze how the meaning of eGovernment has evolved in the past few years and then review the current trends in benchmarking the penetration & sophistication of eGovernment services in Europe and the rest of the world. Furthermore, this project reports and analyses the level of eGovernment services offered in Greece. The basic aims of this project are:
- Define the eGovernment ecosystem, typology and taxonomy.
- Analyse the dominant methodologies of benchmarking eGovernment services
- Gather and process existing results about eGovernment in Greece, regarding service penetration and sophistication, along with other relevant metrics.
This project relies heavily on research. In particular, a lot of research on what the different and sometimes contradicting terms that define eGovernment as well as the rest of the relevant terms in academic papers throughout the previous decade was made. Following that, further research about the current and past trends in benchmarking in general and eGovernment benchmarking in particular are is conducted. From there on, having established what eGovernment is and what the provided services should be, along with how they are measured, more research was conducted in order to reveal what the actual current level of provided eGovernment services is. To accomplish this, reports from many different parties are used. These include reports published both by well knows analyst firms or government bodies in various levels as well as reports issued at a global level such as the United Nations to local reports issued by the authorities of each country such as the IT Observatory in Greece.
Throughout the bibliography, or any other sort of resource for that matter, “electronic” terms do not have a consistent representation. So, just like electronic mail can be found abbreviated in quite a few forms, “electronic” Government is abbreviated to eGovernment, e-Government, E-Government etc.
To avoid this inconsistency, through this project the term eGovernment will be used (changed to EGovernment only in the beginning of sentences). This convention will apply to “electronic” terms that will be used such as eGovernance.
I. EGovernment, eGovernance and Digital Governance
EGovernment is one more of the recent years “buzzwords”. It is usually either paired with the word “services” at the end or other words like eGovernance and Digital Government. Like every other (relatively) new and cool “buzzword” they are used widely by a broad spectrum of individuals who represent mostly two different backgrounds. Information technology and politics. The first because it is a technological issue, the later because they have come to realize, even though a little late, that they represent an excellent vehicle for them to provide a better experience to anyone who interacts with the Government. But, what do these terms mean? Do they collide or conflict each other? How about covering or including one another?
A. EGovernment Definitions
There is not one, unique and commonly accepted definition for eGovernment. It is quite difficult to decide over a specific one but after the research made, the following definition from the World Bank (ΠΑΡΑΠΟΜΠΗ) describes it best:
“Government refers to the use by government agencies of information technologies (such as Wide Area Networks, the Internet, and mobile computing) that have the ability to transform relations with citizens, businesses, and other arms of government. These technologies can serve a variety of different ends: better delivery of government services to citizens, improved interactions with business and industry, citizen empowerment through access to information, or more efficient government management. The resulting benefits can be less corruption, increased transparency, greater convenience, revenue growth, and/or cost reductions ((AOEMA), 2004).
Although other definitions have been provided, this definition is preferred. The reason is that it is the most concise and the easiest to be understood since apart describing in simple words how eGovernment is utilized, it goes on to offer a very brief, yet to the point, reference to its main advantages.
EGovernment definitions various other sources as follows:
· United Nations definition ((AOEMA), 2004): “E-government is defined as utilizing the Internet and the world-wide-web for delivering government information and services to citizens.”
* Global Business Dialogue on Electronic Commerce - GBDe definition ((AOEMA), 2004): “Electronic government (hereafter e-Government) refers to a situation in which administrative, legislative and judicial agencies (including both central and local governments) digitize their internal and external operations and utilize networked systems efficiently to realize better quality in the provision of public services.”
* Gartner Group's definition: “the continuous optimization of service delivery, constituency participation, and governance by transforming internal and external relationships through technology, the Internet and new media.”
* Definition of the Working Group on eGovernment in the Developing World: E-government is the use of information and communication technologies (ICTs) to promote more efficient and effective government, facilitate more accessible government services, allow greater public access to information, and make government more accountable to citizens. E-government might involve delivering services via the Internet, telephone, community centers (self-service or facilitated by others), wireless devices or other communications systems.”
EGovernment is in the first stages of development. Most governments have already taken or are taking initiatives offering government services online. However, for the true potential of eGovernment to be realized, government needs to restructure and transform its long entrenched business processes. EGovernment is not simply the process of moving existing government functions to an electronic platform. Rather, it calls for rethinking the way government functions are carried out today to improve some processes, to introduce new ones and to replace those that require it. The range of services that may be provided by e-government spans from simple information sites to fully interactive experiences where users and government engage in a dialog mediated by information technology.
Internal information systems of Government agencies, information kiosks, automated telephone information services, SMS services and other systems all comprise e-Government services. All these are applications of Information and Communications Technologies (ICT) to improve the services of the Government towards its primary clients: the citizens. In the last few years, there has been much talk of mobile government or m-government. MGovernment refers to the use of wireless technologies like cellular/mobile phones, laptops and PDAs (Personal Digital Assistants) for offering and delivering government services. MGovernment is not a substitute for e-government, rather it complements it.
1. Benefits of eGovernment
E-Government initiatives contribute to citizen empowerment by making information about government processes and decisions easily available, and allowing information-sharing among people and organizations, and between citizens and the civil service (Accenture and the Markle Foundation, 2001). Well-informed citizens are better able to hold their governments accountable. Governments are then compelled to improve the quality of services, expand accessibility of these services, and increase responsiveness to their constituents. Many Government services rely on information passed among different offices within a department or across departments. The large amount of information and paperwork required results in an environment where for red tape rips, the workforce is inefficient and bureaucratic, and the delivery of services is ineffective. With the usage of ICT, the government bureaucracy and citizens are both winners in the battle against the paper trail. eGovernment allows government knowledge and data exchange to be accessed more easily (whether public or secure) by the appropriate offices or individuals. By this, it reduces redundancies of information flows, and resulting in overall increased productivity. Another result of the integration of operations of government agencies is the improvement of transparency in government.
EGovernment minimizes redundant information flows, helps to eliminate duplications of functions, and improves the adherence of public servants to proper government procedures, thereby reducing opportunities for corruption. This, provided it is accompanied by well-informed and active citizens, will assist in limiting the relationship between bureaucracy and corruption and will help lead to a higher sense of accountability among officials.
B. EGovernment Taxonomy
EGovernment can be classified according to different criteria. It can be classified according to its level, its audience and last but certainly not least, according to the delivery mechanism used.
EGovernment can be categorized in the following five distinct levels depending on how broad it is. The levels are:
These levels are illustrated below (see 1 adapted from Heeks, 2006)
The question of where eGovernment originates is pretty much self-explanatory. Nevertheless, the same does not apply when wondering about who is in the receiving end. The answer that first comes to mind is, the citizens. But isn't so. Apart from citizens, there are other entities that are benefited by eGovernment services. According to Backus, “the three main target groups that can be distinguished in eGovernment concepts are government, citizens and businesses/interest groups. The external strategic objectives focus on citizens and businesses and interest groups, the internal objectives focus on government itself” (Backus, 2001).
a) Government to Citizens (G2C)
Government to Citizen activities are those in which the government provides, on-line, one-stop access to information and services to citizens. G2C applications allow citizens to ask questions of government agencies and receive answers, such us:
* File income taxes
* Pay taxes
* Arrange driving tests or renew driver's licenses
* Pay traffic tickets
* Make appointments for vehicle emission inspections and
* Change their address
In addition, a government could:
* Distribute information on the web
* Provide downloadable forms online
* Conduct training (e.g., in some US States, the classes for the driver's tests are offered online)
* Assist citizens in finding employment
* Provide touristic and recreational information
* Provide health advice about safety issues (e.g. warnings for epidemics like the recent H1N1 virus)
* Allow transfer of benefits like food coupons
* File natural disaster relief compensation electronically through the use of smart cards; and the list goes on.
b) Government to Business (G2B)
Government to Business activities refers to those where the government deals with businesses such as suppliers using the Internet and other ICTs. It is a bidirectional interaction and transaction: Government to Business (G2B) and Business to Government (B2G). B2G is about businesses selling products and services to government. The most important G2B areas are eProcurement (which essentially is actually a reverse auction) and the auction of government surpluses.
c) Government to Government (G2G)
Lastly, Government to Government refers to those activities that take place between different government organizations/agencies/entities. Many of these activities aim to improve the effectiveness and efficiency of overall government operations. One such example is the Intelink, an intranet that carries classified information shared by different U.S. intelligence agencies.
3. Delivery Mechanism
EGovernment services are provided not only via the Internet. Instead, many other means are often used. In fact, studies and reports indicate that these “other” means of eGovernment services provision show in some cases extremely high utilization. For example:
* Telephony dominates channel usage in some situations: Accenture (2005) reports 63% of industrialized country respondents contacting government by telephone; compared to 31% using the Internet over a 12-month period.
* In-person visits dominate in other situations: an Australian survey reports half of government contacts to be face-to-face compared to one-fifth undertaken via the Internet (AGIMO 2005).
* Survey data also reflects an ongoing preference for telephone or in-person channels especially for transactional, problem-solving, urgent and complex interactions (AGIMO 2005, Horrigan 2005).
a) Multichannel Examples
Some Governments have embraced this reality and adopted a multichannel approach to the services they offer. In its Progress Reports, the European Comission includes some specific examples:
* In Malta, citizens can access their personal social security records and payments via the internet, and may also opt to be notified about their social security payments via SMS rather than receiving printed payment advice by post. However, the most innovative initiative is the introduction of eGovernment Agents that act as intermediaries to those without access. (ePractice eGovernment Factsheets - Malta, 2009)
* In Austria, all websites that belong to the .gv.at domain are available free of charge or connection fees via wireless hotspots (WLAN), and via public kiosks, thanks to an excellent cooperation between the Austrian Government and two major telecommunication providers. Similar to Malta, Austria also has legislation in place allowing officials to act as intermediaries for citizens who do not have online access or a citizen (ePractice eGovernment Factsheets - Austria, 2009)
* In Spain, 060 is the magic code providing a single access point. Many services provided by different administrations can be accessed via the 060 network, whether they are office-, internet-, or phone-based. Citizens can access the network's 2800 points of presence in the street or their office on the web, by the phone (060) or SMS. The 060 phone number is intended to replace over 1000 phone numbers available for citizens to access information of the General Administration of the State. The network is available 24/7 and currently offers 1225 national, regional and local public services. It is worth noting that In August 2007, only 15 months after its creation, the citizen information phoneline 060 had already dealt with 700000 enquiries. (ePractice eGovernment Factsheets - Spain, 2009)
C. EGovernance Definitions
Just like eGovernment, there is not a single common definition to describe eGovernance. However, the UNESCO defines it best: “E-governance is the public sector's use of information and communication technologies with the aim of improving information and service delivery, encouraging citizen participation in the decision-making process and making government more accountable, transparent and effective. E-governance involves new styles of leadership, new ways of debating and deciding policy and investment, new ways of accessing education, new ways of listening to citizens and new ways of organizing and delivering information and services. E-governance is generally considered as a wider concept than e-government, since it can bring about a change in the way citizens relate to governments and to each other. E-governance can bring forth new concepts of citizenship, both in terms of citizen needs and responsibilities. Its objective is to engage, enable and empower the citizen.”
Other definitions include
* “EGovernance, meaning ‘electronic governance' is using information and communication technologies (ICTs) at various levels of the government and the public sector and beyond, for the purpose of enhancing governance.” (Bedi et all, 2001, Holmes , 2001 and Okot-Uma, 2000).
* Whereas according to Backus (2001), eGovernance is defined as the, “application of electronic means in (1) the interaction between government and citizens and government and businesses, as well as (2) in internal government operations to simplify and improve democratic, government and business aspects of Governance.”
D. Digital Government
The term Digital Governance was introduced more than 7 years ago (McIver & Elmargarmid, 2002). Notions such as eGovernment, eGovernance and any future technology of ICT (e.g. Web 2.0 applications), should fall under the Digital Governance umbrella (Schellong, 2009). This term has been preferred by other researchers as well, due to the excessive usage of adding letters like “e” (electronic), “m” (mobile), “u” (ubiquitous) or “2.0” to government-related terms. Schellong goes further to suggest a specific typology (2008) as illustrated below in 2:
EGovernment contains the terms:
* EAdministration - Internal use of ICT
* EServices. - External use of ICT
* EDemocracy. - Use of ICT for direct public participation in government (decision making or voting)
EGovernance is a completely different branch and deals with government, society and economy.
E. Open Government
In the last decade, there have been many efforts to promote eGovernment. A new initiative has emerged though, Open Government, or OpenGov as it is usually abbreviated. OpenGovernment efforts have begun not only in the US but also in other countries, like Greece. Although OpenGovernment and eGovernment have similar characteristics and share common goals, the greatest one being the promotion of transparency, they are not the same. Open Government can be argued to be an evolution of eGovernment (GUSTETIC, 2009), since the only reason that it exists as an initiative today is because of advances made by eGovernment along with various technological improvements and innovations.
Benchmarking is defined as the process of measuring the performance of an organization along with the practices it applies in key areas and subsequently comparing them to other organizations. It is widely accepted in the private sector and is being used as a practical tool in order to achieve positive results with unlimited potential. EGovernment benchmarking means undertaking a review of comparative performance of eGovernment between nations or agencies. These studies have two purposes:
* Internal: Benefit the individual and/or organization undertaking the benchmarking study
* External: Benefit achieved for users of the study.
This project falls into the first category, as described in the Scope and Aims paragraph earlier in the document.
With new expectations about their performance, government entities are being encouraged to look at ways of implementing changes in their practices. Benchmarking provides them with one of their most useful options. In every industry, there are ways of doing things that are broadly recognized as standard practices for that industry. However, every industry has its leaders. These leaders are organizations that over perform when measured against those standards. They have achieved “best practices” as demonstrated by their results in quality, cost, customer satisfaction and responsiveness.
Benchmarking aims to discover what the “best practices” are that lead to superior performance. In greater detail, the process of benchmarking e-Government :
* Fosters accountability for eGovernment projects.
* Helps meeting rising public expectations
* Enables government officials to take more informed decisions and corrective actions
* Validates the generated public value
* Fosters projects interchange
Moreover, benchmarking can be distinguished from other traditional forms of evaluation by its attempt to visualize “best practices” through normalizing comparison and by urging public entities to ask themselves what they can do to promote them. Benchmarking enables and motivates them to determine how well current practices compare to others practices, locate performance gaps, experience best practices in action, and prioritize areas for improvement or other opportunities. It is quite important to note that “Benchmarking is not the same as benchmarks. Benchmarks are performance measures and benchmarking is the action of conducting the evaluation.” (Yasin, 2002).
C. Data Sources
After establishing what benchmarking is, the most common data sources are evaluated..
1. Calculated Indicators
Quite a few benchmarking reports use composite indicators, for example, for the purposes of national rankings. Because it is not always clear how they are calculated or researched, composites have been criticized (UIS 2003) for their lack of transparency along for their subjectivity. Fortunately, a guide for good practice in use of composites has been developed (eGEP 2006a:45) and includes:
* Developing a theoretical framework for the composite.
* Identifying and developing relevant variables.
* Standardizing variables to allow comparisons.
* Weighting variables and groups of variables.
* Conducting sensitivity tests on the robustness of aggregated variables.
Other than the composite calculation of national rankings, there seems to be little use of calculated indicators in the benchmarking of e-government. The most commonly used indicators include:
* Benefit/Cost Ratio.
* Demand/Supply Match.
* Comparative Service Development.
* National Ranking
Some examples along with the methods used for each indicator are illustrated in Table 1below (adapted from Heeks, 2006).
Expected financial benefit (impact) / Financial cost (input) (NOIE 2003)
Interview (internal self-assessment / internal administrative records)
Preference for online channel in particular services versus Online sophistication of that service (Graafland - Essers & Ettendgui 2003)
Mass citizen survey
Comparative Service Development
Stage model level of citizen services versus business services (Capgemini 2005)
Stage model level of different service cluster areas (Capgemini 2005)
Third party Web assessment
Composite of features and stage model level for national websites (West 2005)
Composite of ICT and human infrastructure with stage model level for national/other websites (UN 2005)
Composite of stage model level, integration and personalization of national websites (Accenture 2005)
Third party Web assessment
Table 1 Calculated Indicators Used in eGovernment Benchmarking (Heeks, 2006).
2. Standard Public Sector Indicators
Apart from calculated indicators, others (Flynn 2002) suggest using a standard indicator set for public sector performance. This set is displayed in Table 2 below (adapted from Flynn 2002).
The amount of inputs used
Expenditure per capita on IT
The ration of input intermediates
Cost per website produced per year
The ratio of inputs: outputs (use)
Cost per citizen user of government websites per year
The fit between actual outputs (use) and organizational objectives or other set targets
The extent to which underserved communities are users of e-government services
The fit between actual impacts and organizational objectives or other set targets
The extent to which citizens are gaining employment due to use of an eGovernment job search service
The quality of intermediates or, more typically outputs (use)
The quality of eGovernent services as perceived by citizen users
The equitability of distribution of outputs and impacts
The quality of time/money saved by eGovernment service use between rich and poor
Table 2 Standard Indicators for eGovernment Performance (Flynn 2002)
Having described the methodologies used more commonly when benchmarking eGovernment services, the next step is to illustrate how the necessary data is gathered. There are a number of official methods (eGEP 2006b):
* Focus groups
* Internal administrative records
* Internal self-assessment
* Mass user surveys
* Official statistics
* Pop-up surveys
* Third part web assessment
* Web metrics and crawlers
Each of these methods can be compared in four different and distinct factors (Heeks, 2006). Those are:
* Cost: The time and financial cost of the method.
* Value: The value of the method in producing data capable of assessing the downstream value of e-government.
* Comparability: The ease with which data produced can be compared across nations or agencies.
* Data Quality: The level of quality of the methods' data. In particular, Heeks suggests using the CARTA (Complete, Accurate, Relevant, Timely, Appropriate) check list when assessing data quality (2006).
There is also a set of methodologies that are not used as frequently as the ones mentioned earlier. These are:
* Intermediary Surveys.
* Intranet Assessment.
* Public Domain Statistics.
* Public Servant and Politician Surveys.
With new eGovernment services being introduced by Governments every day, benchmarking is gradually becoming a more and more important mechanism for identifying best practices and keeping track of developments, but as the number of the offered services increase, data collection becomes more and more difficult. Apart from that, since eGovernment is being expanded to other eGovernment levels, as illustrated earlier in 1, it is only natural that the number of benchmarking studies is increasing fast. Thus, the traditional approach of fata collection has not only become a very challenging but also a very resource intensive task. In order to address this matter, there are projects (eGovMon) which attempt to automate the data collection (Research Council of Norway, 2009). In particular, the eGovMon project is co-funded by the Research Council of Norway and “is developing methodology and software for quality evaluation of web services, in particular eGovernment services, concerning four areas:”
Additionally eGovMon will provide a policy design tool based on simulation models.”
Performing, even in part, the evaluation automatically frees up resources and allows for a larger number of web sites to be evaluated. Apart from the ability to assess services throughout the levels of eGovernment, it is very important that by using this approach, outcomes can be gathered and analyzed at much more frequent intervals compared to traditional studies. Yet, there is a drawback. A manual evaluation will always offer a much superior detail level together with the ability to address unexpected issues. This is the reason why automated data collecting will, for the time being, act as a supplemental tool to manual evaluations
E. Benchmarking Studies
At this point, there are four benchmarking studies that have gain greater traction globally. They represent the longest running efforts to evaluate the development of eGovernment and can be considered successful. They are (in the order that they will be presented):
* EU eGovernment Benchmark (conducted since 2001 by Capgemini)
* United Nations (conducted since 2002)
* Brown University (conducted since 2001)
* Accenture (conducted since 2000)
It is remarkable to examine exactly how each one of these dominating studies is conducted. Nonetheless, considering that all these reports have been monitoring eGovernment from early on, something even more intriguing is to check if and how the benchmarking methods themselves have evolved and developed over the course of these years.
1. EU eGovernment Benchmark
The European Union realized the significance of eGovernment early on, and it has been in its agenda since 2000. In order to measure progress made by Member States (MS), the European Union eGovernment Benchmark (EUeGovBe) was developed and on 2002, consulting firm Capgemini was contracted to produce a Benchmark Measurement Report on yearly basis. The EUeGovBe measures 20 Public (alternatively Basic) Services along with the national portal. For each of them, four indicators are used, with five stages each.
Greece being a Member State of the EU is included in these reports and the evaluation of the current state of the Greek eGovernment services and the analysis of the progress made, represent a main focus of this document. For this reason, a detailed analysis of this methodology, in separate chapters, is deemed necessary and is presented later on in the document, following the chapters describing the details of the EU's initiatives around eGovernment along with the framework for eGovernment benchmarking.
2. United Nations eReadiness
The United Nations Department of Economic and Social Affairs have assessed the eGovernment levels among its 191 member nations since 2002. All reports give a detailed description of research methodology used.
The UN's benchmarking process has evolved, before it arriving to its current methodology, which is described in detail later on.
The first benchmarking study (2002) introduced the eGovernment Readiness Index and examined government web sites of all UN member nations. The web measure assessments were purely quantitative, and were conducted by researchers who assigned binary values to specific indicators, representing the presence or absence of the service. In the 2003 survey, the eGovernment Index was renamed to eGovernment Readiness Index and qualitative elements were introduced by measuring the level of offered public services.
From there on, the reports for the following years, 2004, 2005 and 2008 used the same methodology using the eGovernment Readiness Index as a centerpiece. The eGovernment Readiness Benchmark is used to compare the UN members and it is derived from two indexes:
* eGovernment Readiness
Both of which are composite themselves. The former, eGovernment Readiness, represents a country's social and economic development along with its eGovernment related activities and is the average of the following indices (all data come from various UN divisions supplemented by other entities when needed):
* Telecommunication Infrastructure Index, which is itself derived from the average of five primary indices:
o Internet Users per 1000 persons
o Mobile Phones per 1000 persons
o Online Population
o PCs per 1000 persons
o Telephone Lines per 1000 persons
o Televisions per 1000 persons
· Human Capital Index, which is also a composite index resulting from:
o Adult Literacy Rate (Weighted at 66%)
o Combined Gross Enrolment Ratio (Weighted at 33%)
* Web Measure Index, which is a quantitative analysis of a country's web presence and features. It assesses the following sites in each country:
o Government homepage or National Portal (always measured first)
o Social Welfare
Just like the EUeGovBe, a five-stage model is used to capture each country's state of eGovernment:
1. Emerging Presence - Limited & Basic Information
2. Enhanced Presence - More Information including Search & Help features.
3. Interactive Presence - Email and Forms
4. Transactional Presence - Online Interaction & Payment
5. Networked Presence: Online Decision Making
As far as eParticipation, the UN defines it as “the sum total of both the government programs to encourage participation from the citizen and the willingness of the citizen to do so” (United Nations, 2005). The eParticipation Index is based on the analysis of 21 citizen-centric services that focus on three areas:
* EInformation - Information is distributed via various websites (Blogs, Social Networks, Forums, etc).
* EConsultation - Citizens participate directly in Public Policy discussions and feedback is provided.
* EDecision making - Citizens engage in Decision-Making)
The data for the evaluation is gathered during a two month period by a specialized team of researchers.
The United Nations are currently in the progress of updating the methodology used  and although some preliminary decision have been made with regards to introducing new measures in order to assess areas like back office management, mobile access to government services and inclusiveness.
The latest report, for 2010, is currently pending publication and isn't available at this time.
3. Brown University
The Taubman Center for Public Policy of Brown University has been measuring the evolvement of eGovernment since 2001. This report studies the Government websites of almost 200 countries. The latest report (2008) covered a total of 1667 national government websites, which were evaluated through a number measures, grouped into the areas of:
* Service Delivery
* Public Access.
Each web site has been evaluated for the presence of 28 specific features albeit with no efforts to measure maturity or depth of individual services. Observations were done by native speaking researchers and in some cases by automatic translation tools, which might explain some of the inconsistencies with regards to the results. Below is a comparative table of Brown University's reports thought the years, along with the total number of sites measured.
Total Number of Countries
Total Number of Sites Accessed
The data for the evaluation is gathered during a two month period and the Data Sources for the evaluation are: executive government offices, judicial offices and major agencies such as administration, health, taxation, education, economic development, foreign affairs, foreign investment, transportation, military, tourism and business regulation. Unlike the EUeGovBE or the United Nation's eReadiness benchmark, results are fairly inconsistent from year to year. Unfortunately since Brown University's reports don't provide a detailed methodology description, these inconsistencies in country ranks throughout the years cannot be explained. For example, here is Greece's position throughout the eight years:
Table 3 Greece eGovernment ranking (Brown University)
Due to the fact that access to the report's raw data is made available only for a fee, no further analysis with regards to these inconsistencies could be conducted.
(It should be noted that Professor Darrell M. West who was leading this effort in Brown University is no longer affiliated with that institute, so the latest report (2008), and all subsequent ones, was made available from the Brookings Institution where he currently serves as a Director of Governance Studies.)
A report for the year 2009 hasn't been published at this time.
Accenture is another consulting company (like Capgemini) that been measuring eGovernment efforts in over 20 countries, most of which are European, since 2000. The data collection is by Accenture itself, via its local employees in each country, and all reports are survey reports meaning that they rely on interviews with citizens and government officials. Each report always describes the methodology that was used, however Accenture has not used a consisted title for its report throughout the years; instead it has opted for using a title that describes each year's findings, making it a little challenging to locate all of them. A list of Accenture's reports where are of the reports' titles is in Table 3:
Implementing eGovernment: Rhetoric or Reality
EGovernment Leadership: Rhetoric or Reality
EGovernment Leadership: Realizing the vision
EGovernment Leadership: Engaging the Customer
EGovernment Leadership: High Performance, Maximum Value
Leadership in Customer Service: New Expectations, New Experiences
Leadership in Customer Service: Building Trust
Leadership in Customer Service: Delivering on the Promise
Leadership in Customer Service: Creating Shared Responsibility for Better Outcomes
Table 4 Accenture eGovernment Benchmarking Reports
Although Accenture's reports kept track of a specific number of Countries, the methodologies and the consequently rankings were the same each year. Some of the main changes from year to year are highlighted below.
In the first report that started ranking the eGovernment level of each country (2001), there were two indicator sets used:
* Service Maturity (weighted at 70%), which derives from the total number of services offered and their level. Each service could be in one of the following levels:
o Publish - Service is available online.
o Interact - Citizens can submit information online.
o Transact - Government responds electronically to citizens' requests.
* Delivery Maturity (weighted at 30%), which referred to delivery aspects, such as portal capabilities or single point of entry.
Then, the countries were ranked and grouped into four categories:
* Innovative Leaders.
* Visionary Followers.
* Steady Achievers.
* Platform Builders.
Following that, the 2002 report, substituted the Delivery Maturity indicator with the Customer Relationship Management which had five factors:
* Organization performance.
* Customer offerings.
The weights between the two main indexes remain the same but the in the final ranking, Visionary Followers were changed to Visionary Challengers and Steady Achievers to Emerging Performers respectively.
The 2004 report included a new factor for the Customer Relationship Management indicator, named Support.
In 2005 the Service Maturity index changed weight to 50%. The remaining 50% came from the Customer Relationship Management which changed completely and included four dimensions of customer service:
* Cross Government Service Delivery.
* Proactive Communication (about the services towards users).
A new element of the 2005 report was that it included interviews from citizens, whereas up to that point only government officials were interviewed.
In 2006 Accenture decided to focus on those countries that have performed consistently well, based on the previous' years reports, and temporarily drop the ranking of individual countries. This was done in order to provide greater visibility for examples of best practice, hence creating an actual benchmarking report.
In 2007, citizens' interviews continued. In fact they were included as a third main index (weighted at 40%), so as to incorporate their results in the ranking. The weighting of the service maturity component was reduced to 10%. The Customer Relationship Management included the same aspects as in 2005.
In the last available report, published in 2008, Accenture has made some significant changes. It was decided that it should shift from the quantitative to a qualitative approach, thus dropping the government ranking, as well as the track of individual services offered (which peaked at 2004 where 2006 services were measured). Also, no new citizen interviews were conducted, as it was replaced from a second in-depth research of the 2007 set. The main change however, was that a completely different evaluation system was introduced. Based on examples of best examples from governments around the world, it consists of four key enabling practices:
· Better service starts with better understanding - Differentiate service offerings, based on customer insight and segmentation, to meet people's specific needs and improve equality of outcomes.
· Engage. Listen. Response - Actively engage citizens, service users and other stakeholders in defining outcomes and designing services.
* Harness all available resources - Use the experience and resources available across government, non-profits, community groups, private businesses and individual citizens to achieve complex, cross - cutting outcomes.
* Be transparent. Be accountable. Ask for and act on development. - Focus on improving transparency, accessibility of information and the means for people to address government directly, so that customers can hold governments accountable for the quality of services delivered.
According to Accenture, these practices “will create opportunities for true customer service transformation that closes the gap between expectations and reality.”
A report for the year 2009 hasn't been published at this time.
III. Europe & eGovernment
The European Union realized early on, the importance of transition into the “digital” era. Since 1999, when the eEurope term was first used, the EU has launched specific initiatives & frameworks in order to prevent the formation of a “digital divide” between Member States with unequal access to new technologies such as the internet.
A. Historical Review
The term eEurope was first used by the European Commission in 1999. Later on, in Lisbon in in March 2000, the eEurope initiative was launch, with the intent of “accelerating Europe's transition towards a knowledge based economy and to realize the potential benefits of higher growth, more jobs and better access for all citizens to the new services of the information age” (European Commission, 1999).
Its main objectives were:
* To provide access to a extensive range of services and to an low cost, effective communications infrastructure for both business and citizens,
* To equip citizens with all the necessary skills required for them to live and work in the new information society.
* Give greater priority to life-long learning as a fundamentals element of the European social model.
In order to achieve these objectives the following 10 priorities were set:
1. Accelerating E-Commerce
2. Cheaper Internet access
3. eParticipation for the disabled
4. European youth into the digital age
5. Fast Internet for researchers and students
6. Government online
7. Healthcare online
8. Intelligent transport
9. Risk capital for high-tech SMEs
10. Smart cards for secure electronic access
Following the eEurope initiative, the eEurope 2002 Action Plan was agreed by all member states, in 2000. It had 64 four main targets, broken down in three main focus areas:
1. Faster, cheaper and secure access to the Internet.
2. People development.
3. Increase Internet adoption and usage.
The Action Plan was concluded in in the end of 2002 (European Commission, 2003) and proved to be quite successful and promote Internet connectivity and set the necessary frameworks in place for electronic communications. However, the effective use of the Internet was not increasing at the same rates that the connectivity was. For this reason, it was decided that greater focus should be given to the effective use of ICT.
Following the eEurope 2002 Action Plan, the eEurope 2005 Action Plan launched at the Seville European Council in June 2002. EEurope 2005 aimed to stimulate a positive feedback between service development and infrastructure upgrading. In order to achieve that, it focused on creating of a more encouraging environment for the deployment of infrastructure (which constitutes the supply side of the broadband equation) and assisting the services' development (which is the demand side). All within a secure information infrastructure. Finally, it tried to increase the accessibility of all of the Information Society offerings, so that even people with disabilities could benefit from them.
Following the eEurope2005 initiative (concluded in 2005), which proved quite successful, the European Information society in 2010 (i2010) initiative was presented in the i2010 Communication in June 2005. This initiative will provide “an integrated approach to information society and audio-visual policies in the EU, covering regulation, research, and deployment and promoting cultural diversity. It will look for fast and visible results, building on the optimistic outlook for ICT industries and markets. It will encourage fast growth built around the convergence at the levels of networks, services and devices. Its objective will be to ensure that Europe's citizens, businesses and governments make the best use of ICTs in order to improve industrial competitiveness, support growth and the creation of jobs and to help address key societal challenges” (European Commission, 2005).
An important fact is that eGovernment is not only included in the i2010 initiative but constitutes an i2010 Action Plan. In particular, the i2010 eGovernment Action Plan “is designed to make public services more efficient and more modern and to target the needs of the general population more precisely. To do this, it proposes a series of priorities and a roadmap to accelerate the deployment of eGovernment in Europe”.
One of the most aims of i2010 Action Plan (inter alia) is to greatly increase the provision of tangible benefits for both citizens and businesses that interact with governments at various levels. The five main priorities of the action plan are:
1. No citizen left behind - To promote eGovernment so that every single citizen can benefit from services that are innovative, trustworthy and accessible by everyone.
2. Making efficiency and effectiveness a reality - To improve user satisfaction, increase transparency and accountability, and reduce the administrative burden.
3. Implementing high-impact key services for citizens and businesses - By making 100% of public procurement available online by 2010 and actually use it for 50%. Of the transactions
4. Putting key enablers in place - To create an interoperable environment for convenient yet secure access to public services across Europe.
5. Strengthening participation and democratic decision-making - Engaging the usage of new and effective tools for public debate and participation in electronic decision-making
The i2010 is an evolving initiative and undergoes reviews through Annual Reports. The most recent version is available in the Europe's Digital Competitiveness Report (Annual Report 2009)
Along with launching the eEurope initiative, the European Commission began the process of defining the indicators necessary to monitor its implementation. A list of indicators was approved:
* Percentage of Basic Services available online.
* Public Use of Government online services for information and submission of forms.
* Percentage of public procurement which can be carried out online.
The latter two are fairly straight forward but in order to calculate the former, a set of 20 public services, which would be surveyed with great detail, was developed, agreed to by all EU Member States and then published.
This is the benchmark is now known as the EU eGovernment Benchmark (EUeGovBe).
Since 2001, the Commission has utilized the services of consulting firm Capgemini for the measurement of these 20 services along with their analysis and possible evolution of new indicators. In fact, it was announced recently that the European Commission awarded Capgemini with a four-year extension of its seven-year e-government benchmark contract, meaning that it will keep on delivering these reports at least until 2012. The data source used is Eurostat, the Statistical Office of the European Commission.
The survey itself has changed significantly. Firstly, it begun as a simple web based survey of the Basic Services that would be conducted on a 6 month basis. Until 2003 the measurements were simply reporting the new finds. It was not until the 3rd measurement, which was published in 2003, that comparisons are conducted, along with assessment of progress made. In fact, the first report that had a “benchmarking” tone, was published as a separate document and wasn't included into the measurement. The progress and evolution of the EUeGovBe is proved in
Web-based Survey on Electronic Public Services
Web-based Survey on Electronic Public Services
Web-based Survey on Electronic Public Services
Online Availability of Public Services: How Does Europe Progress? - Comparative Report of the Three first surveys
Online Availability of Public Services: How Is Europe Progressing?
Online Availability of Public Services: How Is Europe Progressing?
The User Challenge Benchmarking The Supply Of Online Public Services
The User Challenge Benchmarking The Supply Of Online Public Services
Smarter, Faster, Better eGovernment
Table 5 EUeGovBe Reports, Prepared by Capgemini
The number of the countries participating in the report evolved from 17 (EU15, Iceland, Norway), to 18 (EU15, Iceland, Norway, Switzerland), to 28 (EU25, Iceland, Norway, Switzerland) to 31 (EU27, Iceland, Norway, Switzerland, Turkey). In the latest 8th measurement, results for Turkey are not included. Instead, Croatia has been added into the report.
1. Methodology Analysis
a) Basic Services
The common set of the 20 Public Services against which the Member States are measured can be grouped of four clusters or buckets:
* Income Generating
* Permits & licenses
Out of these 20 Public Services, 12 are Services to Citizens and the remaining 8 are Services to Businesses.
Table 4 includes a list of the Services to Citizens along with a small description and the cluster they fall into:
Services to the Citizen
Job search services
Social security benefits
Unemployment, student benefits
Permits & Licenses
Passport and driver's license
New, used, imported Cars
Permits & Licenses
New building permit
Declaration to police
Search tools, online catalogues
Birth, marriage certificates
Enrollment in higher education
Permits & Licenses
Enrolment in higher education
Announcement of moving
Change of address
Health related services
Appointments, service availability
Table 6 Government to Citizens Public Services
Table 5 includes a list of the Services to Businesses along with a small description and the cluster they fall into:
Services to the Businesses
Contributions for employees
Data submission to statistical offices
Import and export declarations
Environment - related permits
Permits & Licenses
Mining, logging, fishing permits
Table 7 Government to Businesses Public Services
Each of the 20 aforementioned services is graded depending on its online maturity. Up to the report published in 2006, Capgemini used a framework with four online sophistication stages:
* Stage 1: Information. In this level, a service offers a website which contains only informational material.
* Stage 2: One way interaction. In this level, the website provides the user with all the necessary documents and might even allow for electronic submission but the procedure cannot be completed electronically.
* Stage 3: Two way interaction. In this level, the website offer online forms for electronic submission, implying that a way of securely identifying the requestor is in place. Certain procedures can be completed electronically but not the service as a whole.
* Stage 4: Transaction. In this level, the provided service offers full functionality and can fully replace the “physical” service.
A Stage l 0 was also introduced in order to cover two possibilities:
* Inexistence of a website containing information about a service
* A website does exist for the service provider, but contains no information regarding the service it offers.
However, in the 2007 report, a new 5th Stage of online sophistication was introduced. This Stage was originally called Personalization but in the latest report the name was changed to Targetization which was deemed as more appropriate. In this level, the service can automatically target user's needs and suggest other services that the user could or should use. Both Stages 4 and 5 are jointly referred to as ‘full online availability'. 3, adapted from Capgemini (2009), illustrates the latest 5-Stage online sophistication levels:
Not all of the sophistication levels apply to each one of the 20 Services that are measured. For example, the greater level that the “Definition of the public service” service can reach is Stage 3. This makes sense because in this case, in order for the service to be completed other “physical” actions, which can be done electronically, such an investigation, might be required. However, after including the Transaction stage, the top levels have changed. These changes are illustrated in Table 6.
Maximal Stage (After 2007)
Maximal Stage (Until 2006)
Social security benefits
Declaration to police
Enrollment in higher education
Announcement of moving
Health related services
Environment - related permits
Table 8 Maximal Sophistication Stages for each Service
It should be highlighted that the “Job Search” along with the “Public Libraries” Services were initially predefined to a maximal Stage 3, but based on technology developments, Capgemini defined (2007 onwards) a Stage 4 for both of them. Two very important indices are derived from the above, for each service:
* Online Sophistication - The percentage resulting from the division of a service's Current Sophistication Stage with the corresponding Maximal Sophistication Stage.
* Online Availability - A binary entry, with a value of 1 if a service's Online Sophistication equals 100%, meaning the service has already reached its maximal level, and 0 in any other case.
Since 2007 when the 5th Stage was introduced, a service is considered available only both Stage 4 and Stage 5. Effectively this means that even a service with 80% Online Sophistication would still be considered available.
Apart from the 20 Basic Services, the latest EUeGovBe measurement (2009) introduces the notion of benchmarking the eProcurement services that are offered by each MS. In order to do so with a holistic view, four new composite indices were developed:
* EProcurement Availability Benchmark. A certain set of government websites are visited and for each one, the responsible entity is requested to answer a questionnaire of three weighted questions. The responses add up to 100 and the final indicator is the average of all the websites that were visited in each country.
* EProcurement Development Models. This index categorizes the different approaches that each country's has with regards to its National eProcurement Platform within one of four distinct groups,:
1. Mandatory National eProcurement Platform. Countries in this group have a centralized e
Cite This Dissertation
To export a reference to this article please select a referencing stye below: