BENCHMARKING EGOVERNMENT SERVICESS

Abstract:

Governments around the world have embraced the use of Information and Communication Technologies (ICTs). This represents a relatively new branch of study within the IT field. EGovernment Services are provided through many different means of access and to a variance of audiences, citizens, businesses or even other governmental entities. After clarifying the definitions and differences among similar terms (i.e. eGovernance and Digital Government, eDemocracy) this paper examines how eGovernment is measured by analyzing the dominating methodologies that are used. Furthermore, following specifically the eGovernment benchmarking methodology that is used by the European Commission, a greater focus in the evolution of eGovernment in Greece has been made. The finding through this assessment was far from satisfactory. Particularly, comparing the 20 Basic eGovernment Services offered in Greece, from 2007 to 2009, no development in terms of improvement, has taken place. Finally, the measures that governments need to undertake are discussed.

Introduction

In the past years, assisted by the "invasion" of Information Technology in everyday lives, governments all over the world have begun widely using information technologies for increasing the effectiveness and quality of the services they provide. These initiatives have become known as “electronic government” or eGovernment services. In most cases, when words gain that attractive “e-“ in front of them, the popular belief is that they have become “electronic”, whatever that means, even though in some cases it does not make much sense. This confusion is much more obvious when the original word itself has conceptual and abstract meanings. Words like Government and Governance.

Section I presents the most popular definitions, choosing the one that describe each term the best, and clarifies boundaries between the most common terms. Furthermore, the different ways that eGovernment can be classified, depending on the delivery model or the audience, is outlined.

Although the definitions of eGovernment may vary widely, there an obvious shared theme emerges; eGovernment involves using information technology, and especially the Internet, to improve the delivery of government services to citizens, businesses, and other government agencies. It acts as enabler for citizens to interact and receive services from governments twenty four hours a day, seven days a week. Monitoring eGovernment development and evaluating its effectiveness is a complex and challenging task, as the phenomenon is new and dynamic.

In Section II, the basics of Benchmarking are presented and its structural elements are analyzed. Focusing on specific examples of methodology used, a set of four dominant practices that represent the longest running efforts for measuring eGovernment is chosen to be explored further. Using the reports publish by each one of them, on a periodical basis, their inner workings are analyzed and the various developments, changes and evolutions in the methods employed by each one are mentioned.

Section III focuses at benchmarking of eGovernment Services in Europe. In order to recognize how eGovernment has evolved and matured within the European Union, the relevant European directives, initiatives and frameworks for the development of eGovernment Services in the region since 1999 are examined. Following that, the methodology used for benchmarking eGovernment the European Union is examined in detail. All measuring elements, including some that were used for the first time in the most recently published report are evaluated.

Having established what eGovernment is, what Benchmarking is and how its methodologies function, Section IV uses the data from the latest European eGovernment Benchmarking Report, which was published in November 2009, to assess how the Greek eGovernment landscape evolved since the previous report in 2007. The results are disappointing. When comparing the 20 Basic eGovernment Services offered in Greece, there was no improvement, what so ever, from 2007 to 2009. Following that, Greek performance in the two new indices introduced in the latest report (EProcurement and User Experience) is reported and compared to the respective EU27+ average.

Finally, in Section V, a general overview is provided along with the conclusions about the (lack of) progress in eGovernment in Greece.
Scope and aims

The scope of this project is to analyze how the meaning of eGovernment has evolved in the past few years and then review the current trends in benchmarking the penetration & sophistication of eGovernment services in Europe and the rest of the world. Furthermore, this project reports and analyses the level of eGovernment services offered in Greece. The basic aims of this project are:

* Define the eGovernment ecosystem, typology and taxonomy.

* Analyse the dominant methodologies of benchmarking eGovernment services

* Gather and process existing results about eGovernment in Greece, regarding service penetration and sophistication, along with other relevant metrics.

Resources

This project relies heavily on research. In particular, a lot of research on what the different and sometimes contradicting terms that define eGovernment as well as the rest of the relevant terms in academic papers throughout the previous decade was made. Following that, further research about the current and past trends in benchmarking in general and eGovernment benchmarking in particular are is conducted. From there on, having established what eGovernment is and what the provided services should be, along with how they are measured, more research was conducted in order to reveal what the actual current level of provided eGovernment services is. To accomplish this, reports from many different parties are used. These include reports published both by well knows analyst firms or government bodies in various levels as well as reports issued at a global level such as the United Nations to local reports issued by the authorities of each country such as the IT Observatory in Greece.

Typology Convention

Throughout the bibliography, or any other sort of resource for that matter, “electronic” terms do not have a consistent representation. So, just like electronic mail can be found abbreviated in quite a few forms, “electronic” Government is abbreviated to eGovernment, e-Government, E-Government etc.

To avoid this inconsistency, through this project the term eGovernment will be used (changed to EGovernment only in the beginning of sentences). This convention will apply to “electronic” terms that will be used such as eGovernance.

I. EGovernment, eGovernance and Digital Governance

EGovernment is one more of the recent years “buzzwords”. It is usually either paired with the word “services” at the end or other words like eGovernance and Digital Government. Like every other (relatively) new and cool “buzzword” they are used widely by a broad spectrum of individuals who represent mostly two different backgrounds. Information technology and politics. The first because it is a technological issue, the later because they have come to realize, even though a little late, that they represent an excellent vehicle for them to provide a better experience to anyone who interacts with the Government. But, what do these terms mean? Do they collide or conflict each other? How about covering or including one another?

A. EGovernment Definitions

There is not one, unique and commonly accepted definition for eGovernment. It is quite difficult to decide over a specific one but after the research made, the following definition from the World Bank (ΠΑΡΑΠΟΜΠΗ) describes it best:

“Government refers to the use by government agencies of information technologies (such as Wide Area Networks, the Internet, and mobile computing) that have the ability to transform relations with citizens, businesses, and other arms of government. These technologies can serve a variety of different ends: better delivery of government services to citizens, improved interactions with business and industry, citizen empowerment through access to information, or more efficient government management. The resulting benefits can be less corruption, increased transparency, greater convenience, revenue growth, and/or cost reductions ((AOEMA), 2004).

Although other definitions have been provided, this definition is preferred. The reason is that it is the most concise and the easiest to be understood since apart describing in simple words how eGovernment is utilized, it goes on to offer a very brief, yet to the point, reference to its main advantages.

EGovernment definitions various other sources as follows:

· United Nations definition ((AOEMA), 2004): “E-government is defined as utilizing the Internet and the world-wide-web for delivering government information and services to citizens.”

* Global Business Dialogue on Electronic Commerce - GBDe definition ((AOEMA), 2004): “Electronic government (hereafter e-Government) refers to a situation in which administrative, legislative and judicial agencies (including both central and local governments) digitize their internal and external operations and utilize networked systems efficiently to realize better quality in the provision of public services.”

* Gartner Group's definition: “the continuous optimization of service delivery, constituency participation, and governance by transforming internal and external relationships through technology, the Internet and new media.”

* Definition of the Working Group on eGovernment in the Developing World: E-government is the use of information and communication technologies (ICTs) to promote more efficient and effective government, facilitate more accessible government services, allow greater public access to information, and make government more accountable to citizens. E-government might involve delivering services via the Internet, telephone, community centers (self-service or facilitated by others), wireless devices or other communications systems.”

EGovernment is in the first stages of development. Most governments have already taken or are taking initiatives offering government services online. However, for the true potential of eGovernment to be realized, government needs to restructure and transform its long entrenched business processes. EGovernment is not simply the process of moving existing government functions to an electronic platform. Rather, it calls for rethinking the way government functions are carried out today to improve some processes, to introduce new ones and to replace those that require it. The range of services that may be provided by e-government spans from simple information sites to fully interactive experiences where users and government engage in a dialog mediated by information technology.

Internal information systems of Government agencies, information kiosks, automated telephone information services, SMS services and other systems all comprise e-Government services. All these are applications of Information and Communications Technologies (ICT) to improve the services of the Government towards its primary clients: the citizens. In the last few years, there has been much talk of mobile government or m-government. MGovernment refers to the use of wireless technologies like cellular/mobile phones, laptops and PDAs (Personal Digital Assistants) for offering and delivering government services. MGovernment is not a substitute for e-government, rather it complements it.

1. Benefits of eGovernment

E-Government initiatives contribute to citizen empowerment by making information about government processes and decisions easily available, and allowing information-sharing among people and organizations, and between citizens and the civil service (Accenture and the Markle Foundation, 2001). Well-informed citizens are better able to hold their governments accountable. Governments are then compelled to improve the quality of services, expand accessibility of these services, and increase responsiveness to their constituents. Many Government services rely on information passed among different offices within a department or across departments. The large amount of information and paperwork required results in an environment where for red tape rips, the workforce is inefficient and bureaucratic, and the delivery of services is ineffective. With the usage of ICT, the government bureaucracy and citizens are both winners in the battle against the paper trail. eGovernment allows government knowledge and data exchange to be accessed more easily (whether public or secure) by the appropriate offices or individuals. By this, it reduces redundancies of information flows, and resulting in overall increased productivity. Another result of the integration of operations of government agencies is the improvement of transparency in government.

EGovernment minimizes redundant information flows, helps to eliminate duplications of functions, and improves the adherence of public servants to proper government procedures, thereby reducing opportunities for corruption. This, provided it is accompanied by well-informed and active citizens, will assist in limiting the relationship between bureaucracy and corruption and will help lead to a higher sense of accountability among officials.

B. EGovernment Taxonomy

EGovernment can be classified according to different criteria. It can be classified according to its level, its audience and last but certainly not least, according to the delivery mechanism used.

1. Reach

EGovernment can be categorized in the following five distinct levels depending on how broad it is. The levels are:

* International

* National

* Regional

* State/Provisional

* Local

These levels are illustrated below (see 1 adapted from Heeks, 2006)

2. Audience

The question of where eGovernment originates is pretty much self-explanatory. Nevertheless, the same does not apply when wondering about who is in the receiving end. The answer that first comes to mind is, the citizens. But isn't so. Apart from citizens, there are other entities that are benefited by eGovernment services. According to Backus, “the three main target groups that can be distinguished in eGovernment concepts are government, citizens and businesses/interest groups. The external strategic objectives focus on citizens and businesses and interest groups, the internal objectives focus on government itself” (Backus, 2001).

a) Government to Citizens (G2C)

Government to Citizen activities are those in which the government provides, on-line, one-stop access to information and services to citizens. G2C applications allow citizens to ask questions of government agencies and receive answers, such us:

* File income taxes

* Pay taxes

* Arrange driving tests or renew driver's licenses

* Pay traffic tickets

* Make appointments for vehicle emission inspections and

* Change their address

In addition, a government could:

* Distribute information on the web

* Provide downloadable forms online

* Conduct training (e.g., in some US States, the classes for the driver's tests are offered online)

* Assist citizens in finding employment

* Provide touristic and recreational information

* Provide health advice about safety issues (e.g. warnings for epidemics like the recent H1N1 virus)

* Allow transfer of benefits like food coupons

* File natural disaster relief compensation electronically through the use of smart cards; and the list goes on.

b) Government to Business (G2B)

Government to Business activities refers to those where the government deals with businesses such as suppliers using the Internet and other ICTs. It is a bidirectional interaction and transaction: Government to Business (G2B) and Business to Government (B2G). B2G is about businesses selling products and services to government. The most important G2B areas are eProcurement (which essentially is actually a reverse auction) and the auction of government surpluses.

c) Government to Government (G2G)

Lastly, Government to Government refers to those activities that take place between different government organizations/agencies/entities. Many of these activities aim to improve the effectiveness and efficiency of overall government operations. One such example is the Intelink, an intranet that carries classified information shared by different U.S. intelligence agencies.

3. Delivery Mechanism

EGovernment services are provided not only via the Internet. Instead, many other means are often used. In fact, studies and reports indicate that these “other” means of eGovernment services provision show in some cases extremely high utilization. For example:

* Telephony dominates channel usage in some situations: Accenture (2005) reports 63% of industrialized country respondents contacting government by telephone; compared to 31% using the Internet over a 12-month period.

* In-person visits dominate in other situations: an Australian survey reports half of government contacts to be face-to-face compared to one-fifth undertaken via the Internet (AGIMO 2005).

* Survey data also reflects an ongoing preference for telephone or in-person channels especially for transactional, problem-solving, urgent and complex interactions (AGIMO 2005, Horrigan 2005).
a) Multichannel Examples

Some Governments have embraced this reality and adopted a multichannel approach to the services they offer. In its Progress Reports, the European Comission includes some specific examples:

* In Malta, citizens can access their personal social security records and payments via the internet, and may also opt to be notified about their social security payments via SMS rather than receiving printed payment advice by post. However, the most innovative initiative is the introduction of eGovernment Agents that act as intermediaries to those without access. (ePractice eGovernment Factsheets - Malta, 2009)

* In Austria, all websites that belong to the .gv.at domain are available free of charge or connection fees via wireless hotspots (WLAN), and via public kiosks, thanks to an excellent cooperation between the Austrian Government and two major telecommunication providers. Similar to Malta, Austria also has legislation in place allowing officials to act as intermediaries for citizens who do not have online access or a citizen (ePractice eGovernment Factsheets - Austria, 2009)

* In Spain, 060 is the magic code providing a single access point. Many services provided by different administrations can be accessed via the 060 network, whether they are office-, internet-, or phone-based. Citizens can access the network's 2800 points of presence in the street or their office on the web, by the phone (060) or SMS. The 060 phone number is intended to replace over 1000 phone numbers available for citizens to access information of the General Administration of the State. The network is available 24/7 and currently offers 1225 national, regional and local public services. It is worth noting that In August 2007, only 15 months after its creation, the citizen information phoneline 060 had already dealt with 700000 enquiries. (ePractice eGovernment Factsheets - Spain, 2009)

C. EGovernance Definitions

Just like eGovernment, there is not a single common definition to describe eGovernance. However, the UNESCO defines it best: “E-governance is the public sector's use of information and communication technologies with the aim of improving information and service delivery, encouraging citizen participation in the decision-making process and making government more accountable, transparent and effective. E-governance involves new styles of leadership, new ways of debating and deciding policy and investment, new ways of accessing education, new ways of listening to citizens and new ways of organizing and delivering information and services. E-governance is generally considered as a wider concept than e-government, since it can bring about a change in the way citizens relate to governments and to each other. E-governance can bring forth new concepts of citizenship, both in terms of citizen needs and responsibilities. Its objective is to engage, enable and empower the citizen.”

Other definitions include

* “EGovernance, meaning ‘electronic governance' is using information and communication technologies (ICTs) at various levels of the government and the public sector and beyond, for the purpose of enhancing governance.” (Bedi et all, 2001, Holmes , 2001 and Okot-Uma, 2000).

* Whereas according to Backus (2001), eGovernance is defined as the, “application of electronic means in (1) the interaction between government and citizens and government and businesses, as well as (2) in internal government operations to simplify and improve democratic, government and business aspects of Governance.”

D. Digital Government

The term Digital Governance was introduced more than 7 years ago (McIver & Elmargarmid, 2002). Notions such as eGovernment, eGovernance and any future technology of ICT (e.g. Web 2.0 applications), should fall under the Digital Governance umbrella (Schellong, 2009). This term has been preferred by other researchers as well, due to the excessive usage of adding letters like “e” (electronic), “m” (mobile), “u” (ubiquitous) or “2.0” to government-related terms. Schellong goes further to suggest a specific typology (2008) as illustrated below in 2:

EGovernment contains the terms:

* EAdministration - Internal use of ICT

* EServices. - External use of ICT

* EDemocracy. - Use of ICT for direct public participation in government (decision making or voting)

EGovernance is a completely different branch and deals with government, society and economy.

E. Open Government

In the last decade, there have been many efforts to promote eGovernment. A new initiative has emerged though, Open Government, or OpenGov as it is usually abbreviated. OpenGovernment efforts have begun not only in the US but also in other countries, like Greece. Although OpenGovernment and eGovernment have similar characteristics and share common goals, the greatest one being the promotion of transparency, they are not the same. Open Government can be argued to be an evolution of eGovernment (GUSTETIC, 2009), since the only reason that it exists as an initiative today is because of advances made by eGovernment along with various technological improvements and innovations.

II. Benchmarking

A. Definition

Benchmarking is defined as the process of measuring the performance of an organization along with the practices it applies in key areas and subsequently comparing them to other organizations. It is widely accepted in the private sector and is being used as a practical tool in order to achieve positive results with unlimited potential. EGovernment benchmarking means undertaking a review of comparative performance of eGovernment between nations or agencies. These studies have two purposes:

* Internal: Benefit the individual and/or organization undertaking the benchmarking study

* External: Benefit achieved for users of the study.

This project falls into the first category, as described in the Scope and Aims paragraph earlier in the document.

B. Goals

With new expectations about their performance, government entities are being encouraged to look at ways of implementing changes in their practices. Benchmarking provides them with one of their most useful options. In every industry, there are ways of doing things that are broadly recognized as standard practices for that industry. However, every industry has its leaders. These leaders are organizations that over perform when measured against those standards. They have achieved “best practices” as demonstrated by their results in quality, cost, customer satisfaction and responsiveness.

Benchmarking aims to discover what the “best practices” are that lead to superior performance. In greater detail, the process of benchmarking e-Government :

* Fosters accountability for eGovernment projects.

* Helps meeting rising public expectations

* Enables government officials to take more informed decisions and corrective actions

* Validates the generated public value

* Fosters projects interchange

Moreover, benchmarking can be distinguished from other traditional forms of evaluation by its attempt to visualize “best practices” through normalizing comparison and by urging public entities to ask themselves what they can do to promote them. Benchmarking enables and motivates them to determine how well current practices compare to others practices, locate performance gaps, experience best practices in action, and prioritize areas for improvement or other opportunities. It is quite important to note that “Benchmarking is not the same as benchmarks. Benchmarks are performance measures and benchmarking is the action of conducting the evaluation.” (Yasin, 2002).

C. Data Sources

After establishing what benchmarking is, the most common data sources are evaluated..

1. Calculated Indicators

Quite a few benchmarking reports use composite indicators, for example, for the purposes of national rankings. Because it is not always clear how they are calculated or researched, composites have been criticized (UIS 2003) for their lack of transparency along for their subjectivity. Fortunately, a guide for good practice in use of composites has been developed (eGEP 2006a:45) and includes:

* Developing a theoretical framework for the composite.

* Identifying and developing relevant variables.

* Standardizing variables to allow comparisons.

* Weighting variables and groups of variables.

* Conducting sensitivity tests on the robustness of aggregated variables.

Other than the composite calculation of national rankings, there seems to be little use of calculated indicators in the benchmarking of e-government. The most commonly used indicators include:

* Benefit/Cost Ratio.

* Demand/Supply Match.

* Comparative Service Development.

* National Ranking

Some examples along with the methods used for each indicator are illustrated in Table 1below (adapted from Heeks, 2006).

Calculated Indicator

Example

Method

Benefit/Cost Ratio

Expected financial benefit (impact) / Financial cost (input) (NOIE 2003)

Interview (internal self-assessment / internal administrative records)

Demand/Supply Match

Preference for online channel in particular services versus Online sophistication of that service (Graafland - Essers & Ettendgui 2003)

Mass citizen survey

Comparative Service Development

Stage model level of citizen services versus business services (Capgemini 2005)

Stage model level of different service cluster areas (Capgemini 2005)

Third party Web assessment

National Ranking

Composite of features and stage model level for national websites (West 2005)

Composite of ICT and human infrastructure with stage model level for national/other websites (UN 2005)

Composite of stage model level, integration and personalization of national websites (Accenture 2005)

Third party Web assessment

Table 1 Calculated Indicators Used in eGovernment Benchmarking (Heeks, 2006).

2. Standard Public Sector Indicators

Apart from calculated indicators, others (Flynn 2002) suggest using a standard indicator set for public sector performance. This set is displayed in Table 2 below (adapted from Flynn 2002).

Indicator

Explanation

eGovernment Example

Benchmark

Economy

The amount of inputs used

Expenditure per capita on IT

None

Internal efficiency

The ration of input intermediates

Cost per website produced per year

Minimization

External efficiency

The ratio of inputs: outputs (use)

Cost per citizen user of government websites per year

Minimization

Internal effectiveness

The fit between actual outputs (use) and organizational objectives or other set targets

The extent to which underserved communities are users of e-government services

Maximization

External effectiveness

The fit between actual impacts and organizational objectives or other set targets

The extent to which citizens are gaining employment due to use of an eGovernment job search service

Maximization

Quality

The quality of intermediates or, more typically outputs (use)

The quality of eGovernent services as perceived by citizen users

Maximization

Equity

The equitability of distribution of outputs and impacts

The quality of time/money saved by eGovernment service use between rich and poor

Maximization

Table 2 Standard Indicators for eGovernment Performance (Flynn 2002)
D. Methodologies

Having described the methodologies used more commonly when benchmarking eGovernment services, the next step is to illustrate how the necessary data is gathered. There are a number of official methods (eGEP 2006b):

* Focus groups

* Internal administrative records

* Internal self-assessment

* Mass user surveys

* Official statistics

* Pop-up surveys

* Third part web assessment

* Web metrics and crawlers

Each of these methods can be compared in four different and distinct factors (Heeks, 2006). Those are:

* Cost: The time and financial cost of the method.

* Value: The value of the method in producing data capable of assessing the downstream value of e-government.

* Comparability: The ease with which data produced can be compared across nations or agencies.

* Data Quality: The level of quality of the methods' data. In particular, Heeks suggests using the CARTA (Complete, Accurate, Relevant, Timely, Appropriate) check list when assessing data quality (2006).

There is also a set of methodologies that are not used as frequently as the ones mentioned earlier. These are:

* Intermediary Surveys.

* Intranet Assessment.

* Public Domain Statistics.

* Public Servant and Politician Surveys.

1. Automation

With new eGovernment services being introduced by Governments every day, benchmarking is gradually becoming a more and more important mechanism for identifying best practices and keeping track of developments, but as the number of the offered services increase, data collection becomes more and more difficult. Apart from that, since eGovernment is being expanded to other eGovernment levels, as illustrated earlier in 1, it is only natural that the number of benchmarking studies is increasing fast. Thus, the traditional approach of fata collection has not only become a very challenging but also a very resource intensive task. In order to address this matter, there are projects (eGovMon) which attempt to automate the data collection (Research Council of Norway, 2009). In particular, the eGovMon project is co-funded by the Research Council of Norway and “is developing methodology and software for quality evaluation of web services, in particular eGovernment services, concerning four areas:”

· Accessibility

· Transparency

· Efficiency

· Impact

Additionally eGovMon will provide a policy design tool based on simulation models.”

Performing, even in part, the evaluation automatically frees up resources and allows for a larger number of web sites to be evaluated. Apart from the ability to assess services throughout the levels of eGovernment, it is very important that by using this approach, outcomes can be gathered and analyzed at much more frequent intervals compared to traditional studies. Yet, there is a drawback. A manual evaluation will always offer a much superior detail level together with the ability to address unexpected issues. This is the reason why automated data collecting will, for the time being, act as a supplemental tool to manual evaluations

E. Benchmarking Studies

At this point, there are four benchmarking studies that have gain greater traction globally. They represent the longest running efforts to evaluate the development of eGovernment and can be considered successful. They are (in the order that they will be presented):

* EU eGovernment Benchmark (conducted since 2001 by Capgemini)

* United Nations (conducted since 2002)

* Brown University (conducted since 2001)

* Accenture (conducted since 2000)

It is remarkable to examine exactly how each one of these dominating studies is conducted. Nonetheless, considering that all these reports have been monitoring eGovernment from early on, something even more intriguing is to check if and how the benchmarking methods themselves have evolved and developed over the course of these years.

1. EU eGovernment Benchmark

The European Union realized the significance of eGovernment early on, and it has been in its agenda since 2000. In order to measure progress made by Member States (MS), the European Union eGovernment Benchmark (EUeGovBe) was developed and on 2002, consulting firm Capgemini was contracted to produce a Benchmark Measurement Report on yearly basis. The EUeGovBe measures 20 Public (alternatively Basic) Services along with the national portal. For each of them, four indicators are used, with five stages each.

Greece being a Member State of the EU is included in these reports and the evaluation of the current state of the Greek eGovernment services and the analysis of the progress made, represent a main focus of this document. For this reason, a detailed analysis of this methodology, in separate chapters, is deemed necessary and is presented later on in the document, following the chapters describing the details of the EU's initiatives around eGovernment along with the framework for eGovernment benchmarking.

2. United Nations eReadiness

The United Nations Department of Economic and Social Affairs have assessed the eGovernment levels among its 191 member nations since 2002. All reports give a detailed description of research methodology used.

The UN's benchmarking process has evolved, before it arriving to its current methodology, which is described in detail later on.

The first benchmarking study (2002) introduced the eGovernment Readiness Index and examined government web sites of all UN member nations. The web measure assessments were purely quantitative, and were conducted by researchers who assigned binary values to specific indicators, representing the presence or absence of the service. In the 2003 survey, the eGovernment Index was renamed to eGovernment Readiness Index and qualitative elements were introduced by measuring the level of offered public services.

From there on, the reports for the following years, 2004, 2005 and 2008 used the same methodology using the eGovernment Readiness Index as a centerpiece. The eGovernment Readiness Benchmark is used to compare the UN members and it is derived from two indexes:

* eGovernment Readiness

* eParticipation

Both of which are composite themselves. The former, eGovernment Readiness, represents a country's social and economic development along with its eGovernment related activities and is the average of the following indices (all data come from various UN divisions supplemented by other entities when needed):

* Telecommunication Infrastructure Index, which is itself derived from the average of five primary indices:

o Internet Users per 1000 persons

o Mobile Phones per 1000 persons

o Online Population

o PCs per 1000 persons

o Telephone Lines per 1000 persons

o Televisions per 1000 persons

· Human Capital Index, which is also a composite index resulting from:

o Adult Literacy Rate (Weighted at 66%)

o Combined Gross Enrolment Ratio (Weighted at 33%)

* Web Measure Index, which is a quantitative analysis of a country's web presence and features. It assesses the following sites in each country:

o Government homepage or National Portal (always measured first)

o Education

o Health

o Labor

o Social Welfare

o Finance

Just like the EUeGovBe, a five-stage model is used to capture each country's state of eGovernment:

1. Emerging Presence - Limited & Basic Information

2. Enhanced Presence - More Information including Search & Help features.

3. Interactive Presence - Email and Forms

4. Transactional Presence - Online Interaction & Payment

5. Networked Presence: Online Decision Making

As far as eParticipation, the UN defines it as “the sum total of both the government programs to encourage participation from the citizen and the willingness of the citizen to do so” (United Nations, 2005). The eParticipation Index is based on the analysis of 21 citizen-centric services that focus on three areas:

* EInformation - Information is distributed via various websites (Blogs, Social Networks, Forums, etc).

* EConsultation - Citizens participate directly in Public Policy discussions and feedback is provided.

* EDecision making - Citizens engage in Decision-Making)

The data for the evaluation is gathered during a two month period by a specialized team of researchers.

The United Nations are currently in the progress of updating the methodology used [1] and although some preliminary decision have been made with regards to introducing new measures in order to assess areas like back office management, mobile access to government services and inclusiveness.

The latest report, for 2010, is currently pending publication[2] and isn't available at this time.

3. Brown University

The Taubman Center for Public Policy of Brown University has been measuring the evolvement of eGovernment since 2001. This report studies the Government websites of almost 200 countries. The latest report (2008) covered a total of 1667 national government websites, which were evaluated through a number measures, grouped into the areas of:

* Availability

* Service Delivery

* Public Access.

Each web site has been evaluated for the presence of 28 specific features albeit with no efforts to measure maturity or depth of individual services. Observations were done by native speaking researchers and in some cases by automatic translation tools, which might explain some of the inconsistencies with regards to the results. Below is a comparative table of Brown University's reports thought the years, along with the total number of sites measured.

Year

Total Number of Countries

Total Number of Sites Accessed

2001

198

2288

2002

198

1197

2003

198

2166

2004

198

1935

2005

198

1796

2006

198

1782

2007

198

1687

2008

198

1667

The data for the evaluation is gathered during a two month period and the Data Sources for the evaluation are: executive government offices, judicial offices and major agencies such as administration, health, taxation, education, economic development, foreign affairs, foreign investment, transportation, military, tourism and business regulation. Unlike the EUeGovBE or the United Nation's eReadiness benchmark, results are fairly inconsistent from year to year. Unfortunately since Brown University's reports don't provide a detailed methodology description, these inconsistencies in country ranks throughout the years cannot be explained. For example, here is Greece's position throughout the eight years:

Table 3 Greece eGovernment ranking (Brown University)

Due to the fact that access to the report's raw data is made available only for a fee, no further analysis with regards to these inconsistencies could be conducted.

(It should be noted that Professor Darrell M. West who was leading this effort in Brown University is no longer affiliated with that institute, so the latest report (2008), and all subsequent ones, was made available from the Brookings Institution where he currently serves as a Director of Governance Studies.)

A report for the year 2009 hasn't been published at this time.
4. Accenture

Accenture is another consulting company (like Capgemini) that been measuring eGovernment efforts in over 20 countries, most of which are European, since 2000. The data collection is by Accenture itself, via its local employees in each country, and all reports are survey reports meaning that they rely on interviews with citizens and government officials. Each report always describes the methodology that was used, however Accenture has not used a consisted title for its report throughout the years; instead it has opted for using a title that describes each year's findings, making it a little challenging to locate all of them. A list of Accenture's reports where are of the reports' titles is in Table 3:

Year

Title

Countries

2000

Implementing eGovernment: Rhetoric or Reality

N/A

2001

EGovernment Leadership: Rhetoric or Reality

22

2002

EGovernment Leadership: Realizing the vision

23

2003

EGovernment Leadership: Engaging the Customer

22

2004

EGovernment Leadership: High Performance, Maximum Value

22

2005

Leadership in Customer Service: New Expectations, New Experiences

22

2006

Leadership in Customer Service: Building Trust

21

2007

Leadership in Customer Service: Delivering on the Promise

22

2008

Leadership in Customer Service: Creating Shared Responsibility for Better Outcomes

21

Table 4 Accenture eGovernment Benchmarking Reports

Although Accenture's reports kept track of a specific number of Countries, the methodologies and the consequently rankings were the same each year. Some of the main changes from year to year are highlighted below.

In the first report that started ranking the eGovernment level of each country (2001), there were two indicator sets used:

* Service Maturity (weighted at 70%), which derives from the total number of services offered and their level. Each service could be in one of the following levels:

o Publish - Service is available online.

o Interact - Citizens can submit information online.

o Transact - Government responds electronically to citizens' requests.

* Delivery Maturity (weighted at 30%), which referred to delivery aspects, such as portal capabilities or single point of entry.

Then, the countries were ranked and grouped into four categories:

* Innovative Leaders.

* Visionary Followers.

* Steady Achievers.

* Platform Builders.

Following that, the 2002 report, substituted the Delivery Maturity indicator with the Customer Relationship Management which had five factors:

* Insight.

* Interaction.

* Organization performance.

* Customer offerings.

* Networks.

The weights between the two main indexes remain the same but the in the final ranking, Visionary Followers were changed to Visionary Challengers and Steady Achievers to Emerging Performers respectively.

The 2004 report included a new factor for the Customer Relationship Management indicator, named Support.

In 2005 the Service Maturity index changed weight to 50%. The remaining 50% came from the Customer Relationship Management which changed completely and included four dimensions of customer service:

* Citizen-Centered.

* Multichannel.

* Cross Government Service Delivery.

* Proactive Communication (about the services towards users).

A new element of the 2005 report was that it included interviews from citizens, whereas up to that point only government officials were interviewed.

In 2006 Accenture decided to focus on those countries that have performed consistently well, based on the previous' years reports, and temporarily drop the ranking of individual countries. This was done in order to provide greater visibility for examples of best practice, hence creating an actual benchmarking report.

In 2007, citizens' interviews continued. In fact they were included as a third main index (weighted at 40%), so as to incorporate their results in the ranking. The weighting of the service maturity component was reduced to 10%. The Customer Relationship Management included the same aspects as in 2005.

In the last available report, published in 2008, Accenture has made some significant changes. It was decided that it should shift from the quantitative to a qualitative approach, thus dropping the government ranking, as well as the track of individual services offered (which peaked at 2004 where 2006 services were measured). Also, no new citizen interviews were conducted, as it was replaced from a second in-depth research of the 2007 set. The main change however, was that a completely different evaluation system was introduced. Based on examples of best examples from governments around the world, it consists of four key enabling practices:

· Better service starts with better understanding - Differentiate service offerings, based on customer insight and segmentation, to meet people's specific needs and improve equality of outcomes.

· Engage. Listen. Response - Actively engage citizens, service users and other stakeholders in defining outcomes and designing services.

* Harness all available resources - Use the experience and resources available across government, non-profits, community groups, private businesses and individual citizens to achieve complex, cross - cutting outcomes.

* Be transparent. Be accountable. Ask for and act on development. - Focus on improving transparency, accessibility of information and the means for people to address government directly, so that customers can hold governments accountable for the quality of services delivered.

According to Accenture, these practices “will create opportunities for true customer service transformation that closes the gap between expectations and reality.”

A report for the year 2009 hasn't been published at this time.

III. Europe & eGovernment

The European Union realized early on, the importance of transition into the “digital” era. Since 1999, when the eEurope term was first used, the EU has launched specific initiatives & frameworks in order to prevent the formation of a “digital divide” between Member States with unequal access to new technologies such as the internet.

A. Historical Review
1. eEurope

The term eEurope was first used by the European Commission in 1999. Later on, in Lisbon in in March 2000, the eEurope initiative was launch, with the intent of “accelerating Europe's transition towards a knowledge based economy and to realize the potential benefits of higher growth, more jobs and better access for all citizens to the new services of the information age” (European Commission, 1999).

Its main objectives were:

* To provide access to a extensive range of services and to an low cost, effective communications infrastructure for both business and citizens,

* To equip citizens with all the necessary skills required for them to live and work in the new information society.

* Give greater priority to life-long learning as a fundamentals element of the European social model.

In order to achieve these objectives the following 10 priorities were set:

1. Accelerating E-Commerce

2. Cheaper Internet access

3. eParticipation for the disabled

4. European youth into the digital age

5. Fast Internet for researchers and students

6. Government online

7. Healthcare online

8. Intelligent transport

9. Risk capital for high-tech SMEs

10. Smart cards for secure electronic access

2. eEurope2002

Following the eEurope initiative, the eEurope 2002 Action Plan was agreed by all member states, in 2000. It had 64 four main targets, broken down in three main focus areas:

1. Faster, cheaper and secure access to the Internet.

2. People development.

3. Increase Internet adoption and usage.

The Action Plan was concluded in in the end of 2002 (European Commission, 2003) and proved to be quite successful and promote Internet connectivity and set the necessary frameworks in place for electronic communications. However, the effective use of the Internet was not increasing at the same rates that the connectivity was. For this reason, it was decided that greater focus should be given to the effective use of ICT.

3. eEurope2005

Following the eEurope 2002 Action Plan, the eEurope 2005 Action Plan launched at the Seville European Council in June 2002. EEurope 2005 aimed to stimulate a positive feedback between service development and infrastructure upgrading. In order to achieve that, it focused on creating of a more encouraging environment for the deployment of infrastructure (which constitutes the supply side of the broadband equation) and assisting the services' development (which is the demand side). All within a secure information infrastructure. Finally, it tried to increase the accessibility of all of the Information Society offerings, so that even people with disabilities could benefit from them.

4. i2010

Following the eEurope2005 initiative (concluded in 2005), which proved quite successful, the European Information society in 2010 (i2010) initiative was presented in the i2010 Communication[3] in June 2005. This initiative will provide “an integrated approach to information society and audio-visual policies in the EU, covering regulation, research, and deployment and promoting cultural diversity. It will look for fast and visible results, building on the optimistic outlook for ICT industries and markets. It will encourage fast growth built around the convergence at the levels of networks, services and devices. Its objective will be to ensure that Europe's citizens, businesses and governments make the best use of ICTs in order to improve industrial competitiveness, support growth and the creation of jobs and to help address key societal challenges” (European Commission, 2005).

An important fact is that eGovernment is not only included in the i2010 initiative but constitutes an i2010 Action Plan. In particular, the i2010 eGovernment Action Plan “is designed to make public services more efficient and more modern and to target the needs of the general population more precisely. To do this, it proposes a series of priorities and a roadmap to accelerate the deployment of eGovernment in Europe”[4].

One of the most aims of i2010 Action Plan (inter alia) is to greatly increase the provision of tangible benefits for both citizens and businesses that interact with governments at various levels. The five main priorities of the action plan are:

1. No citizen left behind - To promote eGovernment so that every single citizen can benefit from services that are innovative, trustworthy and accessible by everyone.

2. Making efficiency and effectiveness a reality - To improve user satisfaction, increase transparency and accountability, and reduce the administrative burden.

3. Implementing high-impact key services for citizens and businesses - By making 100% of public procurement available online by 2010 and actually use it for 50%. Of the transactions

4. Putting key enablers in place - To create an interoperable environment for convenient yet secure access to public services across Europe.

5. Strengthening participation and democratic decision-making - Engaging the usage of new and effective tools for public debate and participation in electronic decision-making

The i2010 is an evolving initiative and undergoes reviews through Annual Reports. The most recent version is available in the Europe's Digital Competitiveness Report (Annual Report 2009)
B. EUeGovBe

Along with launching the eEurope initiative, the European Commission began the process of defining the indicators necessary to monitor its implementation. A list of indicators was approved:

* Percentage of Basic Services available online.

* Public Use of Government online services for information and submission of forms.

* Percentage of public procurement which can be carried out online.

The latter two are fairly straight forward but in order to calculate the former, a set of 20 public services, which would be surveyed with great detail, was developed, agreed to by all EU Member States and then published.

This is the benchmark is now known as the EU eGovernment Benchmark (EUeGovBe).

Since 2001, the Commission has utilized the services of consulting firm Capgemini for the measurement of these 20 services along with their analysis and possible evolution of new indicators. In fact, it was announced[5] recently that the European Commission awarded Capgemini with a four-year extension of its seven-year e-government benchmark contract, meaning that it will keep on delivering these reports at least until 2012. The data source used is Eurostat, the Statistical Office of the European Commission.

The survey itself has changed significantly. Firstly, it begun as a simple web based survey of the Basic Services that would be conducted on a 6 month basis. Until 2003 the measurements were simply reporting the new finds. It was not until the 3rd measurement, which was published in 2003, that comparisons are conducted, along with assessment of progress made. In fact, the first report that had a “benchmarking” tone, was published as a separate document and wasn't included into the measurement. The progress and evolution of the EUeGovBe is proved in

#

Publish Date

Title

Countries

1

November 2001

Web-based Survey on Electronic Public Services

17

2

April 2002

Web-based Survey on Electronic Public Services

18

3

February 2003

Web-based Survey on Electronic Public Services

18

N/A

January 2003

Online Availability of Public Services: How Does Europe Progress? - Comparative Report of the Three first surveys

18

4

January 2004

Online Availability of Public Services: How Is Europe Progressing?

18

5

March 2005

Online Availability of Public Services: How Is Europe Progressing?

28

6

June 2006

The User Challenge Benchmarking The Supply Of Online Public Services

28

7

September 2007

The User Challenge Benchmarking The Supply Of Online Public Services

31

8

November 2009

Smarter, Faster, Better eGovernment

31

Table 5 EUeGovBe Reports, Prepared by Capgemini

The number of the countries participating in the report evolved from 17 (EU15, Iceland, Norway), to 18 (EU15, Iceland, Norway, Switzerland), to 28 (EU25, Iceland, Norway, Switzerland) to 31 (EU27, Iceland, Norway, Switzerland, Turkey). In the latest 8th measurement, results for Turkey are not included. Instead, Croatia has been added into the report.

1. Methodology Analysis
a) Basic Services

The common set of the 20 Public Services against which the Member States are measured can be grouped of four clusters or buckets:

* Income Generating

* Registration

* Returns

* Permits & licenses

Out of these 20 Public Services[6], 12 are Services to Citizens and the remaining 8 are Services to Businesses.

Table 4 includes a list of the Services to Citizens along with a small description and the cluster they fall into:

Services to the Citizen

Cluster

Description

Income taxes

Income Generating

Declaration, notification

Job search

Returns

Job search services

Social security benefits

Returns

Unemployment, student benefits

Personal document

Permits & Licenses

Passport and driver's license

Car registration

Registration

New, used, imported Cars

Building permission

Permits & Licenses

New building permit

Declaration to police

Returns

Theft

Public libraries

Returns

Search tools, online catalogues

Certificates

Registration

Birth, marriage certificates

Enrollment in higher education

Permits & Licenses

Enrolment in higher education

Announcement of moving

Registration

Change of address

Health related services

Returns

Appointments, service availability

Table 6 Government to Citizens Public Services

Table 5 includes a list of the Services to Businesses along with a small description and the cluster they fall into:

Services to the Businesses

Cluster

Description

Social contributions

Income Generating

Contributions for employees

Corporate tax

Income Generating

Declaration, notification

VAT

Income Generating

Declaration, notification

Company registration

Registration

New company

Statistical data

Registration

Data submission to statistical offices

Customs declaration

Income Generating

Import and export declarations

Environment - related permits

Permits & Licenses

Mining, logging, fishing permits

Public procurement

Returns

Online purchase

Table 7 Government to Businesses Public Services

Each of the 20 aforementioned services is graded depending on its online maturity. Up to the report published in 2006[7], Capgemini used a framework with four online sophistication stages:

* Stage 1: Information. In this level, a service offers a website which contains only informational material.

* Stage 2: One way interaction. In this level, the website provides the user with all the necessary documents and might even allow for electronic submission but the procedure cannot be completed electronically.

* Stage 3: Two way interaction. In this level, the website offer online forms for electronic submission, implying that a way of securely identifying the requestor is in place. Certain procedures can be completed electronically but not the service as a whole.

* Stage 4: Transaction. In this level, the provided service offers full functionality and can fully replace the “physical” service.

A Stage l 0 was also introduced in order to cover two possibilities:

* Inexistence of a website containing information about a service

* A website does exist for the service provider, but contains no information regarding the service it offers.

However, in the 2007 report, a new 5th Stage of online sophistication was introduced. This Stage was originally called Personalization but in the latest report the name was changed to Targetization which was deemed as more appropriate. In this level, the service can automatically target user's needs and suggest other services that the user could or should use. Both Stages 4 and 5 are jointly referred to as ‘full online availability'. 3, adapted from Capgemini (2009), illustrates the latest 5-Stage online sophistication levels:

Not all of the sophistication levels apply to each one of the 20 Services that are measured. For example, the greater level that the “Definition of the public service” service can reach is Stage 3. This makes sense because in this case, in order for the service to be completed other “physical” actions, which can be done electronically, such an investigation, might be required. However, after including the Transaction stage, the top levels have changed. These changes are illustrated in Table 6.

Public Services

Maximal Stage (After 2007)

Maximal Stage (Until 2006)

Income taxes

5

4

Job search

4

4

Social security benefits

5

4

Personal document

5

3

Car registration

4

4

Building permission

4

4

Declaration to police

3

3

Public libraries

5

4

Certificates

4

3

Enrollment in higher education

4

4

Announcement of moving

4

3

Health related services

4

4

Social contributions

4

4

Corporate tax

4

4

VAT

4

4

Company registration

4

4

Statistical data

5

3

Customs declaration

4

4

Environment - related permits

5

4

Public procurement

4

4

Table 8 Maximal Sophistication Stages for each Service

It should be highlighted that the “Job Search” along with the “Public Libraries” Services were initially predefined to a maximal Stage 3, but based on technology developments, Capgemini defined (2007 onwards) a Stage 4 for both of them. Two very important indices are derived from the above, for each service:

* Online Sophistication - The percentage resulting from the division of a service's Current Sophistication Stage with the corresponding Maximal Sophistication Stage.

* Online Availability - A binary entry, with a value of 1 if a service's Online Sophistication equals 100%, meaning the service has already reached its maximal level, and 0 in any other case.

Since 2007 when the 5th Stage was introduced, a service is considered available only both Stage 4 and Stage 5. Effectively this means that even a service with 80% Online Sophistication would still be considered available.

b) EProcurement

Apart from the 20 Basic Services, the latest EUeGovBe measurement (2009) introduces the notion of benchmarking the eProcurement services that are offered by each MS. In order to do so with a holistic view, four new composite indices were developed:

* EProcurement Availability Benchmark. A certain set of government websites are visited and for each one, the responsible entity is requested to answer a questionnaire of three weighted questions. The responses add up to 100 and the final indicator is the average of all the websites that were visited in each country.

* EProcurement Development Models. This index categorizes the different approaches that each country's has with regards to its National eProcurement Platform within one of four distinct groups,:

1. Mandatory National eProcurement Platform. Countries in this group have a centralized eProcurement policy in place and have made the usage of the National Platform mandatory. This approach encourages coordination and centralization but it doesn't mean that other platforms, in regional or local levels are forbidden.

2. Mandatory National eProcurement Portal: In the countries that belong in this group require all tenders to be published on a single National Portal. Most of the times, these Portals do not offer any further eProcurement services.

3. Non-Mandatory National eProcurement Platform/Portal. Countries in this group do have a National eProcurement solution and do recommend using it, but do not require it.

4. No National eProcurement Platform/Portal. This group is for the countries that have yet to develop an eProcurement solution.

* EProcurement Pre-Award Process Benchmark. This index is calculated from three subphases:

1. ENotification (weighted at 0.36). A five question evenly-weighted questionnaire about the notifications phase of the procurement process is used.

2. ESubmission (weighted at 0.5). Just like above, only difference being that the questionnaire has seven questions.

3. EAwards (weighted at 0.14). Similar, but with only two questions.

The maximum grade for each of the phases is 100

* EProcurement Post-Award Process Benchmark. This index is calculated from three subphases from a total of 5 questions. The subphases are:

1. eOrdering,

2. eInvoicing

3. ePayment,

Having defined all of the eProcurement assessment values, 4 adapted from Capgemini/European Union (2009) illustrates the eGovernment Value Chain which shows where each of the values is positioned.

c) User Experience

The User Experience Indicator is a new and pilot indicator, which is also introduced in the latest measurement. Its main objective is to add the user' prospective into the benchmarking of eGovernment. It measures how easy or difficult it is for the user to fully utilize the offered services. It is calculated from five indicators:

1. Accessibility. This indicator assesses the country's government websites with regards to their compliance with the Web Content Accessibility Guidelines, in order to verify to what degree they are accessible by people with disabilities. It is worth noting that this is a fully automated assessment via specific software (in this instance web crawler).

2. Usability. This indicator measures whether the services' delivery is multichannel (See relevant paragraph in the definitions chapter) and whether privacy policies are made clear.

3. User satisfaction monitoring. This indicator is based on whether the government websites employ some sort of feedback mechanism.

4. One-stop-shop approach. This indicator checks the percentage of the 20 basic services that available from the primary national portal.

5. User-focused portal design. This indicator grades the user friendliness of the websites and now easy they are to navigate.

IV. Greek eGovernment Assessment
A. Basic Services
B. Basic Services

In order to evaluate the progress made with regards to eGovernment Services offered in Greece, data from the EUeGovBe were used. Particularly, the data was mined from the results of the 8th measurement titled “Smarter, Faster, Better eGovernment” which was published on November of 2009. In addition, the Observatory for the Greek Information Society proved to be a very helpful source of information for locating the governmental entities responsible for offering each service, which are shown in Table 7

Public Services

Government Entities Responsible

Income taxes

Central Government, Ministry of Economy and Finance, General Secretariat for Information Systems

Job search

Central Government, Ministry of Employment and Social Protection, Greek Manpower Employment Organization

Social security benefits

Central Government, Ministry of Employment and Social Protection, Greek Manpower Employment Organization Ministry of Health and Social Solidarity, Ministry of National Education and Religious Affairs, State Scholarships Foundation

Personal document

Hellenic Police, National Passport Centre Central Government, Ministry of Interior

Car registration

Central Government, Ministry of Economy and Finance, General Secretariat for Information Systems

Building permission

Central Government, Ministry of Interior

Declaration to police

Central Government, Ministry of Public Order

Public libraries

Central Government, Ministry of National Education and Religious Affairs

Certificates

Central Government, Ministry of Interior

Enrollment in higher education

Central Government, Ministry of National Education and Religious Affairs

Announcement of moving

Ministry of Economy and Finance

Health related services

Central Government, Ministry of Health and Social Solidarity

Social contributions

Central Government, Ministry of Employment and Social Protection, Social Insurance Institute (IKA)

Corporate tax

Central Government, Ministry of Economy and Finance, General Secretariat for Information Systems

VAT

Central Government, Ministry of Economy and Finance, General Secretariat for Information Systems

Company registration

Central Government, Ministry of Development, General Secretariat for Commerce

Statistical data

Central Government, Ministry of Economy and Finance, General Secretariat of the National Statistical Service

Customs declaration

Central Government, Ministry of Economy and Finance, General Secretariat for Information Systems

Environment - related permits

Central Government, Ministry for the Environment, Physical Planning and Public Works

The Online Sophistication and Online Availability per Cluster indices are also commonly displayed via Radar graph 5. This is way EUeGovBe uses tfor visualizing these results as it offers a broader prospective.

The Online Sophistication in Greece is 68%, and the Online Availability is 45%. 7 illustrates how these indices compare to the EU27+ average for 2009:

Greece is still lagging in comparison to the rest of the European Union, especially regarding Online Availably. In average, more than 14 out of the 20 Basic Services have reached the 4th and 5th sophistication levels, whereas in Greece only 9 have accomplished that.

In order to provide a comparative view of Greece's performance to the rest of the countries, a list of the Online Sophistication along with the Online Availability values for each country are necessary. Unfortunately, the specific dataset is not offered as part of Capgemini's report. At least not directly, because the some of the values are mentioned within each of the Country Reports. Using the data provided and by calculating those which are not, Table 13 including all of the values is compiled:

Country

Online Sophistication

Rank

Online Sophistication

Citizen Services

Rank

Online Sophistication

Business Services

Rank

Malta

100%

1

100%

1

100%

1

Portugal

100%

1

100%

1

100%

1

Austria

99%

3

98%

5

100%

1

Sweden

99%

3

100%

1

98%

5

Slovenia

97%

5

99%

4

94%

13

Estonia

95%

6

93%

6

98%

5

Finland

94%

7

92%

9

97%

7

Ireland

94%

7

93%

6

94%

13

United Kingdom

94%

7

93%

6

95%

10

Denmark

93%

10

89%

10

100%

1

France

90%

11

89%

10

93%

16

Belgium

89%

12

85%

13

96%

9

Germany

89%

12

83%

15

97%

7

Spain

89%

12

85%

13

94%

13

Norway

87%

15

81%

16

95%

10

The Netherlands

87%

15

89%

10

86%

22

Luxembourg

81%

17

76%

17

88%

19

Italy

80%

18

76%

17

86%

22

Czech Republic

78%

19

66%

23

95%

10

Latvia

78%

19

72%

20

89%

18

Lithuania

77%

21

75%

19

80%

28

Hungary

76%

22

68%

22

86%

22

Iceland

76%

22

71%

21

83%

26

Poland

74%

24

66%

23

87%

20

Slovakia

72%

25

60%

27

90%

17

Cyprus

70%

26

59%

28

87%

20

Greece

68%

27

62%

26

78%

29

Switzerland

67%

28

63%

25

73%

31

Bulgaria

65%

29

53%

29

83%

26

Romania

61%

30

47%

30

84%

25

Croatia

56%

31

44%

31

74%

30

EU27+ Average

83%

58%

82%

Table 13 Online Sophistication and Ranking (Including Citizen & Business Services) (2009)

For most of the countries all three values are mentioned. Online Sophistication in Greece is at 68% and it is ranked 27th only 4 places away from the worst performing country, Croatia who even though was included for the first time in the report, is at 56%. Online Sophistication of the services available to citizens is at the 26th place whereas Online Sophistication of the services provided to businesses is at the 29th position. The fact that even though it has a higher absolute value than the citizen indicator (78% to 62 %), it is ranked lower, is indicative the efforts that all countries have undertaken so as to provide better services for businesses, effectively raising the bar to the 82% EU27+ average.

A similar task is performed for the Online Availability and the result is Table 14:

Country

Online Availability

Online Availability Rank

Austria

100%

1

Malta

100%

1

Portugal

100%

1

United Kingdom

100%

1

Slovenia

95%

5

Sweden

95%

5

Estonia

90%

7

Finland

89%

8

Denmark

84%

9

Ireland

83%

10

France

80%

11

Norway

80%

11

Spain

80%

11

The Netherlands

79%

14

Germany

74%

15

Belgium

70%

16

Italy

70%

16

Luxembourg

68%

18

Latvia

65%

19

Hungary

63%

20

Czech Republic

60%

21

Lithuania

60%

21

Iceland

55%

23

Slovakia

55%

23

Poland

53%

25

Cyprus

50%

26

Greece

45%

27

Romania

45%

27

Bulgaria

40%

29

Croatia

35%

30

Switzerland

32%

31

EU27+ Average

71%

Table 14 Online Availability and Ranking (2009)

Once again, Greece is located within the 5 worst-performing countries, at 27th place (again) with only an Online Availability value of 45%. Unfortunately, as far as Online Availability is concerned, not enough data are provided in order to extrapolate the individual indices for businesses and citizens

Having analyzed the data for the year 2009 and in order to examine how much and in which levels the eGovernment Services offered in Greece have been altered, they must be compared against their previous values in 2007. First, the Citizen Public Services in Table 15:

Table 15 Citizen Public Services change from 2007 to 2009

And second, the Business Services in Table 16

Table 16 Business Public Services change from 2007 to 2009

There is a very alarming discovery is made. Since the previous measurement, which was published in September 2007, and the current one, which was published in November 2009, the Online Sophistication level of every single one of the Public Services has remained unchanged. .During a 26 month period, there has been no improvement whatsoever in eGovernment services in Greece!

Table 17 compares how the development of Online Sophistication in Greece with the European Average:

Table 17 Online Sophistication Progress from 2001 to 2009

Greece has always remained below the European Average. During the past 2 years the difference almost doubled from 8% to 15%. Obviously, this confirms that while the rest of Europe was making progress and further improved the offered eGovernment services, Greece remain stagnant. Another interesting observation is that the only points in time that Greece reduced the gap were the years that new countries entered the benchmark and reduced the average. In the 2004 report the 10 new EU members were added and in the 2007 one, 3 new countries were included. In both cases the Average index took a hit. Even though, Greece never managed to reach it and the smaller difference was 4%, in 2004.

The previous findings are confirmed. The distance to the European Average was doubled from 13% to 26%. It just goes on to show how much behind Greece was left.

Since every single one of the 20 Basis Services remain unchanged, it serves little purpose to do similar comparative analysis over the course of team for each of them. Something like that effectively is examining only the progress made by the other countries since all of Greece's indices are constant. Bases on the above results for both the Citizen and the Business Services where the gap to the European Average doubled, it is safe to assume that more or less the same pattern was followed for each individual Service offered.

The analysis of the services concludes with that is a Scatter graph that plots each government over both Online Sophistication and Online Availability:

This graph is an excellent representation of the current state of eGovernment services in Greece in comparison with the rest of the European Countries. All of the countries are rapidly moving towards the upper right hand corner of the Graph, meaning that they will be having Online Sophistication and Online Availability equal to 100%. Two countries (Malta and Portugal) reached that spot in 2009 and others like Austria, Slovenia and Sweden are approaching fast. Greece on the other hand is dangerously close to the last place. Only Bulgaria, Romania and Switzerland are worst.

This simply paints a very painful image for Greece and makes the need for necessary actions imperative.

C. Other indices
1. EProcurement

The User Experince index is a new one and it is described earlier in the EUeGovBe methodology analysis. As far as the eProcurement Development Model is concerned, the distribution of the countries is displayed on Table 9:

Table 19 Country Distribution within the eProcurement Development Model (2009

The EU27+ eProcurement Pre-Award Process Benchmark average is at 59% and Greece is way below at 43% and within the low 5 countries. As described above, no data were released for the eProcurement Post-Award Process Benchmark.

The EU27+ average for the overall eProcurement Availability Benchmark is 56% and Greece is at 21%, passing only Iceland at 15%.

Considering that according to the i2010 agenda, by 2010 all countries should have had implemented a fully operational eProcurement infrastructure and furthermore use it for 50% of the transactions, it is pretty obvious that Greece is falling back in the eProcurement category as well. According to the report, a central eProcurement infrastructure is being developed and hopefully it will assist the Greek Government to become synchronized with the relevant EU directives.

2. User Experience

As far as the User Experience index is concerned, despite the fact that it is only a pilot index, some data is provided. All of the User experience indices are displayed, along with the EU27+ and their respective deltas, on Table 19:

Table 20 User experience indicators for Greece (2009)

The 0% in the user-focused portal design indicates an urgent need for the Greek governmental websites to undergo some serious redesigning in order to become more user-friendly and make it easier for users to locate information posted online. On the other hand, this is the first “positive” metric throughout the 2009 benchmarking report. Specifically, regarding One-stop-shop approach, Usability and especially Accessibility, Greece is well above the EU27+ average, demonstrating an unexpected conformance with accessibility standards coupled with sensitivity for citizens with disabilities.

D. Lowlights & Highlights.

It is pretty obvious that the latest EUGovBe benchmark was anything but favorable for Greece. Over and extremely long period of more than two years there was no development in terms of improvement of the country's eGovernment services. Greece is once more in the last position (27th ) but it is not just that. During these two years, other countries have not only made progress but actual leaps forward. This only makes it more difficult to reach them, let alone surpass them.

On a more optimistic note, there were two Highlights about Greece in the report. First, the long awaited social security online services which are offered by the Social Insurance Institute (IKA). Secondly, the National Public Administration Network ‘SYZEFXIS' has become the largest and most sophisticated public administration broadband network in Europe; connecting over 2000 agencies.

Hopefully next reports will include more highlights and certainly better results for the evolvement of Greek eGovernment.

V. Conclusion

The eGovernment landscape is full of challenges. The very first one is the difficulty to gain a clear view on how precisely all of the terms involved in it interact with each other. Terms such as eGovernment, eGovernance, Digital Government, eDemocracy and more are used back and forth with varying meanings and contexts, making it confusing for anyone who is recently introduced into this extremely important field of research. Moreover, the fact that differences in opinions regarding basic terms exist even among seasoned researchers does not make things any easier.

In this paper, in order to tackle this problem, the prevailing definitions for all the main terms that are related with eGovernment were presented, choosing the one that describe each term the best. Furthermore, the different ways that eGovernment can be classified, depending on the delivery model or the audience, were outlined.

Having a clear idea of what exactly eGovernment is, the particulars of how it is measured were analyzed. Firstly, the basics of benchmarking were stated; what is a benchmark and the reason it is important. Secondly, four dominant methodologies (representing the longest running efforts) were chosen to be highlighted. Using the reports they publish on a periodical basis, their inner workings were analyzed and the various developments, changes and evolutions in the methods used was mentioned

Focusing at the benchmarking of eGovernment Services in Europe in general and in Greece in particular it was important to recognize how eGovernment has evolved and matured within the European Union. Thus, the relevant European directives, initiatives and frameworks for the development of eGovernment Services in the region since 1999 were examined. Following that, the methodology used for benchmarking eGovernment the European Union was examined in detail. All measuring elements, including some that were used for the first time in the latest report (PARAPOMPH report (i.e November 2009 report) were analyzed.

Having established what eGovernment is, what Benchmarking is and how its methodologies function, the data from the latest European eGovernment Benchmarking Report which incidentally is extremely recent, is used to assess how the Greek eGovernment landscape had evolved since the previous report which was published over two years ago. The results were far from satisfactory. When comparing the 20 Basic eGovernment Services offered in Greece, there was no development in terms of improvement, what so ever, from 2007 to 2009. Unfortunately, this limited further analysis options only to the new indices introduced for which historical data was not available. Nevertheless, after analyzing those indices, EProcurement and User Experience, Greece is ranks constantly at the bottom five countries, with the exemption of only two indices in which it manages to rank higher than the EU27+ average.

Clearly, the government needs to

* Define the reasons that were the deterring factors and obstacles that didn't allow eGovernment offerings to improve during the last two years.

* Act swiftly and take all necessary measures in order to put the country back on an improvement track and make up for all the lost time.

As demonstrated, the methodologies are well defined and in place. They can and should be used in order to take advantage of other best practices from fellow EU Member States, and implement them as soon as possible. Two years is a lot of wasted time and it has to be made up for, and the fact is that the current financial juncture only makes it more difficult since at this time there are very slim chances

However, this is a matter of utmost importance and one can only hope that its importance will be realized and that it will be addressed properly.

[1] Expert Group Meeting - E-Government Survey: Getting to the Next Level: http://www.unpan.org/Regions/Global/Events/Conferences/tabid/458/mctl/EventDetails/ModuleID/1510/ItemID/1010/Default.aspx

[2] Global E-Government Survey 2010

[3] http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:52005DC0229:EN:HTML

[4] http://europa.eu/legislation_summaries/information_society/l24226j_en.htm

[5] http://www.capgemini.com/news-and-events/news/capgemini-consulting-again-selected-by-european-commission-to-conduct-egovernment-benchmark-study/

[6] http://ec.europa.eu/idabc/servlets/Doc?id=18402

[7]http://ec.europa.eu/information_society/eeurope/i2010/docs/benchmarking/online_availability_2006.pdf