This dissertation has been submitted by a student. This is not an example of the work written by our professional dissertation writers.

Slow and difficult integration of BPS into design practice has been identified as a barrier to the use of simulation tools in the AECO industry. In particular, five obstacles are frequently mentioned concerning the: (1) usability and information management (UIM) of the interface, (2) integration of intelligent design knowledge-base (IIKB), (3) integrated building design process (IBDP) (4) interoperability of building modelling (IBM), (5) accuracy and ability to simulate complex and detailed building components (AADCC) and the. The objective of this paper is to identify those obstacles, better understand user's needs and provide guidance for BPS tools developers. The approach taken is to survey architects and engineers needs in the USA through an online questionnaire comparing the capabilities of ten major BPS tools: The survey was administered during two different periods and resulted into two random sample groups. The survey probes the users' perception of the most important criteria of the usability of ten major USA market tools and how they use, and benefit from the tools associated with their design decisions. The ten tools were ranked and respondents were asked to pinpoint the specific strengths and weaknesses of each tool. More importantly, respondents were asked to provide feedback regarding the five obstacles that face the use of BPS tools. This paper provides an up-to-date comparison of the features and capabilities of. Based on the experience gained during the survey, a critical review of potential and problems of current BPs tools is offered. Suggestions for improvements to current BPS and future ... are discussed. The final results indicate a wide gap between architects and engineers priorities and tools ranking. This gap is discussed and suggestions for improvement to current BPS tools are presented.


Problem context (positioning)

Since the inception of the building simulation discipline it has been constantly evolving as a vibrant discipline that produced a variety of Building Performance Simulation (BPS) tools that are scientifically and internationally validated. The foundation work for building simulation was done in the 60s and 70s focusing on building thermal performance addressing load calculation and energy analysis (Kusuda, 1999; Clarke, 1985; Kusuda, 1970). In the late 70s, and continued through the 80s, efforts were invested into analytically validated and experimental testing methods for codes for simulation tools (Augenbroe 2002). This foundation work was developed mainly within the research community of the mechanical engineering domain. Simulation tools were developed by technical researchers and building scientist aiming to address the needs of engineers. During those early days, the user base of BPS tools was mainly limited to researchers and experts who are concerned with detailed energy analysis applied during design development phases. For example, simulations were performed to estimate peak hourly loads for heating and cooling seasons or predict the annual consumed energy in order to size and select mechanical equipments for large buildings.

It was until the 90s that the building simulation discipline reached a certain level of maturation offering a range of tools for building performance evaluation (Hensen, Lamberts et al. 2002). The beginning of the 90s manifested the shift from an energy consumption focus to many other building performance characteristics (Augenbroe 1992). For example, the integrated modelling whereby the heat and mass transfer, air flow, visual and acoustic aspects of performance were considered. This shift led to a the development to a relatively large range function complete tools (Clarke, Hensen et al. 1998). By the end of the 90s, a range of simulation applications spinned out from the research community to professional practice allowing a diverse tools landscape for a variety of users. For the first time, analytical simulation power became at the finger tips of building designers (Papamichael, LaPorta et al. 1996; Tianzhen, Jinqian et al. 1997).

This maturation of building simulation had a major influence on the building design profession and resulted into four major changes namely:

  • Diversifying tools users and addressing more the whole design team
  • Modifying the tools to suite early and late design phases
  • Increasing the number of tools and developing a large range of function complete tools
  • Localizing the tools capabilities

The first major change was the trend to encourage the whole design team to use BPS tools. The increased complexity of building delivery process has led to a broader view of BPS which resulted in a broader user's base. Simulation tools moved progressively towards all professions involved in design of buildings including architectural designers. Architects, who have been regularly described in literature as non-specialist, non-professional, non-experts, novice or generalist (Morbitzer, Strachan et al. 2001;Ibarra and Reinhart 2009;{Schlueter, 2009 #166; Augenbroe, 2002 #41; Mahdavi, 2005 #173; Hand, 1997 #164} became engaged in the BPS community. Recognizing the implications of design decisions made by the different team members on the energy and environmental performance of the building, engaged all design team members in performing simulations. As a consequence, simulation tools became recognized as design support tools within the Architecture-Engineering-Construction-Operations (AECO) industry. In fact, simulation became an integrated element of the design process (Augenbroe 1992; Mahdavi 1998). This resulted into a diverse growing user's uptake addressing more the whole design team.

The second major change was the trend to progressively move towards early design phases. Due to the increasing importance of the decisions made early in the design process and their impact on energy performance and cost, several BPS tools have been developed to help architects perform early energy analysis, and create more energy efficient more sustainable buildings (Hensen 2004).

The third change was the rapid sprawl of BPS tools. Today we have a diverse tool landscape for all building design professionals. The U.S. Department of Energy (DOE) maintains an up-to-date listing of BPS tools on the Building Energy Software Tools Directory (BESTD) website ranging from research software to commercial products with thousands of users ( (Crawley 1997). In 2009, the number of tools copy.jpg, BPS tools developed between 1997- 2009

The forth major change was the localization of tools capabilities. With the localization of BPS tools incorporating local weather data and provision of local building materials, construction and codes the number of tools users is growing enormously. High quality thermal models are uploaded on earth viewer software (Google Earth) and positioned on 2D and 3D satellite images of terrain and cities. We literally can simply fly over any location on earth and come to a model and run it using BPS tools. With the rapid advances of computer technology, internet and building information technology, building simulation will be more often and more widely applied in building design and analysis worldwide offering design solutions, economic analysis, comparing & optimizing designs, computing performance and verifying compliance{Ellis, 2002 #138}.

Definition of problem

By analyzing those four major changes we can observe that BPS tools are and will continue penetrating the building design practice. As a consequence, simulation tools became recognized as design support tools within the AECO industry. In fact, simulation became an integrated element of the design process (Augenbroe 1992; Mahdavi 1998). However, those changes happened so quickly and resulted into a growing landscape of tools that is considered in itself as barrier. The continuing growing number of BPS tools reflects a broader variety of tools abilities but it does not necessarily reflect a wider penetration within the building design community. Already there is replication of many tools that have striking similarities with no attempt to develop design team friendly, effective and efficient design decision support applications. Most BPS tools are difficult and cumbersome to use, and cater more for engineers. The scope of most existing tools is mainly still oriented towards final design stages. Moreover, most tool developers use engineers' feedback to develop architect friendly tools. The rapid changes could not bridge the mono-disciplinary R&D inheritance, mostly lacking the architects' viewpoint, which does not make BPS suitable for the design (Attia, 2009). Attempts to address the architects and engineers use of BPS tools have been proposed separately by many researchers. Very little effort has been attempting to address the use of BPS tools for both groups together namely architects and engineers.

More importantly, there is no independent evaluation and classification of tool usability and functionality in practice versus users' type and needs (Clarke, 2009). Even tools developers rarely state the tools capabilities and limitations (Reinhart, 2006). Potential user is faced with difficulty of choosing a suitable program among this growing BPS tools sprawl. Which BPS tool should one use? Is the BPS tool suitable for engineers or architects? Is the BPS tool suitable for early design phases or detailed design phases?


Now, the common objective and chance in the BPS discipline to improve the integration and alliances between engineers, architects and even constructers to create realistically integrated projects together and overcome the differences between the logical model and the realities of AEC industry practice. However, it very important to identify the gaps and weaknesses of existing tools and identify clearly the needs of users. Therefore, the purpose of this study is to:

  • Define generic tools assessment criteria for software developers
  • What are the most important features of a simulation tool?
  • Compare and investigate the requirements of architects vis-à-vis engineers
  • Classify and evaluate ten existing simulation tools
  • To identify whether common needs or difficulties exist across different criteria for particular tools

Importance and significance (contribution)

The final goal of this paper is to provide guidance and feedback to BPS tool developers, with particular focus on the different expectation of architects' vis-à-vis engineers' needs. This study will allow user feedback to get main knowledge into application. The paper provides also an up-to date comparison of ten major BPS tools: ECOTECT, HEED, Energy 10, Design Builder, eQUEST, DOE-2, AutoDesk Green Building Studio, IES VE, EnergyPlus and Open Studio. This can provide a resource for simulation tools developers about architect and engineers needs. Comparing tools is performed to generate use cases that allow recording and identifying the BPS functions required by both groups, in order to present a checklist that forms a design basis for software developers. Finally, the study is providing a glimpse into the future, in order to allow the evolution of architecture and engineering education and profession aiming to bridge the gaps between architects and engineers in the professional design practice.

Organization of the paper

The paper is organized into six sections. The first section is positioning the research problem within the BPS community. The second section screens usability and functionality criteria of tools requirements specifications. These criteria are reviewed and collected from literature and are classified in the second section forming the basis for the two online surveys discussed in the third section. The analysis of the results and the survey findings are presented in chapter four. This includes ranking of the ten tools and comparison of different priorities and preferences of architects' vis-à-vis engineers. The final two sections are discussing the survey findings and providing feedback to tools developers and architecture and engineering education community.

Tools Assessment Criteria

(Overview of relevant work)

The simulation community does not have a clear criteria to classify and evaluate the facilities offered by tools (Crawley, Hand et al. 2008). There are not yet uniform definition of tools requirements and specifications based on formal consultations with users, practitioners and tool developers (Clarke 2009). For example, there is no common language to describe what the tools could do (Crawley, Hand et al. 2005). We note there are many nuances of the word 'criteria', for example, capabilities, requirements, functionality, specifications, features, factors etc... Also there is no clear methodology to compare BPS tools. Identifying the basic criteria for BPS tools can support architects and engineers creating more efficient and cost effective sustainable buildings, as well as facilitating future innovation and the progress of the AECO industry. In order to provide the necessary conditions for a evolutionary cycle of tool development; a critical review of the status quo and in-depth reflections on the tools must be achieved (Lam, Wong et al. 1999).Therefore, as part of this paper a literature review was carried out to identify, classify and group requirements and assessment criteria for future development of BPS tools. The following review forms an entrée into the literature. This review forms the basis that will ensure the clarity and relevance of the questionnaire content and allow tools comparison in section three.

Major interested bodies

There are various bodies that could help with building energy modelling and simulation information. On top of those bodies, stands the International Building Performance Simulation Association IBPSA, is a non-profit international society of building performance simulation researchers, developers and practitioners, dedicated to improving the built environment. IBPSA is founded to advance and promote the science of building performance simulation in order to improve the design, construction, operation and maintenance of new and existing buildings worldwide. IBPSA is not particularly busy with developing tools assessment criteria; but it provides a framework and medium for R&D of BPS industry. The bi-annual conference publications are available online and provide a source for many topics including: simulation & users, software environments & paradigms in addition to tools and interfaces assessment. However, there is no formal attempt within IBPSA to define a formal tools requirements specification for practitioners and tool developers.

Another important body involved with evaluating BPS is the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE). ASHRAE lists what factors to take into consideration when selecting energy analysis tools (ASHRAE 2009). According to ASHRAE the most important criterion is the capability of the tool to deal with the project requirements. The second is the complexity of input and the third is the quality of output. The availability of weather data are major feature of a tool. Auxiliary capabilities, such as economic analysis is a final concern in selecting a tool. Apart from these criteria there are general factors that must be embedded in any energy analysis method namely, accuracy, sensitivity, versatility, speed and cost, reproducibility and ease of use. However, the main focus of ASHRAE is to provide evaluation criteria for BPS tools regarding the accuracy and validity of tools algorithms (ASHRAE 2007).

End of 2009, ASHAE announced the Building Energy Modeling Professional (BEMP) Certificating Program that is a structured around...

In recognition of the significance of energy use in buildings, the International Energy Agency IEA has funded wide ranging R&D activities in the building sector. Within the annexes in table .., it can be seen that building simulation is one of the key technologies that contribute to the construction of future buildings which are more energy efficient, health responsive, and environmentally-friendly{Hong, 2000 #37}.

During that same period, the US Department of Energy allocated more than $1 billion to R&D projects on energy conservation and renewable energy. The result of the sponsorship is a series of popular detailed building and energy systems simulation programs like DOE-2, ESP and TRNSYS{Hong, 2000 #37}.

The International Energy Agency (IEA) has created a number of tasks for evaluating BPS tools. However, most IEA tasks are focused only on assessing the accuracy of BPS tools in predicting the performance neglecting other important criteria. Task 12 (Building Energy Analysis and Design Tools for Solar Applications) has created a number of procedures for testing and validating building energy simulation programs. Task 22 (Building Energy Analysis Tools: Analyzing Solar and Low-Energy Buildings) assessed the accuracy of BPS tools in predicting the performance of widely used solar and low-energy concepts. Task 30 (Bringing Simulation to Application) was aiming to investigate why BPS tools were not widely used in the design process and to identify ways of overcoming this problem (Warren 2002). Task 34 (Testing and Validation of Building Energy Simulation Tools) is evaluating the accuracy and capability of analysis and design tool algorithms and developed the BESTEST procedure (Judkoff and Neymark 1995). However, most the tasks focus on quantitative evaluation measures and the audience for most IEA Tasks that are concerned with BPS tools is limited developers, and energy standards development organizations (i.e. ASHRAE, BESTEST and CEN). It is very difficult to estimate the benefit of the IEA tasks on tool users, such as architects, engineers, energy consultants, product manufacturers, and building owners and managers.

It is a free market like community with nor regulatory or protocol for software and tools users.

Previous studies

A number of studies and surveys have been carried out in the past that were concerned with the criteria and requirements of BPS tools. In August 1995 and June 1996 the DOE sponsored workshops to provide input from developers and users to future planning efforts on future generation BPS tools (Crawley and Lawrie 1997). The developer's workshops focused on applications, capabilities and methods and structures. The user's workshops focused on application, capabilities, and interfaces. However, the user's workshop group included mainly software experts, researchers and engineers. Also the workshops did not address the different requirements for different design phases. In fact, the final results focused mainly on identifying criteria for the development of a heart or calculation engine of EnergyPLUS. User interface issues were postponed for the future (Crawley and Lawrie 1997).

Ten years later, when Crawley et al compared the features and capabilities of twenty major BPS tools. they grouped the comparison criteria into 18 category including: results reporting; validation; and user interface and links to other programs (Crawley, Hand et al. 2005; Crawley, Hand et al. 2008). The grouping was based on vendor-supplied information with no uniform and standard assessment criteria.

Aiming to identify the vital capabilities of BPS tools, Tianzhen Hong identified five vital criteria. The first capability is the usability. The second relates to computing capability. The third is the data exchange capability. The fourth is the database support. The final notes of his research that was published in the year 2000 highlighted five additional trends that are on the road ahead. The first is the knowledge-based systems. The second is the BPS for early design stages. The third is the information monitoring and diagnostic system. The fourth is the interested building design system. The fifth is the virtual reality.{Hong, 2000 #37}.

In 2005, Hopfe et al identified the features and capabilities for six software tools and interviewed designers to screen the limits and opportunities for using BPS tools during early design phases (Hopfe, Struck et al. 2005). The tools classification was based on six criteria namely the capabilities, geometric modelling, defaulting, calculation process, limitation and optimization. However, the authors did not report what methodology was used to compile these criteria.

In 1996, Lam et al. carried out a survey on the usage of performance-based building simulation tools in Singapore (Lam, Wong et al. 1999). With one hundred and sixty four valid responses, including architects and engineers, the survey was organized around six main questions. The questions were simple and direct asking about the reasons of using or not using the tools and asking for the major limitations and obstructions. Except the question about the ability of the tools to enhance the design process, no other question could be considered as major BPS tools assessment criterion.

In 2004, Lam conducted a study that involves the development of a comprehensive classification schema for comparing five tools and running a comparative analysis by graduate students. Lam conducted a literature review on well-known energy modeling tools that exist. A comparison of 22 tools was made based on four major criteria, namely, usability, functionality, reliability and prevalence. Under the usability criteria he listed the system requirements, Interoperability, user interface, learning curve, effort to update model, conducting parametric studies and processing time. Then, under the functionality criteria he listed the comprehensiveness of geometric and system modeling, types of energy calculations, types of data analysis and presentation and availability of other environmental domain simulations (e.g., lighting). The third criteria namely reliability included consistency of results and accuracy of results. The forth criteria was the prevalence including compliance with industry standards, documentation, user support and pricing & licensing {Lam, 2004 #190}.

In 2005, Punjabi et al, identified major BPS tools usage problems undertaken an empirical using testing. The usability testing was based on usefulness, effectiveness, likeability and learn ability(Punjabi and Miranda 2005). The research defines six indicators for usability and information management including interface design, navigation, saving and reviewing, database creation and learnability. However, the research was only limited to usability and did not include other tools evaluation criteria.

In 2002 Augenbroe trends in simulation

Energy model usage in building design: A qualitative assessment

A considerable amount of work has been published in this field comparing and assessing tools 11 reference.

Summing up, bodies and previous surveys were capable of identifying general trends and needs in the BPS community. However, all these efforts are diverted and based on individual initiatives without a unified consensus based framework. There is not yet a uniform and clear methodology or outline to assess and define tools specifications and criteria for developers, practitioners and tools users.

The following section presents the five assessment criteria that undergrid current notion of how we can classify and evaluate the facilities offered by BPS tools.

Tools assessment criteria (state of the art)

Summarizing the literature findings we found that the simulation community at large is thinking about and discussing at least five major challenges. They are namely, the (1) Usability and Information Management (UIM) of tools interfaces, (2) Integration of Intelligent design Knowledge-Base (IIKB), (3) Integrated Building Design Process (IBDP) (4) Interoperability of Building Modelling (IBM), (5) Accuracy of tools and Ability to simulate Detailed and Complex and building Components (AADCC) and as shown in . Under those five titles we classified the sub criteria and challenges, found in literature, that correspond to the five topics.

The goal in examining these challenges and criteria is not to conduct an exhaustive analysis. Instead, it is to tease out broad yet critical underlying premises to see if common ones exist. This paper does this and then applies the results to the surveys to assess how the criteria compare.

Usability and Information Management (UIM) of the interface (definition)

The usability and information management of the interface refers to the human-computer interaction. A fundamental feature of a simulation tools is to incorporate interfaces that enhance the human-computer interaction and overall system effectiveness of simulation (Hefley and Murray 1993 ). This means express information using presentation techniques and media to achieve communicative purposes and support users performing their task (Maybury and Wahlster 1998). Usability is a broad term that incorporates better graphical representation of simulation input and output, simple navigation and flexible control. Users would like to see results presented in a concise and straightforward way, with a visual format or 3D spatial analysis preferred to numerical tabulation (Attia, Beltran et al. 2009). For example, CFD is very tremendously appealing to architects, engineers and clients because of the tremendous explanatory power of graphical output. Usability entails also the ability to learn easily, quickly and to support the user with training, online help, look-up tables and error-traps.

More to this criterion emerges information management, as a growing concern for tool users. Information management is responsible for allowing assumptions, using default values and templates to facilitate data entry (Donn 2001). Issues such as simulation input quality control, comparative reports creation, flexible data storage, user customization, simple input review as well as input modification are all considered as part of the information management features of any simulation interface (Crawley, Hand et al. 2005).

Integration of Intelligent design Knowledge-Base (IIKB) (definition)

Integration of Intelligent design Knowledge-Base (IIKB), the second criterion, has generated many debates in the recent years. The concept of IIKB today trades in other realms under such names as design decision support and design optimization. Since the ultimate wish of BPS users is to have tools that support the design process, the knowledge-base (KB) supports the decision making (Yezioro 2008). It should give quantitative answers regarding the influence of the design decisions (Ellis and Mathews 2002) because the output is extremely difficult to interpret and utilize for design decisions (Lam, Wong et al. 1999). A common observation from the literature is that designers cannot estimate the relative importance of design criteria. "They feel it continuously throughout the design process, reformulating it as they compromise between what is desired and what is possible" (Papamichael and Protzen 1993). Therefore, the importance of KB into tools lies in its ability to provide the user with valuable insights and directions during the design process. With the complexity of the design, next generation of simulation tools must embrace KB. This will add real power to BPS tools that will contain descriptive explanations, templates, building and compnents examples and procedural methods for determining appropriate installation and systems, e.g. guidelines, case studies, strategies etc... KB comprising facts and heuristic/prescriptive rules for decision taking at least on the level of compliance with building codes (e.g. ASHRAE, IECC) and rating systems (e.g. LEED®, EnergyStar® and Green Globes®), in addition to be able to assist in adjusting the design parameters to the needs within the framework of existing codes. Despite the criticism to existing BPS tools, which incorporate expert or knowledge-based systems, that they may mislead designers due to defaulting subjective preferences (Papamichael and Protzen 1993) or the limited pre-processed rules of thumb (Donn, Selkowitz et al. 2009), there is great advantage of incorporating knowledge-base in simulation tools as an educational means that help more understanding the complex thermo physical processes and interactions within building and environmental control systems (Hand and Crawley 1997). A knowledge-base play the role of justifier that rationalize and explain the building behaviour and in the same time guide the user during the decision making process. Another very practical ramification of IIKB is the intelligence that is namely defined as design optimization. The intelligence entails finding quantifiable answers to design questions in order to create context specific analysis, evaluate complex design strategies, optimize design solutions, engage 'what if' scenarios, verify compliance and analyze life cycle (LC) and economical aspects. With the increasing analytical power of BPS tools we can examine sensitivity and uncertainty of key parameters in relation to design-decisions (Bambardekar and Poerschke 2009), compare various concepts, rank, quantify parametric and even generate semi-automatically design alternatives (Hensen 2004 ). BPS will never replace good design judgment, but it will calibrate and inspire it. Thus, the new paradigm of BPS is that tools must assist design teams answer qualitative and quantitative design questions through IIKB tools.

Accuracy and Ability to simulate Detailed and Complex building Components (AADCC) (definition)

Tools Accuracy and Ability to simulate Detailed and Complex building Components is the most popular criterion in the field of BPS. This criterion is meant to deal with quality assurance of performance modeling and BPS tools. Since the inception of the discipline, research has been carried out to provide analytical verification, empirical validation and comparative testing studies (ASHRAE, 2007; Judkoff, 1995). However, all building models are simplification and abstraction of reality therefore there is no such thing as a completely validated BPS tool. Thus the importance of this criterion is to guarantee a changing common-accepted agreement representing the state-of-the-art in whole BPS programs. BPS tools are eminently challenged to represent physics accurately by the mathematical and numerical models. Thus the term 'accuracy' is concerned with all the aspects connected to empirical validation, analytical verification and calibration of uncertainty, as defined by IEA and BESTEST procedure, in order to provide liability and a level of quality assurance to the simulation results (Judkoff, 1995).

Another important feature incorporated under this criterion is the ability to simulate complex and detailed building components, in other words, the ability of BPS tools to perform various and specific functions. With the rapid changing building technologies as well as knowledge explosion BPS are providing new features and functions allowing simulating the performance of passive design strategies (e.g. natural ventilation, shading etc...), renewable energy systems, HVAC systems, energy associated emissions, cost analysis, life cycle cost analysis (LCCA) in addition to new building elements such as green roofs, double skin facades, chilled beams, atria, concrete core conditioning etc... AADCC (Hensen) has become pervasive and persistent criterion for tools evaluation.

Passive system Athenitis

Interoperability of Building Modelling (IBM) (definition)

Interoperability of Building Modelling (IBM) responds to the ability to manage and communicate building data between collaborating firms and within individual companies design, construction and maintenance. The IBM is a fundamental criterion for assessing BPS tools because it allows multidisciplinary storing of information with one virtual representation. The need for sharing information and rapid feedback exchange between various design professions emerged in the 90s (Ellis and Mathews 2002). Significant research and development has been carried out to integrate simulation tools with computer aided design (CAD) applications. However, it has been frequently reported that software application process the same building in different representations and formats and the integration of BPS tools with CAD application is not sufficient (Lam, Wong et al. 1999). Aiming to improve the integration and alliances between engineers, architects and even constructors to create realistically integrated projects together and overcome the differences between the logical model and the realities of AECO industry practice, the Industry Foundation Classes (IFC) standard evolved as an international information exchange standard that allows project participants to work across different software applications. It is possible to write IFC interfaces to HVAC design and simulation tools and cost estimation tools. Also importing building geometry data from CAD. This allows the exchange of HVAC data and performance specification, construction properties, geometry. Comparison of performance and cost (Bazjanac 2003; Bazjanac 2004). Later, as an application of the IFC formats, emerged the Building Information Modelling (BIM), a model-based technology that is linked to a project information database (AIA 2007). BIM technology involves the creation and use of coordinated, consistent information about a building. It allows better decision making, documentation and accurate prediction of building performance. In the recent five years, BIM became a comprehensive depository of data that are accessible by many software applications that take part in the AECO industry projects (Bazjanac and Kiviniemi 2007). Recent market surveys show that 48% of the architectural offices in the US already use methods of building information modelling (AIA 2007). Direct links between BIM or non-BIM modelling tools, such as the SketchUpTM plug-in for IES and EnergyPLUS or Revit Architecture plug-in IES and ECOTECT are an important ramification of BIM technology enabling the creation of deliverable that has an explicit relationship to each other, resulting in better coordinated and seamless data exchange that time, resources, effort and assures quality based liability and reduces risk.

However, the success of BIM is limited to the detailed design phase because it ensures access for the design team to BPS tools, only after the whole building design has been completed. The proposition to embrace BIM during early design phases will result in adding complexity by limiting and freezing the design choices during the most critical design phase (Eisenberg, Done et al. 2002; Donn, Selkowitz et al. 2009). Therefore, BPS tools still find a limited application during early design phases. Therefore, we should keep in mind that BIM is an application within the broader definition and objectives of the interoperability of building modelling. Thus, the challenge that is facing the IBM is to assure utmost interoperability by fluidizing model representation, allowing low and high resolution building models that correspond to all design phases and allow a design team based model.

Integration of tools in Building Design Process (IBDP) (definition)

The final criterion assesses the tool Integration in the Building Design Process (IBDP). The building design process is a dynamic process of creating concepts that involve design strategies and technologies and then predicting and assessing their performance with respect to the various performance considerations within the specific design context {Hien, 2003 #105}.

Within the building design community there is constant complains that BPS tools cannot be integrated into the design process (Morbitzer, Strachan et al. 2001; Yezioro 2008). There is evidence that existing tools lacks the capabilities to deal with the nature of design process. The tools does not match the design process (Lam, Wong et al. 1999; Ellis and Mathews 2002). According to Mahdavi (1998), the increasing complexity involved in the design process resulted into mono-disciplinary, specialist-professional approach that emerged as an attempt to address the design process complexity. The aim was to assign due responsibilities to specialist so that they can contribute their specific knowledge. However, this move resulted into the fragmentation and compartmentalization of the design process (Mahdavi 1998). As consequence, the fragmentation has been echoed in the BPS domain. Today most BPS tools cater to only one discipline or only one design phase. Most BPS tools are still easier to use in developed design phases. They help designers to improve their basic concepts, not to create the basic concepts (Donn, 2001).

In fact Balcomb (1992), Tianzhen Hong (2000) and Ellis (2002) classified the BPS tools used during the design process mainly into two groups. The first is the advanced design stages evaluation tools mainly used by engineers. The second is the guidance tools used by architects. The early design phase tools are called design tools (DT) and the late design phase's tools called detailed simulation programs (DSP). DTs are more purpose-specific and are often used at the early design phases because they require less and simpler input data. For example, DTs are very useful in the compliance checking of prescriptive building standards. Because DTs are easy to develop and test they proliferated. On the other hand, DSPs often incorporate computational techniques such as finite difference, finite elements, state space, and transfer function for building load and energy calculation. Besides design, DSPs are also useful in the compliance checking of performance-based building energy standards {Hong, 2000 #37} {Balcomb, 1992 #109} {Ellis, 2002 #138}.

On contrast, BPS tools should be design process centric as proposed by many experts (Hayter, Torcellini et al. 2001; Mendler, Odell et al. 2006; De Wilde and Prickett 2009). With the growing importance in bridging this gap and integrating BPS tools during the whole building design delivery process simulation should be used as an integrated element of the design process (Augenbroe 1992; Mahdavi 1998). Experience has showed that high performance buildings (e.g. passive houses, low energy and zero energy buildings) cannot depend on intuitive design and therefore simulation tools should be an integral part of the design process (Torcellini, Hayter et al. 1999; Hayter, Torcellini et al. 2001). For example, the integration of BPS tools during early design phases can influence better design to achieve our millennium objectives (Robinson 1996; Mahdavi, Silvana et al. 2003; Morbitzer 2003). In order to encourage designers to use simulation tools, IBDP tools should be provided allowing the use for different purposes, by different users and at different design stages (Tianzhen, Jinqian et al. 1997). Thus the IBDP became pervasive and persistent criterion for BPS tools evaluation and selection.

Finally, the inherent limits to a synopsis of the five influential assessment criteria are apparent. These five criteria are more linked than the categorization suggest. However, in order to guarantee plausible and persuasive assessment criteria, this categorization form the basis for the survey questionnaire. The survey provides the opportunity to test and critically judge the assessment criteria. The next section explores the questionnaire design and execution.


The tools assessments criteria were used to form the basis for of the survey questionnaire. The questionnaire was dedicated to gathering information from beginner simulation tools users including architects, engineers, designers and fresh graduate students who are concerned with sustainable building design in the USA. The survey aimed to probe the users' perception of the most important criteria on the use and function of ten major USA market tools. How users utilize, and benefit from the tools during the design decision process. The objectives of the survey are as follows:

  • To conduct an inter-group comparison between architects vis-à-vis engineers
  • Ranking and identifying BPS tools assessment criteria
  • To compare the potential challenges and opportunities of using existing BPS tools
  • To compare ten state-of-the-arts BPS tools in the USA market

Prior to launching the survey the authors set up an online test version. Comments and suggestions were requested from peer reviewers. Reviewers were asked to revise the questionnaire and provide critical feedback in order to optimize the structure, clarity and relevance of the questionnaire before posting the final version online. Also reviewers were asked to screen and list top-ten BPS tools, using the U.S. BESTD list and Crawley et al comparison study (Crawley, Hand et al. 2005). The selection had to represent an overview of state of the art BPS tools used by architects and engineers in the USA (DOE 2009). The list was narrowed down to those tools that are used for evaluating and analyzing the building energy performance. Lighting and CFD simulations have been excluded because they are disconnected from the energy and thermal performance realm. As a result, eight tools namely, ECOTECT (Autodesk 2008; Autodesk 2009), HEED (UCLA 2008; UCLA 2009), Energy 10 (E10) (NREL 2005; NREL 2009), Design Builder (DB) (DesignBuilder 2008; DesignBuilder 2009), eQUEST (LBNL and Hirsch 2009; LBNL and Hirsch 2009) , Autodesk Green Building Studio (GBS) (Autodesk 2008; Autodesk 2009), IES Virtual Environment Viewer plug-in (IES VE plug-in) (v.5.8.2) and SketchUP/Open Studio (OS) plug-in (Google 2009; NREL 2009) (why) were selected plus 'raw' DOE-2 (LBNL and Hirsch 2008; LBNL and Hirsch 2009) and Energy Plus (EP) (DOE 2009; DOE 2009). Reviewers suggested adding DOE-2 and EP to broaden the range of examined tools. First, to allow comparing tools that are capable of making overall energy analysis in the early design phase, versus tools capable of making detailed analysis in later design phases. Secondly, to allow comparing the sensible use of tools vis-à-vis the amount of knowledge required for and by each tool. Thirdly, to compare tools with developed graphical user interface (GUI) versus tools with text based user interface.

Due to the questionnaire's density and length, reviewers recommended a shorter version. Questions regarding IIKB and IBDP were merged into one question group. They also advised launching two surveys during different time periods to guarantee the maximum participation of respondents. The final step, prior to launching the survey, was to include reviewers' feedback and conduct several modifications to the format and content the two final surveys.

Participants were recruited through email invitations to the mailing lists and forums of the ten above mentioned tools, in addition to the AIA Committee on the Environment (COTE),USGBC, 2030 Challenge, 2007 Solar Decathlon entry teams and the building performance simulation mailing lists (Bldg-SIM, Bldg-RATE, IBPSA-USA). Environmental architecture departments, students' chapters, blogs and architecture firms in the USA were approached including the winning offices of the Top Ten Green Architecture Award between 2005 and 2008.

Survey 1

The first survey was hosted at eSurveyPro.Com and was launched between mid December 2008 and mid February 2009 including 22 questions. An invitation letter was included within the email body with a link to the survey web link. The questionnaire's home page clearly stated that the questionnaire purpose, focus group and duration.

As an incentive to complete all the survey questions the respondents were promised to receive the final survey summary report. The average duration for taking the survey was approximately 8 to 12 minutes. A welcome page explained the objective of the survey, informed participants of the approximate survey duration, and defined the expected target group. Including the above mentioned issues, the page listed the tools that will be inquired. The questionnaire was structured into three parts. The first part was screening the respondent's background and experience with BPS tools. The second and third parts of the survey focused on the following key criteria:

(1) The usability and information management (UIM) of interface and (2) the integration of intelligent design knowledge-base (IIKB), including the (3) IBDP. The respondents were asked not only to judge the relevant importance of the above mentioned criteria, but also to share their experience by comparing longitudinally the ten selected tools. An open question followed every part of the questionnaire in order to allow respondents to share their thoughts and comments. At the end of the survey respondents were invited to post their ideas about current limitations or improvements that should be avoided or integrated in the future development of BPS tools.

Survey 2

The second survey was also hosted at eSurveyPro.Com but was launched between mid August 2009 and mid October 2009, including 16 questions. Both surveys were structured to include the same introduction entailing eight questions. The eight questions were addressing the respondent's background and experience with BPS tools. The second and third part of the survey focused on the following key criteria: (3) Interoperability of Building Modelling (IBM), (4) Accuracy of tools and Ability to simulate Detailed and Complex and building Components (AADCC).

Analyzing Results:

After two months online, each survey was closed. The user's responses were stored and results summaries were automatically generated. The first survey attracted over 800 interested visitors. However, the automatic report filtering generated only 481 eligible respondents. The second survey attracted over 750 interested visitors with 417 eligible respondents. Many respondents opted not to complete the survey till the end. The responses came from various IPs of users that answered the survey. IP responses from outside the USA and uncompleted responses were excluded. Questions 4.1-4.8 are representing the 8 introduction questions for both surveys. The results of both surveys are summarized below. Questions 4.9-4.11 are representing the UIM. Questions 4.12-4.14 are representing the IIKB and IBDP. Questions 4.15-4.17 are representing the AASDC. Questions 4.18-4.19 are representing the IBM. Then question 4.20 ranks the most important features of BPS from the point of view of architects and engineers. Finally, compiles the respondents' choices and ranks the ten tools automatically according to the two different groups' preference.

Thus, prior to analyzing the survey results it is very important to question the statistical significance of the survey. In fact, the questionnaire is based on an open sample and therefore, the four respondents sample groups cannot be proven to be representative for the engineering or architecture community. However, the number of respondents of both surveys was quite reasonable to allow the identification of patterns and conduct cross-discipline analysis (Pilgrim, Bouchlaghem et al. 2003).

How do you describe your current position?

This question allowed the respondents to choose from 12 available categories including architect, engineer, architecture or engineering designer, interns, educators, graduate student and) undergraduate student. Remarkably under the "Other" option a number of construction management professionals and students provided complete responses. However, in order to conduct the inter-group comparison between architects and engineers all categories were binned into two main groups. Participants that did not fall into any of the 12 categories were excluded. This step was necessary in order to detect any disciplinary difference between both tools users. shows the number of respondents grouped for each survey. The sample size of each group in both surveys was almost equal. Thus, we could say that on average the magnitude of each group is balanced which allows us to compare votes.

Which of the following affiliations apply to you?

The second question revealed the participants' affiliation. In the first survey, 18% of the architects were AIA accredited with almost the same proportion (17%) in the second survey. On the other hand, more than a quarter of engineers (27%) were ASHRAE Professional Engineers (PE) in the first survey, with a higher representation in the second survey (30%). Next, 21% of the first survey respondents and 19% of the second survey respondents were LEED accredited professional including architects and engineers. The summary report indicates the participation of 44 LEED AP architects in the first survey and 31 in the second survey. Surprisingly LEED AP engineers were more than architects. However, the encouraging finding here is that both groups are acknowledging LEED, as a common ground, and are seeking for professional accreditation.

What of the following energy simulation tools do you use?

most architects have used ECOTECT. eQUEST, DB and IES plug-in were also commonly used among the architects samples. On the other hand, most engineers have used EP and eQUEST. DB, DOE-2 and IES plug-in came in the second category of usage. This question was not aiming to rank the tools. The aim was to get a snap shot of the current use of tools by architects and engineers. Notably, eQUEST, DB and IES plug-in were used by both groups. This match does not necessary demonstrate a preference of usage by each group it rather represents a potential for developing tools that suits and incorporates architects and engineers.

What CAD/3D modelling software do you use?

Due to the advent of BIM and the frequent announcements of direct links between BIM drawing tools and BPS tools, engineers and architects have to identify the drawing they use frequently for geometric modelling. The objective of this question was to trace the mismatch between geometric and thermal modelling.

The majority of architects (36% and 40%), in both surveys, were using Google SketchUpTM for geometric modelling. The second most used software was AutoCAD with an agreement of 31% and 35% of the respondents of both surveys. Revit came in the third place being used by 21% and 15% of respondents of both surveys. ArchiCAD (10% and 7%) came at the last place. On the other hand, the majority (48% and 47%) of engineers were using AutoCAD followed by Revit (27% and 30%). In fact, the existing categories of this question did not offer enough choices for engineers. Under the "Other" option Revit MEP, DDS-CAD and Bentley Microstation products were numerously listed.

Both groups use different tools for geometric modelling. Architects are more in favor of SketchUp while engineers are in favor of CAD applications. Therefore, finding a common geometrical modelling medium is still a challenge. Geometrical modeling should not only cater for the whole design team but also cover aspects of buildings performance (Mahdavi 1998).

How many tools do you use when performing simulations for a project?

the number of tools used per project varies widely. The majority of architects use one tool per project (49% of first survey and 45% of the second survey). However, a large proportion (38% and 43%) uses two tools. Conversely the majority of engineers, 38% and 36% use at least two BPS tools per a project. In the first survey 29% confirmed the usage of only one tool and 10% confirmed the usage of three tools. In the second survey 30% confirmed the usage of three tools followed by 25% confirming the usage of only one tool per project.

There is merit among architects to use one simulation tools. Perhaps this due to that most architects are considered as non specialist and use BPS tools only during the early design stage. On the other hands, building services engineers use tools during different design phases, are more acquainted with BPS tools and rely more on tools for thermal energy calculation, systems sizing and energy compliance issues. In fact, simulation tools for building services engineers are almost a mandatory instrument assumed to be a 'sine-qua-non' in the engineering practice (De Wilde and Prickett 2009).

What is your primary building type you modell?

The majority of architects are running building simulations for residential buildings followed by office buildings and educational buildings. For engineers, the most common building type is office building followed by educational buildings and retail buildings. Residential buildings came in the last place. Under the others option engineers listed other types namely, medical, laboratory and religious buildings.

The results in indicate that most architects' simulation experience is with residential buildings while engineers are larger buildings. In fact, this is a true reflection to what happens in practice. Most residential buildings are designed by architects who have no budget or time to hire a consultant and therefore rely on in-house experience. On the other side, large scale projects such as office, retail and educational buildings require simulation experts HVAC engineers and receive the attention and resources to be major player of the design team.

For which design phase would you use the following programs?

In a follow up question, respondents were asked to justify the design phases for every tool they use. indicates the typical usage phases for the ten tools. There was no difference between architects and engineers classification and all responses are binned in . HEED, Energy 10 and GBS were considered as early design tools that might be used during the pre-schematic design phases followed by ECOTECT and eQUEST which are classified to be used during the schematic design phase. DB, EP SU, EP and DOE-2 were considered as extensive tools that are used for detailed analysis during design development and design optimization phases.

What are the parameters you focus on the most, when performing simulation?

This question reveals another contradiction between architects and engineers priorities. Both groups were asked to classify and rank 15 design parameters. There was an agreement from both groups that the energy consumption is the most important parameter as shown in . For architects, comfort, shading, passive solar heating, orientation and natural ventilation filled the rank from 2 to 6. The three least important parameters were efficient lighting, building tightness and controls. On the other hand, engineers ranked HVAC systems, controls, comfort, lazing and openings in the top five after the energy consumption. The three least important parameters were natural ventilation, daylighting and photovoltaic.

A common observation, that comfort was considered by both groups in the top of both lists. But surprisingly the largest difference was recorded for ranking Controls and HVAC systems. Engineers ranked them in the top of the list and architects suited them at the bottom despite prioritizing the energy consumption parameter. This question indicates a huge gap between both users' preferences. Perhaps the separate building design practice among engineers and architects is the reason. (Deru and Torcellini 2004). Architects are busy with building design issues such as geometry, orientation, natural ventilation and daylighting while engineers are busy with mechanical systems and controls.


Indicate how important you think each of the following objectives is, concerning usability and graphical visualization of the interface.

Ranking the sub criteria was identical for both disciplines. As shown in 23% of architects and 26% engineers agreed that the graphical representation of the output results is the most important feature concerning the usability and graphical visualization of the interface. Also there was consensus that the flexible use and navigation (17% architects and 22% engineers) is the second most important feature followed by the graphical representation of the results in 3D spatial analysis (16% architects and 17% engineers). Surprisingly, both groups agreed that the easy learnability and short learning curve of simulation tools is the least important feature. The result of this question indicates the importance and urgency of representing the simulation results graphically in a way that can be clearly and easily interpreted. However, it is important to point to the risk of being seduced by the graphic output that impede the critical examination of outputs results (Donn, 2001 ).

Indicate how important you feel each of the following objectives is, concerning information management of the interface.

Architects first priority (28%), concerning the information management of the interface, is the ability to create comparative reports for multiple alternatives. Quality control of simulation input comes in the second place with approximately 24% of the architects vote. However, it is very difficult here to draw conclusions because the last three features came very close comes in the number of votes representing 18% for allowing assumptions and default values to facilitate data entry, 16% for flexible data storage and user customizable feature and 14% for flexible data storage and user customizable feature. On the other hand, engineers' first priority (39%) was the quality control of simulation input. Flexible data storage and user customizable features came on the second place with 20% of the engineers votes. The creation of comparative reports for multiple alternatives collected 18% of the votes and 16% for the ability to allow assumptions and default values to facilitate data entry.

This question revealed an important finding. Engineers clearly identified the quality control of simulation input as the most important feature concerning information management of the interface. This is not surprising because the issue of attaining quality insurance of simulation input is repeatedly highlighted in literature (Donn 2001; Augenbroe 2002; Pilgrim, Bouchlaghem et al. 2003; Hensen and Radoševic 2004; Ibarra and Reinhart 2009; Tian, Love et al. 2009). However, architects prioritized the ability to create comparative reports for multiple alternatives above the input quality control. This means that the issue of assigning meaning and accurate input data is not a priority. An explanation to that might be that architects are more involved with BPS tools during early design tools and use the tools for decision making and design optimisation. As mentioned by Donn 1987, precision is not so important to architects if all they are looking for is an answer to a 'what-if' question. Therefore, both groups' choices are different due to the different design phase they work on and the different type of knowledge they require and process.

What other features should be improved in the future development of Building Energy Simulation Tools concerning friendliness, usability and information management of the Interface? (optional)

The last question for this part was an open ended question aiming to give participant the opportunity to share or clarify their opinions. The respondents reported a range of comments that were classified for each group as follows.

Architects' comments include:

  • Regarding skill level of user
  • Allowing debugging and need wizard like assists in data entry: ready examples, etc.
  • Defaults templates, but also front-and-center delineation and ability to create/modify those templates
  • Error-checking to ensure models are correct
  • Mismatch between the common form of input/output in most tools and the architects expectations
  • User friendly HVAC templates
  • 3D visualization of design strategies, for e.g. daylighting
  • Graphical representation of design parameters (use the language of architects)
  • Easy searchable building inputs database
  • Balance between extensive (deep) and quickly (basic) guaranteeing meaningful input data
  • Modify the floor plan after it is initially drawn, ability to add/remove building features with ease, ability to make custom reports, ability to easily navigate all components with ease.

Engineers' comments include:

  • Provide echo-print of input in a readable format
  • Mapping data entry trees and limiting access to relevant paths to objectives
  • Have a huge amount of customizability in terms of output. It would be great to be able to select output format, delimiters, etc, so that one could optimize output for input into any visualization package one wished with scripting capabilities
  • Transparent default options, no more black box, more background information through links
  • The number of available input parameters in many building models is so great that it is almost certain that errors (some minor, some major) will exist.
  • Convert SI to IP units and vice versa
  • Supporting database management

Part II - Integration of KNOWLEDGE-BASE

Indicate how important you think each of the following objectives is, concerning Integration of Knowledge-Base

Both groups identified the ability to provide guidelines for buildings codes and rating systems compliance as the most important feature in BPS tools. The ability to provide case studies database for decision making came in second place. The result is not surprising and there is a common ground between both disciplines concerning the integration of knowledge-base.

Indicate how important you feel each of the following objectives is, concerning Intelligent Knowledge-base and Design Process

architects top priority, concerning the integration of the intelligent knowledge-base and compatibility with design process, was the ability to provide quick energy analysis that supported their decision making (33%). The next priority was the ability to examine sensitivity and uncertainty of key design parameters (29%) followed by the ability to analyze weather characteristics and suggest suitable climatic design strategies (20%). The fourth and last criterion was the overall embracement of design during most design stages. However, engineers had a different order of priorities. The most important feature was the examination of sensitive analysis and uncertainty of key design parameters receiving 55% of the votes. The ability to provide quick energy analysis to support decision making came in second place with 23 % of the votes. The last two sub criteria had the same ranking like architects.

By observing we can find a contradictory finding. Architects are favoring the ability of to support the decision making while engineers are favoring the ability to examine the sensitivity and uncertainly of the design parameters. Despite that both questions seems to be different but a closer look to the wording and semantics reveals that they convey the same message. Architects chose the words support decision making and quick analysis while engineers chose the word sensitivity and uncertainly and design parameters. In fact, both criterion support the same thing which is decision making however, the architect chose the general terms and engineers chose a specific issue. Perhaps most architects did not know what does sensitivity and uncertainly mean or perhaps engineers found the word 'support the decision making' too vague.

Another remark is related to the last sub criteria. The number of architects who chose the ability to embrace overall design during most design stages is three times the number of engineers. Despite that the total number of architects is 249 versus 232 engineers' respondents, the comparison is still valid and a clear difference can be identified. This specific sub criterion is highlighting a very important issue that might be veiled behind the limitation of the question type and method. Architects are seeking tools that embrace the overall design during early and late design phases(Attia, Beltran et al. 2009).

What other features should be improved in the future development of Building Energy Simulation Tools concerning Integration of Intelligent Design Knowledge-Base?

The last question for this part was an open ended question aiming to give participant the opportunity to share or clarify their opinions. The respondents reported a range of comments that were classified for each group as follows.

Architects' comments include:

  • Scenario/Alternatives based design approach
  • Define the most influential design parameters in early design phases and their sensitivity range
  • Assisting decision making process through guidance
  • Cases of low-cost, ultra energy efficient buildings and LEED buildings
  • Guidance and rules of thumb on passive design
  • Passive system - simulation , Simulation of traditional passive design strategies and free-running buildings
  • Guide users into sound building science designs the way the Passive House Standard training does.
  • Assist decision on materials to be used in the design
  • Conform to codes and rating systems
  • Large design components libraries e.g. double façade, green roof
  • Contextual material property database
  • Inclusion of various rates of ventilation based upon latest ASHRAE or IMC standard, and ability to compare differences in ventilation based upon the different codes. Also the ability to utilize ASHRAE's intermittent occupancy calculation or air quality calculation based upon intermittent occupancy and advanced filtration, respectively.
  • Explain what the tool is doing
  • Comprehensive HELP menu
  • Developer need to understand the design process through the eyes of non engineers.

Engineers' comments include:

  • Diagnostics to assist with debugging benchmarking for comparison of results (e.g. EPA databases)
  • Default or built in performance comparisons, benchmarking or ratings such as Energy Star or LEED
  • Multi-objective design optimization
  • Assistance on control settings (e.g. air flow and set point temperatures)
  • System recommendations arrived at through an algorithm of climate and building usage
  • Interface with manufacturers' information - e.g. standard formats for MEP equipment, windows, etc that can be imported directly. Agencies such as ORNL for instance could output test results on materials and assemblies in this format. Companies could provide product information in this format. This would simply make it easier to incorporate reliable and effective data into simulation modeling in a similar manner to how manufacturers offer DXF plans, models, and details of products and components.
  • Introducing optimization models to identify optimal design considering performance and cost

Part III - ACCURACY of the tools

Indicate how important you think each of the following objectives is, concerning tools ACCURACY.

Architects first preference (40%) is the confidence to create real sustainable design. This choice is in line with Holms and Donn's study in which they confirm that many architects performing building simulations doubt the liability of simulation based designs to create real sustainable designs {Holm, 1993 #132; Donn, 2001 #128}. The second priority of architects (28%) is the ability to provide accurate and reality like results followed by (18%) the ability to provide validated performance measures. The ability to calibrate the uncertainty (8%) and the high resolution of simulation model (6%) were the least important criteria.

On the other hand, most engineers (31%) agreed that accurate and reality like results is the most important feature concerning tools accuracy. The second most important sub criterion (29%) is the ability to provide validated performance measures to support design decision. The third most important criterion (21%) was the ability to calibrate uncertainties. The ability of BPS tools to create real sustainable results (10%) and the high resolution of simulation model (9%) were the least important criteria.

Engineers' answers are not surprising because they are in line with many publications and surveys that stress on accuracy, validation and calibration. However, the architect's responses indicate a problem with confidence in simulation results. Architects are seeking assurance that the building model they have simulated with a BPS tool represents the real building {Donn, 2001 #128}. There is also a lack of knowledge about the accuracy requirements. Perhaps it is a problem of language and nuances. The words 'calibration', 'validation' and 'model resolution' are not common words and many architects have been neither exposed to these realities.

Indicate how important you think each of the following objectives is, concerning tools ability to simulate complex & specific building components

Almost one third (31%) of architects chose the ability to allow complex design strategies and elements, as the most important feature of BPS tools. The ability to simulate renewable energy system came in second place (27%). In the third and fourth place, selection was made for the ability to evaluate the emissions associated with the energy consumption (17%) followed by the ability to support various types of HVAC systems (13%). The ability to perform cost and LCC analysis (8%) and allow different building types (4%) came in last place.

On the other hand, engineers selected the ability to support various HVAC systems in the first place (33%). Next, engineers favored the feature of allowing cost analysis and LCC analysis (24%). The ability to simulate complex design strategies and construction elements collected 22% of the votes. Fewer votes (11%) went to the ability to simulate renewable energy systems and the ability to allow emissions associated with the energy consumption (8%).

This question highlights the contradiction between architects and engineers and is in line with the results of question 4.8. reflects a gap. Most architects are concerned and looking for tools to apply passive design strategies and technologies such as double-skin facades, green roofs, heat recovery, thermal storage, atria, concrete core conditioning etc..., including renewable and HVAC types. On the other hand, engineers are concerned mainly with HVAC

systems, controls and LCC, issues that architects have been neither exposed to (Holm 1993). Both groups identified and showed different interest. Perhaps this is due to the different design stages each group is concerned with. Architects are favoring criteria that feed the process of energetic concept initiation while engineers are favoring criteria that feed the process of energetic building optimization.

What other features should be improved in the future development of Building Energy Simulation Tools concerning the ability to simulate complex & specific building components? (optional)

The last question for this part was an open ended question aiming to give participant the opportunity to share or clarify their opinions. The respondents reported a range of comments that were classified for each group as follows.

Architects' comments include:

  • Renewable energy systems calculators should be a part of the package and tied into the overall project's energy performance.
  • BESTS" until they can do all this stuff and certify Passive Houses
  • Passive strategies such as green roofs and natural ventilation
  • Embodied energy calculation
  • Ability to easily simulate essential elements (i.e. fins overhangs) in sufficient detail
  • Building envelope design optimization
  • Consider natural ventilation to combine with HVAC system design
  • Integration of daylighting & daylight energy with other tools like EP
  • Inform users as to the cost impacts of energy reduction measures

Engineers' comments include:

  • Real-time results, parametric feedback.
  • Collecting realistic data from cases to establish performance based data sets
  • Optimized for small, ultra-efficient buildings
  • Data to measure uncertainty
  • Adapt to the complexities of the real life designs and climatic conditions
  • Model thermal mass, air-to-air heat exchangers, passive and active solar gains, or the most efficient lighting and passive drying options, radiant slabs/beams, ground source HX, heat recovery chillers etc,
  • Better analysis for double skin facades, VRV systems, dedicated outdoor air systems, and natural ventilation
  • Indication of the degree of error that could be expected in the results
  • Error estimate of models for validation and acceptable error range
  • Validation and Verification of the simulation output
  • Gather data after implementation and get the performance data back to into the software
  • Be built on an underlying database to aid in benchmarking
  • Perform trade-off analysis and an LCA tool to compare different options
  • Ability to model complex HVAC and lighting control strategies
  • Wider range of HVAC and natural ventilation modeling techniques
  • Simulate monitor daylighting, displacement ventilation and chilled beam systems
  • Test cases representing building in reality
  • Robustness of models. Features should not be added until they are well-tested features and well-considered
  • Allow more than one system per zone
  • Describe uncertainty with the data model
  • Clarity on the algorithms used to perform the simulations and the limitations of those algorithms

Part IV - INTEROPERABILITY of Building Model

Indicate how important you think each of the following objectives is, concerning interoperability of the building model

In the first place (39%), architects chose the ability to exchange models with 3D drawing packages such as SketchUp and 3DS Max. The second choice was for the exchange of models with CAD programs (25%). The exchange of model for multiple simulation domains and the exchange of model with MEP drawing packages came in the last place with almost no difference (18.3% and 17.8%) in preference.

On the other hand, engineers prioritized different sub criteria. The most important sub criterion was the ability to exchange model with MEP drawing packages such as Revit and Bentley products (45%). In second place (35%), came the ability to exchange models for multiple simulation domains. In the third place (18%), engineers voted for the ability to exchange models with CAD programs. Not surprising, the last feature was the ability to model with 3D drawing packages with less than 2% of the votes.

What other features should be improved in the future development of Building Energy Simulation Tools concerning the interoperability of the building modelling?

The last question for this part was an open ended question aiming to give participant the opportunity to share or clarify their opinions. The respondents reported a range of comments that were classified for each group as follows.

Architects' comments include:

  • Allowing organic modeling of curved volumes and non-cubical zones and volumes
  • Ability to easily model essential elements (i.e. fins overhangs) in sufficient detail
  • Ability to directly import .dwg or Revit files.
  • Allowing input from multiple modeling programs (sketch up, rhino, 3dmax, Revit, etc) easily and with minimal error. There needs to be clear guidance for how to build models in each of these interfaces in order to facilitate the use of these models quickly and easily.
  • Developing complex geometries
  • Ability to merge architectural CAD drawings into respective thermal zones
  • Change building geometry without having to re-enter all data variables from scratch
  • Importing of detailed geometries with more accuracy and all layers being correctly imported in energy simulation software
  • Proper translation of the geometry in complex models

Engineers' comments include:

  • One common language like gbXML (but more robust) to become an open standard, third party organizations need to create a standard language.
  • 3D parametric modelling
  • Full IFC compliance: Import / Export equally robust, all elements that can be modeled must be able to be exported / imported in IFC with all relevant data (at a minimum name, type, size, material) - this includes MEP as well as Architecture & Structure
  • exchange of model needs to be more seamless and less frustrating which would greatly facilitate the iterative process of optimizing the design
  • library of building components and building assemblies in a common format or formats (GBXML, IDF)
  • Components that include data that describe how they behave

Part V - Most important features of a simulation tool

What are the most important features of a simulation tool?

This is one of the most important questions of the survey. The question was repeated in both surveys aiming to benchmark and rank the importance of major assessment criteria for BPS tools. shows a strong cross disciplinary difference. Architects in both surveys agreed on their priorities and ranking of the major criteria. For architects, the most important criteria (31% and 34%) was the ability of the tool to integrate intelligent design knowledge-base to assist designer in decision making. This was surprisingly, more important (28% and 30%) than the friendliness of interface concerning usability and information management. In the third place, selection was made for the IBM. Finally, AASDC came in last place (18%). These results reveal a very interesting finding. Respondents prioritize the IIKB over the UIM of the interface and even the AASDC. We believe that architects work more during early design phases and therefore need guidance to answer 'what if' scenarios that can assist design optimization process. More importantly, architects lack the knowledge of building sciences and building behavior and therefore require constant information and educational knowledge that guide them into building science (Attia, Beltran et al. 2009). In this context accuracy of simulation results is not of paramount importance to architects as understanding the relative effect on performance due to changes in design alternatives. This finding also suggest that the accuracy of the simulation model should be adaptive and adjustable to the user type and design phase to correspond to the different needs of architects as well as engineers.

On the other hand, engineers had a different ranking. There was an agreement among both engineers' samples. Engineers ranked the accuracy of tools and ability to simulate complex elements in the first place (42% and 42%). The second most important criterion (25% and 24%) was the friendliness of interface concerning usability and information management followed by (22% and 24%) the ability of the tool to integrate intelligent design knowledge-base to assist designer in decision making with a very small difference. The interoperability of building model came in last place with 11% and 9% in both surveys.

Part VI - Evaluating and ranking ten tools Architects versus Engineers (primary results)

Besides analyzing the criteria that influence the performance of any tool the study aimed to compare and evaluate ten existing tools. Needless to say, we believe that it is difficult to compare and evaluate tools in absolute ways, because each tool has its advantages and limitations. However, the comparison allows identify tendencies. The ranking of the tools is a result of a compilation of answers of 7 questions (4.9, 4.10, 4.12, 4.13, 4.15, 4.16 & 4.18). Each question was followed by a follow up question asking the participants to rank the tools according to the question's sub criteria. Results were classified into two groups.

Most evaluation tools serve at advance design stages after the architects has already proposed solution (Yezioro 2008).

Complex tools cause reservations among architects.

IES users have no access to source code and documentation,

One reason why EP and DOE-2 came in the last place not only because they do not have a friendly GUI but because they request extensive input data (Hopfe, Struck et al. 2005).

One thing we can learn that architects and engineers don't need to use the same interfaces or tools but they have at least to use the same engine.

General Comments

The final survey screen invited participants to comment on what should be done to increase the integration of BPS tools in the design practice.

(Architects & Engineers) Integrated Building Design Process

  • Should include building owner, building users, government regulatory and advisory agents, engineering, construction, facilities management agents.
  • Toolkits for corresponding all design stages
  • Tools are not practical for the design process
  • Being able to work with the software at conceptual and DD level that outputs information that is really useful.
  • Integrate different performance domains
  • The flexibility to provide basic information during pre-design while more complex information in later design phases.
  • Reliable tools address late design phases
  • Integrated tools intended for early phase design decision making Automatic graphic output (plots and graphs) of simulation results
  • Suitability for the entire design process
  • Integration of various analysis features in a single software

(Architects & Engineers) Tools and training cost, learning curve and future development

  • proper training in building science
  • easy learning curve
  • cost of programs for students
  • Tutorials, help menu, courses
  • Video guidance on how to use
  • provide adequate help either at the beginning of the tool or where ever necessary while performing simulation/calculation use

Discussion: (Interpretation - explain - data analysis)

This research reviews the current situation of BPS tools among architect and engineers in the US considering the use of simulation in the building design and design process. The ultimate objective of this research was to define generic tools assessment criteria for software developers and compare the requirements of architects' vis-à-vis engineers under five classified criteria. These five criteria were tested through two online surveys. The above surveys results confirmed that there is a large gap between architects and engineers and that the classification, of five assessment criteria, inherent apparent limitations.

Concerning the survey results we should remind ourselves that the survey was dedicated toward beginner tools users and that respondent samples are not representative because they are very small. However, surprisingly to the authors, the open end question produced the most valuable info. Based on a questionnaire and in-depth literature review we can summarize the research findings under two main subjects. The first subject, discussed in section 5.1, is the tools assessment criteria for BPS tools, addressing mainly software developers. The idea here is to present the criteria that can be used to assess the BPS tools as a technology or hardware. The second subject discusses the human factor or the software. There is the gap between architects and engineers as BPS users. In section 5.2 we try to analyze the reason of this gap (academia, lack of code enforcement etc...). Finally, we will discuss suggestions to overcome this gap in the future.

Tools Assessment Criteria

The literature review conducted in section 2, is an example that contains pieces of all five perspectives. These five criteria allowed us to classify and group the user's wishes and needs. The survey generated very comprehensive and abundant wish lists as presented in section 4. Tools developers should tap into those wish lists and understand the different perspective and needs of architects and engineers. Comparing the ten tools might be viewed as insignificant due to short expiry of any tools comparison study, but relating the questions to real tools allowed recording and identifying the BPS functions required by both groups, in order to present this wish lists. The next section discusses the survey results under the five criteria.

Usability and information management of the interface

On the level of usability and graphical representation, the findings of the questionnaire suggest the users commonly need various and customizable graphical representation of input and output results including 3D visualization of design strategies and analysis in addition to more flexible use and navigation of the interface. It is important that simulation result be visualized within the 3D model environment.

On the level of information management, there are many capabilities and needs that are not supported by simulation tools. Users need to compare multiple analyses of alternatives, easily manage support-databases, and ensure quality control of input trough data entry mapping and error-checking features. Beginner users are overwhelmed with complex input parameters that require domain expertise with no guidance to assure a minimum assurance for the quality of simulation input. Also there is an emerging call for allowing debugging, transparent and modifiable default templates.

The respondents put forward two missing features that were not included in the questionnaire. Both users groups are dissatisfied with the current inflexibility of data input. Further work is required to provide adaptive GUI. An adaptive interface will balance between extensive and basic data input in relation to the user type and skill level. Tools should allow users to go back and forth moving from simple visual interfaces t detailed models. Different users need to be addressed with different interfaces and graphics. Specialist will want an in depth understanding, less experienced users will want get a quick evaluation. The GUI should be adaptive and flexible to improve the usability, allow simple and basic data entry choices for non specialist, and in the same time detailed and complex data entry choices for specialist.

Also interfaces of most existing tools are designed in an input/output logic that does not correspond to the architects' expectations. The idea here is not to support the textual based input text with a graphical icon claiming that this will make it more architects friendly. However, the design of user friendly GUI should correspond to the parameters and decisions that the architect is dealing with. Input and output format should be user oriented. It is recommended that researchers and developers focus on providing tools interfaces that use a language familiar to architects and explicitly support different user's needs.

Integration of intelligent design knowledge-base (IIKB)

Under the second category, integration of intelligent design knowledge-base (IIKB), the survey findings suggest the users commonly need to KB systems that advice with code, rating and certification compliance. The increasing complexity in the design and performance evaluation of buildings has resulted in the need for the use BPS tools. Knowledge-based systems can provide decision support systems and databases. Users repeatedly mentioned that tools should embed integrated consideration of passive, ultra low-energy and LEED buildings. The survey suggests that users are dissatisfied with tools that do not embrace alternatives based approach. The performance evaluation process requires the comparison of multiple alternative design schemes. Users are dissatisfied with tools that over-rely on mechanical systems to achieve comfort and obscure the passive design strategies.

On the level of intelligence, the survey findings highlight the importance of providing quick energy analysis that support the decision making in addition to conducting quick parametric study and examine sensitivity and uncertainty of key design parameters in a simple way. The survey confirms that architects and engineers generally use different knowledge types. Architects require tools during early design phases that assist the decision on designing the building geometry and envelope in relation to its physical and climatic context, while engineers require tools that assist the decision on design HVAC systems, occupancy energy management and control settings.

Respondents suggest many other capabilities that were not included in the questionnaire but fall under the IIKB criteria. For example, the idea of benchmarking and comparing results features. Additionally, the inclusion of contextual KB for material properties and design components libraries (e.g. double façade, green roof), occupant behavior, climatic design characteristics and local codes and standards. Furthermore, innovative strategies for energy saving such as reflective roof, daylighting, free cooling, solar hot water heating, heat recovery, and thermal storage can be evaluated before implementation {Hong, 2000 #37}. Also many respondents suggested the introduction of optimization models that can identify optimal design decisions regarding energy performance and cost for architects and engineers. It is clear that BPS tools of the future must help and inform different users at different design stages to optimize and identify optimum building design strategies.

Accuracy and ability to simulate complex and detailed building components (AADCC)

The third assessment criteria investigated the accuracy and ability to simulate complex and detailed building components (AADCC). The findings of the questionnaire suggest the users commonly need accurate and validated performance measures and above all the confidence that BPS tools can create real sustainable buildings. The calibration of uncertainty and fluidity of model resolution must be supported by the tools. Survey respondents are looking forward for simulating the performance of specific design strategies and building components including complex HVAC systems. Sizing and estimating renewable systems, CO2 emissions, energy cost analysis and LCCA is not commonly supported by simulation tools. Users often employ additional calculation tools that require extra time and cost.

The survey proofs that architects define accuracy different than engineers. Architects who design during early design phases want to have the answer to 'what-if' questions and compare different design alternatives thus they are looking for tendencies. Engineers on the other side, who design in later design phases, are looking for high model accuracy with validated and calibrated performance results.

Analyzing the answers of the open question, we found that users suggested the improvement of models robustness. Further work is required to indicate the degree of error that could be expected in the results, the error estimate of models for validation and acceptable error range, describe uncertainty with the data model and the algorithms used and the limitations of those algorithms. Survey responses suggest the need for higher model resolution/detail and better model assumptions that allows integrated sub systems design. Most of these requirements are not new but according to users they are missing in existing tools. Users suggest also gathering measured operating data after implementation and getting the performance data back to into the software.

Future tools should adopt post-construction monitoring and verification exercises. These would provide opportunities for the calibration of models and serve to help understanding the design assumptions. Subsequently, building simulation can supplement energy auditing to check the energy performance of the as built building. Collecting realistic data from cases to establish performance based data sets, aid in benchmarking, measure uncertainty and generate real-time results is not commonly supported by BPS tools. The survey identified a gap between predicted and real energy use. Users suggest more adaption to the complexities of the real life designs and climatic conditions. Features should not be added to tools until they are well-tested features and well-considered. Another gap was identified spotting the widening discrepancy between tools capabilities and the new technologies. BPS does not meet users' changing need to new building components and systems application. The survey suggest that users are dissatisfied with the current obstructions to simulate passive technologies and solutions such as thermal mass, air-to-air heat exchangers, passive and active solar gains, efficient lighting, passive drying options, double skin facades, VRV systems, radiant slabs/beams, ground source HX, heat recovery chillers, etc...Future work should bridge this gap and oversee the need for more detailed simulation systems and components assuring higher quality and higher models resolution.

Interoperability of building modelling (IBM)

Under the fourth category, interoperability of building modelling, the survey findings suggest that users commonly need to exchange the geometric building model accurately with simulation tools. For architects the priority is for drawing and CAD packages while for engineers the priority is for MEP drawing packages. The survey suggests that users are dissatisfied with obscuring organic modeling of curved volumes and detailed elements such as fins overhangs. Another frustrating obstacle is the difficulty of merging geometric model in thermal models with full zones representation. Most engineers use more than one tool according to the survey findings, which implicates managing and exchanging the design for every tool. This process is tedious and creates a barrier in practice. Moreover, IBM is only addressed during late design phase for large scale and multi-disciplinary team based projects.

However, IBM should be also addressed for small scale projects. IBM in BPS tools should correspond to the user type and design phase. Respondents suggest one common language like gbXML (CAD) to become an open standard and full IFC (BIM) compliance. Architects would like to see fluent building modelling technologies that does not cause complexity and allow exporting back and forth simple geometrical models with little input during early design phases. Engineers prefer seamless model exchange and facilitating iterative process optimization. . It is clear that geometric modelling in the future should be attuned with early design phases allowing the concept development. The tools mechanism should encourage the design team approach and allow architects and engineers to input building data within an integrated central building model. Also, it should emphasize the use of 3D model from the beginning of the process and the level of geometric details could gradually increase.

Integrated building design process (IBDP)

The integrated building design process, as one of the five assessment criteria, was not presented explicitly in the questionnaire. However, the survey findings proved that this criterion is one of the most important one. Users mentioned that existing tools are not practical for the design process. The open questions provided a rich source that addresses this issue. The two important findings of the survey concerning the integrative design process are (1) the integration of BPS in different design phases and (2) the integration of various users in the design process.

Concerning (1) the integration of BPS during all design phases, users reported that they commonly need fluid tools that could produce initial results from a rough building representation during early design phases and in the same time allow for detailing of building components during later design phases. Users complained that most existing BPS tools address late design phases. In this survey, only a few tools had a GUI, which could progressively reveal different levels of pertinent information input, demands to assist in decision making at different design stages. Interfaces for engineers can look different to interfaces for architects. In order to integrate BPS tools in the design process, different user interfaces must communicate to different users using their familiar language.

Users suggested to develop toolkits that correspond to all design stages allowing the flexibility to provide basic information during pre-design while more complex information in later design phases. Further work is required to better understand the approach of existing building design and delivery process in order to extend the application of simulation during all design phases. The building thermal model should evolve through the design process as the model resolution becomes more highly specified. As the design progresses, the design team can modify the model to higher resolution options incorporating more accurate computational algorithms in order to create higher resolution performance details results.

Concerning (2) the integration of various users in the design process, users reported that tools should cater more for design teams. The findings of the questionnaire suggest that architects have more confident in tools used by engineers while engineers have more confidence in tools that facilitate the multidisciplinary and can be shared by the design team. Tools that allow the integration and interdisciplinary work were ranked higher than tools that focus on individual and mono-disciplinary work. The survey respondents suggested that BPS should also address building owners, users, facility management agents in order to include their feedback in the process.

Further work should support multi-disciplinary collaborative design as building projects. BPS maybe started discipline oriented however it should continue to be design oriented in the future. The mono-disciplinary approach should be replaced by a team driven approach. Architects and engineers should be less discipline oriented. We consider that BPS tools can play the role of the vehicle that will get the whole design team on board. However, the realm of BPS tools development requires interdisciplinary research that is based on design teams, design process, design integration.

Lastly, the classification of five tools assessment criteria was composed to stack up against theories and practices of building performance simulation. The inherent limits to a synopsis of five BPS tools assessment and evaluation criteria are apparent. Several pages are not enough to convey the nuances of various tools assessment criteria. But let us ask a question. Do the five criteria respond to common themes discussed in practice? This is a question of action. In fact, each of these criteria has its critiques. Nonetheless, the five criteria presented in this paper form a basis for tools assessment criteria. As with many classifications they overlap and are not mutually exclusive.

With the sprawl and diversification of BPS tools in addition to the increasing growth of number of users, it will not be surprising to find entities (bodies, private magazines, software companies, etc...) that are only concerned with facilitating the selection of tools and comparing them among different users needs using the five assessment criteria. Despite that new tools must take into account the five criteria, tools satisfying these criteria are not guaranteed to succeed. Tools developers have to look into the future and think beyond the five assessment criteria. To be truly effective, tools have to be based on research and adapted to the experience and background of the different design team members at different design stages. New tools have to be developed in close co-operation with the designers it is intended for.

The five assessment criteria presented in this paper are not new. However, it is clear from the survey results that these criteria/objectives have still not been met, for existing tools, to the satisfaction of BPS user. There is therefore still considerable scope for improvement in making BPS tools accessible to users.

The Gap

Although tools, which are considered as the 'hardware', may be improved based on the criteria discussed previously, they will be wasted if we ignore the human factor or the user, as the 'software'. The survey revealed that some of the barriers to the use of simulation lie outside the realm of tool development. The common finding of the survey indicates a wide gap between architects and engineers as tools users. Out of 9 questions, architects and engineers agreed only two times. By analyzing the survey results and in particular question 4.8 and 4.20 we discovered that a gap between both groups is evident.

There are many reasons for this gap that can be traced very early. Since the industrial revolution with the great development in the field of sciences and materials a clear division between the two professions became more obvious {Larsen, 2003 #193}. Both groups developed within a mono-disciplinary environment and catered their services within a linear and fragmented building delivery process {Mahdavi, 1998 #104}. Architects were in charge of architectural issues, whereas engineers were concerned with technical issues. Consequently the formation of computational building performance modeling and simulation as a discipline, developed within the womb of engineering, reflecting this mono-disciplinary environment.

Today, we are facing a paradigm shift. With the growing trend towards environmental protection and achieving sustainable development, the design of green buildings using BPS tools is gaining attention. Simulation tools became a significant part of the building design {Donn, 2001 #128}. Together with sustainability BPS tools became a part of the larger trend toward integration in the AECO industry. We are on the verge of a major revolution, triggered by mandatory codes and standards that will change the way building are designed and constructed. With the 2030 objective and international Net Zero Energy Buildings objective (IEA 2009), there is a great effort to work together in a focused effort as a design team including the building owner. Discipline oriented design approaches can no longer achieve exceptional performance. In fact, the typical uni-disciplinary design process where the architect and engineers work in separate islands and with no performance goals cannot achieve the new millennium objectives.

The solution is to change this approach and develop the design goals together as a unified design team from the beginning during the initial stages of the design. Performance goals should be set from the beginning and every team member has to be in the service of the objective. But where and how to start to bridge this gap? The answer to this question works at several levels.

In Education

There is a fundamental need that the architectural and engineering education has to evolve. Evolution has to take place in universities introducing new classes and updating existing ones. In most architecture schools, students are taught within the design studio to start with a concept and then push their concept towards the design details (Schon 1985). During conceptualization, students jump from a concept to another aiming to compare alternatives and optimize their designs (Lawson 1997). During this phase the major design characteristic are determined.

On the other hand, in most engineering schools student are taught to follow systematic and methodological and progressive steps. Designing and sizing energy systems is based building a simulation model step by step based on the accretion of detail (Holm 1993). The procedure moves from the basic parts towards the whole, thus in the opposite direction of the architectural design approach. It is true that the language difference between engineers and architects is a barrier. Engineers are more technically oriented and require verified and accurate models that effectively represent the real world complexity. Engineers are typically educated to think of design as being a systematic methodological process for determining the appropriate energy systems.

Architects on the other hand, often think of design as being the heuristic conception and idealization process of vision for the appearance and function of a building. Architects are more interested in simple, visual, straightful and intuitive tools. For example, an architect, not aware about building thermal characteristic, will find it difficult to specify the thermodynamic properties of a building. However, he can easily define the construction material used. In doing so, some of the thermal characteristics are inherently specified {Marsh, 2004 #145}. The educational system is nourishing this gap. Moreover, not all architecture schools provide a good grounding in building physics and even if provided in practice much of this knowledge is quickly lost {Marsh, 2004 #145}. User surveys indicate that architect lack simulation know-how (Mahdavi, Silvana et al. 2003). It is necessary that architecture student receive a sufficient knowledge in environmental building design to use the tools for quick evaluation of design concepts.

A review of the overall curriculum of 17 architectural engineering programs accredited by the Accreditation Board for Engineering and Technology (ABET) and the 119 architecture programs accredited by the National Architecture Accreditation Board (NAAB) will likely show the same mono-disciplinary approach and lack of interdisciplinary team work. A deeper review will show that the word building performance simulation is missing from most undergraduate course curricula and in particular architecture programs. In fact, the integration cannot take place in a single course or during a single project that is taught only for architects or engineers. It must be developed in a team environment over a lifetime of education {Geschwindner, 1995 #196}.

The next generation architects and engineers have to be trained to work in teams. Architects and engineer student has to come together early as possible during the undergraduate educating to work together and design together in teams. In fact, there are many successful examples of extensive forms of collaboration in educational schools, such as the ecoMOD project at the University of Virginia and the Sola Decathlon entries across 17 universities in the USA {Quale, 2005 #194}{Charles, 2009 #195}. Also architecture students should be introduced to scientific and technical foundation to the use of BPS tools during their education to learn how to integrate them in their own practice. {Pedrini, 2005 #144}. On the other hand, engineering students will need to study the architectural approach to design and find ways to integrate that approach with a diverse set of engineering goals {Geschwindner, 1995 #196}.

In fact, architecture and engineering education takes place in an increasingly computationally rich and diverse world. BPS tools are a part of this environment and must be brought to the students in the classroom and in the studio. Successful high performance buildings are a logical outcome of an integrated process, requiring a connection between design and building performance. Therefore, BPS tools are already a potential medium or vehicle that can bridge this gap.

In Practice

In traditional practice the building owner and architect create the building program and the architect has to satisfy the requirements of the design brief. Then the building services engineers design the mechanical and other systems. As a result, the decision on mechanical systems and controls is frequently decoupled from the design (Lam, Wong et al. 1999). Therefore, most BPS tools are used during late phases of design because they were mainly catering to engineers. Thus they help engineers to design and refine the basic systems design, not to help architects in formulating the basic building ideas (Donn, 2001). As a consequence, most major decisions have been taken, by the time BPS are performed, making it impossible to go back (Holm 1993). Holm (1993) point out that "By that time the building owner and/or the architect may have fallen in love with the design or even be married to it, in which case the cost of divorce would exceed the cost of hanging on".

Moreover, in traditional practice most architecture bureaus pass the simulation work to engineers and shift the energy issues away. This is probably true in situations where the design team comprises a diverse group of specialist working together right from the beginning. Passing the simulation work to engineers might be due to time limitation since preparing and interpreting data for thermal building simulation tolls can take several days {Ellis, 2002 #138}. Or maybe due to the great expertise requirement for understanding ,analyzing and interpreting physical processes involved {Marsh, 2004 #145}. The lack of detailed knowledge in building performance might be the fundamental reason. What is evident is that the confidence of most architects in practice quickly falls when it comes to thermal building performance analysis, incident solar radiation and regulatory compliance {Marsh, 2004 #145}{Reinhart, 2006 #48}{Augenbroe, 1992 #99} {Donn, 2001 #128}.

Another inherent problem in the traditional practice is related to small projects that have limited budget. During the design of many medium and small scales, architects are forced to base their design on intuition. They are obliged to generate a reasonably cohesive design solution without using BPS tools or getting general directional advice from engineering consultant to avoid paying significantly high consultancy fees. Also the prescriptive nature of many current codes of practice and design guidelines facilitates this practice. On the other hand, not much tools are developed to satisfy the architects' needs during early design phases. The results of the survey confirm that problem and perhaps explain possible reason behind this gap. Despite that future design trend will entail a multidisciplinary design team approach; this approach will be limited to large projects with sufficient budgets. A huge part of the newly constructed building stock will be small residential units that are, in principal, designed by architects only. Tools developers have to reach those architects who are not energy experts. Instead of focusing on engineers needs only, developers have to cooperate with architects to create adaptive tools. Adaptive tools that address architects needs, who generally use different types of knowledge required by existing tools, during early design. The fragmented building delivery process has resulted in little progress in the augmentation of simulation tools that address architects during conceptual design. We have to foster the development of architects' in-house simulation capability within design practices by developing tools that are centered on architects. As mentioned before, there is no doubt that architects themselves need a fundamental understanding of basic building physics. However, it is not enough to leave the energy issues up to the engineering consultant who will never participate in such projects with a tight budget {Marsh, 2004 #145}.

Finally, we strongly believe that the building regulation enforcement is a good entry point to solve those problems. Using BPS tools can help in bridging the gap between architects and engineers. Recent development and application of information technology in the building industry is changing completely the building design philosophy and methodology. The experience of the National Renewable Energy Lab (NREL) with the High Performance Building Initiative (HPBi) highlighted the importance collaboration between all designers as one team (Deru and Torcellini 2004). There is evidence in literature that the starting point for the whole team should be the same {Hopfe, 2005 #51; Lam, 1999 #34; Mahdavi, 1998 #104}. Transformative concepts such as integrated delivery and the long-term involvement of architects and engineers in building operations all have a significant role to play in creating a more sustainable built environment.

Much more effort is needed to get BPS tools into the architecture main stream and to maximize the tools usage in the design process {Wong, 2000 #189}.. There are many qualities of BPS. BPS entails an embedded feature that not many practitioners recognize. BPS highlights and reinforces the iterative nature of design. BPS can capture the complex design interrelationships between building design and building performance. BPS can bring the whole design team participants together. A better understanding of energy simulation tools, their advantages and their limitations, may encourage architects to have the confidence to use the simulation tools, {Yezioro, 2008 #107}. The professional experience with leading architecture firms urges brining BPS tools into the undergraduate studio environment. Therefore, BPS offers a common ground, a platform to support the collaboration between architects' and engineers in practice and education.


(summary - perspective - critical assessment - future work)

The AECO disciplines are moving towards convergence. There is evidence that building services disciplines are merging (Attia, Beltran et al. 2009) Triggered by the mandatory codes and rating systems environments the rapid emerging confluence of multidisciplinary integrated building design process and BPS will accelerate the use and development of BPS tools within the architectural and engineering practice and education.

Today architecture and engineering practice takes place in an increasingly rich BPS tools environment. The purpose of this article has been to identify assessment criteria for building performance tools. The overview presented in this paper aims to introduce criteria for selecting and evaluating building simulation and provide information sources to building simulation developers. The five criteria presented in this paper, namely (1) usability and information management (UIM) of the interface, (2) integration of intelligent design knowledge-base (IIKB), (3) integrated building design process (IBDP) (4) interoperability of building modelling (IBM), (5) accuracy and ability to simulate complex and detailed building components (AADCC), continue to resound and form the basis of much scholarly and professional activity. The survey results provide an overview of the criteria that need to be addressed by developers to improve the uptake of simulation practice. Addressing these criteria will require interdisciplinary research in the field of building simulation research and development with design process management{De Wilde, 2009 #120}. In order to improve the uptake of simulation practice, within an accelerating growing BPS tools environment, we believe that the BPS community has to set a uniform definition of tools assessment criteria and specifications. On the other hand, developers might create metrics to analyze the costs and benefits of using BPS tools. This will accelerate and improve the BPS practice.

While developers can use the survey results to improve their tools and create an innovative bridge between architecture and engineering both groups have to work together to bridge their interdisciplinary gap at several levels. But first, architecture and engineering education should enforce transversal team oriented education and support students with necessary skills to use and judge BPS tools results. Architects and engineers in practice must broaden their skills to 'adjacent' domains or learn to work with other experts to successfully support integrated design of high performance buildings. Clearly for mastering such skills, users' need domain knowledge for quality assurance. Improving the use of BPS tools will lead to improved building performance.

Finally, we believe that the next generation of BPS tools has to direct its development within the gestalt of multidisciplinary design team and the gestalt of design process. Quantitative and qualitative understanding of building energy performance should be brought to architects and engineers. Architects and engineers will design buildings using BPS tools which are very adaptive, accurate and can predict, during all design stages, the energy consumption of high performance buildings. . BPS will be the heart of designing and building high performance buildings in order to inform the design process and evaluate the impact of design decisions. We believe that BPS offers a common ground, a platform to support the collaboration between architects' and engineers.


The authors express their thanks to all respondents who participated in the survey and appreciate their valuable comments and feedback. This paper is part of an ongoing PhD research funded by the Université Catholique de Louvain.


  1. AIA (2007). Integrated Project Delivery: A Guide.
  2. AIA. (2007). "Interoperability in the Construction Industry ", from
  3. ASHRAE (2007). Standard method of test for the evaluation of building energy analysis computer programs. ASHRAE.
  4. ASHRAE (2009). ASHRAE handbook. Fundamentals. Atlanta, Ga., American Society of Heating, Refrigerating, and Air-Conditioning Engineers: v.
  5. Attia, S., L. Beltran, et al. (2009). "Architect Friendly": A comparison of ten different building performance simulation tools. IBPSA, Galsgow, Scotland.
  6. Augenbroe, G. (1992). "Integrated Building Performance Evaluation in the Early Design Stages." Building and Environment 27(2): 149-161.
  7. Augenbroe, G. (2002). "Trends in building simulation." Building and Environment 37: 891 - 902.
  8. Autodesk (2008). ECOTECT v5.60.
  9. Autodesk (2008). Green Building Studio v3.3.
  10. Autodesk. (2009). "ECOTECT." Retrieved November, 2009, from
  11. Autodesk (2009). ECOTECT.
  12. Autodesk. (2009). "Green Building Studio." Retrieved November, 2009, from
  13. Balcomb, J. D. (1992). Passive solar buildings. Cambridge, Mass., MIT Press.
  14. Bambardekar, S. and U. Poerschke (2009). The architect as performer of energy simulation in the early design stage. BS2009, Glasgow.
  15. Bazjanac, V. (2003). Improving Building Energy Performance Simulation With Software Interoperability Interoperability. IBPSA, Eindhoven.
  16. Bazjanac, V. (2004). "Building energy performance simulation as part of interoperable software environments." Building and Environment 39 879 - 883.
  17. Bazjanac, V. and A. Kiviniemi (2007). Reduction, simplification, translation and interpretation in the exchange of model data. Bringing ITC knowledge to work, University of Maribor.
  18. Clarke, J. (2009). Integrated Building Performance Simulation: Trends and Requirements. Glasgow, University of Strathclyde.
  19. Clarke, J., J. Hensen, et al. (1998). Integrated Building Simulation: State-of-the-Art. Indoor climate of Buildings, Bratislava, ASHRAE Transactions.
  20. Clarke, J. A. (1985). Energy simulation in building design. Bristol [Avon] ; Boston, A. Hilger.
  21. Crawley, D. (1997). BUILDING ENERGY TOOLS DIRECTORY. IBPSA, Prague.
  22. Crawley, D., J. Hand, et al. (2005). Contrasting the capabilities of building energy performance simulation programs. Washington DC.
  24. Crawley, D. B., J. W. Hand, et al. (2005). "Contrasting the capabilities of building energy performance simulation programs." Building and Environment 43(4): 661-673.
  25. Crawley, D. B., J. W. Hand, et al. (2008). "Contrasting the capabilities of building energy performance simulation programs." Building and Environment 43(4): 661-673.
  26. De Wilde, P. and D. Prickett (2009). PRECONDITIONS FOR THE USE OF SIMULATION IN M&E ENGINEERING. BS2009, Glasgow.
  27. Deru, M. and P. Torcellini (2004). Improving Sustainability of Buildings through a performance-based design approach. World Renewable Energy Congress. Denver, Colorado.
  28. DesignBuilder (2008). Design Builder v.
  29. DesignBuilder. (2009). "DesignBuilder " Retrieved November, 2009, from
  30. DOE. (2009). "EnergyPlus." Retrieved November, 2009, from
  31. DOE, U. S. (2009). "Building Energy Software Tools Directory." Retrieved 15 November, 2009, from
  32. Donn, M. (2001). "Tools for quality control in simulation." Building and Environment 36: 673-680.
  33. Donn, M., S. Selkowitz, et al. (2009). Simulation in the service of design - asking the right questions. IBPSA Glasgow, Scotland.
  34. Donn, M., S. Selkowitz, et al. (2009). SIMULATION IN THE SERVICE OF DESIGN - ASKING THE RIGHT QUESTIONS. IBPSA. Glasgow, Scotland: 1314-1321.
  35. Eisenberg, D., R. Done, et al. (2002). Breaking down the barriers: Challenges and solutions to code approval of green buildings. Tucson, AZ.
  36. Ellis, M. and E. Mathews (2002). "Need s and trends building and HVAC system design tools." Building and Environment 37: 461-470.
  37. Ellis, P., P. Torcellini, et al. (2008). ENERGY DESIGN PLUGIN: AN ENERGYPLUS PLUGIN FOR SKETCHUP. SimBuild 2008, Berkeley, California.
  38. Gale (2001c). General Contractors—Non-residential Buildings, Other Than Industrial Buildings and Warehouses. Encyclopedia of American Industries, 3rd ed., Farmington Hills.
  39. Google. (2009). "SketchUp Pro7." Retrieved November, 2009, from
  40. Hand, J. and D. Crawley (1997). Forget the tool when training new simulation users. IBPSA, Prague, Czech Republic.
  41. Hayter, S., P. Torcellini, et al. (2001). The Energy Design Process for designing and constructing high-performance buildings. Clima 2000. Napoli.
  42. Hefley, W. and D. Murray (1993 ). Intelligent user interfaces. International Conference on Intelligent User Interfaces, Orlando, Florida, United States.
  43. Hensen, J. (2002). Integrated Building (and) airflow simulation: An overview. Computing in Civil and Building Engineering, Taipei, Taiwan.
  44. Hensen, J. (2004 ). Towards more effective use of building performance simulation in design. 7th International Conference on Design & Decision Support Systems in Architecture and Urban Planning, Eindhoven: TU/e.
  45. Hensen, J., R. Lamberts, et al. (2002). "A view of energy and building performance simulation at the start of the third millennium." Energy and Buildings 34(9): 853-855
  46. Hensen, J. and M. Radoševic (2004). Some quality assurance issues and experiences in teaching building performance simulation. IBPSA 2004, Eindhoven.
  47. Hien, W., L. Poh, et al. (2003). "Computer-Based Performance Simulation for Building Design and Evaluation: The Singapore Perspective." Simulation & Gaming 34(3): 457-477.
  48. Holm, D. (1993). "Building Thermal Analyses: What the Industry.Needs: The Architect's Perspective." Building and Environment 28(4): 405~07.
  49. Hopfe, C., C. Struck, et al. (2005). Exploration of using building performance simulation tools for conceptual building design. IBPSA-NVL Conference, Delft, The Netherlands, TU-Delft.
  50. Ibarra, D. and C. Reinhart (2009). Daylight factor simulations - how close do simulation beginners 'really' get? BS2009, Galsgow.
  51. IEA (2009). IEA Net Zero Energy. Montreal.
  52. Judkoff, R. and J. Neymark (1995). International Energy Agency Building Energy Simulation Test (BESTEST) and Diagnostic method. Golden, Colorado, NREL.
  53. Kusuda, T. (1999). Early history and future prospects of buildings system simulation. IBPSA, Kyoto Japan.
  54. Lam, K., N. Wong, et al. (1999). A study of the use of performance-based simulation tools for building design and evaluation in Singapore. IBPSA, Kyoto, Japan.
  55. Lawson, B. (1997). How designers think : the design process demystified. Oxford, U.K. ; Boston, Mass., Architectural Press.
  56. LBNL and J. Hirsch (2008). DOE v2.2 Building energy use and cost analysis program. Berkeley, CA, LBNL.
  57. LBNL and J. Hirsch. (2009). "DOE-2." Retrieved November, 2009, from
  58. LBNL and J. Hirsch. (2009). "eQUEST." Retrieved November, 2009, from
  59. LBNL and J. Hirsch (2009). eQUEST v3.61b. Berkeley, LBNL.
  60. Mahdavi, A. (1998). "Computational decision support and the building delivery process: A necessary dialogue." Automation in Construction 7: 205-211.
  61. Mahdavi, A., F. Silvana, et al. (2003). An inquiry into building performance simulation tools usage by architects in Austria. IBPSA, Eindhoven.
  62. Maybury, M. T. and W. Wahlster (1998). Readings in intelligent user interfaces. San Francisco, Calif., Morgan Kaufmann Publishers.
  63. Mendler, S., W. Odell, et al. (2006). The HOK guidebook to sustainable design. Hoboken, N.J., John Wiley & Sons.
  64. Morbitzer, C. (2003). Towards the Integration of Simulation into the Building Design Process. Department of Mechanical Engineering. Glasgow, University of Strathclyde. PhD.
  65. Morbitzer, C., P. Strachan, et al. (2001). Integration of building simulation into the design process of an architecture practice. IBPSA. Rio de Janeiro, Brazil.
  66. NREL (2005). ENERGY-10 v1.8.
  67. NREL. (2009). "Energy-10." Retrieved November, 2009, from
  68. NREL (2009). OpenStudio 1.0.2.
  69. Papamichael, K., J. LaPorta, et al. (1996). The Building Design Advisor. ACADIA, Arizona.
  70. Papamichael, K. and J. Protzen (1993). The limits of Intelligence in Design. 4th International Symposium on System Research, Informatics and Cybernetics, Baden, Germany.
  71. Pedrini, A. and S. Szokolay (2005). The architects approach to the project of energy efficient office building in warm climate and the importance of design methods. IBPSA, Montréal, Canada.
  72. Pilgrim, M., N. Bouchlaghem, et al. (2003). "Towards the effecient use of simulation in building performance analysis: a user survey." Building Services Engineers 24(3): 149-162.
  73. Punjabi, S. and V. Miranda (2005). Development of an integrated building design information interface. IBPSA, Montreal, Canada.
  74. Robinson, D. (1996). "Energy model usage in building design: A qualitative assessment." Building Services Engineering Research and Technology 17(2): 89-95.
  75. Schlueter, A. and F. Thesseling (2009). "Building information model based energy/exergy performance assessment in early design stages." Automation in Construction 18(2): 153-163.
  76. Schon, D. (1985). The Design Studio. London, RIBA.
  77. Tian, Z., J. Love, et al. (2009). "Applying quality control in building energy modelling: comparative simulation of a high performance building " Journal of Building Performance Simulation 2(3): 163 - 178.
  78. Tianzhen, H., Z. Jinqian, et al. (1997). "IISABRE: An Integrated Building Simulation Environment " Building and Environment 32(3): 219-224.
  79. Torcellini, P., S. Hayter, et al. (1999). "Low Energy Building Design - The Process and a Case Study." ASHRAE 105(2): 802-810.
  80. UCLA (2008). HEED v.3.0.
  81. UCLA. (2009). "HEED." Retrieved November, 2009, from
  82. US-DOE. (2009). "Building Energy Software Tools Directory." Retrieved 01 October, 2009, from
  83. Warren, P. (2002). Bringing simulation to application, IEA ECBCS Annex 30. Brimingham.
  84. Yezioro, A. (2008). "A knowledge based CAAD system for passive solar architecture." Renewable Energy 34: 769-779.