Print Reference This Reddit This

Determination Of Cost For Local Loops Management Essay

This research project is focussed on establishing how best to determine cost for local loops in the telecommunications industry. As consumer and business demands continue to increase it has become progressively more apparent that a most cost effective method of costing last mile or local loops must be established. For every incorrect cost determination profit margin is eroded and when this scenario is multiplied out over innumerable operations the cost of error becomes considerable. Therefore this project aims to address the stated overall aim of research a determination of cost for local loops by means of creation of a generic framework engaging heuristic techniques. Thus the project will address the proposed subject through the following research objectives:-

A comparison of originally quoted costs for local loops and the final cost of local loops to establish where there is a deviation and at which points within the quote breakdown is the deviation greatest. The objective is to enable an understanding of the strengths and weaknesses of the currently employed costing methods.

Use the information set out in objective 1 to establish the relative accuracy of each method (using heuristic principles) and determine the most suitable combination of methods relative to a cost / risk / benefit analysis.

Establish a fully built-up cost of the most accurate method and specifically ascertain the fiscal risk associated with this method.

Determine through robust cost / benefit analysis the relative benefits of creating and maintaining an automated method relative to the current conventional manual method of process enquiry. Use this information to establish the point at which it is more economically feasible to pursue an automated or manual process.

Use the information generated from objectives 1-4 above to create a generic model of local loop costing during process enquires which can be used in a multi-national context.

It is believed that be logically addressing each aspect of the issue under review it will be possible to use this information to create a generic framework model which can be used to engender significant long term cost and efficiency savings in the telecommunications industry which will benefit both practitioners and consumers. Thus the following chapter will be devoted to a review of the relevant literature which will be used to create a wider contextual understanding.


This chapter is devoted to a review of the most pertinent literature in regard to procurement and costing models. The literature is drawn from a combination of academic and practitioner journals so as to provide a broad contextual overview of the current situation, and also critique where appropriate the existing models and theories. The literature will be divided into five main topic areas in alignment with the overall aims and objectives of this research, viz, procurement for re-selling, creation of cost models, evaluation and management of risk, generic models of pricing which can serve as proxy for the foundations of this research and best practice in regards to procurement and advance costing in uncertainty. Given that in some areas of research there is little in the mode of formalised academic study some examples of proxy have been adopted to serve as comparators for the proposed framework costing model.

Procurement for Re-Selling

The first area under review is that of literature and theory which currently relates to procurement, and specifically procurement for reselling. Considerable literature exists with regard to procurement as a discipline, and indeed the Chartered Institute of Procurement and Supply (CIPS, 2011) has devoted considerable resource to researching various methods and models of best practice with regard to procurement of goods and services in a wide variety of industries and niche disciplines. Procurement as a discipline has been defined as “the acquisition of appropriate goods and/or services at the best possible total cost of ownership to meet the needs of the purchaser in terms of quality and quantity, time, and location” (CIPS, 2011). From this definition it can be suggested that the procurement function is often aligned with project management techniques and disciplines, given that time and quality are two of the central principles of project management. This view is echoed by Raghunathan and Yeh (2000) and also Krasnokutskaya and Seim (2006) where in their working papers which consider the issue of procurement within a project management context. This significance of their findings for this review is concerned with the application of procurement in a B2B context given that in this project the main consideration is procurement of local services in a closed-loop environment. They have both taken the context of discrete project management as proxy for a closed loop environment considering the implications of re-selling back into an operation within a locality and highlight that in the main this is a positive experience for both the organisation procuring he services and the customers and end-users. Not only does it retain profit within a local environment which is to be encouraged, the management and distributions costs are generally considerably lower which of course enhances the overall profit margins.

A similar yet slightly contrasting view is taken by Mukhopadhyay and Kekre (2002:1301-1313) where they broadly agree with the principles of closed-loop or local-loop procurement, however they raise some concerns as to the simplistic assumption that management costs are lower due to outsourcing. They in fact indicate that the greater number of suppliers engendered by local-loop operations in fact accrues hidden costs which are seldom built into the true cost calculations. This point is also echoed by Levin, D. and E. Ozdenoren (2004:229-235) and Li (2005:173-200). Although they approach the issue of procurement for re-selling from a slightly different angle (specifically lowest cost auction or tender for services), they caution against assuming that lowest cost is necessarily best value or quality. Their evidence and research suggests that in fact lowest cost is quite often false economy in the longer term and that this should be given consideration during the entire tender and procurement process. Finally, it is interesting to consider the forecasts of Carter et al (2000:14-26) in their 10 year forecast of the future of purchasing and supply. Whilst they were unable to envisage quite how reliant many organisations would become on lowest cost options in the immediate future they were correct in highlighting the increasing significance and importance of procurement as an organisational strategy. The lesson from this is that in the longer term procurement decisions should be incorporated within the organisational strate

Creation of Cost Models

The next area of consideration is concerned with the creation of cost models, and specifically some of the challenges which face organisations or project groups seeking to create an accurate and yet flexible model which will suit the various competing needs of the business. According to Jonker et al (2004) in their working paper a cost model refers to the practice of calculating the “whole life cost” or “lifecycle cost” of an asset or project. They observe that this is not a simplistic or straightforward exercise, and for the model to be as accurate as possible at the time of use it must incorporate a wide variety of significant variables and include discount factors to allow for estimation of economic usefulness in the immediate and longer term. Their main issue of concern is that different stakeholders place different emphasis on the value of relative pieces of data which are incorporated in the model, and therefore the application of subjective judgements such as risk factors can have significant influence on the overall outcome.

Mankiw and Reis (2002:1295-1328) had in fact reached a similar conclusion but for different reasons some years earlier in their work Sticky Information Versus Sticky Prices: A Proposal to Replace the New Keynesian Phillips Curve. They had established that whilst this established economic model had great validity in establishing base points in cost models, as the external parameters and assumptions on which the Phillips Curve is based had shifted, it would be prudent to re-shape the application. They refer to the phenomena of “sticky priors” which is a concept whereby there is a tendency to draw on previous knowledge and personal assumption. Thus human influence in the cost model based on prior experience exposes the model to risk of subjective bias and influence. However, both Konieczny and Skrzypacz (2005:621-632) and Vermeulen et al (2006) in their working paper subsequently indicated that in a natural cost environment such as the conditions of this project, the risks associated with subjectivity in cost model design were not as great as Mankiw and Reis (2002) had feared. In fact they established that long-standing principles of economics and costing still held true and that over time the likelihood of distortion was largely eliminated. However Vermeulen et al (2006) sounded a note of caution in this as they indicated that trying to replicate models in different local economies was likely to result in some distortion due to local factors and therefore to mitigate this greater iterations of a model to reduce the error percentage may be required, a view echoed by Dhyne et al (2006:171-192).

Evaluation and Management of Risk

The discussion of risk whilst building and applying cost models leads neatly to the next area for review, viz, the evaluation and management of risk. This is a complex subject area which can be applied to many industries and disciplines. As observed by Ekelhart et al (2007:156-162) one of the central challenges is establishing precisely what a risk might be. Whilst the dictionary definition of “risk” is “the possibility of incurring misfortune or loss” (Collins English Dictionary, 2007:1336), what is of far greater value is determining precisely how risk adverse or otherwise and organisation or individual might be. For example, when someone tries something new for the first time there is a high risk of failure, yet when someone has completed the same task many times the risk of failure is small. Precisely the same analogy can be set against commercial operations, thus if an activity is a core function and there is great organisational and operation experience then the risk of failure is low and the tolerance will be high. However with regard to the application of new projects and specifically new technologies such as the creation of a new cost model using untested software, the aversion to risk is likely to be extremely high indeed, and thus great care will be required in creation of the model and determination of the relative parameters and their influence as has been discussed previously.

Risk management in software and modelling operations is subtly different from generic risk management as associated with project management or health and safety operations. Risk in the context of software and modelling refers to the ability to compare and contrast a considerable number of variables for a wide variety of sources (Baker et al, 2007:101-106). Similes for the purposes of this exercise would include insurance risk models or other financial instruments (Stoneburner et al, 2002:800-830). Such instruments would make an ideal starting point for the creation of a bespoke model for the purposes of closed-loop communication costing. According to both Neubauer and Stummer (2007) and Yi Peng et al (2009:747-767), the main issues for consideration would be accessibility, which in turn engenders its own risk factors (Computer Economics, 2005), as well as flexibility, responsiveness, support capability and pricing and perhaps most significantly for this model, modular applications. Once the model has been built and populated it can be applied depending on the propensity to risk and the level of experience, hence sufficient adaptability will be required as organisational experience increases. Possible models to consider in the first instance are Binomial Options Pricing Model and also the Black-Scholes Model which are relatively straightforward and can be used as a generic foundation (Hausman and Wise, 1981; Amemiya, 1985).

Generic Models of Pricing

This section of the literature review will consider some of the literature which examines generic pricing models as opposed to specifically financial instruments relating to similar disciplines. As discussed by Elmaghraby et al (2008:126-148) there are a vast range of tools and models available for organisations wishing to discern the optimum pricing point for their goods and services. These range from cost-plus (Xu and Hopp, 2006:1098-1109) and competition pricing (Su, 2007:726-741) through to premium pricing (Granot, et al, 2006), with variations such as predatory pricing (Fershtman and Pakes, 2000:207-236) and contribution based pricing (Aviv and Pazgal, 2008:339-359). All of these models and variations of such models have benefits and applications, and equally they have disadvantages and areas of improvement. As has been a running theme throughout this review, much depends on the specifics of the overall objectives and strategies of the organisation and also their propensity to risk. A pricing model is not something that can be considered in isolation from the overall aims and objectives of the business for the simple reason that the cost structure will drive the actions and behaviours of the organisation and also the underlying metrics and measurements. As a simple illustration, if the overall driver of the business is lowest cost option (as discussed under procurement above), this will doubtless result in up-front cost savings which will be reflected in the profit margin of the service provided. However, as observed by Levina et al (2008), such a simplistic approach to pricing models will serve as a constraint on the flexibility of the model and its responsiveness to changes in consumer and service demands. Moreover, extrapolating the work of Stoneburner et al, (2002:800-830), it could be suggested that by slavishly following lowest cost models will eventually result in far greater total cost of ownership during the lifecycle as considerably more costly configurations and customisations. This is particularly the case in the context of the project under review where it is known that localised variations of the modle will be required.

Best Practice for Advance Costing in Uncertainty

The final section to consider is current best practice for advance costing in uncertainty, which forms the crux of the project under review. For this section of the review the literature selected has been chosen for its relevance to current models and applications which have been proven to work, and not necessarily theoretical models. This is quite deliberate, as although theory in this context is valuable for pushing the boundaries of knowledge, there is usually a strong reticence for organisations to be used as “beta test” unless they have specifically chosen to enter a collaborative partnership with a software development company (Singh et al, 2007:249-268).

Broadly speaking the models for resource distribution and allocation of cost are based on the premise of efficiency and not lowest-cost option. As such there is recognition in such second –generation cost models for some of the realities of life such as ramp-up and ramp-down which are frequently lacking from pure fiscal models (Deelman et al, 2009). Similarly Cirne, et al. (2002:1571-1601) highlight the requirement of flexibility when costing in uncertainty and with imperfect information when seeking to enhance performance and efficiency; A concept they refer to as “mouldability”. This echoes the findings of the review in relation to risk adversity as discussed previously and is also noted by authors such as Buyya et al (2005:648-659) and Huang, et al. (2007). The over-riding consensus from this section is that a fully holistic picture should be created before the generation of a model to ensure that as many variables as possible are included. The main criticism of fiscal models in the application of “messy problems” is their failure to take account for real-world practicalities and application, and thus best practice advocates the use of experience tempered by analysis to guide such models to prevent costly errors in the longer term (Sulistio, et al., 2007:396-405).


Thus in summary of this literature review the following closing comments are observed. Firstly, whilst there is a vast array of literature on the subject of procurement and costing, there are many niche elements within the topic which should be noted and considered. This review has addressed issues pertaining to procurement, cost models, pricing strategies, risk evaluation as it relates to software and current best practice with regard to costing in uncertainty. Organisation X will be exposed to all of these issues during the course of this project and this review has been invaluable in generating a holistic understanding of the impact of all of these elements on one another.

In overview it can be seen that there is precedent for use of existing “tried and tested” models, however the experience of the studies and literature is that this should be tempered with experience and knowledge which directly relates to the issue under review. It is unwise to assume that bespoke models or even generic ones for that matter will directly translate into discrete and applicable models. Another strong theme which emerged from the review was that lowest-cost options were best-suited to short term options only and that it was better to search for value in pricing. Long term ramifications of lowest-cost frequently included costly configuration at a subsequent point. Thus, bearing this contextual knowledge in mind the following chapter will discuss how best to inspect the issues currently facing Organisation X, drawing on the best practice examples and body of knowledge examined above.


This chapter is given over to an analysis of the most suitable research methods for a project of this nature. It is critically important to consider the relative merits and disadvantages of any research approach when embarking on academic research, as various methods and research philosophies will return alternative perspectives (Horn, 2009:79-82). It is therefore necessary to consider the research methodology at the outset of any research in conjunction with the aims and objectives of the research project to ensure that the research methodology is aligned with the overall approach. This chapter will therefore set out the determination of the proposed methodology and subsequently briefly analyse the strengths and limitations of the selected approach.

Saunders et al (2009:110-120) have established that conducting research of this nature is akin to peeling back the layers of an onion as the research methodology is built up and designed. The schematic below in figure 1 serves as a useful illustration as to how this thought process can be applied during research analysis.

Figure 1 The Research Process Onion, source Saunders et al (2009:123)

As can be seen from the schematic, Suanders et al (2009) advise that the first issues to be considered should relate to the research philosophy and the research approaches. Given that this project is focussed on the resolution of an existing dilemma of an empirical nature, it would seem that a positivist philosophy is most suited to the research. Cresswell (2003:119-120) indicates that as the school of positivism holds that rigorous scientific analysis is the most suitable method of establishing why events occur, then it is the most apt method for this project. Similarly as the nature of this research is focussed on establishing a robust model for cost determination, then a deductive approach to logical interrogation is most suited. This approach is aligned with the pure scientific and mathematical nature of the study as opposed to research into social principles or activities.

Given the contextual setting of the research it is considered that a hybrid combination of quantitative and qualitative methods is best suited to the problem under review (Saunders et al, 2009). Thus, quantitative and specifically heuristic methods will be engaged to consider the scientific and mathematical nature of the cost / risk / benefit analysis. This will take the form of a series of analyses of existing databases and information as set out below:

Analysis of the quote database, which contains detailed information about all quotes that have been processed, including quoted costs, method the quote was generated with, costs when circuit was ordered. It is intended to create a comprehensive and quantitative picture of the complementary result produced by the unique combination of methods.

Analysis of the time reporting database to determine effort involved in the creation and maintenance of heuristics and automated methods.

The second stage of the research process will engage qualitative methods which will be used to triangulate the findings of the qualitative research and establish the level of subjectivity and/or bias which may be present in the risk analysis of such problems. This qualitative research will take the form of semi-structured interviews with key members of the organisation, and the insights gained from these interviews will be used as the basis for determining various local interpretations of the proposed generic framework (Willis et al, 2007). For example it is known that for the model to function on an international basis there must be various adjustment in respect of localised restrictions and external determinants. It will not be possible to build these into the model from the outset as there will be too great a number of variables, however by acknowledging their existence at the outset through the data gathered from the interviews it will be possible to make due provision for their subsequent addition.

As with any research it is also necessary to consider if there are any limitations to the proposed research method so as to take all steps necessary to control or mitigate the effects of such limitations. The main limitations are heuristic principles are concerned with the limits of the assumptions on which the algorithms are based. Whilst there little argument that heuristic methods are an excellent method of reducing the cycle time in a problem solving scenario, there is a strong risk that the algorithms will be based on current understanding and not necessarily a reflection of future states or situations. It is therefore necessary to conduct a series of statistical analyses to establish the parameters of the assumptions are acceptable (Kahneman and Frederick, 2002:49-81). In essence as this problem is contained by a series of parameters more colloquially termed as a “knapsack problem” (Kellerer et al, 2004). It is of critical importance to establish a series of simulations which also recognise possible future states in addition to the current scenario. Moreover from a pragmatic perspective given that the burdens on telecommunications infrastructure are likely to increase, an exponential “worst-case” scenario must be included within these calculations.


Amemiya, T. (1985) Advanced Econometrics. Harvard University Press, Cambridge, MA.

Baker, L. Rees, and P. Tippett, (2007) “Necessary measures: metric-driven information security risk assessment and decision making,” Communications of the ACM, vol. 50, no. 10, pp. 101–106,

Buyya, et al., (2005) "The grid economy," Proceedings of the IEEE, vol. 93, pp. 698-714,

Carter, J. R. Carter, R. M. Monczka, T. Slaight, and A. J. Swan, “The Future of Purchasing and Supply: A Ten-Year Forecast,” The Journal of Supply Chain Management (Winter 2000): 14–26;

Chartered Institute of Purchasing and Supply (2011) CIPS Resources available at retrieved 12th Feb 2011

Cirne, et al., (2002) "Using Moldability to Improve the Performance of Supercomputer Jobs," Journal of Parallel and Distributed Computing, vol. 62, pp. 1571-1601, 2002/10

Collins English Dictionary (2007) Definition of Risk Harper Collins pp1336

Computer Economics, Inc., “2005 malware report: Executive summary,” January 2006. [Online]. Available: retrieved 12th Feb 2011

Creswell, J. (2003). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. Thousand Oaks, California: Sage Publications

Deelman, et al., (2006) "Pegasus: Mapping Large-Scale Workflows to Distributed Resources," in Workflows in e-Science, Springer

Dhyne, E., L. J. ´Alvarez, H. L. Bihan, G. Veronese, D. Dias, J. Hoffmann, N. Jonker, P. Lunnemann, F. Rumler, and J. Vilmunen (2006): “Price Setting in the Euro Area and the United States: Some Facts From Individual Consumer Price Data,” Journal of Economic Perspectives, 20(2), 171–192.

Ekelhart, S. Fenz, M. Klemen, and E. Weippl, (2007) “Security Ontologies: Improving Quantitative Risk Analysis,” in 40th Hawaii International Conference on System Sciences (HICSS’07). Los Alamitos, CA, USA: IEEE Computer Society, Jan 2007, pp. 156–162.

Elmaghraby, W., A. Gülcü, P. Keskinocak. (2008) Designing optimal pre-announced markdowns in the presence of rational customers with multi-unit demands. Manufacturing Service Oper. Management 10(1) 126–148.

Fershtman, C., A. Pakes. (2000) A dynamic oligopoly with collusion and price wars. RAND J. Econom. 31(2) 207–236.

Granot, D., F. Granot, B. Mantin. (2006) Revenue management of perishable products under competition. Working paper, University of British Columbia, Vancouver.

Hausman, J., D. Wise. 1981. Stratification on endogenous variables and estimation: The Gary income maintenance experiment. Structural Analysis of Discrete Data with Econometric Applications. MIT Press, Cambridge, MA.

Horn, R., (2009) Researching and Writing Dissertations: A complete guide for business and management students Chartered Institute of Personnel and Development

Huang, et al., (2007) "Automatic Resource Specification Generation for Resource Selection," in Super Computing Conference, Reno

Jonker, N., C. Folkertsma, and H. Blijenberg (2004): “An Empirical Analysis of Price Setting Behavior in the Netherlands in the Period 1998-2003 Using Micro Data,” Working Paper No. 413, European Central Bank.

Kahneman, D., & Frederick, S., (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich, Dale Griffin, Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81

Kellerer, Hans; Pferschy, Ulrich; Pisinger, David (2004). Knapsack Problems. Springer

Konieczny, J. D., and A. Skrzypacz (2005): “Inflation and Price Setting in a Natural Experiment,” Journal of Monetary Economics, 52(3), 621–632.

Krasnokutskaya, E. and K. Seim (2006): .Bid Preference Programs and Participation Highway Procurement Auctions,.Working Paper, University of Pennsylvania

Levin, D. and E. Ozdenoren (2004): .Auctions with Uncertain Number of Bidders,. Journal of Economic Theory, 118, 229-25

Levina, T., Y. Levin, J. McGill, M. Nediak. (2008) Dynamic pricing with online learning and strategic consumers: An application of the aggregating algorithm. Oper. Res. Forthcoming.

Li, T. (2005): .Econometrics of First-Price Auctions with Entry and Binding Reservation Prices,. Journal of Econometrics, 126, 173-200

Mankiw, N. G., and R. Reis (2002): “Sticky Information Versus Sticky Prices: A Proposal to Replace the New Keynesian Phillips Curve,” Quarterly Journal of Economics, 117(4), 1295–1328.

Mukhopadhyay, T., & Kekre, S., (2002) Strategic and Operational Benefits of Electronic Integration in B2B Procurement Processes Management Science Vol. 48, No. 10, October 2002 pp. 1301–1313

Neubauer and C. Stummer, (2007) “Interactive Decision Support for multi objective COTS Selection,” in Proceedings of the 40th Annual Hawaii International Conference on System Sciences, no. 01, 2007.

Raghunathan, S., A. B. Yeh. 2000. Beyond EDI: Impact of continuous replenishment program. Working paper, University of Texas at Dallas, Dallas, TX

Saunders, M., Lewis, P., and Thornhill, A., (2009) Research Methods for Business Students (5th Edition), Financial Times Prentice Hall

Singh, et al., (2007) "Optimizing workflow data footprint," Scientific Programming, vol. 15, pp. 249-268

Stoneburner, A. Goguen, and A. Feringa, (2002) “Risk management guide for information technology systems,” National Institute of Standards and Technology (NIST), Gaithersburg, MD 20899-8930, NIST Special Publication 800-30, July 2002.

Su, X. (2007) Intertemporal pricing with strategic customer behavior. Management Sci. 53(5) 726–741.

Sulistio, et al., (2007) "Using Revenue Management to Determine Pricing of Reservations," in e-Science 2007, pp. 396-405.

Vermeulen, P., D. Dias, M. Dossche, E. Gautier, I. Hernando, R. Sabbatini, and H. Stahl (2006): “Price setting in the euro area: Some stylised facts from Individual Producer Price Data and Producer Surveys,” Working Paper.

Willis, J., Jost, M. and Nilakanta, R., (2007) Foundations of qualitative research, NY: M. E. Sharpe, Inc

Xu, X., W. Hopp. (2006) A monopolistic and oligopolistic stochastic flow revenue management model. Oper. Res. 54(6) 1098–1109.

Yi Peng, Gang Kou Guoxun Wang, Honggang Wang and Franz I. S. Ko (2009) Empirical Evaluation of Classifiers for Software Risk Management International Journal of Information Technology & Decision Making (IJITDM), 2009, vol. 08, issue 04, pages 749-767

Need help with your literature review?

Our qualified researchers are here to help. Click on the button below to find out more:

Literature Review Service

Related Content

In addition to the example literature review above we also have a range of free study materials to help you with your own dissertation: