FormalPara Key Points for Decision Makers

The implementation of major reorganisations of clinical services can carry substantial cost, partly as financial expenditure and partly as staff time, spent within working hours and as discretionary out-of-hours effort.

We present a framework for costing implementation of major system change, suggesting what information should be collected and how/when/from whom.

This framework can support different stakeholders, including service planners, researchers, and policymakers, to collect and analyse implementation costs that are often considered too complex to measure or excluded as sunk costs.

Cross-disciplinary collaborations involving health economists, qualitative researchers, clinicians, managers, and patients and the public are recommended for this work.

1 Introduction

Major system change (MSC) involves the reorganisation or reconfiguration of services at a regional level. Reconfiguration has been defined as “a deliberately induced change of some significance in the distribution of medical, surgical, diagnostic and ancillary specialties that are available in each hospital or other secondary or tertiary acute care unit in locality, region or health care administrative area” [1]. A key reason why healthcare systems undertake MSC is the view that the new service configuration will result in improved clinical outcomes at lower cost [1, 2]. In particular, service centralisations, leading to fewer services providing specialist care to more patients per service, have been found to improve outcomes through improved quality of care, increased service volume and better trained staff [3,4,5]. If centralised services deliver improved care at a similar or reduced cost, potentially through economies of scale [6, 7], then the centralised model should be more cost effective than the preceding model. Clear evidence for this via high-quality economic evaluations is however lacking, particularly regarding implementation cost, i.e. the initial outlays required to effect change [8].

Economic evaluations of MSC tend to fall outside health technology assessment (HTA) and within service evaluation, where evaluation of costs and consequences is often neglected [9]. Service evaluations preclude randomisation due to ethical considerations, as they are designed to assess the current standard of a service and not to compare technologies; hence, challenges exist in attributing causal links between service delivery changes and improvements in outcomes [10]. Data collection can be challenging, with most MSC studies relying on patient-level data collected routinely or via audit systems [9]. A recent systematic review of service centralisations found their economic evaluations were generally of poor quality, sometimes neglecting relevant outcomes, and not using statistical methods to sufficiently account for bias given the lack of randomisation [2]. Furthermore, implementation costs of any type of healthcare intervention or service delivery change tend to be omitted from its economic evaluation [11]. In some cases, these costs can be assumed to be low, for example adding an uncontroversial drug to a procurement list. However, in other cases they may be substantial, for example due to training or engagement activities or acquiring new equipment. This is likely to be the case for MSC. Some argue implementation is a sunk cost as it occurs once, albeit over a period of time, and cannot be recovered. This however assumes that the change is already rolled out to all possible regions, which is unlikely to be true, and ignores different ways of implementing the same MSC, which might differ in costs and effectiveness. For the organisation deciding whether to undertake MSC, these upfront costs can be substantial and must be met.

Data on implementation processes may be difficult to obtain, and require additional skills and resources not normally associated with quantitative economic evaluation. Some information may be captured as part of research into organisational processes, although perhaps inconsistently and without yielding specific information amenable to costing. If consideration of resources associated with implementation is given only later in the evaluation, some information may no longer exist if the organisation has dissolved or there is poor organisational memory [12].

When evaluating complex interventions, the Medical Research Council (MRC) guidance [13] has implementation built into the research cycle, with iterative steps and data collection planned throughout, and an analogous approach has been applied in the development of our framework. While performing the economic evaluation work for stroke [12, 14], it became clear that having contemporaneous data collection would have allowed the possibility of conducting a bottom-up implementation cost analysis instead of top-down. This work also showed a possible implementation pathway used in MSC and confirmed who were the relevant stakeholders, both in National Health Service (NHS) bodies and allied organisations, regarding time and money spent on designing, planning, and implementing MSC. Published work detailed in Sect. 2.1 was also used to further develop this framework [11, 15,16,17,18,19], and we incorporated feedback on draft versions of the framework from multiple stakeholders as part of the RESPECT-21 programme (see Sect. 3.1) [20], as well as at the UK Health Economists’ Study Group.

The aim of this paper was to present the partly contemporaneous evaluation of the cost of the design, planning and implementation of MSC in specialist cancer surgical services in London [20] as a case study, demonstrating the importance of considering implementation upfront when designing economic evaluations alongside MSC. This work illustrates costs to be accounted for in planning MSC and the impact they might have on the cost effectiveness of the overall change, which is important for planners and commissioners. The case study, taken from the RESPECT-21 programme [20], illustrates use of our framework, which is designed to help planners and researchers to elucidate the implementation pathway and to therefore consider early on what data to collect.

2 Principles of Costing the Implementation of Major System Change

2.1 Existing Frameworks for Incorporating Implementation Strategies into Economic Evaluations

Implementation science is “the scientific study of methods to promote the systematic uptake of evidence-based interventions into practice and policy and hence improve health” [8], and encompasses MSC, which aims to increase the proportion of patients receiving high-quality evidence-based care in a region. The National Institute for Health and Care Excellence (NICE) recommends implementation costs be included in economic evaluations as sensitivity analysis [21], and comments that the cost of implementing a new intervention would be important for the organisation paying for implementation [22]. Systematic reviews found a limited number of economic evaluations of implementation strategies that were generally of poor quality and mostly not related to MSC [16, 17]. Hoomans, Severens and others further highlighted the lack of evidence in this area and suggested establishing the net monetary benefit of the new intervention or system compared with existing or standard of care, and comparing expected benefit to calculated implementation cost to help making the implementation decision [11, 23]. Frameworks for calculating implementation costs have considered calculating patient-level costs to inform value-of-information analysis [18], and discussed data collection and reporting in specific scenarios [15, 19]. We consider it timely to develop these ideas further and provide researchers and health service organisations with a detailed and tested framework to apply to implementation cost analysis in MSC.

2.2 Audience and Perspective

Key to any evaluation is identifying the intended audience, addressing the important question of the analysis perspective. This determines which costs and benefits are included. As stated by Meacock [9], HTA agencies such as NICE are not traditionally the intended audience of service evaluations as they are not generally the topic of reimbursement decisions. This may have contributed to the lack of methodological guidance [9]. The main audiences, and therefore perspectives, tend to be (a) local providers responsible for providing some or all of the services involved in the MSC; (b) a local/regional payer and/or health authority responsible for the planning, performance management and total cost of providing services involved in an MSC from more than one provider and usually across a whole system; and (c) national healthcare policy makers requiring information on the expected costs and benefits of providing resources for MSC who may prefer a societal perspective possibly including wider non-healthcare costs. These different viewpoints are summarised in Fig. 1.

Fig. 1
figure 1

Framework and principles of implementation cost analysis in MSC. MSC major system change

Evaluations may, from the point of view of a specific audience, be retrospective or prospective. In the retrospective case, the audience are interested in confirming delivery of expected clinical benefits of the changes and may prefer to focus on efficiency savings, rather than sunk implementation costs. Collecting and reporting this information can still be helpful for audiences weighing up the costs and benefits of undergoing a future MSC, where implementation is anticipated. For prospective evaluations and service planning where the audience are decision makers deciding whether to implement MSC, implementation costs can influence the implementation decision/approach [19, 24]. With a year-on-year fixed budget, there is less interest in analyses focusing on cost-effectiveness thresholds per gain in outcome (e.g. quality-adjusted life-years [QALYs]), and instead greater focus on extra resources required, either upfront or overall. Hence, regional decision makers may prefer a budget impact, return-on-investment or programme budget marginal analysis [12].

2.3 Costing Methodology

Broadly, two main methodologies are available for costing implementation: top-down or bottom-up. Top-down costing, considering costs of specific ‘units’, e.g. salaries or consultation events, may be less precise here; staff could be responsible for a range of activities besides reconfiguration, and event costs may reflect estimates only. This also excludes less well-defined costs, such as relocating staff and reorganising rotas and contracts. The strength of top-down costs however is that they can more readily and cheaply be calculated. Conversely, bottom-up costing (also called micro-costing) identifies each implementation component and assigns unit costs, requiring more resources and planning to conduct. Collecting this information poses challenges, and mixed-methods approaches, including realist reviews, qualitative methods or documentary analysis, may be required in addition to traditional quantitative methods.

Figure 1 sets out the framework and principles of bottom-up implementation cost analysis in MSC, conceived during our work on stroke [4, 5, 12, 14]. We list key costing components and considerations, with information on within which perspective they may be relevant, i.e. (a) local provider, (b) local/regional payer and/or health authority, or (c) national. The total implementation cost (adjusted to a common financial year) can be calculated as the sum of all implementation activities [sum of items (A) to (E) in Fig. 1]. Depending on the perspective, non-healthcare costs such as costs to patients (G) could be included, and the total could be annuitized [25] according to the lifetime of the changes or assets purchased (I), and divided across the relevant patient population (H). We recommend using standard accounting techniques and recommendations from Drummond et al. [25] to make the best estimate of the implementation cost, including appropriate methods for accounting for capital costs and assets, i.e. annuitizing over their lifetimes and over the lifetime of the MSC. Discounting of future costs and outcomes is not appropriate here, but would be applicable in economic evaluations looking at future costs and benefits with time horizons longer than 12 months, including for future implementation costs. We do not propose to include this kind of future discounting when calculating upfront or retrospective implementation costs. Variable costs such as tariffs (F) may be incorporated directly into the full cost-effectiveness analysis (CEA) but not into the implementation cost. They are mentioned here in the framework as their collection could more easily be done during the reconfiguration rather than after.

3 Case Study

3.1 Background to Case Study: RESPECT-21

From February 2012 to April 2016, an integrated network of cancer providers in North Central London, North East London and West Essex (initially known as ‘London Cancer’, population 3.2 million, 8 NHS Trusts) worked to centralise specialist surgery services for eight cancer pathways across urology, head and neck, brain, oesophagogastric (OG) and haematological cancer services [26]. Cardiovascular service reconfiguration took place simultaneously [27]. The RESPECT-21 study focuses on four pathways: prostate, bladder, renal and OG, and includes full CEAs comparing the new centralised services to the previous systems [20]. RESPECT-21 is a retrospective analysis but occurred contemporaneously with the MSC, allowing collection of implementation cost data, using the local/regional payer perspective (b) above in terms of the regional health authority, specifically what was London Cancer. RESPECT-21 covered similar changes in Greater Manchester, but delays meant that only London changes feature here. Reconfigurations involved the following moves: OG, from three sites to two; prostate and bladder, from four sites to one; and renal, from nine sites to one.

3.2 Methods for Case Study

3.2.1 Data Sources

Identification of components according to Fig. 1 used a mixed-methods approach. The team’s qualitative researchers undertook a programme of interviews and meeting observations and collected extensive documentation for qualitative analyses around process and impact on staff. Extra interview questions around financial considerations were included in collaboration with the team’s health economists, and a comprehensive timeline of events was generated. The health economist used this jointly produced work to identify and cost implementation events according to the framework, obtaining additional details and clarifications from NHS senior finance staff identified by the qualitative team as potential sources of further information. Documentary sources (n = approximately 100) included meeting minutes from the various boards that discussed aspects of the reconfiguration and some sites’ business cases (see Table 1). Further details on the qualitative aspect of RESPECT-21 can be found in recently published work [26]. Estimates of some details were made in collaboration with senior managerial and clinical staff from NHS and associated organisations where documentation was unavailable or incomplete. Published NHS salary scales for average relevant grades were applied to monetise estimates of staff time spent [28].

Table 1 Boards included in the implementation cost analysis, and the dates during which they were included

3.2.2 Board Structure

We included costs for boards where we could obtain evidence or estimates regarding meeting frequencies and numbers, using information from minutes and attendees’ memories (Table 1). We did not include time spent by members of the public or local authority staff, as this did not fall within the cost perspective. Meetings were coded using the identity, number and type of NHS attendees, length of meetings, and the estimated proportion of time spent on the MSCs of interest, to generate a mean cost per meeting, which was multiplied by the total number of meetings to give the total reconfiguration cost per board.

The landscape for the various boards is complex, and changes took place during the analysis period. An approximate and incomplete summary is this: the Cancer Commissioning Board (CCB) was established on 1 October 2011 and attended by commissioners, holding full statutory responsibilities from 1 April 2013. Work also started in 2011 under the London Health Programmes of NHS London (as was), and the Joint Development Group (JDG) for cancer, chaired by a senior commissioner, oversaw work to develop the specialist cancer reconfigurations to the point at which NHS East London region was instituted in June 2013. The Joint Cancer Cardiac Programme Board (JCCPB) took over from the CCB and JDG in 2014 and oversaw reconfigurations in cancer and cardiac pathways. The London Cancer Board (LCB) was tasked in February 2012 with designing and implementing the cancer reconfigurations, and Pathway Boards were created in 2012 by LCB to oversee pathway development. Pathway Directors and Managers were hired to run Pathway Boards. Pathway Boards reported to the CCB then the JCCPB, including presenting Gateway Reviews that assessed and documented Trusts’ and Pathways’ readiness for reconfiguration. Operational Steering Groups (OSGs) for each Pathway met regularly from 2014 to 2015/2016 to discuss the reconfigurations. The Cancer Unification Board (CUB), which discussed aspects of the reconfiguration, met from 2014 to 2016, when its functions were transferred to the Cancer Vanguard Programme Board. The London Clinical Senate reviewed the decision-making process. Some members of these boards were included in the ‘Key Actors’ list in this analysis. Besides these boards, there were also local authority bodies, such as the council Health Overview and Scrutiny Committees (OSCs) and Joint OSCs, which included some NHS staff.

3.2.3 Implementation Timeline

Implementation activities for MSC can have (1) a ‘before’ period, when design and planning begins; (2) a ‘during’ period, often > 12 months, where initially only implementation activities happen but later both implementation and the new intervention occur simultaneously; and (3) an ‘after’ period where the new intervention is happening and implementation activities have stopped. In our case-study scenario, the ‘during’ period for renal specialist surgery began on 1 January 2015, and for prostate and bladder surgery on 1 July 2015, and ended on 1 April 2016 for these three cancers. The OG reconfiguration had only ‘before’ and ‘after’ periods, as changes did in fact take place on one day (1 January 2016).

The start date for this implementation cost analysis was 28 February 2012, coinciding with LCB’s first meeting after being tasked with leading the reconfiguration, and ended on 1 April 2016, the start of the ‘after’ period for the three urological cancers. These dates (summarised in Table 2) were used for all four cancers, despite OG’s ‘after’ period having begun 4 months earlier, to simplify data collection. For the purposes of this retrospective evaluation, we combined all costs occurring during the 4 years into a single one-off cost, adjusted to a common price year, that could be applied at the start of the ‘after’ period in a future economic evaluation.

Table 2 The implementation cost analysis ran from 28 February 2012 to 1 April 2016 for the four cancers

3.2.4 Data Collection and Cost Adjustment

The qualitative work identified boards, observed meetings, collected minutes, and interviewed NHS and related staff. Estimations of approximate total numbers of meetings per board during the timeline, by one author (CL), supplemented the documentary evidence (see Table 1). The average salary grade of meeting attendees was estimated as the midpoint of NHS Band 9 [29, 30]. Estimates of the costs of engagement events were made similarly, using information from current and former North and East London Commissioning Support Unit (NELCSU) staff (the internal change support agency for the NHS), Transforming Cancer Services Team (TCST) staff, and other NHS clinicians and managers. Business cases containing information on capital expenditure on equipment, facilities and other items were obtained from Trust senior finance staff.

Costs in categories A–E (Fig. 1) were adjusted to 2017–2018 prices using the new Health Services Index using the Consumer Price Index (CPI; Health), and the previous Hospital and Community Health Services indices [31, 32], and summed to give total costs.

3.2.5 Incorporating Implementation Costs into Full Economic Evaluation

Certain considerations were made to ensure that the results of this implementation cost analysis would be suitable for inclusion in a full CEA. When calculating capital costs such as expenditure on new equipment or changes to buildings, we considered the potential lifetime of the asset and eventual resale value, and used these to annuitize implementation cost over these time periods [25]. We assumed that both the MSC and any capital assets had a lifetime of 10 years before they would be ‘replaced’ (Consideration I), and used interest rates that matched the standard future discounting rate of 3.5% per year used in standard economic evaluations where this implementation cost is expected to be used [33]. Sensitivity analysis was performed to evaluate the implications of using different lifetime durations. Total costs will be divided across the four different cancer specialities according to the annual incidence in the relevant population (Consideration H) for inclusion in the full CEAs for RESPECT-21, as these will be done separately for each of the four cancers.

3.3 Results for Case Study

3.3.1 Using the Major System Change Implementation Cost Framework

3.3.1.1 People’s Time

Using board meeting minutes alone to calculate staff time was insufficient as staff also spent substantial time outside meetings. We therefore created a list of 19 Key Actors based on the qualitative work and additional conversations with key central figures and included a weighted portion of their salary and on-costs, on top of the time spent by other staff in board meetings. These Key Actors spent an estimated 12.5% (11 people), 25% (5 people), 40% (1 person) or 50% (2 people) of their time across the four specified cancer reconfigurations. Time spent in board meetings by these specific people was excluded to avoid double counting.

On top of the above, other informal discussions and planning tasks took place among other staff. This would have constituted substantial time for clinicians and managers, but was not possible to quantify, therefore represented missing information.

3.3.1.2 Costs to Sites

We initially considered obtaining financial documentation from all 14 sites in the region to confirm expenditure, but conversations with key management and clinical staff suggested no direct external expenditure during the timeframe, except at the new prostate/bladder and renal specialist centres.

Some specialist renal, bladder and prostate surgeries are increasingly performed using robotic techniques, and replacement robots were purchased by two sites during the timeline of the reconfiguration study.

3.3.2 Results by Cost Component

All costs for components A–E (see Fig. 1) quoted in this section are unadjusted. Some board meetings straddled cost components A, B, C and E. They are all reported under C for convenience. Some costs covered all eight pathways across the five services listed above, some covered urology pathways plus OG (i.e. the four cancers of interest here), some just covered urology, and some also covered engagement events or meetings where the cardiac reconfigurations were also discussed. The base-case analysis maintained that all costs collected were attributed to the four services of interest, regardless of any overlap with the other four cancer services or cardiac services, as it was not clear that costs for engagement events or consultancy providing transport analyses for example would have been reduced if only the four services of interest had been involved. Breakdowns and sensitivity analysis with some reductions in these costs where they were shared with cardiac reconfiguration are discussed below and in Sect. 3.3.4.

3.3.2.1 Component A: Options Appraisal

As part of the appraisal process, work was performed at the NELCSU and by external consultants. This included a complex business case that included capital works across different hospital sites, programme management support, and competition (market) and transport analyses. The total cost was estimated at £1,850,000 across all eight cancers and cardiac services. The base case included this total amount, and sensitivity analysis considered including only half of this amount (i.e. half attributed instead to cardiac changes).

Regarding bidding preparations at prospective specialist sites, we obtained no information beyond some Key Actors writing bids as part of their role. These costs were therefore included within Key Actors.

3.3.2.2 Component B: Stakeholder Engagement

There were two engagement phases: October–December 2013 and May–June 2014. These were led by NHS England and Clinical Commissioning Groups and included workshops attended by clinicians and the public, as well as planning and engagement meetings with NHS staff. Provider staff time at workshops totalled 475 person-hours (£23,248) in the first phase (urology only) and 220 person-hours (£10,768) in the second phase (all eight cancers). A further 520 person-hours (£25,451) were spent in ongoing meetings covering the joint cancer and cardiac changes, and £23,713 was spent by the LCB on room hire, catering, etc. for events (all eight cancers). The base-case analysis included all these costs and there was a similar reduction made in sensitivity analysis for the overlap here with the cardiac reconfiguration.

There were no minutes available for engagement events, therefore information came from memories and calendar invitations from current and former TCST and NELCSU staff and mentions in various documentation, including archived news items. LCB direct expenditure figures came from a report discussed at a 2014 LCB meeting. We could not exclude Key Actors’ time here as event attendee lists were not available. No clinic sessions were cancelled for these events, and almost all occurred outside working hours. No distinction was made in this analysis between staff time spent during working hours and during leisure time.

3.3.2.3 Component C: Planning Meetings

Staff, excluding Key Actors, spent an estimated 1459 person-hours (£71,309) on board meetings (four cancers of interest), including during the options appraisal period (component A), the engagement period (component B), planning and monitoring (component C), and for auditing and monitoring performance (component E). Expenditure on internal change support for planning and monitoring totalled £100,000 (eight cancers and cardiac) with the base case including this total amount, and sensitivity analysis including only half of this amount (£50,000).

3.3.2.4 Component D: Making Purchases or Hiring Staff

There were some new hires later due to increased patient volumes at the specialist sites and some sharing or movement of surgeons between sites, but these were not included in the analysis: only one-off costs of new roles created specifically for the design, planning and implementation were included, and these were all included within Key Actors’ time.

Robots Two specialist sites obtained old robots from associated sites in the years leading up to the reconfiguration, one for renal surgery and one for bladder/prostate. Both robots were later replaced, one in 2014 and one in 2017, and cost £1.9 million each, according to figures from the confidential business case for purchase at one of the new specialist sites. In the absence of information for the robot at the other site, it was assumed that its purchase price was the same. They were intended for exclusive use for these surgeries at each site.

Other equipment purchases For renal cancer, an itemised business case discussing the reconfiguration included £0.16 million for additional theatre equipment. There were some costs in OG of purchasing a new theatre kit at one new specialist OG centre for surgeons now operating at that site due to the reconfiguration, but specific cost information was unavailable. No costs of this type were reported for prostate or bladder cancer.

3.3.2.5 Key Actors

The 19 Key Actors were assigned a flat percentage of their time on the reconfiguration of the four pathways over a number of years, summing to 10.7 person-years (£1,081,602). Salaries were taken from budget documents or estimated from published figures in consultation with NHS colleagues.

3.3.2.6 Component E: Implementing Monitoring Systems

Time spent on implementing audit and monitoring systems in board meetings was included under Component C. These covered the four cancers of interest only.

3.3.3 Other Cost Considerations

3.3.3.1 Consideration F: Tariff Changes

Temporary top-up tariffs for robotic surgeries were used at the prostate/bladder and renal specialist centres. Changes in tariffs were not included in this cost analysis, but will be considered in sensitivity analysis in future full CEAs.

3.3.3.2 Consideration G: Costs to Patients

The perspective of this study was the regional healthcare authority (b), therefore costs to patients were not included.

3.3.3.3 Consideration H: Population of Interest

The relevant population was the catchment area, or 3.2 million people. Patient numbers required for adaptation of this cost analysis for inclusion in the main CEA approximate to relative incidence rates of the four cancers for North Central and East London and West Essex. According to International Classification of Diseases, 10th Revision (ICD-10) codes and using 2017 figures from CancerStats (PHE, 13 September 2019), these were: 511 renal cancers (codes C64-66, kidney, renal pelvis, ureter), 343 bladder cancers (code C67, bladder), 2077 prostate (code C61, prostate), and 482 OG (codes C15–16, upper gastrointestinal).

3.3.3.4 Consideration I: Lifetime of the Changes

The perspective of this study was the regional healthcare authority (b), therefore we considered the implementation cost as a one-off cost, annuitized over 10 years in agreement with the estimated lifetime of the assets purchased (robots) and the MSC itself. We have not projected future costs of subsequent reconfigurations; however, this could be relevant under the national perspective (c).

3.3.4 Total Implementation Cost: Overall and Per Patient

Total implementation cost was £6.9 million by expenditure year or £7.2 million adjusted to 2017/2018 prices (see Table 3). This included some costs that could potentially be attributable to the cardiac reconfiguration or to the other four cancer pathways. Sensitivity analysis removing half of the shared costs where there was overlap with cardiac reconfigurations reduced the total to £6.2 million in 2017/2018 costs, mostly due to halving the Component A costs. In other reconfigurations, it is possible that such a large equipment cost might not be required; excluding the robot costs gave an adjusted total cost of £3.2 million.

Table 3 Breakdown of expenditure by year and type, for actual expenditure (Raw) and adjusted to 2017–2018 prices (Adjusted)

Total costs per patient for incorporation into an economic evaluation are reported in Table 4. The costs have been broken down by cancer type, given that the MSC will have a different impact on the costs and consequences/clinical effectiveness for each cancer type in the ‘after’ period. The assumption regarding the lifetime of the MSC is likely to have implications for the economic evaluation, given the magnitude of the difference between the results for the different assumptions.

Table 4 Cost per patient by cancer specialist area using a 10-year lifetime for both capital costs and non-capital costs in 2017/2018 prices as the base case, and over shorter and longer time horizons as sensitivity analysis

4 Discussion

4.1 Context of the Results

Based on a bottom-up, single-arm cost analysis, the design, planning and implementation stages of the reconfiguration of four specialist cancer surgery services in the London Cancer region cost approximately £7.2 million in 2017–2018 prices. A framework for calculating implementation cost for use in economic evaluations for MSC planning and other research purposes (Fig. 1), conceived during our previous work on stroke [4, 5, 12, 14], was used to guide the methods and analysis for this work. Considerations such as cost perspective (who is the target audience for the analysis), whether the analysis is prospective or retrospective, the lifetime of the changes, the relevant patient population, and how capital and non-capital costs are incorporated into the analysis all have important implications for how the cost per patient for use in an economic evaluation is calculated. These types of implementation costs have not historically been included in economic evaluations, but their inclusion can be important to decision makers, depending on the cost perspective.

We have produced a best estimate of the implementation cost, to be used in future work. We performed deterministic scenario analysis to provide alternative best estimates for alternative scenarios. We expect that a joint probabilistic sensitivity analysis would be performed when this implementation cost is included in a full CEA, as in that context there would be a research question to be answered, comparing two alternative courses of action and requiring information regarding their relative cost effectiveness.

4.2 Overview and Limitations for Case Study

There are a number of limitations to this cost analysis, the most important being the lack of counterfactual data as it is a single-arm analysis; it implicitly assumes that maintaining the previous system involves zero cost, which may not be appropriate and also ignores other possible changes and innovations taking place at this time. The reconfigurations were expected to deliver benefits in patient care and outcomes, and these will be assessed in RESPECT-21 [20] as part of a full comparative CEA per cancer, using patient-level outcomes and resource-use data from the ‘before’ and ‘after’ periods, in the reconfigured London Cancer region and the non-reconfigured ‘Rest of England’ region. Other limitations are that some work began before the analysis time period, especially in prostate cancer, and some potentially relevant costs appeared after the cut-off and were therefore excluded. For example, one non-specialist site hired locums for a while after moving services to the specialist site due to short-term difficulties in recruiting permanent staff, which incurred higher costs than filling permanent positions. Oversight and facilitation of ‘bedding in’ after the main changes had been effected could also be seen as a key part of implementation, but we captured only limited information on this in the form of staff time in component E.

In addition, the data collected were incomplete due to the timescales involved and the analysis complexity. Assumptions were made around how to attribute costs, unit costs used, and time spent by staff. Staff time is likely to be the biggest underestimate, as we could not gather accurate information on exactly how long was spent discussing and reflecting on planning and implementation decisions.

The inclusion of costs for robotic surgery implicitly raises questions regarding the relative cost effectiveness of robotic and traditional surgery, and there is limited evidence on this to date. In both renal [34] and prostate [35] cancer, there is some evidence that better clinical outcomes can be obtained with robotic surgery, and reduction in length of stay has been observed at these sites. Regarding reduction in length of stay, use of the robot to avoid open surgery with prolonged length of stay could be a contributing factor, and the budget for the purchase of the replacement robot was partly justified by the high case volume created through the service reconfiguration.

The impact of discontinuing specialist surgeries at some sites was also hard to measure. It could attract other interventions to fill the space, which may result in longer or shorter procedures and lengths of stay, different use of consumables, loss of expertise in some fields and development of new expertise in other fields. No staff were made redundant as a result of the MSC. They either moved to the new service or to other clinical areas. This may not be the case for other service reconfigurations in the future.

4.3 Limitations Regarding Onward Use of the Implementation Costs

The total cost of implementation is particularly important for a prospective analysis of a reconfiguration, where the audience of the analysis is one that is weighing up the costs and benefits of an MSC. The total outlay of costs is likely to be important in this context, to budget for the additional costs such as staff time and capital that will be required to implement the MSC. However, this case study is part of a retrospective analysis of a reconfiguration that took place in London Cancer, where the aim was to evaluate the clinical and cost effectiveness of the MSC, including the implementation cost in the ‘after’ group in the reconfigured region.

The theoretical basis for deciding the cost year and annuitizing non-capital costs requires careful consideration in each individual case, in terms of when the expenditure is likely to take place. In the case study reported here, the choice to annuitize the implementation had little impact on the total per-patient cost, but assumptions regarding the lifetime of the MSC were important. However, the implications for the results of the full economic evaluation subsequently being performed for RESPECT-21 remain to be seen. It is possible, for example, that incidence rates and total population might change over time.

The aims of this paper were to illustrate the importance of implementation costs for MSC, with a view to inclusion in a full CEA, as well as discuss how relevant costs for capture can be categorised to frame discussions with colleagues during data collection, and we have presented them in our framework for application by planners and researchers in the future in similar MSC contexts. Certain items are very specific to this context, namely the replacement robots, and some are more widely applicable, i.e. staff time, consultancy fees, and engagement activities. We annuitized costs according to the estimated lifetime of assets and MSC, and varied this in sensitivity analysis. We note that the perspective of RESPECT-21’s main CEAs will be that of the NHS and Personal Social Services [i.e. perspective (c)], therefore we will consider how to include assets, as costs of the robots to the NHS were also partly accounted for via temporary tariff changes at some of the specialist sites.

Use of the framework principles and components described above greatly facilitated data collection across the various NHS services and sites. We have broken down the implementation costs into categories (see Sect. 3.3.2 and Table 3) that we hope will assist future service designers and planners, as well as researchers, in making estimates of the likely costs of implementation, ahead of performing their own MSC.

5 Conclusions

This was the first time this framework (Fig. 1), conceived during the stroke reconfiguration, was used to explore and capture the components and costs of designing, planning and implementing MSC. We have provided a model scenario describing the investigation of these costs.

Inclusion of implementation costs in CEA is likely to make MSC appear less cost effective, potentially influencing future decisions regarding MSC. Regional and subregional decision-making is likely to become more prevalent with increased devolution of the type currently underway in England, thus making these types of implementation cost analysis at these levels even more relevant and important in future as regional authorities aim to harmonise MSC planning.

Implementation is often thought of as a ‘black box’ and not fully investigated. With this work, we have begun to explore the components and associated costs of designing, planning and implementing MSC, and their presentation, yet there is still further work to be done. It is often assumed that implementation costs can be ignored and we hope that this work goes some way towards challenging this assumption.