Full article title Laboratory demand management strategies: An overview
Journal Diagnostics
Author(s) Mrazek, Cornelia; Haschke-Becher, Elisabeth; Felder, Thomas K.; Keppel, Martin H.; Oberkofler, Hannes; Cadamuro, Janne
Author affiliation(s) Paracelsus Medical University
Primary contact Email: c dot mrazek at salk dot at
Year published 2021
Volume and issue 11(7)
Article # 1141
DOI 10.3390/diagnostics11071141
ISSN 2075-4418
Distribution license Creative Commons Attribution 4.0 International
Website https://www.mdpi.com/2075-4418/11/7/1141/htm
Download https://www.mdpi.com/2075-4418/11/7/1141/pdf (PDF)

Abstract

Inappropriate laboratory test selection in the form of overutilization as well as underutilization, frequently occurs despite available guidelines. There is broad approval among laboratory specialists and clinicians that demand management (DM) strategies are useful tools to avoid this issue. Most of these tools are based on automated algorithms or other types of machine learning. This review summarizes the available DM strategies that may be adopted to local settings. We believe that artificial intelligence (AI) may help to further improve these available tools.

Keywords: appropriate laboratory test ordering overutilization, pre-analytical phase, underutilization

Introduction

Laboratory tests are fundamental for medical diagnosis, prognosis, and treatment decisions[1] and are being ordered in rising numbers each year due to increased availability, mostly based on technological advances.[2] However, due to this fact that laboratory orders increase along with convenient availability, it seems that a certain amount of laboratory tests are ordered inappropriately.[3][4] On the one hand, inappropriate orders may present as overutilization, where tests with doubtful contribution to further patient management are ordered; on the other hand, there may be underutilization, when required tests are not being ordered.[5] Even if studies estimating over- or underuse are rarely comparable due to differences in study design, it seems that the extent is not negligible. In a systematic review, Zhi et al.[5] estimated an overall mean rate of overutilization of 20.6%. Subgroup analysis revealed a higher mean rate, around 44%, for inappropriate initial testing. However, single studies state that up to 70% of ordered tests may be of doubtful importance for patient management.[6][7] A workup of closed malpractice claims conducted by Gandhi et al.[8], as well as Kachalia et al.[9], revealed that failure to order the appropriate diagnostic or laboratory test contributed to missed or delayed diagnoses in 55% and 58% of cases in an ambulatory setting and the emergency department, respectively. Zhi et al.[5] state the overall mean rate of underutilization is 44.8%.

Along with Sarkar et al.[10], who support the high proportions of errors in test selection by evaluating orders for coagulation disorders in real time, inappropriate ordering may be considered a substantial threat to patient safety. Overutilization may lead to unnecessary follow-up investigations or treatments, increased workload and costs, and increased patient anxiety, while underutilization may result in missed or delayed diagnoses.[5][11][12] Lack of knowledge, insecurity, pure habit, patient pressure, or fear of lawsuits are possible causes for inappropriate testing.[13][14][15] The lack of knowledge is reflected by various studies, which observed inappropriate orders despite available guidelines or recommendations on the implementation of demand management (DM) tools.[12][14][16][17][18]

This review summarizes available DM strategies, which may be implemented into local settings to reduce inappropriate test utilization.

Possible strategies to avoid inappropriate test utilization

DM tools may help to prevent overutilization and underutilization. An attempt to categorize the different DM strategies as appropriate tools to overcome over- and underutilization is depicted in Figure 1.


Fig1 Mrazek Diagnostics21 11-7.png

Figure 1. Categorization of DM strategies: Stratification of whether DM tools prevent overutilization and underutilization. LOP = laboratory ordering profile.

Many studies combine several tools[14][17][19], which has been shown to have an additive effect on the overall outcome.[20] In addition, the collaboration of laboratory specialists and clinicians together with audits, feedback, reminders, and multiple plan-do-study-act (PDSA) cycles will further improve efficiency in terms of a continuous improvement process.[12][14][18][19]

Alerts at the stage of order entry

Alerts appearing in the form of pop-up windows in the clinical physician order entry (CPOE) system may be designed to avoid various causes of overutilization.

Lippi et al.[21] implemented alerts for biological implausibility concerning age (e.g., beta human chorionic gonadotropin in patients < 9 and > 60 years) or gender (e.g., prostatic specific antigen [PSA] in females) at two university hospital wards. In addition, alerts for minimum retesting intervals (MRIs) were implemented (addressed in a further subsection). The alert provides an explanation as to why the order is deemed inappropriate and enables the ordering provider to choose order cancellation or acceptance.

Similarly, Juskewitch et al.[16] implemented an alert, triggered by the concomitant order of erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) in a community health system. Again, the user is informed about the inappropriate request and has the choice to cancel ESR or to proceed with the order. The implementation of this DM strategy resulted in a 42% relative rate reduction of ESR/CRP co-ordering.

Alerts may also help to suggest an alternative test, as Parkhurst et al.[22] showed. The authors reduced genetic testing of methylene tetrahydrofolate reductase (MTHFR) by informing the ordering physician about the latest recommendations of MTHFR testing, including the suggestion of homocysteine as an alternate test. In this study, the choice of overruling or adopting the suggestion was left with the user. Overall, there was a significant decrease of average monthly MTHFR tests from 12.93 per million patients in the year before the intervention to 7.08 per million patients afterwards.

Larochelle et al.[17] aimed to improve ordering of cardiac biomarkers according to guidelines for the diagnosis of acute coronary syndrome (ACS). As part of a multimodal intervention, including education and several changes in the CPOE system (see later subsections), a pop-up alert was introduced, triggered by the order of creatinkinase (CK) and CK-MB isoform (CK-MB), informing the user about the recommended indications for these tests.

MRIs, which may also be implemented in the form of alerts at the stage of order entry, are discussed in the subsection about minimum retesting intervals.

Hold back orders in the laboratory information system

Informing the ordering provider through alerts at the stage of order entry would be the preferred solution; however, it may not always be possible to reject inappropriate orders in the CPOE system due to technical issues. In these cases, orders may be screened for appropriateness upon arrival in the laboratory information system (LIS).

Cadamuro et al.[23] selected the analysis of anti-PF4/heparin antibodies (HIT-Ab) as the objective for a so-called gatekeeping strategy. This test is used in cases of suspected heparin-induced thrombocytopenia (HIT), type II. However, before ordering the HIT-Ab test, pretest probability may be assessed with the 4T-score.[24] The four questions of this scoring system were incorporated into the CPOE system, and the appropriate answers had to be selected from a drop-down menu as a mandatory part of the HIT-Ab ordering process. Subsequently, the score was calculated automatically within the LIS, and depending on the result, the LIS rejected or submitted the order for testing. In the case of rejection, the ordering physician was informed of the probability of a positive HIT-Ab test being <2% and the possibility to overrule the laboratory’s decision. This intervention resulted in a reduction of HIT-Ab testing of about 50%, without jeopardizing patient safety.

Mrazek et al.[4], who aimed to collect cases indicating a relationship between availability and number of ordered tests, described an example, provided by Maria Salinas, where the LIS held back orders, in which at least four tumor markers (TMs) were requested concomitantly. The laboratory specialist then decided upon the appropriateness of the order in synopsis with the patient’s medical record. Samples of inappropriate orders are stored until the order necessity is clarified with the general practitioner. Three years after implementation of this DM strategy, annual requests containing four or five TMs declined by 66%.

MRIs, which may be considered as a subset of holding back orders, are discussed in the following section.

Minimum retesting intervals

Minimum retesting intervals (MRIs) are defined as “the minimum time before a test should be repeated, based on the properties of the test and the clinical situation in which it is used.”[25] Recommendations for MRIs are freely available, for example, from the collaboration of the Royal College of Pathologists, the Association for Clinical Biochemistry and Laboratory Medicine, and the Institute of Biomedical Science.[25] MRIs may be implemented in the LIS, dependent on available technical possibilities. Salinas et al.[26] implemented an MRI in the form of a comment on the laboratory report. In case a ferritin test was re-ordered within three days of the last order for inpatients and three months for outpatients, the LIS rejected the request and stated an explanation in the comment, including the previous ferritin value. The results showed that 3.9% and 12% of requested ferritin were not measured in inpatients and outpatients, respectively.

A similar approach was used by Mrazek et al.[27], who implemented an MRI of 60 days for hemoglobin A1c (HbA1c) at one site of a University Hospital (Landeskrankenhaus [LKH]). Inappropriate orders were automatically rejected by the LIS, and an automatically generated comment explained the inappropriateness, stated the date as well as the result of the last HbA1c test and advised calling the laboratory if the analysis was needed in a special situation. This resulted in a decline of HbA1c measurements by 15.8%. After the implementation of the MRI, only 1.1% of ordered HbA1c were measured within 60 days, compared to 15% before the intervention. At another site, the Landesklinik St. Veit (STV), the MRI was implemented by educational measures only (see the later subsection on "Education").

One drawback of rejecting tests in the LIS is that unnecessary blood collections may be performed for cancelled tests. Therefore, it would be favorable if the requesting physician is at least alerted in the course of order entry. Waldron et al.[28] implemented an MRI of 48 hours for CRP testing. The ordering provider was alerted, but as blocking the order was not possible at the stage of order entry, the LIS rejected the test and provided an accompanying comment on the report. Requests within the MRI were only possible through direct consultation of a consultant microbiologist. Over one year, CRP requests dropped by 7.0%, and analyzed CRP tests decreased by 12.3%. The results of Larochelle et al.[17], who implemented a duplicate order pop-up warning for troponin re-orders within six hours as part of a multifaceted approach, are discussed in the next subsection.

Different outcomes are reported with regard to the reactions to the alert. As was previously mentioned, Lippi et al.[21] introduced pop-up alerts with the possibility to override the rule for biological implausibility as well as MRIs for 15 different tests at two University Hospital wards. In the observational period of six months, 22% of the orders generated an alert and 77% of these tests were cancelled. Lapić et al.[29] implemented an MRI for inpatients at a university hospital for 53 tests. The pop-up alert gave information about the inappropriateness, referred to the date as well as the status of the previous test request, and included the possibility to override the warning. In the observational period of one year, 106,780 orders, which accounted for 14.8% of all requests, violated the defined MRIs. The percentage of ignored alerts depended on the tests, but for high-volume tests—including complete blood count, CRP, alanine-aminotransferase (ALT), gamma-glutamyltransferase (GGT), and total bilirubin, which together accounted for 65% of alerts—the alert was ignored in > 85% of cases. Therefore, outcomes may depend on the clinical setting and may not be generalizable.

Moyer et al.[30] implemented MRIs for ionized calcium (iCa), magnesium (Mg), and N-terminal pro brain natriuretic peptide (NT-proBNP) for intensive care unit inpatients. The alert at the stage of order entry did not only depend on the MRI, but also on the previous results for iCa and Mg. The iCa alert was triggered if iCa was re-ordered within 24 hours and the previous iCa result was within the reference range. The pop-up alert informed the user about the date and result of the previous order, provided information about clinical situations in which iCa might still be indicated, and left the choice to cancel the request or to continue with the order to the user. In the latter case, an indication for the re-order had to be provided. Comparison of 90-day periods before and after the implementation of this DM strategy revealed a decrease in test numbers of between 28% for NT-proBNP and 48% for iCa. In a six-month period after the implementation, 6,110 alerts were triggered, with the majority for Mg (5,160). Overall, alerts were dismissed in 66% of the cases, again, with the majority for Mg testing (88%). iCa and NT-proBNP were re-ordered only in 5% and 7% of cases, respectively. Regarding patient safety, the authors examined the International Classification of Diseases Ninth Revision (ICD-9) codes, which may be associated with electrolyte disturbances. Despite the decline in electrolyte measurements, no increase of ICD-9 codes was observed.

Riley et al.[31] aimed to avoid duplicate genetic testing, as this is generally indicated only once in a patient’s lifetime. If the order has already been performed, the ordering provider was informed about the date of the previous result. Repeated analyses could be ordered by phone only. Evaluation after the intervention revealed that 82% of repeated orders were justified because the previous order yielded no result due to errors in the testing process. The authors mention that they have adjusted the programming according to these results, but this was not included in the study.

Revision of laboratory ordering forms and profiles

The position where tests are placed in the order entry system may affect the number of placed orders.[3] Furthermore, laboratory ordering profiles (LOPs), which are used to order a bundle of defined analytes with one click in the CPOE system, seem to be a source of overutilization; studies show that the number of orders drops after removing tests from such LOPs. An example provided by Michael Cornes describes a reduction of GGT orders of 82% after the test aiming to assess liver function was removed from the LOP.[4] Keppel et al.[32] retrospectively evaluated a DM strategy implemented to reduce unnecessary testing of the cardiac markers high-sensitive troponin T (hsTropT) and NT-proBNP. This intervention was conducted in collaboration with clinicians at three wards of the department of Cardiology, Clinic of Internal Medicine II, University Hospital Salzburg. The implementation started in one ward with an educational approach (see the later subsection on "Education"). Later, both cardiac markers were removed from all LOPs of the three wards, along with the distribution of information about the correct use of hsTropT and NT-proBNP in the form of guidelines and oral presentations. Despite the opportunity to order both tests without restrictions in the CPOE system separate from the LOP, monthly orders decreased by 66.1% and 75.8% for hsTropT and NT-proBNP, respectively, on all three wards. These results indicate that LOPs may indeed be a source of overutilization since they are often not used correctly (e.g., for specific indications) but merely for convenience purposes. Regarding patient safety, length of patient stay and 30-day all-cause re-admission rate were evaluated as surrogate markers, without adverse outcomes.

Similarly, Larochelle et al.[17] removed cardiac markers from LOPs. While CK and CK-MB were entirely removed, troponin remained in two LOPs for evaluation of new symptoms, suggesting ACS. As indicated above, this DM strategy was implemented in a multifaceted approach. Altogether, the percentage of patients per month with guideline-concordant ordering of cardiac markers for ACS increased from 57.1% to 95.5%. Annually, ordered tests decreased by 16%, 87%, and 95% for troponin, CK, and CK-MB, respectively.

Along with educational sessions, audits, and feedback, Bartlett et al.[14] introduced a panel for CRP and ESR testing. CRP was preselected, and an explanation referred to the recommended indications for these tests. Overall, ESR as well as combined ESR/CRP testing were reduced by 33% and 25%, respectively, while the mean number of CRP tests remained unchanged. However, further examination of patients’ charts revealed that inappropriate ESR orders remained after the intervention.

Other studies focus on LOPs for specific indications or diagnoses. Delvaux et al.[33] conducted a randomized controlled trial among general physicians (GPs). LOPs were created for 17 selected indications, based on available guidelines. In the intervention group, GPs received suggested analyses through the CPOE system after selecting an indication, and modifications were allowed before submitting the request. The control group also stated the indication of their orders but did not receive suggestions for test ordering. In the intervention group, the proportion of appropriate tests significantly increased by 0.21 for all tests. In the intervention arm, only 24 tests were ordered per panel, compared to 31 tests in the control arm. The evaluation of potentially delayed diagnoses revealed no difference between the groups. This is an example of how laboratory specialists may aid in test requesting and of how physicians are willing to accept their expert opinion.

Whiting et al.[12] aimed to standardize blood tests and introduced the possibility for primary care physicians to order “test groups” for monitoring patients with chronic diseases. Compared to previous habits, full blood counts (FBCs) and liver function tests (LFTs) were not required in this indication. The implementation comprised several PDSA cycles, educational sessions, and regular meetings for discussion and feedback. Requests per 1,000 patients significantly decreased by 14% and 22% for hemoglobin assessing FBC and bilirubin assessing LFT, respectively. Sodium, which was not affected by the DM strategy, and ALT values ≥ 120IU/L, which were assessed to identify alterations in possible significant pathology, did not show significant changes. Therefore, the authors concluded that the measures may not lead to more missed diagnoses.

In conclusion, LOPs should be revised to suggest appropriate tests for specific indications or diagnoses[34] rather than for unspecific “routine” panels.

Removal of outdated tests

Apart from giving an alert for inappropriate orders, tests may also be entirely removed from the order entry system. One example within the publication of Mrazek et al.[4], provided by Ana-Maria Simundic, refers to a stepwise elimination process of CK-MB isoform. According to an expert consensus document, CK-MB isoform may not be necessary in the case of high sensitive troponin assay availability.[35]

Display costs

Some studies evaluated the effect of displaying costs during the order entry process. Horn et al.[36] selected 27 laboratory tests, which yielded overall high costs due to the high price of a single analysis or to frequent ordering. Through the intervention period, the costs were displayed to primary care physicians of a group practice (“intervention physicians”), while physicians of other group practices, who received no information about prices, served as a control. In addition, the intervention physicians were informed about the aim of the project via e-mail. The results showed that for five of the twenty-seven tests, the display of cost information was associated with a statistically significant reduction in monthly laboratory ordering rates.

Similarly, Feldman et al.[37] focused on laboratory tests that were either frequently ordered or expensive. Different from the above-mentioned study, this intervention was conducted in a tertiary care hospital, and 61 laboratory tests were randomized, with the costs displayed (“active” arm) or not (control arm). The ordering physicians were not actively informed about why fees were displayed. This intervention resulted in a 9.1% reduction of orders in the active arm, while the orders of control tests increased by 5.1%.

Silvestri et al.[38] conducted a similar study in an academic health system comprising three hospitals. The evaluation of laboratory orders before and after the implementation of cost display for 1,032 laboratory tests revealed decreased likelihoods for patients with orders during the encounter. Even if tests were ordered, the proportion of requests on a given hospital day as well as the number of tests ordered in one day decreased. In addition, in-hospital mortality, which was assessed for patient safety, did not increase in the post-intervention period.

Overall, the interventional impact was rated as “modest” by the authors.[36][37][38] Furthermore, investigation of appropriateness of test selection was not part of the study designs. Costs should never be the sole decision criterion for laboratory test ordering, not only because patient wellbeing should always be the number one priority of each physician, but also because laboratory costs only contribute to up to 2.5% of the overall healthcare costs but make up the majority of medical decision-making.[39] Therefore, cost reduction in the laboratory would have a minor impact on the total budget but a major impact on the quality of patient care. This opinion is supported by the survey of Horn et al.[36] Only a minority of surveyed clinicians stated that cost information frequently influenced their decisions. In general, cost control is endorsed by clinicians, but reductions in expenditure may be also achieved by implementing DM strategies, which combat overutilization.[14][16][17][19][21][26][28][29][30][31]

Adding tests

Adding tests may be one attempt to prevent delayed or missed diagnoses. Salinas et al.[40] added calcium testing to orders from primary care patients older than 45 years of age and without a previous calcium test within the last three years. Using this approach, several cases of primary hyperparathyroidism could be detected.

Reflex and reflective testing

Another possibility of adding tests is through reflex or reflective testing. While reflex testing refers to the automated addition of tests according to a fixed algorithm within the LIS, reflective testing is the approach of adding tests and comments after the laboratory specialist has interpreted the results in synopsis with available clinical information.[41]

In general, reflex testing may be used to prevent over- as well as underutilization. For example, reflex testing may be suitable for the stepwise analysis of thyroid hormones, where thyroid-stimulating hormone (TSH) is the initial test, and subsequent analysis of free thyroid hormones should only be performed in the case of abnormal TSH results.[18][19][42] By implementing this reflex, subsequent tests cannot be missed (preventing underutilization), and clinicians do not have to order all tests at once, which would be a source of overutilization. With regard to patient safety, retrospective data analysis may help to identify appropriate cut-off points for reflex testing.[42] Furthermore, two studies address the important topic of quality improvement and balancing measures. In the study of Taher et al.[18], continuous data monitoring and collaboration with clinicians resulted in improvements of the TSH reflex algorithm through two PDSA cycles. Gilmour et al.[19] accompanied the two-stage process by a root cause analysis, feedback through a survey and baseline data analysis to evaluate potentially missed diagnoses after the implementation of the reflex system.

Reflex and reflective approaches may also be combined as depicted by Elnenaei et al.[43], who describe an approach to early detection of pituitary dysfunction. Reflex rules for selected hormones were defined in collaboration with laboratory and endocrine physicians and implemented in the LIS. Lists of identified test results were then further evaluated by a laboratory physician, who decided if defined follow-up tests were indicated or not (reflective testing) based on previous results and available clinical information. In the case where added tests yielded abnormal results, the laboratory physicians would add an appropriate comment, including possible causes as well as a referral to an endocrine physician, if required.

Oosterhuis et al.[44] conducted a randomized controlled trial on reflective testing. The laboratory specialist performed reflective testing according to a routine procedure. Afterwards, it was randomly assigned whether the primary care physician received the added results or comments (intervention arm) or not (control arm). The evaluation of medical records revealed a significant positive outcome for patient management in the intervention arm.

In general, the evaluation of a questionnaire with clinical scenarios revealed that reflective testing is appreciated by physicians, but it depends on the tests and associated ethical questions. For pregnancy tests or PSA, physicians wish to be consulted before the tests are added. Furthermore, after adding tests it must be ensured that the results are not overlooked.[45] In a survey among patients attending a general practice surgery or hospital outpatient clinics, the majority endorse the concept of reflective testing.[46]

Algorithms

Algorithms are an advancement of reflex und reflective testing; several concatenated if-then queries are addressed, until a diagnostic decision is possible.[47] One practical example of such an algorithm is the PTT Advisor, a mobile application that helps to choose the appropriate follow-up tests in patients with a prolonged partial thromboplastin time (PTT) and normal prothrombin time.[48] However, an evaluation of apps with regard to the impact on test ordering would be meaningful. Meyer et al.[49] propose an approach with patient vignettes that have to be solved either with the PTT Advisor or with usual clinical decision support. The results indicate a superiority of the PTT Advisor regarding test ordering and diagnostic decision-making. In addition, a questionnaire may be used to identify further fields of improvement of the mobile application.

Another way to implement diagnostic algorithms would be to program the according if-then cycles directly into the LIS. Furundarena et al.[50] introduced a new possibility for physicians to order an “initial study of anemia.” The order starts with the analysis of a hemogram, and only upon detection of anemia are further tests added automatically in the LIS according to predefined rules. Since the ordering provider does not know beforehand if the patient is anemic, follow-up tests are often ordered simultaneously with the hemogram. Therefore, many of these overused analyses were prevented with the algorithm, since anemia was only present in 20% of these requests.

Definition of these algorithms requires a lot of knowledge and time, as they need to be based on current evidence, with the need of annual revision. Additionally, evidence may not be available for every step or may be contradictory. In these cases, expert opinions would be the method of choice. Information technology (IT) aids in constructing these algorithms, and constant improvement in form of artificial intelligence (AI) solutions would be the next logical step.

Education

The impact of educational interventions in the form of a workshop for general practice trainees was assessed by Morgan et al.[15] In two of three clinical scenarios, inappropriate testing was reduced after the workshop. However, in one scenario appropriate testing even decreased. A questionnaire, which was distributed, revealed that the attitudes of the trainees changed. The proportion of GP trainees believing that over-testing is a problem and tests can harm patients increased after the workshop, while fewer stated that they were reassured, felt pressured by patients and believed that they were less likely to be sued if more tests were ordered.

Other authors have used educational measures as the first step of a two-stage process. Gilmour et al.[19] aimed to reduce inappropriate free thyroxine (fT4) and free triiodothyronine (fT3) testing. In a first step, physicians were informed about the appropriate utilization of fT3/fT4 via oral presentations, emails, and postings. In a second step, an automated reflex system was implemented, adding fT4 in the case of an abnormal TSH. FT3 was only available by providing a clinical justification. The median numbers of weekly free thyroid hormones did not decrease significantly from baseline (90 and 39 for fT4 and fT3, respectively) compared to the period after the educational intervention (78 and 34 for fT4 and fT3, respectively). However, after the implementation of the reflex system, fT4 and fT3 testing could be reduced significantly to a weekly median of 59 and 14 for fT4 and fT3 tests, respectively. In the previously mentioned study of Keppel et al.[32], the first step in reducing cardiac markers was to inform the medical staff of one of the wards about the recommended indications for NT-proBNP and ask the ordering physicians to deselect NT-proBNP from LOPs where the order was not indicated, resulting in a reduction of 52.8% of respective orders. After further removal of NT-proBNP from LOPs, the overall decrease was 84.6% of NT-proBNP tests per month at this ward. These studies used a stepwise approach to show that education may reduce the test volume, but the effect may be more pronounced by an additional implementation of IT-based solutions.

The observation that education as a sole method seems to be inferior compared to automated solutions was also noticed by our study group, comparing educational measures to an automated re-testing interval at two sites of a university hospital.[27] Previously mentioned was the implementation of the automated MRI for HbA1c in the LIS. In STV, only educational measures were realized, comprising the oral presentation of the evidence-based use of HbA1c in daily meetings and putting up posters to remind the medical staff about the re-testing interval of 60 days. Compared to the baseline period before the intervention, HbA1c measurements dropped by 21.1% after the educational measures in STV. The decline from 7.4% to 3.6% of HbA1c measured within 60 days was significant, but less pronounced compared to the automated MRI implemented in the LKH.

On the one hand, as already mentioned above, educational measures—e.g., transfer of information orally in several meetings or in a written form via pocket-sized cards or brochures—may accompany IT-based solutions to fill the gap of knowledge concerning appropriate utilization.[12][14][17] The effectiveness of education was not evaluated separately in these studies. On the other hand, IT-based solutions may also serve as a learning tool, as stated for the algorithm in form of a mobile application.[49] Lippi et al.[21], who implemented alerts for biological implausibility and MRIs, observed that the number of requests violating the rules decreased. Likewise, Waldron et al.[28] attribute the decline of CRP requests after the implementation of the MRI to an altered behavior of ordering providers. However, we could not observe an educational side effect of the automated MRI of HbA1c, implemented in form of a comment on the laboratory report. As has already been mentioned, measurements of HbA1c significantly decreased after the implementation of the automated MRI, but the number of orders remained nearly unchanged in the LKH.[27]

Discussion and conclusions

There is broad approval that laboratory DM approaches are useful for appropriate test utilization, and several tools are already in use.[51] However, there are still a number of challenges. Due to different outcome criteria and settings, results may not be generalizable or comparable, which is why DM approaches have to be adapted to local settings. Therefore, harmonization strategies would be desirable. However, a survey conducted by the European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) Working Group on Harmonization of the total testing process (WG-H) among national society members of the EFLM revealed that existing harmonization activities are not coordinated. MRIs are one example mentioned, for which the EFLM WG-H wants to start initiatives to produce official documents in European countries.[52] In addition, the third EFLM Strategic Conference under the chair of EFLM president Ana-Maria Simundic was planned to focus on DM only and to generate several task-and-finish groups that would lead the profession in this direction. Sadly, this conference had to be postponed due to the COVID-19 pandemic. We believe that the topics of harmonization as well as DM are recognized by laboratory specialists and that progress will be made over the coming decade.

As mentioned above, another challenge is that inappropriate orders remain.[14] One possibility for achieving appropriate test selection may be to conduct a health technology assessment prior to test implementation. Landaas et al.[53] describe an approach whereby the Laboratory Formulary Committee, comprising different medicine professionals, and the Smart Innovation staff of the local hospital analyzed a new molecular bladder cancer test according to a locally implemented health technology assessment program. In conclusion, the committee currently does not support system-wide use, but decided to start a small pilot study. The results thereof indicate that the test could have benefits for selected patients.

However, these evidence-based assessments and further recommendations proposed for successful implementation, like the selection of quality indicators for monitoring and improvement as well as the ensuring of regular updates, are time-consuming.[54] We believe that AI solutions are the next logical step, aiding in the development as well as improvement of DM strategies, as they could help to manage large data sets. The synopsis of results from laboratory medicine, diagnostic imaging, and pathology is necessary for the purpose of integrated diagnostics. Furthermore, the patient’s history, comorbidities, symptoms, and treatments have to be taken into account for correct interpretation.[55] Currently, few published articles deal with the issue of applying AI algorithms to laboratory test selection. Islam et al.[56][57] have published two such studies, one of which in this issue, where they developed a deep learning algorithm based on retrospective patient data to predict appropriate laboratory tests. Xu et al.[58] aimed to identify superfluous tests in existing lab orders by estimating normal test results within a retrospective dataset. Machine learning (ML) models may also be used to identify prognostic factors. Tseng et al.[59] incorporated clinical, pathological, and cancer-related gene features of patients with advanced oral cancer and found that only 6 of 44 genes analyzed are necessary for further prognostic risk stratification. Therefore, costs and resources for molecular analysis could be reduced with targeted requests. The MRIs mentioned above are implemented as pre-defined alerts. Concerning this challenge, Baron et al.[60] mention an approach where logistic regression models may be used to predict whether alerts will be accepted or overruled. The aim is to reduce the alert burden for the ordering clinician by showing only alerts that have a high probability of being accepted. However, not all questions can be solved with AI. For example, using serum tumor markers alone for cancer screening may currently not be recommended even if data were retrospectively evaluated using various ML models.[61]

Furthermore, it has to be acknowledged that AI is only a tool of assistance.[57] A combination of computerized and physician-guided processes may be better than each one on their own. Wang et al.[62] proved this theory when evaluating the efficiency of a deep learning system and experienced pathologists in detecting breast cancer cells. AUROC values were 0.925 for the former and 0.966 for the latter, but 0.995 when combined. Therefore, AI solutions may complement the recommended collaborations with clinicians for successful implementation.[54] Intensifying collaborations should be a feasible task, since a survey indicates that interest from both professions exists.[51] An advantage of complementary AI solutions would be that these systems, fed with unfiltered patient data, are capable of finding completely new diagnostic strategies that humans have not yet thought of. For example, Lien et al.[63] compared different ML models concerning the prediction of the two-day mortality of thrombocytopenic patients on the basis of hematological tests only.

In conclusion, the implementation of DM tools of laboratory specialists in collaboration with clinicians is increasing, and the incorporation of AI solutions is emerging in recent years. We believe that these solutions will help us to overcome technical barriers, a lack of harmonization, and other challenges.

Acknowledgements

Author contributions

Conceptualization, C.M. and J.C.; writing—original draft preparation, C.M.; writing—review and editing, C.M., E.H.-B., T.K.F., M.H.K., H.O. and J.C.; visualization, C.M.; supervision, J.C.; project administration, C.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of interest

The authors declare no conflict of interest.

References

  1. Whiting, Penny; Toerien, Merran; de Salis, Isabel; Sterne, Jonathan A.C.; Dieppe, Paul; Egger, Matthias; Fahey, Tom (1 October 2007). "A review identifies and classifies reasons for ordering diagnostic tests" (in en). Journal of Clinical Epidemiology 60 (10): 981–989. doi:10.1016/j.jclinepi.2007.01.012. https://linkinghub.elsevier.com/retrieve/pii/S0895435607000820. 
  2. Fryer, Anthony A; Hanna, Fahmy W (1 November 2009). "Managing demand for pathology tests: financial imperative or duty of care?" (in en). Annals of Clinical Biochemistry: International Journal of Laboratory Medicine 46 (6): 435–437. doi:10.1258/acb.2009.009186. ISSN 0004-5632. http://journals.sagepub.com/doi/10.1258/acb.2009.009186. 
  3. 3.0 3.1 Blumberg, Gari; Kitai, Eliezer; Vinker, Shlomo; Golan-Cohen, Avivit (1 June 2019). "Changing electronic formats is associated with changes in number of laboratory tests ordered". The American Journal of Managed Care 25 (6): e179–e181. ISSN 1936-2692. PMID 31211550. https://pubmed.ncbi.nlm.nih.gov/31211550. 
  4. 4.0 4.1 4.2 4.3 Mrazek, Cornelia; Simundic, Ana-Maria; Salinas, Maria; von Meyer, Alexander; Cornes, Michael; Bauçà, Josep Miquel; Nybo, Mads; Lippi, Giuseppe et al. (1 June 2020). "Inappropriate use of laboratory tests: How availability triggers demand – Examples across Europe" (in en). Clinica Chimica Acta 505: 100–107. doi:10.1016/j.cca.2020.02.017. https://linkinghub.elsevier.com/retrieve/pii/S0009898120300723. 
  5. 5.0 5.1 5.2 5.3 Zhi, Ming; Ding, Eric L.; Theisen-Toupal, Jesse; Whelan, Julia; Arnaout, Ramy (15 November 2013). Szecsi, Pal Bela. ed. "The Landscape of Inappropriate Laboratory Testing: A 15-Year Meta-Analysis" (in en). PLoS ONE 8 (11): e78962. doi:10.1371/journal.pone.0078962. ISSN 1932-6203. PMC PMC3829815. PMID 24260139. https://dx.plos.org/10.1371/journal.pone.0078962. 
  6. Cadamuro, Janne; Gaksch, Martin; Wiedemann, Helmut; Lippi, Giuseppe; von Meyer, Alexander; Pertersmann, Astrid; Auer, Simon; Mrazek, Cornelia et al. (1 April 2018). "Are laboratory tests always needed? Frequency and causes of laboratory overuse in a hospital setting" (in en). Clinical Biochemistry 54: 85–91. doi:10.1016/j.clinbiochem.2018.01.024. https://linkinghub.elsevier.com/retrieve/pii/S0009912017312274. 
  7. Miyakis, S.; Karamanof, G.; Liontos, M.; Mountokalakis, T. D (1 December 2006). "Factors contributing to inappropriate ordering of tests in an academic medical department and the effect of an educational feedback strategy" (in en). Postgraduate Medical Journal 82 (974): 823–829. doi:10.1136/pgmj.2006.049551. ISSN 0032-5473. PMC PMC2653931. PMID 17148707. https://pmj.bmj.com/lookup/doi/10.1136/pgmj.2006.049551. 
  8. Gandhi, Tejal K.; Kachalia, Allen; Thomas, Eric J.; Puopolo, Ann Louise; Yoon, Catherine; Brennan, Troyen A.; Studdert, David M. (3 October 2006). "Missed and Delayed Diagnoses in the Ambulatory Setting: A Study of Closed Malpractice Claims" (in en). Annals of Internal Medicine 145 (7): 488–96. doi:10.7326/0003-4819-145-7-200610030-00006. ISSN 0003-4819. http://annals.org/article.aspx?doi=10.7326/0003-4819-145-7-200610030-00006. 
  9. Kachalia, Allen; Gandhi, Tejal K.; Puopolo, Ann Louise; Yoon, Catherine; Thomas, Eric J.; Griffey, Richard; Brennan, Troyen A.; Studdert, David M. (1 February 2007). "Missed and delayed diagnoses in the emergency department: a study of closed malpractice claims from 4 liability insurers". Annals of Emergency Medicine 49 (2): 196–205. doi:10.1016/j.annemergmed.2006.06.035. ISSN 1097-6760. PMID 16997424. https://pubmed.ncbi.nlm.nih.gov/16997424. 
  10. Sarkar, Mayukh K.; Botz, Chad M.; Laposata, Michael (1 March 2017). "An assessment of overutilization and underutilization of laboratory tests by expert physicians in the evaluation of patients for bleeding and thrombotic disorders in clinical context and in real time". Diagnosis (Berlin, Germany) 4 (1): 21–26. doi:10.1515/dx-2016-0042. ISSN 2194-802X. PMID 29536907. https://pubmed.ncbi.nlm.nih.gov/29536907. 
  11. Cornes, Michael (15 June 2017). "Case report of unexplained hypocalcaemia in a slightly haemolysed sample" (in en). Biochemia Medica 27 (2): 426–429. doi:10.11613/BM.2017.046. ISSN 1330-0962. PMC PMC5493164. PMID 28694734. http://www.biochemia-medica.com/en/journal/27/2/10.11613/BM.2017.046. 
  12. 12.0 12.1 12.2 12.3 12.4 Whiting, Darunee; Croker, Richard; Watson, Jessica; Brogan, Andy; Walker, Alex J; Lewis, Tom (1 March 2019). "Optimising laboratory monitoring of chronic conditions in primary care: a quality improvement framework" (in en). BMJ Open Quality 8 (1): e000349. doi:10.1136/bmjoq-2018-000349. ISSN 2399-6641. PMC PMC6440689. PMID 30997410. https://qir.bmj.com/lookup/doi/10.1136/bmjoq-2018-000349. 
  13. Vrijsen, B.E.L.; Naaktgeboren, C.A.; Vos, L.M.; van Solinge, W.W.; Kaasjager, H.A.H.; ten Berg, M.J. (1 March 2020). "Inappropriate laboratory testing in internal medicine inpatients: Prevalence, causes and interventions" (in en). Annals of Medicine and Surgery 51: 48–53. doi:10.1016/j.amsu.2020.02.002. PMC PMC7021522. PMID 32082564. https://linkinghub.elsevier.com/retrieve/pii/S2049080120300157. 
  14. 14.0 14.1 14.2 14.3 14.4 14.5 14.6 14.7 Bartlett, Kristen J; Vo, Ann P; Rueckert, Justin; Wojewoda, Christina; Steckel, Elizabeth H; Stinnett-Donnelly, Justin; Repp, Allen B (1 February 2020). "Promoting appropriate utilisation of laboratory tests for inflammation at an academic medical centre" (in en). BMJ Open Quality 9 (1): e000788. doi:10.1136/bmjoq-2019-000788. ISSN 2399-6641. PMC PMC7047503. PMID 32098777. https://qir.bmj.com/lookup/doi/10.1136/bmjoq-2019-000788. 
  15. 15.0 15.1 Morgan, Simon; Morgan, Andy; Kerr, Rohan; Tapley, Amanda; Magin, Parker (1 September 2016). "Test ordering by GP trainees: Effects of an educational intervention on attitudes and intended practice". Canadian Family Physician Medecin De Famille Canadien 62 (9): 733–741. ISSN 1715-5258. PMC 5023346. PMID 27629671. https://pubmed.ncbi.nlm.nih.gov/27629671. 
  16. 16.0 16.1 16.2 Juskewitch, Justin E.; Norgan, Andrew P.; Johnson, Ryan D.; Trivedi, Vipul A.; Hanson, Curtis A.; Block, Darci R. (1 April 2019). "Impact of an electronic decision support rule on ESR/CRP co-ordering rates in a community health system and projected impact in the tertiary care setting and a commercially insured population" (in en). Clinical Biochemistry 66: 13–20. doi:10.1016/j.clinbiochem.2019.01.009. https://linkinghub.elsevier.com/retrieve/pii/S0009912018311652. 
  17. 17.0 17.1 17.2 17.3 17.4 17.5 17.6 Larochelle, Marc R.; Knight, Amy M.; Pantle, Hardin; Riedel, Stefan; Trost, Jeffrey C. (1 November 2014). "Reducing Excess Cardiac Biomarker Testing at an Academic Medical Center" (in en). Journal of General Internal Medicine 29 (11): 1468–1474. doi:10.1007/s11606-014-2919-5. ISSN 0884-8734. PMC PMC4238205. PMID 24973056. http://link.springer.com/10.1007/s11606-014-2919-5. 
  18. 18.0 18.1 18.2 18.3 Taher, Jennifer; Beriault, Daniel R.; Yip, Drake; Tahir, Shafqat; Hicks, Lisa K.; Gilmour, Julie A. (1 July 2020). "Reducing free thyroid hormone testing through multiple Plan-Do-Study-Act cycles" (in en). Clinical Biochemistry 81: 41–46. doi:10.1016/j.clinbiochem.2020.05.004. https://linkinghub.elsevier.com/retrieve/pii/S0009912020303106. 
  19. 19.0 19.1 19.2 19.3 19.4 19.5 Gilmour, Julie A.; Weisman, Alanna; Orlov, Steven; Goldberg, Robert J.; Goldberg, Alyse; Baranek, Hayley; Mukerji, Geetha (1 June 2017). "Promoting resource stewardship: Reducing inappropriate free thyroid hormone testing" (in en). Journal of Evaluation in Clinical Practice 23 (3): 670–675. doi:10.1111/jep.12698. https://onlinelibrary.wiley.com/doi/10.1111/jep.12698. 
  20. Mostofian, Fargoi; Ruban, Cynthiya; Simunovic, Nicole; Bhandari, Mohit (1 January 2015). "Changing physician behavior: what works?". The American Journal of Managed Care 21 (1): 75–84. ISSN 1936-2692. PMID 25880152. https://pubmed.ncbi.nlm.nih.gov/25880152. 
  21. 21.0 21.1 21.2 21.3 Lippi, Giuseppe; Brambilla, Marco; Bonelli, Patrizia; Aloe, Rosalia; Balestrino, Antonio; Nardelli, Anna; Ceda, Gian Paolo; Fabi, Massimo (1 November 2015). "Effectiveness of a computerized alert system based on re-testing intervals for limiting the inappropriateness of laboratory test requests". Clinical Biochemistry 48 (16-17): 1174–1176. doi:10.1016/j.clinbiochem.2015.06.006. ISSN 1873-2933. PMID 26074445. https://pubmed.ncbi.nlm.nih.gov/26074445. 
  22. Parkhurst, Emily; Calonico, Elise; Noh, Grace (31 July 2020). "Medical Decision Support to Reduce Unwarranted Methylene Tetrahydrofolate Reductase (MTHFR) Genetic Testing". Journal of Medical Systems 44 (9): 152. doi:10.1007/s10916-020-01615-5. ISSN 1573-689X. PMID 32737598. https://pubmed.ncbi.nlm.nih.gov/32737598. 
  23. Cadamuro, Janne; Mrazek, Cornelia; Wiedemann, Helmut; Felder, Thomas Klaus; Oberkofler, Hannes; Haschke-Becher, Elisabeth; Lippi, Giuseppe (1 September 2017). "Effectiveness of a Laboratory Gate-Keeping Strategy to Overcome Inappropriate Test Utilization for the Diagnosis of Heparin-Induced Thrombocytopenia". Seminars in Thrombosis and Hemostasis 43 (6): 645–648. doi:10.1055/s-0037-1604054. ISSN 1098-9064. PMID 28750423. https://pubmed.ncbi.nlm.nih.gov/28750423. 
  24. Cuker, Adam; Arepally, Gowthami M.; Chong, Beng H.; Cines, Douglas B.; Greinacher, Andreas; Gruel, Yves; Linkins, Lori A.; Rodner, Stephen B. et al. (27 November 2018). "American Society of Hematology 2018 guidelines for management of venous thromboembolism: heparin-induced thrombocytopenia" (in en). Blood Advances 2 (22): 3360–3392. doi:10.1182/bloodadvances.2018024489. ISSN 2473-9529. PMC PMC6258919. PMID 30482768. https://ashpublications.org/bloodadvances/article/2/22/3360/16129/American-Society-of-Hematology-2018-guidelines-for. 
  25. 25.0 25.1 Lang, T.; Croal, B. (March 2021). "National minimum retesting intervals in pathology" (PDF). Royal College of Pathologists. pp. 1–80. https://www.acb.org.uk/resource/g147-national-minimum-retesting-intervals-in-pathology.html. 
  26. 26.0 26.1 Salinas, Maria; López-Garrigós, Maite; Flores, Emilio; Blasco, Alvaro; Leiva-Salinas, Carlos (25 October 2020). "Successful implementations of automated minimum re-test intervals to overcome ferritin over-requesting in a Spanish hospital laboratory" (in en). Clinical Chemistry and Laboratory Medicine (CCLM) 58 (11): e287–e289. doi:10.1515/cclm-2020-0668. ISSN 1434-6621. https://www.degruyter.com/document/doi/10.1515/cclm-2020-0668/html. 
  27. 27.0 27.1 27.2 Mrazek, Cornelia; Stechemesser, Lars; Haschke-Becher, Elisabeth; Hölzl, Bertram; Paulweber, Bernhard; Keppel, Martin H.; Simundic, Ana-Maria; Oberkofler, Hannes et al. (1 June 2020). "Reducing the probability of falsely elevated HbA1c results in diabetic patients by applying automated and educative HbA1c re-testing intervals" (in en). Clinical Biochemistry 80: 14–18. doi:10.1016/j.clinbiochem.2020.03.014. https://linkinghub.elsevier.com/retrieve/pii/S0009912019314018. 
  28. 28.0 28.1 28.2 Waldron, Jenna L; Ford, Clare; Dobie, Donald; Danks, Graham; Humphrey, Richard; Rolli, Alain; Gama, Rousseau (1 August 2014). "An automated minimum retest interval rejection rule reduces repeat CRP workload and expenditure, and influences clinician-requesting behaviour" (in en). Journal of Clinical Pathology 67 (8): 731–733. doi:10.1136/jclinpath-2014-202256. ISSN 0021-9746. http://jcp.bmj.com/lookup/doi/10.1136/jclinpath-2014-202256. 
  29. 29.0 29.1 Lapić, Ivana; Rogić, Dunja; Fuček, Mirjana; Galović, Ružica (15 October 2019). "Effectiveness of minimum retesting intervals in managing repetitive laboratory testing: experience from a Croatian university hospital". Biochemia medica 29 (3): 531–558. doi:10.11613/BM.2019.030705. PMC PMC6784426. PMID 31624458. https://www.biochemia-medica.com/en/journal/29/3/10.11613/BM.2019.030705. 
  30. 30.0 30.1 Moyer, Ann M.; Saenger, Amy K.; Willrich, Maria; Donato, Leslie J.; Baumann, Nikola A.; Block, Darci R.; Botz, Chad M.; Khan, Munawwar A. et al. (1 June 2016). "Implementation of Clinical Decision Support Rules to Reduce Repeat Measurement of Serum Ionized Calcium, Serum Magnesium, and N-Terminal Pro-B-Type Natriuretic Peptide in Intensive Care Unit Inpatients". Clinical Chemistry 62 (6): 824–830. doi:10.1373/clinchem.2015.250514. ISSN 1530-8561. PMID 27022069. https://pubmed.ncbi.nlm.nih.gov/27022069. 
  31. 31.0 31.1 Riley, Jacquelyn D; Stanley, Glenn; Wyllie, Robert; Burt, Holly L; Horwitz, Sandra B; Cooper, Donna D; Procop, Gary W (29 October 2019). "An Electronic Strategy for Eliminating Unnecessary Duplicate Genetic Testing" (in en). American Journal of Clinical Pathology 153 (3): 328–332. doi:10.1093/ajcp/aqz163. ISSN 0002-9173. https://academic.oup.com/ajcp/advance-article/doi/10.1093/ajcp/aqz163/5608999. 
  32. 32.0 32.1 Keppel, Martin H.; Kolbitsch, Tobias; Hoppe, Uta C.; Auer, Simon; Felder, Thomas K.; Oberkofler, Hannes; Mrazek, Cornelia; Haschke-Becher, Elisabeth et al. (27 August 2020). "The clinically effective use of cardiac markers by restructuring laboratory profiles at Cardiology wards". Clinical Chemistry and Laboratory Medicine 58 (9): 1565–1571. doi:10.1515/cclm-2019-1229. ISSN 1437-4331. PMID 32305953. https://pubmed.ncbi.nlm.nih.gov/32305953. 
  33. Delvaux, Nicolas; Piessens, Veerle; Burghgraeve, Tine De; Mamouris, Pavlos; Vaes, Bert; Stichele, Robert Vander; Cloetens, Hanne; Thomas, Josse et al. (4 November 2020). "Clinical decision support improves the appropriateness of laboratory test ordering in primary care without increasing diagnostic error: the ELMO cluster randomized trial". Implementation science: IS 15 (1): 100. doi:10.1186/s13012-020-01059-y. ISSN 1748-5908. PMC 7640389. PMID 33148311. https://pubmed.ncbi.nlm.nih.gov/33148311. 
  34. Smellie, W. S. A.; Association for Clinical Biochemistry’s Clinical Practice Section (20 March 2012). "Time to harmonise common laboratory test profiles". BMJ (Clinical research ed.) 344: e1169. doi:10.1136/bmj.e1169. ISSN 1756-1833. PMID 22434088. https://pubmed.ncbi.nlm.nih.gov/22434088. 
  35. Thygesen, Kristian; Alpert, Joseph S; Jaffe, Allan S; Chaitman, Bernard R; Bax, Jeroen J; Morrow, David A; White, Harvey D; ESC Scientific Document Group et al. (14 January 2019). "Fourth universal definition of myocardial infarction (2018)" (in en). European Heart Journal 40 (3): 237–269. doi:10.1093/eurheartj/ehy462. ISSN 0195-668X. https://academic.oup.com/eurheartj/article/40/3/237/5079081. 
  36. 36.0 36.1 36.2 Horn, Daniel M.; Koplan, Kate E.; Senese, Margaret D.; Orav, E. John; Sequist, Thomas D. (1 May 2014). "The Impact of Cost Displays on Primary Care Physician Laboratory Test Ordering" (in en). Journal of General Internal Medicine 29 (5): 708–714. doi:10.1007/s11606-013-2672-1. ISSN 0884-8734. PMC PMC4000348. PMID 24257964. http://link.springer.com/10.1007/s11606-013-2672-1. 
  37. 37.0 37.1 Feldman, Leonard S.; Shihab, Hasan M.; Thiemann, David; Yeh, Hsin-Chieh; Ardolino, Margaret; Mandell, Steven; Brotman, Daniel J. (27 May 2013). "Impact of Providing Fee Data on Laboratory Test Ordering: A Controlled Clinical Trial" (in en). JAMA Internal Medicine 173 (10): 903–908. doi:10.1001/jamainternmed.2013.232. ISSN 2168-6106. http://archinte.jamanetwork.com/article.aspx?doi=10.1001/jamainternmed.2013.232. 
  38. 38.0 38.1 Silvestri, Mark T.; Xu, Xiao; Long, Theodore; Bongiovanni, Tasce; Bernstein, Steven L.; Chaudhry, Sarwat I.; Silvestri, Julia I.; Stolar, Marilyn et al. (1 August 2018). "Impact of Cost Display on Ordering Patterns for Hospital Laboratory and Imaging Services" (in en). Journal of General Internal Medicine 33 (8): 1268–1275. doi:10.1007/s11606-018-4495-6. ISSN 0884-8734. PMC PMC6082197. PMID 29845468. http://link.springer.com/10.1007/s11606-018-4495-6. 
  39. Lippi, Giuseppe; Plebani, Mario (25 February 2019). "Cost, profitability and value of laboratory diagnostics: in God we trust, all others bring data". Journal of Laboratory Medicine 43 (1): 1–3. doi:10.1515/labmed-2018-0321. ISSN 2567-9449. https://www.degruyter.com/document/doi/10.1515/labmed-2018-0321/html. 
  40. Salinas, Maria; López-Garrigós, Maite; Pomares, Francisco; Lugo, Javier; Asencio, Alberto; López-Penabad, Luis; Dominguez, Jose Ramón; Leiva-Salinas, Carlos (1 September 2013). "Serum calcium (S-Ca), the forgotten test: Preliminary results of an appropriateness strategy to detect primary hyperparathyroidism (pHPT)" (in en). Bone 56 (1): 73–76. doi:10.1016/j.bone.2013.05.011. https://linkinghub.elsevier.com/retrieve/pii/S8756328213001932. 
  41. Venne, Wilhelmine P. H. G. Verboeket-van de; Aakre, Kristin M.; Watine, Joseph; Oosterhuis, Wytze P. (1 July 2012). "Reflective testing: adding value to laboratory testing" (in en). Clinical Chemistry and Laboratory Medicine 50 (7): 1249–1252. doi:10.1515/cclm-2011-0611. ISSN 1437-4331. https://www.degruyter.com/document/doi/10.1515/cclm-2011-0611/html. 
  42. 42.0 42.1 Gill, Jasmine; Barakauskas, Vilte E.; Thomas, Dylan; Rodriguez-Capote, Karina; Higgins, Trefor; Zhang, Don; VanSpronsen, Amanda; Babenko, Oksana et al. (26 October 2017). "Evaluation of thyroid test utilization through analysis of population-level data". Clinical Chemistry and Laboratory Medicine 55 (12): 1898–1906. doi:10.1515/cclm-2016-1049. ISSN 1437-4331. PMID 28306523. https://pubmed.ncbi.nlm.nih.gov/28306523. 
  43. Elnenaei, Manal; Minney, Derek; Clarke, David B.; Kumar-Misir, Andrew; Imran, Syed Ali (1 April 2018). "Reflex and reflective testing strategies for early detection of pituitary dysfunction". Clinical Biochemistry 54: 78–84. doi:10.1016/j.clinbiochem.2018.02.014. ISSN 1873-2933. PMID 29486187. https://pubmed.ncbi.nlm.nih.gov/29486187. 
  44. Oosterhuis, Wytze P.; Venne, Wilhelmine Phg Verboeket-van de; Deursen, Cees Tbm van; Stoffers, Henri Ejh; Acker, Bernadette Ac van; Bossuyt, Patrick Mm (1 March 2021). "Reflective testing - A randomized controlled trial in primary care patients". Annals of Clinical Biochemistry 58 (2): 78–85. doi:10.1177/0004563220968373. ISSN 1758-1001. PMID 33040573. https://pubmed.ncbi.nlm.nih.gov/33040573. 
  45. Darby, Denise; Kelly, Anne-Marie (1 September 2006). "Reflective testing - what do our service users think?" (in en). Annals of Clinical Biochemistry: International Journal of Laboratory Medicine 43 (5): 361–368. doi:10.1258/000456306778520016. ISSN 0004-5632. http://journals.sagepub.com/doi/10.1258/000456306778520016. 
  46. Paterson, Sarah G; Robson, Jean E; McMahon, Michael J; Baxter, Gwen; Murphy, Michael J; Paterson, John R (1 September 2006). "Reflective testing: what do patients think?" (in en). Annals of Clinical Biochemistry: International Journal of Laboratory Medicine 43 (5): 369–371. doi:10.1258/000456306778520098. ISSN 0004-5632. http://journals.sagepub.com/doi/10.1258/000456306778520098. 
  47. Hoffmann, Georg; Aufenanger, Johannes; Födinger, Manuela; Cadamuro, Janne; von Eckardstein, Arnold; Kaeslin-Meyer, Martha; Hofmann, Walter (1 December 2014). "Benefits and limitations of laboratory diagnostic pathways". Diagnosis (Berlin, Germany) 1 (4): 269–276. doi:10.1515/dx-2014-0045. ISSN 2194-802X. PMID 29540006. https://pubmed.ncbi.nlm.nih.gov/29540006. 
  48. Savel, Thomas G.; Lee, Brian A.; Ledbetter, Gregory S.; Brown, Sara E.; Taylor, Julie; Thompson, Pamela J. (1 June 2013). "PTT Advisor: A CDC-supported initiative to develop a mobile clinical laboratory decision support application for the iOS platform.". Online Journal of Public Health Informatics 5 (2): 1–9. doi:10.5210/ojphi.v5i2.4363. http://journals.uic.edu/ojs/index.php/ojphi/article/view/4363. 
  49. 49.0 49.1 Meyer, Ashley N D; Thompson, Pamela J; Khanna, Arushi; Desai, Samir; Mathews, Benji K; Yousef, Elham; Kusnoor, Anita V; Singh, Hardeep (1 July 2018). "Evaluating a mobile application for improving clinical laboratory test ordering and diagnosis" (in en). Journal of the American Medical Informatics Association 25 (7): 841–847. doi:10.1093/jamia/ocy026. ISSN 1067-5027. PMC PMC6947660. PMID 29688391. https://academic.oup.com/jamia/article/25/7/841/4980803. 
  50. Furundarena, J R; Uranga, Alasne; González, Carmen; Martínez, Bruno; Iriondo, June; Ondarra, Laida; Arambarri, Amaia; San Vicente, Ricardo et al. (24 November 2020). "Initial study of anaemia profile for primary care centres with automated laboratory algorithms reduces the demand for ferritin, iron, transferrin, vitamin B 12 and folate tests" (in en). Journal of Clinical Pathology: jclinpath–2020–207130. doi:10.1136/jclinpath-2020-207130. ISSN 0021-9746. https://jcp.bmj.com/lookup/doi/10.1136/jclinpath-2020-207130. 
  51. 51.0 51.1 Ibarz, Mercedes; Cadamuro, Janne; Sumarac, Zorica; Guimaraes, Joao Tiago; Kovalevskaya, Svetlana; Nybo, Mads; Cornes, Michael P.; Vermeersch, Pieter et al. (23 February 2021). "Clinicians’ and laboratory medicine specialists’ views on laboratory demand management: a survey in nine European countries" (in en). Diagnosis 8 (1): 111–119. doi:10.1515/dx-2019-0081. ISSN 2194-802X. https://www.degruyter.com/document/doi/10.1515/dx-2019-0081/html. 
  52. Ceriotti, Ferruccio; Barhanovic, Najdana Gligorovic; Kostovska, Irena; Kotaska, Karel; Perich Alsina, Maria Carmen (1 January 2016). "Harmonisation of the laboratory testing process: need for a coordinated approach". Clinical Chemistry and Laboratory Medicine (CCLM) 54 (12): e361–e363. doi:10.1515/cclm-2016-0244. ISSN 1437-4331. https://www.degruyter.com/document/doi/10.1515/cclm-2016-0244/html. 
  53. Landaas, Erik J.; Eckel, Ashley M.; Wright, Jonathan L.; Baird, Geoffrey S.; Hansen, Ryan N.; Sullivan, Sean D. (1 January 2020). "Application of Health Technology Assessment (HTA) to Evaluate New Laboratory Tests in a Health System: A Case Study of Bladder Cancer Testing" (in en). Academic Pathology 7: 237428952096822. doi:10.1177/2374289520968225. ISSN 2374-2895. PMC PMC7656863. PMID 33225061. http://journals.sagepub.com/doi/10.1177/2374289520968225. 
  54. 54.0 54.1 Cadamuro, Janne; Ibarz, Mercedes; Cornes, Michael; Nybo, Mads; Haschke-Becher, Elisabeth; von Meyer, Alexander; Lippi, Giuseppe; Simundic, Ana-Maria (26 March 2019). "Managing inappropriate utilization of laboratory resources". Diagnosis (Berlin, Germany) 6 (1): 5–13. doi:10.1515/dx-2018-0029. ISSN 2194-802X. PMID 30096052. https://pubmed.ncbi.nlm.nih.gov/30096052. 
  55. Lippi, Giuseppe; Plebani, Mario (15 February 2020). "Integrated diagnostics: the future of laboratory medicine?". Biochemia Medica 30 (1): 010501. doi:10.11613/BM.2020.010501. ISSN 1846-7482. PMC 6904966. PMID 31839719. https://pubmed.ncbi.nlm.nih.gov/31839719. 
  56. Islam, Md Mohaimenul; Yang, Hsuan-Chia; Poly, Tahmina Nasrin; Li, Yu-Chuan Jack (18 November 2020). "Development of an Artificial Intelligence-Based Automated Recommendation System for Clinical Laboratory Tests: Retrospective Analysis of the National Health Insurance Database". JMIR medical informatics 8 (11): e24163. doi:10.2196/24163. ISSN 2291-9694. PMC 7710445. PMID 33206057. https://pubmed.ncbi.nlm.nih.gov/33206057. 
  57. 57.0 57.1 Islam, Md. Mohaimenul; Poly, Tahmina Nasrin; Yang, Hsuan-Chia; Li, Yu-Chuan (Jack) (29 May 2021). "Deep into Laboratory: An Artificial Intelligence Approach to Recommend Laboratory Tests" (in en). Diagnostics 11 (6): 990. doi:10.3390/diagnostics11060990. ISSN 2075-4418. PMC PMC8227070. PMID 34072571. https://www.mdpi.com/2075-4418/11/6/990. 
  58. Xu, Song; Hom, Jason; Balasubramanian, Santhosh; Schroeder, Lee F.; Najafi, Nader; Roy, Shivaal; Chen, Jonathan H. (11 September 2019). "Prevalence and Predictability of Low-Yield Inpatient Laboratory Diagnostic Tests" (in en). JAMA Network Open 2 (9): e1910967. doi:10.1001/jamanetworkopen.2019.10967. ISSN 2574-3805. PMC PMC6739729. PMID 31509205. https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2749559. 
  59. Tseng, Yi-Ju; Wang, Hsin-Yao; Lin, Ting-Wei; Lu, Jang-Jih; Hsieh, Chia-Hsun; Liao, Chun-Ta (3 August 2020). "Development of a Machine Learning Model for Survival Risk Stratification of Patients With Advanced Oral Cancer". JAMA network open 3 (8): e2011768. doi:10.1001/jamanetworkopen.2020.11768. ISSN 2574-3805. PMC 7442932. PMID 32821921. https://pubmed.ncbi.nlm.nih.gov/32821921. 
  60. Baron, Jason M; Huang, Richard; McEvoy, Dustin; Dighe, Anand S (1 January 2021). "Use of machine learning to predict clinical decision support compliance, reduce alert burden, and evaluate duplicate laboratory test ordering alerts". JAMIA Open 4 (ooab006). doi:10.1093/jamiaopen/ooab006. ISSN 2574-2531. PMC PMC7935497. PMID 33709062. https://doi.org/10.1093/jamiaopen/ooab006. 
  61. Wang, Hsin-Yao; Hsieh, Chia-Hsun; Wen, Chiao-Ni; Wen, Ying-Hao; Chen, Chun-Hsien; Lu, Jang-Jih (29 June 2016). Tan, Min-Han. ed. "Cancers Screening in an Asymptomatic Population by Using Multiple Tumour Markers" (in en). PLOS ONE 11 (6): e0158285. doi:10.1371/journal.pone.0158285. ISSN 1932-6203. PMC PMC4927114. PMID 27355357. https://dx.plos.org/10.1371/journal.pone.0158285. 
  62. Wang D.; Khosla, A.; Gargeya, R. et al. (2016). "Deep Learning for Identifying Metastatic Breast Cancer". arXiv. arXiv:1606.05718v1. https://arxiv.org/abs/1606.05718v1. 
  63. Lien, Frank; Wang, Hsin-Yao; Lu, Jang-Jih; Wen, Ying-Hao; Chiueh, Tzong-Shi (1 March 2021). "Predicting 2-Day Mortality of Thrombocytopenic Patients Based on Clinical Laboratory Data Using Machine Learning". Medical Care 59 (3): 245–250. doi:10.1097/MLR.0000000000001421. ISSN 1537-1948. PMC 7993911. PMID 33027237. https://pubmed.ncbi.nlm.nih.gov/33027237. 

Notes

This presentation is faithful to the original, with only a few minor changes to presentation, spelling, and grammar. In some cases important information was missing from the references, and that information was added.