Title: Considerations in the Automation of Laboratory Procedures

Author for citation: Joe Liscouski, with editorial modifications by Shawn Douglas

License for content: Creative Commons Attribution 4.0 International

Publication date: January 2021

Introduction

Scientists have been dealing with the issue of laboratory automation for decades, and during that time the meaning of those words has expanded from the basics of connecting an instrument to a computer, to the possibility of a fully integrated informatics infrastructure beginning with sample preparation and continuing on to the laboratory information management system (LIMS), electronic laboratory notebook (ELN), and beyond. Throughout this evolution there has been one underlying concern: how do we go about doing this?

The answer to that question has changed from a focus on hardware and programming, to today’s need for a lab-wide informatics strategy. We’ve moved from the bits and bytes of assembly language programming to managing terabytes of files and data structures.

The high-end of the problem—the large informatics database systems—has received significant industry-wide attention in the last decade. The stuff on the lab bench, while the target of a lot of individual products, has been less organized and more experimental. Failed or incompletely met promises have to yield to planned successes. How we do it needs to change. This document is about the considerations required when making that change. The haphazard "let's try this" method has to give way to more engineered solutions and a realistic appraisal of the human issues, as well as the underlying technology management and planning.

Why is this important? Whether you are conducting intense laboratory experiments to produce data and information or making chocolate chip cookies in the kitchen, two things remain important: productivity and the quality of the products. In either case, if the productivity isn’t high enough, you won’t be able to justify your work; if the quality isn’t there, no one will want what you produce. Conducting laboratory work and making cookies have a lot in common. Your laboratories exist to answer questions. What happens if I do this? What is the purity of this material? What is the structure of this compound? The field of laboratories asking these questions is extensive, basically covering the entire array of lab bench and scientific work, including chemistry, life sciences, physics, and electronics labs. The more efficiently we answer those questions, the more likely it will be that these labs will continue operating and, that you’ll achieve the goals your organization has set. At some point, it comes down to performance against goals and the return on the investment organizations make in lab operations.

In addition to product quality and productivity, there are a number of other points that favor automation over manual implementations of lab processes. They include:

  • lower costs per test;
  • better control over expenditures;
  • a stronger basis for better workflow planning;
  • reproducibility;
  • predictably; and
  • tighter adherence to procedures, i.e., consistency.

Lists similar to the one above can be found in justifications for lab automation, and cookie production, without further comment. It’s just assumed that everyone agrees and that the reasoning is obvious. Since we are going to use those items to justify the cost and effort that goes into automation, we should take a closer look at them.

Lets begin with reproducibility, predictability, and consistency, very similar concerns that reflect automation’s ability to produce the same product with the desired characteristics over and over. For data and information, that means that the same analysis on the same materials will yield the same results, that all the steps are documented and that the process is under control. The variability that creeps into the execution of a process by people is eliminated. That variability in human labor can result from the quality of training, equipment setup and calibration, readings from analog devices (e.g., meters, pipette meniscus, charts, etc.), there is a long list of potential issues.

Concerns with reproducibility, predictability, and consistency are common to production environments, general lab work, manufacturing, and even food service. There are several pizza restaurants in our area using one of two methods of making the pies. Both start the preparation the same way, spreading dough and adding cheese and toppings, but the differences are in how they are cooked. Once method uses standard ovens (e.g., gas, wood, or electric heating); the pizza goes in, the cook watches it, and then removes it when the cooking is completed. This leads to a lot of variability in the product, some a function of the cook’s attention, some depending on requests for over or under cooking the crust. Some is based on "have it your way" customization. The second method uses a metal conveyor belt to move the pie through an oven. The oven temperature is set as is the speed of the belt, and as long as the settings are the same, you get a reproducible, consistent product order after order. It’s a matter of priorities. Manual verses automated. Consistent product quality verses how the cook feels that day. In the end, reducing variability and being able to demonstrate consistent, accurate, results gives people confidence in your product.

Lower costs per test, better control over expenditures, and better workflow planning also benefit from automation. Automated processes are more cost-efficient since the sample throughput is higher and the labor cost is reduced. The cost per test and the material usage is predictable since variability in components used in testing is reduced or eliminated, and workflow planning is improved since the time per test is known, work can be better scheduled. Additionally, process scale-up should be easier if there is a high demand for particular procedures. However there is a lot of work that has to be considered before automation is realizable, and that is where this discussion is headed.

How does this discussion relate to previous work?

This work follows on the heels of two previous works:

  • A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work (2018): This webinar series complements the above text. It begins by introducing the major topics in informatics (e.g., LIMS, ELN, etc.) and then discusses their use from a strategic viewpoint. Where and how do you start planning? What is your return on investment? What should get implemented first, and then what are my options? The series then moves on to developing an information management strategy for the lab, taking into account budgets, support, ease of implementation, and the nature of your lab’s work.

The material in this write-up picks up where the last part of the webinar series ends. The last session covers lab processes, amd this picks up that thread and goes into more depth concerning a basic issue: how do you move from manual methods to automated systems?

Productivity has always been an issue in laboratory work. Until the 1950s, a lab had little choice but to add more people if more work needed to be done. Since then, new technologies have afforded wider options, including new instrument technologies. The execution of the work was still done by people, but the tools were better. Now we have other options. We just have to figure out when, if, and how to use them.

Before we get too far into this...

With elements such as productivity, return on investment (ROI), data quality, and data integrity as driving factors in this work, you shouldn’t be surprised if a lot of the material reads like a discussion of manufacturing methodologies; we’ve already seen some examples. We are talking about scientific work, but the same things that drive the elements noted in labs have very close parallels in product manufacturing. The work we are describing here will be referenced as "scientific manufacturing," manufacturing or production in support of scientific programs.[a]

The key points of a productivity conversation in both lab and material production environments are almost exact overlays, the only significant difference is that the results of the efforts are data and information in one case, and a physical item you might sell in the other. Product quality and integrity are valued considerations in both. For scientists, this may require an adjustment to their perspectives when dealing with automation. On the plus side, the lessons learned in product manufacturing can be applied to lab bench work, making the path to implementation a bit easier while providing a framework for understanding what a successful automation effort looks like. People with backgrounds in product manufacturing can be a useful resource in the lab, with a bit of an adjustment in perspective on their part.

Transitioning from typical lab operations to automated systems

Transitioning a lab from its current state of operations to one that incorporates automation can raise a number of questions, and people’s anxiety levels. There are several questions that should be considered to set expectations for automated systems and how they will impact jobs and the introduction of new technologies. They include:

  • What will happen to people’s jobs as a result of automation?
  • What is the role of artificial intelligence (AI) and machine learning (ML) in automation?
  • Where do we find the resources to carry out automation projects/programs?
  • What equipment would we need for automated processes, and will it be different that what we currently have?
  • What role does a laboratory execution system (LES) play in laboratory automation?
  • How do we go about planning for automation?

What will happen to people’s jobs as a result of automation?

Stories are appearing in print, online, and in television news reporting about the potential for automation to replace human effort in the labor force. It seems like it is an all-or-none situation, either people will continue working in their occupations or automation (e.g., mechanical, software, AI, etc.) will replace them. The storyline is people are expensive and automated work can be less costly in the long run. If commercial manufacturing is a guide, automation is a preferred option from both a productivity and an ROI perspective. In order to make the productivity gains from automation similar to those seen in commercial manufacturing, there are some basic requirements and conditions that have to be met:

  • The process has to be well documented and understood, down to the execution of each step without variation, while error detection and recovery have to be designed in.
  • The process has to remain static and be expected to continue over enough execution cycles to make it economically attractive to design, build, and maintain.
  • Automation-compatible equipment has to be available. Custom-built components are going to be expensive and could represent a barrier to successful implementation.
  • There has to be a driving need to justify the cost of automation; economics, the volume of work that has to be addressed, working with hazardous materials, and lack of educated workers are just a few of the factors that would need to be considered.

There are places in laboratory work where production-scale automation has been successfully implemented; life sciences applications for processes based on microplate technologies are one example. When we look at the broad scope of lab work across disciplines, most lab processes don’t lend themselves to that level of automation, at least not yet. We’ll get into this in more detail later. But that brings us back to the starting point: what happens to people's jobs?

In the early stages of manufacturing automation, as well as fields such as mining where work was labor intensive and repetitive, people did lose jobs when new methods of production were introduced. That shift from a human workforce to automated task execution is expanding as system designers probe markets from retail to transportation.[1] Lower skilled occupations gave way first, and we find ourselves facing automation efforts that are moving up the skills ladder, most recently is the potential for automated driving, a technology that has yet to be fully embraced but is moving in that direction. The problem that leaves us with is providing displaced workers with a means of employment that gives them at least a living income, and the purpose, dignity, and self-worth that they’d like to have. This is going to require significant education, and people are going to have to come to grips with the realization that education never stops.

Due to the push for increased productivity, lab work has seen some similar developments in automation. The development of automated pipettes, titration stations, auto-injectors, computer-assisted instrumentation, and automation built to support microplate technologies represent just a few places where specific tasks have been addressed. However these developments haven’t moved people out of the workplace as has happened in manufacturing, mining, etc. In some cases they’ve changed the work, replacing repetitive time-consuming tasks with equipment that allows lab personnel to take on different tasks. In other cases the technology addresses work that couldn’t be performed in a cost-effective manner with human effort; without automation, that work might just not be feasible due to the volume of work (whose delivery might be limited by the availability of the right people, equipment, and facilities) or the need to work with hazardous materials. Automation may prevent the need for hiring new people while giving those currently working more challenging tasks.

As noted in the previous paragraph, much of the automation in lab work is at the task level: equipment designed to carry out a specific function such as Karl-Fisher titrations. Some equipment designed around microplate formats can function at both the task level and as part of user-integrated robotics system. This gives the planner useful options about the introduction of automation that makes it easier for personnel to get accustomed to automation before moving into scientific manufacturing.

Overall, laboratory people shouldn’t be loosing their jobs as a result of lab automation, but they do have to be open to changes in their jobs, and that could require an investment in their education. Take someone whose current job is to carry out a lab procedure, someone who understands all aspects of the work, including troubleshooting equipment, reagents, and any special problems that may crop up. Someone else may have developed the procedure, but that person is the expert in its execution.

First of all you need these experts to help plan and test the automated systems if you decide to create that project. These would also be the best people to educate as automated systems managers; they know how the process is supposed to work and should be in a position to detect problems. If it crashes, you’ll need someone who can cover the work while problems are be addressed. Secondly, if lab personnel get the idea that they are watching their replacement being installed, they may leave before the automated systems are ready. In the event of a delay, you’ll have a backlog and no one to handle it.

Beyond that, people will be freed from the routine of carrying out processes and be able to address work that had been put on a back burner until it could be addressed. As we move toward automated systems, jobs will change by expansion to accommodate typical lab work, as well as the management, planning, maintenance, and evolution of laboratory automation and computing.

Automation in lab work is not an "all or none" situation. Processes can be structured so that the routine work is done by systems, and the analyst can spend time reviewing the results, looking for anomalies and interesting patterns, while being able to make decisions about the need for and nature of follow-on efforts.

What is the role of AI and ML in automation?

When we discuss automation, what we are referencing now is basic robotics and programming. AI may, and likely will, play a role in the work, but first we have to get the foundations right before we consider the next step; we need to put in the human intelligence first. Part of the issue with AI is that we don’t know what it is.

Science fiction aside, many of today's applications of AI have a limited role in lab work today. Here are some examples:

  • Having a system that can bring up all relevant information on a research question—a sort of super Google—or a variation of IBM’s Watson could have significant benefits.
  • Analyzing complex data or large volumes of data could be beneficial, e.g., the analysis of radio astronomy data to find fast radio bursts (FRB). After discovering 21 FRB signals upon analyzing five hours of data, researchers at Green Bank Telescope used AI to analyze 400 terabytes of older data and detected another 100.[2]
  • "[A] team at Glasgow University has paired a machine-learning system with a robot that can run and analyze its own chemical reaction. The result is a system that can figure out every reaction that's possible from a given set of starting materials."[3]
  • HelixAI is using Amazon's Alexa as a digital assitant for laboratory work.[4]

Note that the points above are research-based applications, not routine production environments where regulatory issues are important. While there are research applications that might be more forgiving of AI systems because the results are evaluated by human intelligence, and problematic results can be made subject to further verification, data entry systems such as voice entry have to be carefully tested and the results of that data entry verified and shown to be correct.

Pharma IQ continues to publish material on advanced topics in laboratory informatics, including articles on how labs are benefiting from new technologies[5] and survey reports such as AI 2020: The Future of Drug Discovery. In that report they note[6]:

  • "94% of pharma professionals expect that intelligent technologies will have a noticeable impact on the pharmaceutical industry over the next two years."
  • "Almost one fifth of pharma professionals believe that we are on the cusp of a revolution."
  • "Intelligent automation and predictive analytics are expected to have the most significant impact on the industry."
  • "However, a lack of understanding and awareness about the benefits of AI-led technologies remain a hindrance to their implementation."

Note that these are expectations, not a reflection of current reality. That same report makes comments about the impact of AI on headcount disruption, asking, "Do you expect intelligent enterprise technologies[b] to significantly cut and/or create jobs in pharma through 2020?" Among the responses, 47 percent said they expected those technologies to do both, 40 percent said it will create new job opportunities, and 13 percent said there will be no dramatic change, with zero percent saying they expected solely job losses.[6]

While there are high levels of expectations and hopes for results, we need to approach the idea of AI in labs with some caution. We read about examples based on machine learning (ML), for example using computer systems to recognize cats in photos, to recognized faces in a crowd, etc. We don’t know how they accomplish their tasks, and we can’t analyze their algorithms and decision-making. That leaves us with testing in quality, which at best is an uncertain process with qualified results (it has worked so far). One problem with testing AI systems based on ML is that they are going to continually evolve, so testing may affect the ML processes by providing a bias. It may also cause continued, redundant testing, because something we thought was evaluated was changed by the “experiences” the AI based it’s learning on. As one example, could the AI modify the science through process changes without our knowing because it didn’t understand the science or the goals of the work?

AI is a black box with ever-changing contents. That shouldn’t be taken as a condemnation of AI in the lab, but rather as a challenge to human intelligence in evaluating, proving, and applying the technology. That application includes defining the operating boundaries of an AI system. Rather than creating a master AI for a complete process, we may elect to divide the AI’s area of operation into multiple, independent segments, with segment integration occurring in later stages once we are confident in their ability to work and show clear evidence of systems stability. In all of this we need to remember that our goal is the production of high-quality data and information in a controlled, predictable environment, not gee-wiz technology. One place where AI (or clever programming) could be of use is in better workflow planning, which takes into account current workloads and assignments, factors in the inevitable panic-level testing need, and, perhaps in a QC/production environment, anticipates changes in analysis requirements based on changes in production operations.

Throughout this section I've treated “AI” as “artificial intelligence,” its common meaning. There may be a better way of looking at it for lab use as, noted in this excerpt from the October 2018 issue of Wired magazine[7]:

Augmented intelligence. Not “artificial,” but how Doug Engelbart[c] envisioned our relationship with computer: AI doesn’t replace humans. It offers idiot-savant assistants that enable us to become the best humans we can be.

Augmented intelligence (AuI) is a better term for what we might experience in lab work, at least in the near future. It suggests something that is both more realistic and attainable, with the synergism that would make it, and automation, attractive to lab management and personnel—a tool they can work with and improve lab operations that doesn’t carry the specter of something going on that they don’t understand or control. OPUS/SEARCH from Bruker might be just such an entry in this category.[8] AuI may serve as a first-pass filter for large data sets—as noted in the radio astronomy and chemistry examples noted earlier—reducing those sets of data and information to smaller collections that human intelligence can/should evaluate. However, that does put a burden on the AuI to avoid excessive false positives or negatives, something that can be adjusted over time.

Beyond that there is the possibility of more cooperative work between people and AuI systems. An article in Scientific American titled “My Boss the Robot”[9] describes the advantage of a human-robot team, with the robot doing the heavy work and the human—under the robots guidance—doing work he was more adept at, verses a team of experts with the same task. The task, welding a Humvee frame, was competed by the human machine pair in 10 hours at a cost of $1,150; the team of experts took 89 hours and a cost of $7,075. That might translate into terms of laboratory work by having a robot do routine, highly repetitive tasks and the analyst overseeing the operation and doing higher-level analysis of the results.

Certainly, AI/AuI is going to change over time as programming and software technology becomes more sophisticated and capable; today’s example of AuI might be seen as tomorrow’s clever software. However, a lot depends on the experience of the user.

There is something important to ask about laboratory technology development, and AI in particular: is the direction of development going to be the result of someone’s innovation that people look at and embrace, or will it be the result of a deliberate choice of lab people saying “this is where we need to go, build systems that will get us there”? The difference is important, and lab managers and personnel need to be in control of the planning and implementation of systems.

Where do we find the resources to carry out automation projects/programs?

Given the potential scope of work, you may need people with skills in programming, robotics, instrumentation, and possibly mechanical or electrical engineering if off-the-shelf components aren’t available. The biggest need is for people who can do the planning and optimization that is needed as you move from manual to semi- or fully-automated systems, particularly specialists in process engineering who can organize and plan the work, including the process controls and provision for statistical process control.

We need to develop people who are well versed in laboratory work and the technologies that can be applied to that work, as assets in laboratory automation development and planning. In the past, this role has been filled with lab personnel having an interest in the subject, IT people willing to extend their responsibilities, and/or outside consultants. A 2017 report by Salesforce Research states "77% of IT leaders believe IT functions as an extension/partner of business units rather than as a separate function."[10] The report makes no mention of laboratory work or manufacturing aside from those being functions within businesses surveyed. Unless a particular effort is made, IT personnel rarely have the backgrounds needed to meet the needs of lab work. In many cases, they will try and fit lab needs into software they are already familiar with, rather then extend their backgrounds into new computational environments. Office and pure database applications are easily handled, but when we get to the lab bench, it's another matter entirely.

The field is getting complex enough that we need people whose responsibilities span both science and technology. This subject is discussed in the webinar series A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work, Part 5 "Supporting Laboratory Systems."

What equipment would we need for automated processes, and will it be different that what we currently have?

This is an interesting issue and it directly addresses the commitment labs have to automation, particularly robotics. In the early days of lab automation when Zymark (Zymate and Benchmate), Perkin Elmer, and Hewlett Packard (ORCA) were the major players in the market, the robot had to adapt to equipment that was designed for human use: standard laboratory equipment. They did that through special modifications and the use of different grippers to handle test tubes, beakers, and flasks. While some companies wanted to test the use of robotics in the lab, they didn’t want to invest in equipment that could only be used with robots; they wanted lab workers to pick up where the robots left off in case the robots didn’t work.

Since then, equipment has evolved to support automation more directly. In some cases it is a device (e.g., a balance, pH meter, etc.) that has front panel human operator capability and rear connectors for computer communications. Liquid handling systems have seen the most advancement through the adoption of microplate formats and equipment designed to work with them. However, the key point is standardization of the sample containers. Vials and microplates lend themselves to a variety of automation devices, from sample processing to auto-injectors/samplers. The issue is getting the samples into those formats.

One point that labs, in any scientific discipline, have to come to grips with is the commitment to automation. That commitment isn’t going to be done on a lab-wide basis, but on a procedure-by-procedure basis. Full automation may not be appropriate for all lab work, whereas partial automation may be a better choice, and in some cases no automation may be required (we’ll get into that later). The point that needs to be addressed is the choice of equipment. In most cases, equipment is designed for use by people, with options for automation and electronic communications. However, if you want to maximize throughput, you may have to follow examples from manufacturing and commit to equipment that is only used by automation. That will mean a redesign of the equipment, a shared risk for both the vendors and the users. The upside to this is that equipment can be specifically designed for a task, be more efficient, have the links needed for integration, use less material, and, more likely, take up less space. One example is the microplate, allowing for tens, hundreds, or thousands (depending on the plate used) of sample cells in a small space. What used to take many cubic feet of space as test tubes (the precursor to using microplates) is now a couple of cubic inches, using much less material and working space. Note, however, that while microplates are used by lab personnel, their use in automated systems provides greater efficiency and productivity.

The idea of equipment used only in an automated process isn’t new. The development and commercialization of segmented flow analyzers—initially by Technicon in the form of the AutoAnalyzers for general use, and the SMA (Sequential Multiple Analyzer) and SMAC (Sequential Multiple Analyzer with Computer) in clinical markets—improved a lab's ability to process samples. These systems were phased out with new equipment that consumed less material. Products like these are being provided by Seal Analytical[11] for environmental work and Bran+Luebbe (a division of SPX Process Equipment in Germany).[12]

The issue in committing to automated equipment is that vendors and users will have to agree on equipment specifications and use them within procedures. One place this has been done successfully is in clinical chemistry labs. What other industry workflows could benefit? Do the vendors lead or do the users drive the issue? Vendors need to be convinced that there is a viable market for product before making an investment, and users need to be equally convinced that they will succeed in applying those products. In short, procedures that are important to a particular industry have to be identified, and both users and vendors have to come together to develop automated procedure and equipment specifications for products. This has been done successfully in clinical chemistry markets to the extent that equipment is marketed for use as validated for particular procedures.

What role does a LES play in laboratory automation?

Before ELNs settled into their current role in laboratory work, the initial implementations differed considerably from what we have now. LabTech Notebook was released in 1986 (discontinued in 2004) to provide communications between computers and devices that used RS-232 serial communications. In the early 2000s SmartLab from Velquest was the first commercial product to carry the "electronic laboratory notebook" identifier. That product became a stand-alone entry in the laboratory execution system (LES) market; since its release, the same conceptual functionality has been incorporated into LIMS and ELNs that fit the more current expectation for an ELN.

At it’s core, LES are scripted test procedures that an analyst would follow to carry out a laboratory method, essentially functioning as the programmed execution of a lab process. Each step in a process is described, followed exactly, and provision is made within the script for data collection. In addition, the LES can/will (depending on the implementation; "can" in the case of SmartLab) check to see if the analyst is qualified to carry out the work and that the equipment and reagents are current, calibrated, and suitable for use. The systems can also have access to help files that an analyst can reference if there are questions about how to carry out a step or resolve issues. Beyond that, the software had the ability to work with lab instruments and automatically acquire data either through direct interfaces (e.g., balances, pH meters, etc.) or through parsing PDF files of instrument reports.

There are two reasons that these systems are attractive. First, they provide for a rigorous execution of a process with each step being logged as it is done. Second, that log provides a regulatory inspector with documented evidence that the work was done properly, making it easier for the lab to meet any regulatory burden.

Since the initial development of SmartLab, that product has changed ownership and is currently in the hands of Dassault Systèmes as part of the BIOVIA product line. As noted above, LIMS and ELN vendors have incorporated similar functionality into their products. Using those features requires “scripting” (in reality, software development), but it does allow the ability to access the database structures within those products. The SmartLab software needed programmed interfaces to other vendors' LIMS and ELNs to gain access to the same information.

What does this have to do with automation?

When we think about automated systems, particularly full-automation with robotic support, it is a programmed process from start to finish. The samples are introduced at the start, and the process continues until the final data/information is reported and stored. These can be large scale systems using microplate formats, including tape-based systems from Douglas Scientific[13], programmable autosamplers such as those from Agilent[14], or systems built around robotics arms from a variety of vendors that move samples from one station to another.

Both LES and the automation noted in the previous paragraph have the following point in common: there is a strict process that must be followed, with no provision for variation. The difference is that in one case that process is implemented completely through the use of computers, as well as electronic and mechanical equipment. In the other case, the process is being carried out by lab personnel using computers, as well as electronic and mechanical lab equipment. In essence, people take the place of mechanical robots, which conjures up all kinds of images going back to the 1927 film Metropolis.[d] Though the LES represents a step toward more sophisticated automation, both methods still require:

  • programming, including “scripting” (the LES methods are a script that has to be followed);
  • validated, proven processes; and
  • qualified staff, though the qualifications differ. (In both cases they have to be fully qualified to carry out the process in question. However in the full automation case, they will require more education on running, managing, and troubleshooting the systems.)

In the case of full automation, there has to be sufficient justification for the automation of the process, including sufficient sample processing. The LES-human implementation can be run for a single sample if needed, and the operating personnel can be trained on multiple procedures, switching tasks as needed. Electro-mechanical automation would require a change in programming, verification that the system is operating properly, and may require equipment re-configuration. Which method is better for a particular lab depends on trade-offs between sample load, throughput requirements, cost, and flexibility. People are adaptable, easily moving between tasks, whereas equipment has to be adapted to a task.

How do we go about planning for automation?

There are three forms of automation to be considered:

  1. No automation – Instead, the lab relies on lab personnel to carry out all steps of a procedure.
  2. Partial automation – Automated equipment is used to carry out steps in a procedure. Given the current state of laboratory systems, this is the most prevalent since most lab equipment has computer components in them to facilitate their use.
  3. Full automation - The entire process is automated. The definition of “entire” is open to each labs interpretation and may vary from one process to another. For example, some samples may need some handing before they are suitable for use in a procedure. That might be a selection process from a freezer, grinding materials prior to a solvent extraction, and so on, representing cases where the equipment available isn’t suitable for automated equipment interaction. One goal is to minimize this effort since it can put a limit on the productivity of the entire process. This is also an area where negotiation between the lab and the sample submitter can be useful. Take plastic pellets for example, which often need to be ground into a course powder before they can be analyzed; having the submitter provide them in this form will reduce the time and cost of the analysis. Standardizing on the sample container can also facilitate the analysis (having the lab provide the submitter with standard sample vials using barcodes or RFID chips can streamline the process).

One common point that these three forms share is a well-described method (procedure, process) that needs to be addressed. That method should be fully developed, tested, and validated. This is the reference point for evaluating any form of automation (Figure 1).


Fig1 Liscouski ConsidAutoLabProc21.png

Figure 1. Items to be considered in automating systems

The documentation for the chosen method should include the bulleted list of items from Figure 1, as they describe the science aspects of the method. The last four points are important. The method should be validated since the manual procedure is a reference point for determining if the automated system is producing useful results. The reproducibility metric offers a means of evaluating at least one expected improvement in an automated system; you’d expect less variability in the results. This requires a set of reference sample materials that can be repeatedly evaluated to compare the manual and automated systems, and to periodically test the methods in use to ensure that there aren’t any trends developing that would compromise the method’s use. Basically, this amounts to statistical quality control on the processes.

The next step is to decide what improvements you are looking for in an automated system: increased throughput, lower cost of operation, the ability to off-load human work, reduced variability, etc. In short, what are your goals?

That brings us to the matter of project planning. We’re not going to go into a lot of depth in this piece about project planning, as there are a number of references[e] on the subject, including material produced by the former Institute for Laboratory Automation.[f] There are some aspects of the subject that we do need to touch on, however, and they include:

  • justifying the project and setting expectations and goals;
  • analyzing the process;
  • scheduling automation projects; and
  • budgeting.

Justification, expectations, and goals

Basically why are you doing this, what do you expect to gain? What arguments are you going to use to justify the work and expense involved in the project? How will you determine if the project is successful?

Fundamentally, automation efforts are about productivity and the bulleted items noted in the introduction of this piece, repeated below with additional commentary:

  • Lower costs per test, and better control over expenditure: These can result from a reduction in labor and materials costs, including more predictable and consistent reagent usage per test.
  • Stronger basis for better workflow planning: Informatics systems can provide better management over workloads and resource allocation, while key performance indicators can show where bottlenecks are occurring or if samples are taking too long to process. These can be triggers for procedure automation to improve throughput.
  • Reproducibility: The test results from automated procedures can be expected to be more reproducible by eliminating the variability that is typical of steps executed by people. Small variation in dispensing reagents, for example, could be eliminated.
  • Predictability: The time to completion for a given test is more predictable in automated programs; once the process starts it keeps going without interruptions that can be found in human centered activities
  • Tighter adherence to procedures: Automated procedures have no choice but to be consistent in procedure execution; that is what programming and automation is about.

Of these, which are important to your project? If you achieved these goals, what would it mean to your labs operations and the organization as a whole; this is part of the justification for carrying out the projects.

As noted earlier, there are several things to consider in order to justify a project. First, there has to be a growing need that supports a procedures automation, one that can’t be satisfied by other means that could include adding people, equipment, and lab space, or outsourcing the work (with the added burden of insuring data quality and integrity, and integrating that work with the lab’s data/information). Second, the cost of the project must be balanced by it’s benefits. This includes any savings in cost, people (not reducing headcount, but avoiding new hires), material, and equipment, as well as the improvement of timeliness of results and overall lab operations. Third, when considering project justification, the automated process’s useful lifetime has to be long enough to justify the development work. And finally, the process has to be stable so that you aren’t in a constant re-development situation (this differs from periodic upgrades and performance improvements, EVOP in manufacturing terms). One common point of failure in projects is changes in underlying procedures; if the basic process model changes, you are trying to hit a moving target. That ruins schedules and causes budgets to inflate.

This may seem like a lot of things to think about for something that could be as simple as perhaps moving from manual pipettes to automatic units, but that just means the total effort to do the work will be small. However it is still important since it impacts data quality and integrity, and your ability to defend your results should they be challenged. And, by the way, the issue of automated pipettes isn’t simple; there is a lot to consider in properly specifying and using these products.[g]

Analyzing the process

Assuming that you have a well-described, thoroughly tested and validated procedure, that process has to be analyzed for optimization and suitability for automation. This is an end-to-end evaluation, not just a examination of isolated steps. This is an important point. Looking at a single step without taking into account the rest of the process may improve that portion of the process but have consequences elsewhere.

Take a common example: working in a testing environment where samples are being submitted by outside groups (Figure 2).


Fig2 Liscouski ConsidAutoLabProc21.png

Figure 2. Lab sample processing, initial data entry through results

Most LIMS will permit sample submitters (with appropriate permissions) to enter the sample description information directly into the LIMS, reducing some of the clerical burden. Standardizing on sample containers, with barcodes, reduces the effort and cost in some aspects of sample handling. A barcode scanner could be used to scan samples as they arrive into the lab, letting the system know that they are ready to be tested.

That brings us to an evaluation of the process as a whole, as well as an examination of the individual steps in the procedure. As shown in Figure 1, automation can be done in one of two ways: automating the full process or automating individual steps. Your choice depends on several factors, not the least of which is your comfort level and confidence in adopting automation as a strategy for increasing productivity. For some, concentrating on improvements in individual steps is an attractive approach. The cost and risk may be lower and if a problem occurs you can always backup to a fully manual implementation until they are resolved.

Care does have to be taken in choosing which steps to improve. From one perspective, you’d want to do the step-wise implementation of automation as close to the end of the process as possible. The problem with doing it earlier is that you may create a backup in later stages of the process. Optimizing step 2, for example, doesn’t do you much good if step 3 is overloaded and requires more people, or additional (possibly unplanned) automation to relieve a bottleneck there. In short, before you automate or improve a given step, you need to be sure that downstream processing can absorb the increase in materials flow. In addition, optimizing all the individual steps, one at time, doesn’t necessarily add up to a well-designed full system automation. The transition between steps may not be as effective or efficient if the system were evaluated as a whole. If the end of the process is carried out by commercial instrumentation, the ability to absorb more work is easier since most of these systems are automated with computer data acquisition and processing, and many have auto-samplers available to accumulate samples that can be processed automatically. Some of those auto-samplers have built in robotics for common sample handling functions. If the workload builds, additional instruments can pick up the load, and equipment such as Baytek International’s TurboTube[15] can accumulate sample vials in a common system and route them to individual instruments for processing.

Another consideration for partial automation is where the process is headed in the future. If the need for the process persists over a long period of time, will you eventually get to the point of needing to redo the automation to an integrated stream? If so, is it better to take the plunge early on instead of continually expending resources to upgrade it?

Other considerations include the ability to re-purpose equipment. If a process isn’t used full-time (a justification for partial automation) the same components may be used in improving other processes. Ideally, if you go the full-process automation route, you’ll have sufficient sample throughput to keep it running for an extended period of time, and not have to start and stop the system as samples accumulate. A smoothly running slower automation process is better than a faster system that lies idle for significant periods of time, particularly since startup and shutdown procedures may diminish the operational cost savings in both equipment use and people’s time.

All these points become part of both the technical justification and budget requirements.

Analyzing the process: Simulation and modeling

Simulation and modeling have been part of science and engineering for decades, supported by ever-increasing powerful computing hardware and software. Continuous systems simulations have shown us the details of how machinery works, how chemical reactions occur, and how chromatographic systems and other instrumentation behaves.[16] There is another aspect to modeling and simulation that is appropriate here.

Discrete-events simulation (DES) is used to model and understand processes in business and manufacturing applications, evaluating the interactions between service providers and customers, for example. One application of DES is to determine the best way to distribute incoming customers to a limited number of servers, taking into account that not all customers have the same needs; some will tie up a service provider a lot longer than others, as represented by the classic bank teller line problem. That is one question that discrete systems can analyze. This form of simulation and modeling is appropriate to event-driven processes where the action is focused on discrete steps (like materials moving from one workstation to another) rather than as a continuous function of time (most naturally occurring systems fall into this category, e.g., heat flow and models using differential equations).

The processes in your lab can be described and analyzed via DES systems.[17][18][19] Those laboratory procedures are a sequence of steps, each having a precursor, variable duration, and following step until the end of the process is reached; this is basically the same as a manufacturing operation where modeling and simulation have been used successfully for decades. DES can be used to evaluate those processes and ask questions that can guide you on the best paths to take in applying automation technologies and solving productivity or throughput problems. For example:

  • What happens if we tighten up the variability in a particular step; how will that affect the rest of the system?
  • What happens at the extremes of the variability in process steps; does it create a situation where samples pile up?
  • How much of a workload can the process handle before one step becomes saturated with work and the entire system backs up?
  • Can you introduce an alternate path to process those samples and avoid problems (e.g., if samples are held for too long in one stage, do they deteriorate)?
  • Can the output of several parallel slower procedures be merged into a feed stream for a common instrumental technique?

In complex procedures some steps may be sensitive to small delays, and DES can help test and uncover them. Note that setting up these models will require the collection of a lot of data about the processes and their timing, so this is not something to be taken casually.

Previous research[16][17][18][19] suggests only a few ideas where simulation can be effective, including one where an entire labs operation’s was evaluated. Models that extensive can be used to not only look at procedures, but also the introduction of informatics systems. This may appear to be a significant undertaking, and it can be depending on the complexity of the lab processes. However, simple processes can be initially modeled on spreadsheets to see if more significant effort is justified. Operations research, of which DES is a part, has been usefully applied in production operations to increase throughput and improve ROI. It might be successfully applied to some routine production oriented lab work.

Most lab processes are linear in their execution, one step following another, with the potential for loop-backs should problems be recognized with samples, reagents (e.g., being out-of-date, doesn’t look right, need to obtain new materials), or equipment (e.g., not functioning properly, out of calibration, busy due to other work). On one level, the modeling of a manually implemented process should appear to be simple: each step takes a certain amount of time. If you add up the times, you have a picture of the process execution through time. However, the reality is quite different if you take into account problems (and their resolution) that can occur in each of those steps. The data collection used to model the procedure can change how that picture looks and your ability to improve it. By monitoring the process over a number of iterations, you can find out how much variation there is in the execution time for each step and whether or not the variation is a normal distribution or skewed (e.g., if one step is skewed, how does it impact others?).

Questions to ask about potential problems that could occur at each step include:

  • How often do problems with reagents occur and how much of a delay does that create?
  • Is instrumentation always in calibration (do you know?), are there operational problems with devices and their control systems (what are the ramifications?), are procedures delayed due to equipment being in use by someone else, and how long does it take to make changeovers in operating conditions?
  • What happens to the samples; do they degrade over time? What impact does this have on the accuracy of results and their reproducibility?
  • How often are workflows interrupted by the need to deal with high-priority samples, and what effect does it have on the processing of other samples?

Just the collection of data can suggest useful improvements before there are any considerations for automation, and perhaps negating the need for it. The answer to a lab’s productivity might be as simple as adding another instrument if that is a bottleneck. It might also suggest that an underutilized device might be more productive if sample preparation for different procedures workflows were organized differently. Underutilization might be a consequence of the amount of time needed to prepare the equipment for service: doing so for one sample might be disproportionately time consuming (and expensive) and cause other samples to wait until there were enough of them to justify the preparation. It could also suggest that some lab processes should be outsourced to groups that have a more consistent sample flow and turn-around time (TAT) for that technique. Some of these points are illustrated in Figures 3a and 3b below.


Fig3a Liscouski ConsidAutoLabProc21.png

Figure 3a. Simplified process views versus some modeling considerations. Note that the total procedure execution time is affected by the variability in each step, plus equipment and material availability delays; these can change from one day to the next in manual implementations.

Fig3b Liscouski ConsidAutoLabProc21.png

Figure 3b. The execution times of each step include the variable execution times of potential issues that can occur in each stage. Note that because each factor has a different distribution curve, the total execution time has a much wider variability than the individual factors.

How does the simulation system work? Once you have all the data set up, the simulation runs thousands of times using random number generators to pick out variables in execution times for each component in each step. For example, if there is a one-in-ten chance a piece of equipment will be in use when needed, 10% of the runs will show that with each one picking a delay time based on the input delay distribution function. With a large number of runs, you can see where delays exist and how they impact the overall processes behavior. You can also adjust the factors (what happens if equipment delays are cut in half) and see the effect of doing that. By testing the system, you can make better judgments on how to apply your resources.

Some of the issues that surface may be things that lab personnel know about and just deal with. It isn’t until the problems are looked at that the impact on operations are fully realized and addressed. Modeling and simulation may appear to be overkill for lab process automation, something reserved for large- scale production projects. The physical size of the project is not the key factor, it is the complexity of the system that matters and the potential for optimization.

One benefit of a well-structured simulation of lab processes is that it would provide a solid basis for making recommendations for project approval and budgeting. The most significant element in modeling and simulation is the initial data collection, asking lab personnel to record the time it takes to carry out steps. This isn’t likely to be popular if they don’t understand why it is being done and what the benefits will be to them and the lab; accurate information is essential. This is another case where “bad data is worse than no data.”

Guidleines for process automation

There are two types of guidelines that will be of interest to those conducting automation work: those that help you figure out what to do and how to do it, and those that must be met to satisfy regulatory requirements (both those evaluated by internal or external groups or organizations).

The first is going to depend on the nature of the science and automation being done to support it. Equipment vendor community support groups can be of assistance. Additionally, professional groups like the Pharmaceutical Research and Manufacturers of America (PhRMA), International Society for Pharmaceutical Engineering (ISPE), and Parenteral Drug Association (PDA) in the pharmaceutical and biotechnology industrues, with similar organizations in other industries and other countries. This may seem like a large jump from laboratory work, but it is appropriate when we consider the ramification of full-process automation. You are essentially developing a manufacturing operation on a lab bench, and the same concerns that large-scale production have also apply here; you have to ensure that the process is maintained and in control. The same is true of manual or semi-automated lab work, but it is more critical in fully-automated systems because of the potential high volume of results that can be produced.

The second set is going to consist of regulatory guidelines from groups appropriate to your industry: the Food and Drug Administration (FDA), Environmental Protection Agency (EPA), and International Organization for Standardization (ISO), as well as international groups (e.g., GAMP, GALP) etc. The interesting point is that we are looking at a potentially complete automation scheme for a procedure; does that come under manufacturing or laboratory? The likelihood is that laboratory guidelines will apply since the work is being done within the lab's footprint; however, there are things that can be learned from their manufacturing counterparts that may assist in project management and documentation. One interesting consideration is what happens when fully automated testing, such as on-line analyzers, becomes integrated with both the lab and production or process control data/information streams. Which regulatory guidelines apply? It may come down to who is responsible for managing and supporting those systems.

Scheduling automation projects

There are two parts to the schedule issue: how long is it going to take to compete the project (dependent on the process and people), and when do you start? The second point will be addressed here.

The timing of an automated process coming online is important. If it comes on too soon, there may not be enough work to justify it’s use, and startup/shutdown procedures may create more work than the system saves. If it comes too late, people will be frustrated with a heavy workload while the system that was supposed to provide relief is under development.

In Figure 4, the blue line represents the growing need for sample/material processing using a given laboratory procedure. Ideally, you’d like the automated version to be available when that blue line crosses the “automation needed on-line” level of processing requirements; this the point where the current (manual?) implementation can no longer meet the demands of sample throughput requirements.


Fig4 Liscouski ConsidAutoLabProc21.png

Figure 4. Timing the development of an automated system

Those throughput limits are something you are going to have to evaluate and measure on a regular basis and use to make adjustments to the planning process (accelerating or slowing it as appropriate). How fast is the demand growing and at what point will your current methods be overwhelmed? Hiring more people is one option, but then the lab's operating expenses increase due to the cost of people, equipment, and lab space.

Once we have an idea of when something has to be working, we can begin the process of planning; note: the planning can begin at any point, it would be good to get the preliminaries done as soon as a manual process is finalized so that you have an idea of what you’ll be getting into. Those preliminaries include looking at equipment that might be used (keeping track of its development), training requirements, developer resources, and implementation strategies, all of which would be updated as new information becomes available. The “we’ll-get-to-it-when-we-need-it” approach is just going to create a lot of stress and frustration.

You need to put together a first-pass project plan so that you can detail what you know, and more importantly what you don’t know. The goal is to have enough information, updated as noted above, so that you can determine if an automated solution is feasible, make an informed initial choice between full and partial automation, and have a timeline for implementation. Any time estimate is going to be subject to change as you gather information and refine your implementation approach. The point of the timeline is to figure out how long the yellow box in Figure 4 is because that is going to tell you how much time you have to get the plan together and working; it is a matter of setting priorities and recognizing what they are. The time between now and the start of the yellow box is what you have to work with for planning and evaluating plans, and any decisions that are needed before you begin, including corporate project management requirements and approvals.

Those plans have to include time for validation and the evaluation of the new implementation against the standard implementation. Does it work? Do we know how to use and maintain it? And are people educated in its use? Is there documentation for the project?

Budgeting

At some point, all the material above and following this section comes down to budgeting: how much will it cost to implement a program and is it worth it? Of the two points, the latter is the one that is most important. How do you go about that? (Note: Some of this material is also covered in the webinar series A Guide for Management: Successfully Applying Laboratory Systems to Your Organization's Work in the section on ROI.)

What a lot of this comes down to is explaining and justifying the choices you’ve made in your project proposal. We’re not going to go into a lot of depth, but just note some of the key issues:

  • Did you choose full or partial automation for your process?
  • What drove that choice? If in your view it would be less expensive than the full automation of a process, how long will it be until the next upgrade is needed to another stage?
  • How independent are the potential, sequential implementation efforts that may be undertaken in the future? Will there be a need to connect them, and if so, how will the incremental costs compare to just doing it once and getting it over with?

There is a tendency in lab work to treat problems and the products that might be used to address them in isolation. You see the need for a LIMS or ELN, or an instrument data system, and the focus is on those issues. Effective decisions have to consider both the immediate and longer-term aspects of a problem. If you want to get access to a LIMS, have you considered how it will affect other aspects of lab work such as connecting instrument to it?

The same holds true for partial automation as a solution to a lab process productivity problem. While you are addressing a particular step, should you be looking at the potential for synergism by addressing other concerns. Modeling and simulations of processes can help resolve that issue.

Have you factored in the cost of support and education? The support issue needs to address the needs of lab personnel in managing the equipment and the options for vendor support, as well as the impact on IT groups. Note that the IT group will require access to vendor support, as well as being educated on their role in any project work.

What happens if you don’t automate? One way to justify the cost of a project is to help people understand what the lab’s operations will be like without it. Will more people, equipment, space, or added shifts be needed? At what cost? What would the impact be on those who need the results and how would it affect their programs?

Build, buy, or cooperate?

In this write up and some of the referenced materials, we’ve noted several times the benefits that clinical labs have gained through automation, although crediting it all to the use of automation alone isn’t fair. What the clinical laboratory industry did was recognize that there was a need for the use of automation to solve problems with the operational costs of running labs, and recognition that they could benefit further by coming together and cooperatively addressing lab operational problems.

It’s that latter point that made the difference and resulted in standardized communications, and purpose-built commercial equipment that could be used to implement automation in their labs. They also had common sample types, common procedures, and data processing. That same commonality applies to segments of industrial and academic lab work. Take life sciences as an example. Where possible, that industry has standardized on micro-plates for sample processing. The result is a wide selection of instruments and robotics built around that sample-holding format that greatly improves lab economics and throughput. While it isn’t the answer to everything, it’s a good answer to a lot of things.

If your industry segment came together and recognized that you used common procedures, how would you benefit by creating a common approach to automation instead of each lab doing it on their own? It would open the development of common products or product variations from vendors and relieve the need for each lab developing its own answer to the need. The result could be more effective and easily supportable solutions.

Project planning

Once you’ve decided on the project you are going to undertake, the next stage is looking at the steps needed to manage your project (Figure 5).


Fig5 Liscouski ConsidAutoLabProc21.png

Figure 5. Steps in a laboratory automation project. This diagram is modeled after the GAMP V for systems validation.

The planning begins with the method description from Figure 1, which describes the science behind the project and the specification of how the automation is expected to be put into effect: as full-process automation, a specific step, or steps in the process. The provider of those documents is considered the “customer” and is consistent with GAMP V nomenclature (Figure 6); that consistency is important due to the need for system-wide validation protocols.


Fig6 Liscouski ConsidAutoLabProc21.png

Figure 6. GAMP V model for showing customer and supplier roles in specifying and evaluating project components for computer hardware and software.

From there the “supplier” (e.g., internal development group, consultant, IT services, etc.) responds with a functional specification that is reviewed by the customer. The “analysis, prototyping, and evaluation” step, represented in the third box of Figure 5, is not the same as the process analysis noted earlier in this piece. The earlier section was to help you determine what work needed to be done and documented in the user requirements specification. The analysis and associated tasks here are specific to the implementation of this project. The colored arrows refer to the diagram in Figure 7. That process defines the equipment needed, dependencies, and options/technologies for automation implementations, including robotics, instrument design requirements, pre-built automation (e.g., titrators, etc.) and any custom components. The documentation and specifications are part of the validation protocol.


Fig7 Liscouski ConsidAutoLabProc21.png

Figure 7. Defining dependencies and qualification of equipment

The prototyping function is an important part of the overall process. It is rare that someone will look at a project and come up with a working solution on the first pass. There is always tinkering and modifications that occur as you move from a blank slate to a working system. You make notes along the way about what should be done differently in the final product, and places where improvements or adjustments are needed. These all become part of the input to the system design specification that will be reviewed and approved by the customer and supplier. The prototype can be considered a proof of concept or a demonstration of what will occur in the finished product. Remember also that prototypes would not have to be validated since they wouldn’t be used in a production environment; they are simply a test bed used prior to the development of a production system.

The component design specifications are the refined requirement for elements that will be used in the final design. Those refinements could point to updated models of components or equipment used, modifications needed, or recommendations for products with capabilities other than those used in the prototype.

The boxes on the left side of Figure 5 are documents that go into increasing depth as the system is designed and specified. The details in those items will vary with the extent of the project. The right side of the diagram is a series of increasingly sophisticated testing and evaluation against steps in the right side, culminating in the final demonstration that the system works, has been validated, and is accepted by the customer. It also means that lab and support personnel are educated in their roles.

Conclusions (so far)

“Laboratory automation” has to give way to “laboratory automation engineering.” From the initial need to the completion of the validation process, we have to plan, design, and implement successful systems on a routine basis. Just as the manufacturing industries transitioned from cottage industries to production lines and then to integrated production-information systems, the execution of laboratory science has to tread a similar path if the demands for laboratory results are going to be met in a financially responsible manner. The science is fundamental; however, we need to pay attention now to efficient execution.

Abbreviations, acronyms, and initialisms

AI: Artificial intelligence

AuI: Augmented intelligence

DES: Discrete-events simulation

ELN: Electronic laboratory notebook

EPA: Environmental Protection Agency

FDA: Food and Drug Administration

FRB: Fast radio bursts

GALP: Good automated laboratory practices

GAMP: Good automated manufacturing practice

ISO: International Organization for Standardization

LES: Laboratory execution system

LIMS: Laboratory information management system

ML: Machine learning

ROI: Return on investment

SDMS: Scientific data management system

TAT: Turn-around time


Footnotes

  1. The term "scientific manufacturing" was first mentioned to the author by Mr. Alberto Correia, then of Cambridge Biomedical, Boston, MA.
  2. Intelligent enterprise technologies referenced in the report include robotic process automation, machine learning, artificial intelligence, the internet Of things, predictive analysis, and cognitive computing.
  3. Doug Engelbart found the field of human-computer interaction and is credited with the invention of the computer mouse, and the “Mother of All Demos” in 1968.
  4. See Metropolis (1927 film) on Wikipedia.
  5. See for example https://www.projectmanager.com/project-planning; the simplest thing to do it put “project planning” in a search engine and browse the results for something interesting.
  6. See for example https://theinformationdrivenlaboratory.wordpress.com/category/resources/; note that any references to the ILA should be ignored as the original site is gone, with the domain name perhaps having been leased by another organization that has no affiliation with the original Institute for Laboratory Automation.
  7. As a starting point, view the Artel, Inc. site as one source. Also, John Bradshaw gave an informative presentation on “The Importance of Liquid Handling Details and Their Impact on your Assays” at the 2012 European Lab Automation Conference, Hamburg, Germany.

About the author

Initially educated as a chemist, author Joe Liscouski (joe dot liscouski at gmail dot com) is an experienced laboratory automation/computing professional with over forty years of experience in the field, including the design and development of automation systems (both custom and commercial systems), LIMS, robotics and data interchange standards. He also consults on the use of computing in laboratory work. He has held symposia on validation and presented technical material and short courses on laboratory automation and computing in the U.S., Europe, and Japan. He has worked/consulted in pharmaceutical, biotech, polymer, medical, and government laboratories. His current work centers on working with companies to establish planning programs for lab systems, developing effective support groups, and helping people with the application of automation and information technologies in research and quality control environments.

References

  1. Frey, C.B.; Osborne, M.A. (17 September 2013). "The Future of Employment: How Susceptible Are Jobs to Computerisation?" (PDF). Oxford Martin School, University of Oxford. https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf. Retrieved 04 February 2021. 
  2. Hsu, J. (24 September 2018). "Is it aliens? Scientists detect more mysterious radio signals from distant galaxy". NBC News MACH. https://www.nbcnews.com/mach/science/it-aliens-scientists-detect-more-mysterious-radio-signals-distant-galaxy-ncna912586. Retrieved 04 February 2021. 
  3. Timmer, J. (18 July 2018). "AI plus a chemistry robot finds all the reactions that will work". Ars Technica. https://arstechnica.com/science/2018/07/ai-plus-a-chemistry-robot-finds-all-the-reactions-that-will-work/5/. Retrieved 04 February 2021. 
  4. "HelixAI - Voice Powered Digital Laboratory Assistants for Scientific Laboratories". HelixAI. http://www.askhelix.io/. Retrieved 04 February 2021. 
  5. PharmaIQ News (20 August 2018). "Automation, IoT and the future of smarter research environments". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/news/automation-iot-and-the-future-of-smarter-research-environments. Retrieved 04 February 2021. 
  6. 6.0 6.1 PharmaIQ (14 November 2017). "The Future of Drug Discovery: AI 2020". PharmaIQ. https://www.pharma-iq.com/pre-clinical-discovery-and-development/whitepapers/the-future-of-drug-discovery-ai-2020. Retrieved 04 February 2021. 
  7. Rossetto, L. (2018). "Fight the Dour". Wired (October): 826–7. https://www.magzter.com/stories/Science/WIRED/Fight-The-Dour. 
  8. "OPUS Package: SEARCH & IDENT". Bruker Corporation. https://www.bruker.com/en/products-and-solutions/infrared-and-raman/opus-spectroscopy-software/search-identify.html. Retrieved 04 February 2021. 
  9. Bourne, D. (2013). "My Boss the Robot". Scientific American 308 (5): 38–41. doi:10.1038/scientificamerican0513-38. PMID 23627215. 
  10. SalesForce Research (2017). "Second Annual State of IT" (PDF). SalesForce. https://a.sfdcstatic.com/content/dam/www/ocms/assets/pdf/misc/2017-state-of-it-report-salesforce.pdf. Retrieved 04 February 2021. 
  11. "Seal Analytical - Products". Seal Analytical. https://seal-analytical.com/Products/tabid/55/language/en-US/Default.aspx. Retrieved 04 February 2021. 
  12. "Bran+Luebbe". SPX FLOW, Inc. https://www.spxflow.com/bran-luebbe/. Retrieved 04 February 2021. 
  13. "Array Tape Advanced Consumable". Douglas Scientific. https://www.douglasscientific.com/Products/ArrayTape.aspx. Retrieved 04 February 2021. 
  14. "Agilent 1200 Series Standard and Preparative Autosamplers - User Manual" (PDF). Agilent Technologies. November 2008. https://www.agilent.com/cs/library/usermanuals/Public/G1329-90012_StandPrepSamplers_ebook.pdf. Retrieved 04 February 2021. 
  15. "iPRO Interface - Products". Baytek International, Inc. https://www.baytekinternational.com/products/ipro-interface/89-products. Retrieved 05 February 2021. 
  16. 16.0 16.1 Joyce, J. (2018). "Computer Modeling and Simulation". Lab Manager (9): 32–35. https://www.labmanager.com/laboratory-technology/computer-modeling-and-simulation-1826. 
  17. 17.0 17.1 Costigliola, A.; Ataíde, F.A.P.; Vieira, S.M. et al. (2017). "Simulation Model of a Quality Control Laboratory in Pharmaceutical Industry". IFAC-PapersOnLine 50 (1): 9014-9019. doi:10.1016/j.ifacol.2017.08.1582. 
  18. 18.0 18.1 Meng, L.; Liu, R.; Essick, C. et al. (2013). "Improving Medical Laboratory Operations via Discrete-event Simulation". Proceedings of the 2013 INFORMS Healthcare Conference. https://www.researchgate.net/publication/263238201_Improving_Medical_Laboratory_Operations_via_Discrete-event_Simulation. 
  19. 19.0 19.1 "Application of discrete-event simulation in health care clinics: A survey". Journal of the Operational Research Society 50: 109–23. 1999. doi:10.1057/palgrave.jors.2600669.