This blog will give you an introduction into the basics of ISO 17025 and just some of the requirements which LIMS can help with. What is ISO 18025? ISO 17025 is a laboratory accreditation standard that all testing and calibration laboratories vie to achieve to prove their technical competence in performing an array of testing, […]
February 11, 2020 - Tips on Implementing, Maintaining and Validating Empower 3 Software Custom Fields to Maximize Data Integrity
Waters Empower 3 software is a chromatography data system (CDS) that links to chromatographic instruments to help facilitate management of chromatography test results through data acquisition, processing, reporting and distribution. Waters Empower 3 generates lots of data with complex calculations and detailed reports and has several tools that address compliance and data integrity concerns such as electronic signatures and audit trails.
Given that Waters Empower 3 typically generates large amounts of data, transcribing the data from Empower 3 to an external system can present huge risks from a data integrity perspective. As a result, scientific laboratories are looking for ways to maximize their results output from Empower and ideally utilize this system as their only solution for chromatographic results.
February 11, 2020 - MELLODDY Consortium Employs Federated Learning and Blockchain to Enhance AI Drug Discovery
A new consortium of pharmaceutical, technology and academic partners is hoping to improve collaboration and competitive data sharing among pharmaceutical companies in ways that would address IP concerns. Utilizing blockchain and federated learning technologies, the MELLODDY (Machine Learning Ledger Orchestration for Drug Discovery) Consortium aims to use deep learning methods on the chemical libraries of 10 pharma companies to create a modeling platform that can more quickly and accurately predict promising compounds for development, all without sacrificing the data privacy of the participating companies.
Over the last few years, pharma and biotech companies have been adopting artificial intelligence (AI) and big data approaches, particularly machine learning and deep learning, in hopes that these technologies will change the way the industry discovers, develops, and manufactures medicines. Advances in computational power have opened up the possibility of creating digital models utilizing machine learning that can dramatically accelerate innovation and yield inexpensive predictions from expensive data.
Modern Life Science laboratories typically utilize many different applications and legacy systems to manage their workflows and data, but unfortunately, there is often very little integration between these systems. As such, managing all the IT dependencies in these modern labs is a significant undertaking.
Today’s Life Science IT professionals need to be both knowledgeable across scientific domains and experts at the informatics. The reality is that creating the necessary integrated laboratory ecosystem and the subsequent support, maintenance, and extension of it is such a significant undertaking that only large companies with sufficient budget and resources have been able to attempt it.
With the rate of data generation increasing exponentially in Life Science R&D laboratories, there has never been a time when laboratories are so tightly coupled to informatics systems and scientists so dependent on IT. In this challenging environment, IT professionals often get bogged down in supporting current infrastructure instead of developing and maintaining strategic roadmaps to be ready for future demand. In order to future-proof your lab, a third-party informatics vendor with industry experience and expertise in laboratory informatics systems can be a valuable partner to your organization by providing Laboratory Informatics as a Service (LIaaS).
As part of the U.S. Department of Health and Human Services, the National Institutes of Health (NIH) is our nation’s medical research agency and strives to make important discoveries that improve health and save lives. Founded in 1870, the NIH conducts its own scientific research through its Intramural Research Program (IRP), which supports approximately 1,200 principal investigators and more than 4,000 postdoctoral fellows conducting basic, translational and clinical research. The NIH also provides biomedical research funding for non-NIH research facilities via its Extramural Research Program, investing nearly $39.2 billion in external medical research in 2019.
In a study conducted by the Nature Index in 2019, the NIH ranked #2 in the world in terms of published scientific papers, behind only Harvard University in Boston. In this blog, we will highlight recent ground-breaking research that has been conducted through the NIH IRP, while also providing links to upcoming NIH events designed to keep researchers abreast of the latest discoveries.
Ensuring regulatory compliance, reducing errors, streamlining workflows and enabling growth through the introduction, consolidation and integration of IT systems will ultimately pay dividends for the business and help achieve a competitive advantage. However, for many reasons including lack of resources, no formal business plan or the difficulty of articulating the potential business benefits, laboratories often struggle to drive the introduction and modernization of IT solutions in their laboratories.
Autoscribe Informatics works with companies worldwide to improve business efficiency. Our Matrix Gemini Laboratory Information Management System (LIMS) drives consolidation, improves data integrity and increases laboratory efficiency. Laboratory process automation and workflow management can reduce transcription errors, improve turn-around times, and increase resource utilization; and these are just a few of the business benefits associated with adopting a LIMS.
OnQ Software is pleased to announce the release of our new off the shelf integration to WaterOutlook, an operational data management system, to combine lab results with data from SCADA and field staff.
The addition to our integrations catalogue allows customers to maximise the use and benefits of both LIMS and WaterOutlook by allowing data to transfer securely and easily between the two. They may then use the data to make better decisions while having the peace of mind that the integration is fully supported.
Traditional on-premise informatics systems in scientific laboratories are often accompanied by significant operational costs – securing the data, applying patches, providing backup and disaster recovery, hardware maintenance, etc. Public, multi-tenant cloud-based systems accessed “as a service” share infrastructure across several customers and deliver value by managing those systems with shared resources and procedures that drive efficiency. For customers, costs are accrued as a monthly operating expense as opposed to a capital-intensive purchase that requires months of planning.
While the public cloud software as a service (SaaS) model is well established, there are other cloud computing models.
Digital transformation uses digital technologies throughout an organization to fundamentally improve or change how businesses operate and provide value to their customers. Digital transformation may encompass multiple systems and technologies including robotics, workflow automation, advanced data analytics, artificial intelligence (AI) and Cloud technologies. While this may require the implementation of new systems there is great potential for improvement by applying the ideas to existing systems. Digital transformation is a natural extension of the so-called fourth industrial revolution, and as such requires seamless multi-directional flows of data and information from every facet of the business. This creates a data ecosystem characterized by complex relationships. Done correctly this has the potential to link tacit data (data known only to a single person), and group data or tribal data (data known to a limited number people often within a single team) to help create and protect that most valuable resource – corporate knowledge. Nowhere are these ideas more applicable than in the laboratory environment.
The human construct of time provides us with a frame of reference for discussing a sequence of events. It helps our consciousness make sense of the physical world around us. In some schools of thought, the future doesn’t exist. As scientists, engineers, and laboratory informatics professionals, however, it’s your job to plan for the future, if not to chart a path for getting there.
In the run-up to New Year’s Day, we saw many retrospective takes on what changes the last decade brought. In this blog post, we’ll take a look at some possible advances that the coming decade might bring to the world of laboratory informatics. In almost all of the writing that has already been done about the Lab of the Future, data has a central role. As informatics consultants, we can’t disagree with this assessment.
Since 1995, Astrix Technology Group has been serving the scientific community by helping our customers improve laboratory business processes, source the right talent, and harmonize quality processes by employing the right technology. Our core mission is to build value and trust with the clients we serve, and ultimately help our customers do the important scientific work that improves the world we live in.
2019 was a really great year for Astrix. Over this past year, we doubled our practice servicing federal agencies, reached an important milestone of 750 total employees, and grew by an amazing 34%. Along the way, we added new services, conducted industry surveys, produced helpful white papers and blog posts, sponsored and gave presentations at several important industry conferences, and provided a number of information-packed webinars to support the scientific community. Let’s take a closer look at some of our key contributions and achievements which helped make 2019 such an exciting year for the Astrix family.
So far in our LIMS master data best practices series, we have discussed how to define master data and create a Master Data Plan, how to effectively extrapolate master data from current records to configure your system, and how to configure your master data so it will be easy to maintain and scale as your organization grows and the system matures.
The Master Data Plan, along with other documents we have discussed in previous blogs in this series, are part of an overall quality control process.
In today’s global economy, mergers and acquisitions have become a dominant strategy to improve profitability, maintain competitive edge, and expand services and reach. This practice is common in several industries such as pharmaceutical, biotech, food and beverage, oil and gas, and others. While corporate mergers certainly can provide several benefits for the organizations involved, they can also present significant challenges, not the least of which is harmonization and optimization of the laboratory environment. This often leads to the need to the need for integrating multiple LIMS apps to support a global enterprise.
Scientific organizations that have recently undergone a merger, and oftentimes even those that have not, are frequently in the situation where different labs in different locations are using different LIMS technologies/solutions. This scenario serves to inhibit process efficiency, cross-organization data reporting, regulatory compliance, and can result in high IT demand.
Given the advanced capabilities of modern LIMS, and the competitive advantages gained through establishing digital continuity across the product lifecycle, there is a strong incentive for modern scientific organizations with disparate LIMS to harmonize their laboratory environment by integrating the multiple LIMS into a single system. In this blog, we will discuss best practices for a project of this nature.
People have been asking, “What is a LIMS?,” practically since LIMS was invented. (We’ve asked it ourselves!) In 1998, Dr. Alan McLelland of the Institute of Biochemistry, Royal Infirmary, Glasgow, penned his famous—at least in informatics circles—essay offering four viewpoints on this question; those of the analytical staff, the laboratory manager, the IT group, and the finance team.
Whether clinical, analytical or research, all laboratories operate under a time constraint of having to provide information to a customer, client or management as business decisions are then made based on that information. To make such information available instantly is nearly impossible without a LIMS, traditional methods of record-keeping using excel or paper-based files makes it impossible to source the required data.
The pace of change in the pharmaceutical and biotech industries means that most companies eventually must face the daunting task of relocating their laboratories. Whether due to mergers, acquisitions, funding changes or simply organic growth, a laboratory relocation is an extraordinarily complex undertaking that will impact your laboratory’s scientists, research and business goals.
A laboratory relocation involves moving high-end analytical instrumentation, hazardous materials, products and samples, and sometimes even live animals. It will require shutting down laboratory equipment, along with safely packing and shipping instruments, samples, materials, devices, computer hardware and potentially data to the new location. Once all necessary items have been moved to the new location, they will need to be unpacked and equipment will need to be requalification and/or validated.
Whether you are moving your lab across the hall, street or country, a laboratory relocation is never a routine exercise. The reality is that no two laboratories are alike – each will have a set of unique challenges that will need to be addressed with care. In this blog, we will discuss critical best practices that should be followed in all cases in order to make sure your laboratory relocation is a smooth, safe and efficient process that minimizes downtime and disruption for your business.
Master data design has very important impacts over the lifecycle of a LIMS, as nearly every piece of functionality in the system revolves around the design of the master data. One of the most important aspects to any LIMS implementation is designing the master data so that it is easy to maintain and scale as the organization grows and business needs change. Some of the key benefits of configuring your master data to be maintainable and scalable include:
- Easier to add and/or modify master data down the road
- Increased system efficiency and reliability
- Future system enhancements are less resource intensive
- Better management for large volumes of data
- Increased user acceptance
- Increased ROI
In short, focusing on maintainability and scalability when configuring your master data really helps improve the lifespan and usability of your LIMS. In this blog, we will provide some best practice tips on how to set up master data so it will be easy to maintain and scale as your organization grows and the system matures.
Imagine…historic rainfalls flood your production facility and its lab, while knocking out power for miles around. Your analytical equipment is ruined. Your server is under water. Or, wildfires ignite suddenly after a lightning strike or car accident. The fires spread rapidly. Lab personnel have just enough time to escape with what they can carry. Your facility is burned to the ground. These are just two scenarios that have occurred in the last couple of years. And all of us have experienced a network failure or the dreaded blue screen of death at some point. System downtime is going to happen. You may even have a total loss on your hands in the future. How you plan for it is the key.
Do you know how to protect your LIMS or ELN in case of a catastrophic failure? It’s important to have a detailed plan in place in case a disaster happens.
Due to technological advances in laboratory instruments and higher throughput processes, data volumes in modern analytical laboratories have increased dramatically over the last several decades. While this increased data volume presents the opportunity to improve innovation and enable timely and effective business decisions, it also presents significant data management and processing challenges. In order to meet the challenge of turning this data into knowledge, laboratories are looking to automate and integrate laboratory operations and processes as much as possible in order to provide digital continuity throughout the product lifecycle.
Integrating laboratory instruments with Laboratory Information Management Systems (LIMS) is one of the best ways to automate laboratory processes. Instruments that are commonly integrated with LIMS in laboratories include:
- Particle Counters
- DNA Sequencers
- AA Analyzers
Unfortunately, many LIMS implementation projects either run out of time or lose the momentum before they are able to accomplish their initial instrument integration goals. In this blog, we will discuss both the benefits of instrument integration and best practices that help to ensure instruments are integrated effectively during a LIMS implementation.