IT in the Lab: The Instrument Interface... Revisited

Source: Galactic Industries Corp.

By James Duckworth

As the first author to write in this space after Gordon Logan, I realize that I have a high standard to uphold in terms of the insight and relevance he brought to his chosen topics. While I don't have the same bird's-eye business perspective as Gordon, there was one issue which I found to be a theme throughout many of his columns and on which I can offer some expertise: analytical instrumentation.

When you look at it from a certain perspective, the primary output product of a lab is data. This is always the first step in the analytical decision and learning process that forms the basis of what we call research. Vast amounts of money are being spent on research infrastructures that generate data. Companies are pushing instrumentation and lab automation vendors to deliver hardware systems that are capable of higher throughput to shorten the product development cycle, generating even more data.

With instruments that allow a single person to generate vast quantities of data results, it is clear that companies engaged in scientific research need to consider IT as a core part of their infrastructure and not simply a support function. Advances have been made on this front as research-oriented companies have invested in "-informatics" systems—insert "chem," "bio," or your own favorite prefix—to bring the flow and storage of data under control and make it accessible to the groups that need it. Significant progress has been made in the areas of chemical synthesis, genomics, and high-throughput screening (HTS), among others. In the void of non-existent industry standards for data handling and systems integration, many of these advances have been by a number of Independent Software Vendors (ISVs), just as Gordon predicted in July 1999 (The More Things Change....).

However, there is still a major gap in support for information technologies when it comes to analytical instrumentation. The data generated by these pieces of equipment is often key to the decision making process in research. For example, LC-MS instruments are now ubiquitous throughout pharmaceutical development for rapid synthesis confirmation. Nevertheless, once an interesting compound has been entered in the corporate registration system, the instrument data files are effectively locked away on a backup CD or tape, or worse, just left on the workstation. Hopefully it can be found again when the FDA regulator asks to see the supporting data many years from today when the drug goes into clinical trial, and hopefully the instrument software is still around to look at it.

Developing proprietary software for a vertical market such as analytical instrumentation is time-consuming and thus expensive. For the manufacturers, adding features beyond what customers require to run analyses doesn't necessarily help them sell more instruments (they have shareholders and need to show profit too). Through purchase decisions, users have been telling analytical instrumentation vendors that throughput, sensitivity, and suitability-to-task are what really matter. Somehow, software compatibility, data portability, automation interfacing, and system integration capabilities always seem to be lower down on the checklist. But the fact remains, as the software is the primary interface to a piece of hardware that will be in the lab for a 5 to 8 year life cycle, it alone determines what can ultimately be done with the instrument and its data.

There have been some efforts made on this front in the past in terms of defining standardized data formats for instrumentation. These efforts have been sponsored by various industry consortia and standards bodies such as the International Union of Pure and Applied Chemistry (IUPAC) and American Society for Testing and Materials (ASTM). However, while some vendors have implemented them, often as an afterthought, the standards have generally failed to make an impact on the industry overall. The primary reasons for this have been both consumer apathy and the inability of such standards to keep up with the rapid pace of development in new instrument technologies.

While it seems like this is an endless morass with no solutions in sight, I don't believe that to be true. More recently, some interesting trends have caught my attention. With the Wintel PC now being the dominant platform for instrument workstations, most vendors have effectively abandoned other computer platforms for their software development efforts (although there are some obvious exceptions, i.e. UNIX is the major OS found on NMR instrument workstations). With this they are starting to adopt modular software development methodologies. This helps them develop software by dividing the workload. But it has an interesting side benefit: they can provide the modules to their users to develop their own software. Rather than exert extra effort trying to support an ill-defined standard, manufacturers are letting customers integrate these software components directly into their in-house data management solutions. There are a number of third-party and consulting companies now involved in this new business of "instrument integration" partly due to manufacturers opening up the doors to their software.

There is actually one emerging instrument interface standard that has the backing of various instrument vendors, automation equipment manufacturers and end users. Developed by the ASTM E49.52 subcommittee (Computerization of Analytical Sciences Data), the concept is known as the Laboratory Equipment Control Interface Specification or "LECIS." The standard takes an interesting approach to applying modern software architectures to equipment that is inherently unique. The concept is to develop a common software interface to "talk" to different pieces of laboratory equipment without really knowing their "language." With a unified instrument software interface programmers can more easily develop automated laboratory systems that integrate data handling systems with instrument control functions for multiple pieces of hardware from multiple different vendors. The promise of LECIS is to be able to quickly develop new software applications to automate laboratory tasks for new synthesis and analysis protocols and integrate the output into data management systems. This technology is well worth watching and there is extensive, up-to-date information available at http://www.lecis.org.

While I don't like to mention specific vendors, one has taken a very unique approach to developing customer-oriented solutions in this field. In July of 1999, Gilson Inc. opened the Center for Integrated Discovery Technology (CIDT) in Rhode Island. The goal of this project is to "contribute to the discovery of new medicines through the creation of integrated laboratory instrument solutions based on partnerships with the pharmaceutical and laboratory instrument industries." Obviously part of their strategy is to sell more of their equipment; but pulling together key groups of customers and other manufacturers strikes me as the right balance to solving at least part of the integration problem.

It should be noted that both the problem and the proposed solutions are derived from software issues, not the instrument hardware. If there is a lesson in all this, it is that any company with a mandate to manage its analytical instruments and data as a knowledge resource will eventually make this discovery. The IT solution to any software incompatibility is usually more software. Ultimately, each piece of equipment will be custom-integrated at least at some level. If the value of the instrument and data to the overall research process is very high, it may necessitate spending considerable sums on integration. But if these outlays allow an organization to get the next Viagra to market two years ahead of its competitor, all these efforts will seem worth it.

About the Author
James Duckworth is currently Vice President of Sales and Marketing at Galactic Industries Corp. (Salem, NH), a commercial supplier of spectroscopy and chromatography software. During his 12 years with Galactic he has worked in a variety of capacities including product design and development, technical support, sales, and marketing.

James is also active in various committees and standards bodies including ASTM E01, The Coblentz Society, and The Society for Applied Spectroscopy. While his formal training is in chemistry, he has publications and presentations to his credit that span many different disciplines including micro-sampling ion chromatography, data processing algorithms and chemometric data analysis, and data handling technologies.

As an independent software vendor for over 14 years, Galactic's products can be found at companies in nearly every research market sector. In addition, Galactic has OEM and VAR relationships with nearly every major spectroscopy and chromatography instrument vendor. From this position, Galactic has a very unique perspective on the state of technology in the analytical instrumentation marketplace.

The author can be reached at: Galactic Industries Corp., 395 Main St., Salem, NH 03079. Tel: 603-898-7600 x325. Fax: 603-898-6228. Email: jhd@galactic.com.