On-line literature references used for this study, grouped under About transcription errors, LIMS business case, and Open Source TCO and ROI
About transcription errors
“The use of reliable instrument interfacing with LIMS systems provides a significant route to reducing errors introduced through human mistake. It is a generally accepted statistic that there is a 3% error in each level of transcription, which reduces to 0.5% with checking in a manual process working culture. By removing the need for transcription, this source of error is eliminated.
Reduced errors in reporting results also add to the cost/benefit analysis, such as the recall of a product batch could cost $150k per recall let alone any damage to a company image if the recall becomes common knowledge. “
Vishal Rosha, www.limsfinder.com
“In the real world there are going to be errors. A common estimate is that each level of transcription incurs an error rate of 3 – 5%. Samples will be incorrectly entered or even missed in the transcription process resulting in errors, confusion over missing sample results and the need for reruns”
Author unknown, www.labtronics.com
“Manual transcription of numerical data is prone to error. We quantified the transcription error rate for blood results recorded in a critical care setting by comparing the handwritten and printed laboratory results in 100 consecutive patients in the intensive care unit, Glasgow Royal Infirmary. 954 Sets of results with 4664 individual values were analysed.
There was complete and accurate transcription in 67.6% of cases, a failure to transcribe in 23.6% and inaccurate transcription in 8.8%. Transcription was significantly more accurate in the morning.
This study highlights that our current system of recording blood results is unreliable. These results strengthen the case for computerisation of the patient record in terms of data retrieval and transcription accuracy. ”
R. Black, P. Woolman and J. Kinsella, University Department of Anaesthesia, Glasgow Royal Infirmary
“Pathology laboratories had an error rate of up to 26% for analytical results. The worst-performing laboratory had errors, of patient identification or results of analysis, in 46% of requests. The three best-performing laboratories achieved 85% error-free reporting, with one achieving 95%”
Mounira Khoury, Leslie Burnett and Mark A Mackay, www.mja.com.au
LIMS business case
“Compared to the lab’s pre-automation processes, sample processing efficiency increased 115 percent, analytical productivity increased 53 percent and productivity in handling requisitions increased 111 percent. Meanwhile, test volume increased 18 percent, and yet the number of opportunities for human error decreased by 71 percent. ”
Author unknown, www.beckmancoulter.com
“Raw data was maintained in paper logbooks or entered into Microsoft Excel spreadsheets and reports were often generated in Microsoft Word. While this method was sufficient when the laboratory was smaller, as the laboratory grew managing information and trying to obtain trending data was proving to be increasingly time consuming, error prone and complex. Recording and extracting appropriate data for analysis and reporting was becoming difficult at best.
The Gwinnett laboratory now enjoys increased control of their samples, data management, data access, analysis and meeting regulatory requirements. This represents a significant improvement over the previous islands of information that were located in separate logbooks and Excel spreadsheets. With the ability to set permissions, automatically e-mail pdf reports, and integrate with other applications, the LIMS offers a centralized data repository from which final analysis reports can be generated and data exchanged with others in the organization “
Alan Serrero, Ted Paczek and Christine Paszko, Waterworld - www.pennnet.com
Mark Fish, www.scientificcomputing.com
Ron Kasner, www.pharmtech.com
Randy Perry, Michael Swenson and Brock Reeve. October 2004. www.thermo.com
Author unknown, www.msc-lims.com
Open Source TCO and ROI
“Open Source (OSS/FS) has significant market share in many markets, is often the most reliable software, and in many cases has the best performance. OSS/FS scales, both in problem size and project size. It has far better security, perhaps due to the possibility of worldwide review. Total cost of ownership is often far less than proprietary software, especially as the number of platforms increases. These statements are not merely opinions; these effects can be shown quantitatively, using a wide variety of measures. This doesn’t even consider other issues that are hard to measure, such as freedom from control by a single source and freedom from licensing management”
David A. Wheeler
Bryon Cottrell and James Dixon
By David HM Spector, Bittersweet Labs
“As the past decade has shown, standardization with a proprietary flavour, think Microsoft, has its drawbacks: bloatware, security loopholes, eye-popping license fees and an unsettling reliance upon a single vendor. In offices around the globe, an era of open-source standardization, determined to condemn such drawbacks to history, may be dawning”
Malcolm Wheatley, www.cio.com