Data integrity is at the basis of a quality system for the pharmaceutical industry and beyond. With the introduction of automatic data collection and storage systems, the close link between corporate information systems (basic and complex IT processes and production support systems) and quality procedures has become more significant. These activities may be erroneously understood as an increase in complexity, operations and, ultimately, costs. Instead the integrity of the information technology is a real tool for speeding up the business. It is in fact a system of continuous improvement, a management element that allows you to understand the process in detail. Not least it is also an extremely powerful element of competitiveness. All the data collected during the various points of the process, including all the phases of purchase, production, analysis, processing and marketing, becomes more and more relevant, acquiring the name of Big Data.

Data Integrity

Data Integrity means the protection of data in any aggregated form. According to the Data Integrity approach in a process, the data must be “ALCOA“, so they have to meet 5 general requirements:

A – Attributable to the person generating the data

L – Legible and permanent

C – Contemporaneous

O – Original record (or “true copy”)

A – Accurate

The “CCEA” requirements are also added to the “ALCOA” requirements:

C – Complete (everything generated on the single lot, for example)

__ C – Consistent with everything collected and throughout its lifecycle

__ E – Enduring long lasting (saved on a resistant / reliable / safe “support”)

__ A – Available, accessible for consultation during the entire life cycle of the data.

Electronic Batch Record (eBR)

The Batch Record, in compliance with the Current Good Manufacturing Practice (CGMP), is the document that allows to trace every single phase and operation of the production process for each batch. The current scenario often sees the use of a certified copy, Batch Record paper, of a Master Batch Record printed and signed by the responsible figures. The data capture is done manually with responsibility on the compiler and any form of data control is possible only afterwards. The introduction of the electronic Batch Record allows the certified copy of the Master Batch Record as a digital document with a digital signature and the immediacy of the data compiled both atomically and aggregated. The management of the validation and traceability of the data is guaranteed thanks to the use of electronic signatures confirming the operation and electronic recognition of the operator.


The OEE (Overall Equipment Effectiveness), “general efficiency of the plant”, is a percentage indicator that represents the global performance of a productive resource or a set of resources, whether human or technical, during the time when these are available to produce. Typically, a production system that has never faced an efficiency improvement project stands at OEE values no higher than 50-60%. The best producers, however, reach and maintain over time an OEE equal to 85%, considered as a world class objective. The calculation of the OEE does not automatically improve productivity. It must be combined with a detailed and accurate analysis of the reasons for reduced productivity. To reach the 85% “world class”, not only good technical management of resources is needed, but also and above all an excellent organizational management.

Data governance

Transforming the mass of data into Information Systems in order to have the “unique and true” knowledge of what has happened and takes place in a company, is a process that absorbs important human, economic and technological resources. It is from this observation that the concept of Data Governance was born, which can be defined as the set of activities aimed at managing people, processes, methodologies and information technologies in order to achieve a constant and correct treatment of all data that are important for an organization. It is not a technology, but a set of strategies, processes and rules to be defined upstream of the use of data, in order to precisely exercise an effective control on the processes and methods used to prevent errors and to suggest the necessary interventions to resolve the problems created by poor quality data.