Data Management

Data Management is the central requirement for enabling subsequent processes with the intent of providing structured data consistently for analysis and reporting. It is a critical phase in clinical research that leads to the generation of high-quality, reliable, and statistically proven data from clinical trials. Adapting innovative data management methodologies helps to ensure data integrity, quality, and accountability in our process. At OrciMed, we have set standards and put governance mechanisms in place to ensure the quality of data with a reduced turnaround time. We provide tailor made solutions to various procedures in Clinical data management including Case Report Form (CRF) designing.

CRF annotation, database designing, data entry, data validation, discrepancy management, medical coding, data extraction, and database locking. The expert team at our organization proactively recognize the pain points, highlights it to the customers and provides custom-built approach to ensure quality data submission to regulatory authorities. Our goal is to provide high-quality data with minimal or no missing data, most significantly high-quality data that possess subjectively acceptable level of variation which should not affect the conclusion of the study on statistical analysis. We provide comprehensive data management solutions with well-designed outsourcing models and experience in all therapeutic areas and all phases of clinical trials. Our strategic solution is to focus on data flow for real time data cleaning. We comply with 21 CFR part 11 regulations and also have acquired proficiency in various CDM tools like ORACLE CLINICAL, CLINTRIAL, MACRO, RAVE including open-source tools like OpenClinica, OpenCDMS, TrialDB and PhOSCo. We possess team members with different roles and responsibilities who perform their roles effectively and efficiently.

A case report form (CRF) is a specialized document in clinical trial. The principle of the case report form is protocol driven, robust in content and should be enabled to collect study specific data. It can be paper based CRF forms and also electronic CRF. The idea of the CRF from is to facilitate the collection of accurate data and reporting. It is imperative to have a strong collaboration between the statistical programmer and other key stakeholders which has a huge impact on the accurate designing of the CRF in a clinical study. The CRF is a link between the protocol, the database and statistical analysis plan. Database design plays a vital role in clinical research. It is a structured form that has a defined rows and columns that allows to capture all the data in the CRF. The data base designing has to be compliant to 21 CFR part 11. We possess extremely eminent professionals who can support in paper CRF/eCRF designing. We serve as one stop store for any customized database designing requirements.

It is important for any clinical trial study to possess a data management plan. It is a written document that describes the plan for collection and management of data throughout the lifecycle of a clinical trial. The data management also emphasis on the storage of the data and data protection provisions. It is also essential to focus on detail information on data archival process. It is important to review and finalize the document for the clinical trial study. The data management plan needs to be designed in such a way that it ensures the availability and accessibility of the trial even after the project completion. We have developed expertise over the years in designing and development of the data management plan including all the parameters to enable a comprehensive document detailing the end to end process.

Clinical data management is a critical phase in clinical research that leads to generation of high quality, consistent and statistically heavy data from clinical trial study. The data entry is the key to accumulate and all the information from CFR/eCRF or any other sources as designated in the study into the approved database. Our experts follow robust practice in performing the accurate data entry. We also provide two level quality check as a part of the process to ensure data accuracy. The primary objective of the clinical data management process is to provide high quality data ensuring the minimal errors and no data misses. Data validation is the heart of the process in clinical data management process that ensures by testing the validity of the data in accordance to the protocol specifications. The expert teams also perform edit checks to ensure discrepancy management and provide validated data.

Double data entry is the state of the art in clinical trials where the data is entered twice to ensure the integrity of the data entered. It is a process where the data is first entered in paper CRF and then the data is entered into the prescribed database. The efficient clinical data managers at our company play a key role in ensuring the data entry process. It is believed to be more effective in reducing data entry errors. Though it is considered as a conventional approach to data entry in clinical data management, some companies prefer to capture the data through the double data entry process and we play an intermediate role in providing our expert support in performing double data entry utilizing the best practices to provide more accurate data.

Medical coding is performed to categorize the reported medical terms appropriately. It is important activity in data management to enhance the analyzation of verbatim. The expert medical coding professionals at our organization have rich experience in MedDRA (Medical Dictionary for regulatory activities) and WHO DD (World Health Organization Drug Dictionary) The experts ensure that the reported verbatim is coded appropriately as per the CRF/eCRF. MedDRA is used to code all the medical terminologies generated during all the phases of the clinical trial, therapeutic indications, signs and symptoms, disease diagnosis. It also plays a vital role in coding quantitative results of the investigations, surgical procedures, medical and family history. WHO DD is most comprehensive dictionary which has medicinal product information. We have certified team members who can provide holistic approach to coding.

Query resolution is an elaborated process to ensure the data discrepancies are addressed. The data discrepancies can arise at various levels of the data management process and as a part of the validation process it is the responsibility of the data managers to raise the queries to the sponsor or sponsor representative or the investigator to resolve the error or inconsistency discovered during data review. We also provide solution in automating the structured query email which allows the data managers to edit the email as required and send it to appropriate stake holders for query resolution. This fastens the process of query and with automation has a process to track all the queries raised and help in addressing the discrepancies. The database lock is an action taken to prevent any further changes to the database after query resolution and determine that the study is ready for further analysis.

Safety databases are designed to collect study information to accurately assess the safety of the medicinal product. It also facilitates the reporting of individual and aggregate safety data to the regulatory authorities and any stake holders. It acts as a key source of information for any safety signals and also a repository to evaluate the benefit risk of the study drug. It is important to consider global compliance while designing a safety database. It is important to consider parameters compromising of common code base, periodic reporting, E2B exchange scientific querying and submissions. Decision making on size of the database during the database designing is important as safety databases for medicinal product intended to use in life threatening disease especially when there is no alternative treatment can be smaller compared to safety database used for treating that are not life threatening. We provide customized support in designing the safety database on the need basis of the customers.

Serious Adverse event data reconciliation is the comparison of key safety data variables between clinical data management system and sponsor PV. Reconciliation is performed to ensure that events residing in both the systems are consistent. The data managers are responsible for performing reconciliation on a periodic basis. The objective is to reconcile all the discrepancies before the final data lock of the clinical trial. Our expert team provide solution with automation to review reports from the data management system and the line listings from safety database. It presents the highlighted discrepancies for further validation bt the data managers to ensure consistency in the serious adverse events. This enhances to reduce the effort estimate to perform the SAE reconciliation process and enables the data consistency during the clinical trial process.

It is a computerized system designed to collect clinical data in electronic format. The EDC system replaces the traditional paper based data collection methodology to streamline the data collection process. The EDC solutions are widely accepted and adopted by pharma and CROs to benefit in particular for the late phases clinical studies and also in post marketing surveillance. The current trend as recommended by the US FDA is to suggest methods of capturing clinical trial data electronically from the beginning of the clinical trial and moving to the cloud. Our experts also promote solutions to transform the structured ad unstructured data from different sources into a trusted system. We provide data integration tools and solutions with promising data accuracy.

Data listings are the reports generated on the raw datasets. They play an important role in the data validation. The main objective of the data listings is to pull out the discrepancies in the raw data. It is well known fact that every study is different with its own objective and maintaining and validating the data quality is an important step in the data management platform. The data validation specification describes the number of the listings, purpose, the fields to be presented in the outcome, specific raw datasets to be used, the discrepancies that need to be flagged. The listings can be customized to the study requirement and the skilled professionals at our organization provides a thorough solution to determine the customization of the data listings as the clinical study requirement.

©2023. OrciMed Life Sciences. All Rights Reserved.