Sunday, May 31, 2009
Saturday, May 30, 2009
Thursday, May 28, 2009
Wednesday, May 27, 2009
Tuesday, May 26, 2009
Sunday, May 24, 2009
Saturday, May 23, 2009
Monday, May 11, 2009
You can use the DEFINE.PDF/DEINF.XML files that are created for electronic submission to review your own data. This is usually performed by an independent reviewer outside of the development team. The fresh perspective from the reviewer creates a redundancy that ensures the accuracy and integrity of your data. This would allow you to catch discrepancies that may otherwise be captured during a review from regulatory agencies. There are steps which you can perform to ensure that your domain documentation is accurate and that the data which it is describing is accurate.
Step 1: Verify that any hyperlinks such as the one to external transport (XPT) files link to the right files. This is to ensure that the domain document itself has accurate hyperlinks.
Step 2: At the top list for datasets, verify the key fields. Ensure that the following criteria are met:
- The key field exists and is listed first in the list of variables.
- The dataset is sorted by the key fields.
Step 3: Verify all decoded formats. Verify that the values of the decodes match what was defined in the analysis plan or original case report form. Review the data to see if there are any values that do not meet the formatted codes and therefore was not properly de-coded.
Step 4: All derived variables need to be verified. You can choose to do some or all of the following recommended verification tasks to ensure the integrity of the derived variables:
Systematic review of program code pertaining to the derivation according to a predetermined checklist of verification criteria.
Perform testing on SAS programs pertaining to the derivation supplying valid and invalid inputs and verify expected output.
Evaluate the SAS log for error, warning and other unexpected messages.
Visual or programmatic review of report outputs related to the derivation as compared to expected results.
Review attributes and contents of output data for accuracy and integrity.
Independent programming to produce the same derivation and output for comparison.
There are many tasks performed in the process of verifying and validating SAS programs to ensure the quality of your data. Many of these tasks are overlooked for their significance in maintaining accuracy and integrity of the program logic and output which it produces. The repetitive aspect of these tasks gives them a bad reputation of being unglamorized grunt work that must be done to meet departmental SOPs. However, it is an essential step which can be performed and directed by what is documented in the domain documentation.
The domain documentation originally was specified to be a DEFINE.PDF file. The PDF format is good in that it is not intended to be edited and can be viewed both on screen and on printed paper on many computing platforms. This is a good file format for the final electronic submission but if it is used for other purposes, other formats may be more suitable. PDF does have limitations in that it is not extensible. You cannot add extra information. For example, if you wanted to store information about the user name and date and time as to when they have last updated a particular variable, you cannot easily do this within the current DEFINE.PDF. However, XML file format is extensible which allows you to add more information. The new standard therefore is calling for the documentation to be stored in a more vendor neutral, universal and extensible structure of DEFINE.XML.
If you are to use the domain documentation for project management, the information can be stored in either an Excel spreadsheet or a Word document. For electronic submissions, the XML or DEFIND.PDF is a better choice. There may be slight variations but the main core information is the same for these files.
Since the content of the data is similar, file format becomes less significant. The file can be converted from one format to the other while maintaining all the same information. There are tools that will make this a transparent process. The goal is to make use of the information stored within the domain documentation and not be restricted by being forced to use one particular file format.
Tuesday, May 5, 2009
- Backward compatibility issues with older versions of datasets and format catalogs.
- Validating multi use macros and standardized code templates.
- Verifying stand alone or project specific programming and output.
- Effects on standard operating procedures and programming practices.
The interconnectedness of the SAS computing environment does require considerable efforts in validating a SAS System. However, if this is executed successfully, it can allow for greater traceability between output, programs and source data. The performance qualification also sheds light on ways of optimizing the work and data flow of your computing environment. The many benefits of performing validation of the SAS System will outweigh the costs. In addition, it is a requirement within a regulated environment so it is recommended to be prepared.
Validation of a SAS system most commonly occurs during an upgrade from an older version of SAS or moving to a new platform. The examples used in this paper include migrating from SAS 8.2 to SAS 9.1.3 and moving from a legacy operating system to the windows platform. In either case, similar validation challenges are confronted. It is recommended that you first acquire a global view of the system and identify the architecture. Only after gaining this perspective would it be useful to then zoom in on individual components. This allows you to access the scope and interconnectedness of each component so that your validation efforts are balanced and thorough. Once the architecture is clearly understood, the requirements and functional specifications of each component are documented. These functional specifications then drive the validation testing.
It is important to follow these steps in a systematic and orderly fashion since they are interdependent. Documentation of each step in the validation process is also essential in capturing and proving that the validation effort was done properly. Besides documenting each step, it is also important to capture the traceability of each validation task. For each test case that is performed, there is an associated functional specification which then is connected to the requirements for a particular component of the system as a whole. The map or traceability matrix that ties all these validation components together is pivotal to an auditor. Proper documentation will make the difference between a successful validation audit and a complete failure.
The main goal of the validation effort is to ensure that the installation and implementation of the SAS system and its associated tools function as intended by the vendor (SAS Institute) and your organization. The validation will ensure this success. In addition to this goal, the documentation of your validation effort will also ensure the integrity of your computing environment and be in compliance with regulatory requirements such as the CFR Part 11 within the biotechnology and pharmaceutical industry.
The first step in your validation effort is to understand what it is that you are working with. The SAS System, as delivered to you in a series of CDs, is a system which contains modules such as Base, Stat, Graph and other components of SAS. This however only makes up part of the system that you are implementing in your organization. The SAS software fits into a computing environment that interacts with other software and hardware. If you were to take into account all the associated hardware and software that SAS interacts with, this is what is considered the “SAS System” from a validation perspective. It is therefore important for you to take the right steps to identifying and documenting all these components.
Step 1: Identify all the hardware components of your computing environment. For example:
SAS Application Server
SAS File Server
Monday, May 4, 2009
The SAS® System gives you the ability to create a wide range of web-ready reports. This paper walks through a series of examples showing what you can do with Base SAS and when you need SAS/IntrNet. Starting with the simplest HTML reports, this paper shows how you can jazz up your output by using the STYLE= options, traffic-lighting and hyper-linking available in the reporting procedures: PRINT, REPORT, and TABULATE. With the use of SAS/IntrNet you can add functionality to reports with features such as drill-down links that are data-driven, and you can produce dynamic reports created on-the-fly for individual users. Using this technique, your clients can navigate to the exact information needed to fulfill your business objective.
SAS ODS Overview
In the SAS System, both the Output Delivery System (ODS) and SAS/IntrNet produce documents for viewing over the Internet. All SAS users have ODS because it is part of Base SAS, but SAS/IntrNet is a separate product which you must have installed in addition to Base SAS.
If all you want to do is produce reports and post them on the Internet for people to view, then you probably don't need SAS/IntrNet. With a few ODS statements you can send any SAS output to the HTML (Hyper Text Markup Language) destination. You can also change the way HTML output looks by choosing one of the built-in style definitions that comes with Base SAS, or creating your own style definition using the TEMPLATE procedure. (Unfortunately, we don't have room to cover PROC TEMPLATE in this paper. For more information on PROC TEMPLATE or ODS basic concepts, see Slaughter and Delwiche (2001).) Using the STYLE= option in the TABULATE, REPORT, and PRINT procedures, you can change the color, font, and many other features of reports. You can even insert images and hyperlinks.
Using ODS to insert hyperlinks, you can create a pseudo-dynamic effect. When a person clicks on one of these hyperlinks, the browser takes them to a new page. While this has a dynamic feel, the new page is in fact static because you have created it in advance. To create truly dynamic reports, you need SAS/IntrNet.
With SAS/IntrNet you can create reports on the fly, based on the needs of individual users. The advantage of combining SAS/IntrNet with your ODS programs is that your program is dynamically executed when the user clicks on your hyperlink. This means that if your data is changing, the drill-down will capture the most up-to-date results. If your reports are not time dependent, the static approaches of generating HTML reports with ODS will suffice. However, if your report helps decision makers decide upon time-sensitive information, the marriage of ODS and SAS/IntrNet is the perfect solution.
For the first part of this paper we will be using basically the same table produced from PROC TABULATE to show you how you can use ODS and the various STYLE options to modify the look of the table. Here are the SAS statements that produce this basic table and the listing output is shown in Table 2.
'P' = 'Pecan Pie'
'B' = 'Banana Bash'
'A' = 'Apple Spice'
'M' = 'Mango'
'C' = 'Choco Mint';
TITLE 'Jelly Bean Production in 2001';
TITLE2 'Millions of Pounds';
PROCTABULATE DATA=production FORMAT=4.1;
CLASS Factory Flavor;
FORMAT Flavor $flav.;
TABLE Flavor ALL,
RUN;complete paper found at "ODS Meets SAS/IntrNet?", related CDISC Software and CDISC Standards...
Friday, May 1, 2009
Controlled Terminology Overview
Coding decisions for adverse events and medications is part science and part art. There is room for interpretation left up to the person deciding on which preferred term or hierarchical System Organ Class (SOC) is associated with the verbatim term. This may differ slightly between projects with different drugs and indications. The difference in coding decisions is compounded when there is more than one person making the decision. This is even further exacerbated when the individuals work in different organizations such as various CROs with different operating procedures. There are many variables contributing to different coding decisions which create a challenge for the data manager who needs to pull all these coding decisions into one coherent and consistent set of coded data for analysis and submission. This paper will describe an approach to manage and reconcile these differences referred to as “ThesQA” or Thesaurus Quality Assurance. The workflow of this methodology is shown here:
The first and pivotal step in the work flow is to be able to manage all the dictionaries centrally by registering them. This is also referred to as “Setup”. Setup gives you the ability to track change control and manage the metadata pertaining to each dictionary. Once you have identified all the versions of dictionaries and their related coding decisions and store the information centrally, you can start to work towards reviewing and reconciling their differences. The goal is to manage all the changes while maintaining change control that takes place during updates.
Dictionary Setup and Management
The first step in managing your dictionary is to manage the metadata pertaining to each set of data. The metadata is stored in a SAS dataset so that it can be easily updated by SAS tools. An example view of the data would look like:
The SAS dataset named DICTDB, which stands for dictionary database, does not contain the actual values of the dictionary, but rather it captures information about each thesaurus dictionary to be managed. The following steps describe the approach towards setting up the dictionaries.
complete paper at: "Effective Ways to Manage Thesaurus Dictionaries", AE Coding Software and Coding Dictionary.