Wednesday, September 30, 2009
The part that I find challenging about reviewing data against CDISC has to do it multiple times as data is being developed. There are times when I need to review the latest update and I am not at my desk. I have found a solution for this when I am out and about, I can review the latest data on my iPhone. This allows me to see how standardize my data is and I see the status of my CDISC effort at any moment even standing in line at Starbucks. :)
I capture a YouTube video as to how this is done which you can see below.
Monday, September 28, 2009
Rather than going into a verbose description here, I recorded a video to illustrate this to you. I find that the iPhone which is a multi-touch graphical user interface can be best explained by showing so hope you find this video useful.
Sunday, September 27, 2009
Friday, September 18, 2009
Tuesday, September 15, 2009
I recently submitted an iPhone application to the apple developer program to be approved. This was my first time so the hoops that Apple had me jump through were interesting. The initial application had general information which seems pretty normal. These included things such as:
- Application Name: BI Flash
- Application Description: BI Flash delivers business intelligence to information to the iPhone. It takes SAS macros, programs or data on a server and delivers this to iPhone. The user can register the programs or SAS libraries, and then make this available to specified users who can then view this through BI Flash on the iPhone. This is executed in real time so the user gets always the latest data.
- Primary Category: Business
- Keywords: Business Intelligence, SAS, Analytics
- Application URL: http://meta-x.com/biflash
These seem pretty standard since they would probably use this for the AppStore. The next step is pretty interesting since it asked for me to check three categories including: None, Infrequent/Mild or Frequent/Intense. The description for the categories included:
- Cartoon or Fantasy Violence
- Realistic Violence
- Sexual Content or Nudity
- Profanity or Crude Humor
- Alcohol, Tobacco, or Drug Use or References
- Mature/Suggestive Themes
- Simulated Gambling
- Horror/Fear Themes
- Prolonged Graphic or Sadistic Realistic Violence
- Graphic Sexual Content and Nudity
I then uploaded the binary for my application along with a large 512x512 icon, a screen shot. It took Apple a couple of weeks for them to review my app. Since they receive thousands of apps, I wonder if they even check it at all. I was surprise to find out that they did do a detailed review. They found two things that they wanted to change due to that fact that “it does not adhere to the iPhone Human Interface Guidelines as outlined in the iPhone Developer Program License Agreement section 3.3.5.”
The first part is when the application is tested without a network connection; it just presented a blank screen. Apparently, I missed these guidelines which suppose to display a clear message detailing what the user is suppose to do. I created a screen that is more friendly now:
The second point is that the small icon of the application does not match the large icon.
They have very specific guidelines. I had our graphic artist work on these at different stages of the application so they appeared different so they thought it would confuse the user. I thought these suggestions were very good and I do appreciate that someone actually took the time to do a review.
I then updated the large icon as shown here:
I just submitted an updated version so they sent me a friendly but generic message that someone is reviewing it and will send me a message of any additional request. I will keep my fingers crossed but hopefully this round will take less than two weeks.
Monday, September 14, 2009
Thursday, September 10, 2009
- Transformation Model Validation- A transformation model documenting the source data and how it was transformed confirming the destination and source variables.
- Data Value Subset Review - An automated report printing out a subset of the data before and after the transformation is reviewed and validated. This may catch truncation.
- Categorical Aggregate Review - An automated summary report is generated summarizing the frequency counts of categorical variables verifying the counts are the same. This catches missing or dropped values.
- Continuous Aggregate Review - An automated summary report is generated summarizing the min, max median counts of continuous variables verifying the counts are the same. This catches missing or dropped values.
- CDISC Rules PROC CDISC - SAS tools such as PROC CDISC provides a short list of deviations or guidelines that may have been violated. This review is applied programmatically and a report is generated.
- Variable Lengths - An evaluation of all variable lengths and a report is generated with recommendations on standardizing lengths for variables across all data to adhere to standards.
- Deviation Summary - A summary report documenting all deviations and their resolutions.
- Test Plan - A formal test plan document is used to document all the related tests and deviations.
- CDISC Builder Rule Test - An 18 criteria check list. The list are shown here with an example report shown below:
- Required Fields: Required identifier variables including: DOMAIN, USUBJID, STUDYID and --SEQ.
- Subject Variable: (220.127.116.11) For variable names, labels and comments, use the word "Subject" when referring to "patients" or "healthy volunteer".
- Variable Length: (18.104.22.168) Variable names are limited to 8 characters with labels up to 40 characters.
- Yes/No: (22.214.171.124) Variables where the response is Yes or No (Y/N) should normally be populated for both Yes and No responses.
- Date Time Format: (126.96.36.199) Date or Datetime must be in ISO 8601 format.
- Study Day Variable: (188.8.131.52) Study day variable has the name ---DY.
- Variable Names: (3.2.3) If any variable names used matches CDISC variables, the associated label has to match.
- Variable Label: (3.2.3) If any variable labels match that of CDISC labels, the associated variable has to match.
- Variable Type: (3.2.3) If any variables match that of CDISC variables, the associated type has to match.
- Dataset Names: (3.2.3) If any of the dataset names match CDISC, the associated data label has to match.
- Dataset Labels: (3.2.3) If any of the dataset label match CDISC, the associated dataset name has to match.
- Abbreviations: (10.3.1) (10.4) The following abbreviations are suggested for variable names and data sets.
AE Adverse Events AU Autopsy BM Bone Mineral Density (BMD) Data BR Biopsy CM Concomitant Meds CO Comments DA Drug Accountability DC Disease Characteristics DM Demographics DS Disposition DV Protocol Deviations EE EEG EG EEG EX Exposure HU Healthcare Resource Utilization IE Inclusion/Exclusion IM Imaging LB Laboratory Data MB Microbiology Specimens MH Medical History ML Meal Data MS Microbiology Susceptibility OM Organ Measurements PC PK Concentration PE Physical Exam PP PK Parameters PG Pharmacogenomics QS Questionnaires SC Subject Characteristics SE Subject Elements SG Surgey SK Skin Test SL Sleep (Polysomnography) Data SL Signs and Symptoms ST Stress (Exercise) Test Data SU Substance Use SV Subject Visits TA Trial Arms TE Trial Elements TI Trial Inclusion/Exclusion Criteria TS Trial Summary TV Trial Visits VS Vital Signs CAN ACTION ADJ ADJUSTMENT ADJ ANALYSIS DATASET BL BASELINE BRTH BIRTH BOD BODY CAN CANCER CAT CATEGORY C CHARACTER CND CONDITION CLAS CLASS CD CODE COM COMMENT CON CONCOMITANT CONG CONGENTTAL DTC DATE TIME - CHARACTER DY DAY DTH DEATH DECOD DECODE DRV DERIVED DESC DESCRIPTION DISAB DISABILITY DOS DOSE DOS DOSAGE DOSE DOSE DOSE DOSAGE DUR DURATION EL ELAPSED ET ELEMENT EM EMERGENT END END EN END ETHNIC ETHNICITY X EXTERNAL EVAL EVALUATOR EVL EVALUATION FAST FASTING FN FILENAME FL FLAG FRM FORMULATION, FORM FREQ FREQUENCY GR GRADE GRP GROUP HI HIGHER LIMIT HOSP HOSPITALIZATION ID IDENTIFIER INDC INDICATION INDC INDICATOR INT INTERVAL INTP INTERPRETATION INV INVESTIGATOR LIFE LIFE-THREATENING LOC LOCATION LOINC LOINC CODE LO LOWER LIMIT MIE MEDICALLY-IMPORTANT EVENT NAM NAME NST NON-STUDY THERAPY NR NORMAL RANGE ND NOT DONE NUM NUMBER N NUMERIC ONGO ONGOING ORD ORDER ORIG ORIGIN OR ORIGINAL OTH OTHER O OTHER OUT OUTCOME OD OVERDOSE PARM PARAMETER PATT PATTERN POP POPULATION POS POSITION QUAL QUALIFIER REAS REASON REF REFERENCE RF REFERENCE RGM REGIMEN REL RELATED R RELATED REL RELATIONSHIP R RELATIONSHIP RES RESULT RL RULE SEQ SEQUENCE S SERIOUS SER SERIOUS SEV SEVERITY SPEC SPECIMEN SPC SPECIMEN SPEC SPONSOR SPC SPONSOR ST STANDARD STD STANDARD ST START STD START STAT STATUS SCAT SUBCATEGORY SUBJ SUBJECT SUPP SUPPLEMENTAL SYS SYSTEM TXT TEXT TM TIME TPT TIMEPOINT TOT TOTAL TOX TOXICITY TRANS TRANSITION TRT TREATMENT U UNIT U UNIQUE UP UNPLANNED VAR VARIABLE VAL VALUE V VEHICLE
- SEQ Values: When the --SEQ variable is used, it must have unique values for each USUBJID within each domain.
- Label Casing: For Dataset labels and variable labels, all non trivial words (more than three characters) must start with a capital letter with the rest of the characters lowercase.
- Required Values: (184.108.40.206) For required fields such as the ones specified in number 1, check to see if there are values. If there are any missing, values, report the observation number where it is missing.
- Similar Parenthesis: For labels with matching values inside parenthesis such as (Yes/No) within the same dataset, it will check to see if the variables have the same type and length. If not, it will report the differences.
- Required Variables: (220.127.116.11) A Required variable is any variable that is basic to the identification of a data record (i.e., essential key variables and a topic variable) or is necessary to make the record meaningful. Required variables should always be included in the dataset and cannot be null for any record.
- Expected Variable: (18.104.22.168) An Expected variable is any variable necessary to make a record useful in the context of a specific domain. Columns for Expected variables are assumed to be present in each submitted dataset even if some values are null.
Wednesday, September 9, 2009
I realize that while holding the camera on my hand, it was not on a tripod so it was perhaps not stable and I was looking at the person I was talking to so the composition was not optimal but this adds to the whole amateur video blogging experience. This experiment of one man on the fly reporting was interesting. I hope to do it again at future conferences.
I will include some of the videos below for your viewing.
Claire Castell, Mgr. Information Mgmt| Consumer and Small Business Deposit Risk at Wells Fargo shares the history of WUSS and its name.
Marci Russel, Web Project Manager of SAS Institute talks about Online Support at WUSS 2009 conference
Lora Delwiche author of Little SAS book shares Debugging Techniques
Art Carpenter talks about SAS Macros at WUSS 09 in San Jose Art is a consultant for Caloxy, California Occidental Consultant
Michael A. Raithel
WUSS Opening Session Keynote Address by Michael A. Raithel It's not Easy Being a SAS Programmer
Vincent DelGobbo, Senior Systems Developer for web Tools Group at SAS Institute talks about SAS export to Excel XML format at WUSS.
Heather Brown, Senior Recruiter, Advanced Clinical Services LLC