SAS DI Studio Best Practices
1. Take a metadata backup on a frequent basis (as per your environment) using %OMABAKUP. Ensure that a metadata backup is taken during partial promotion of metadata from one repository to another repository.
2. Export / Import Wizards need to be used for partial promotion of metadata.
3. The Loading technique in the “Loader” transformation should be
a) Truncate – works well for DBMS tables
b) Drop – works well for SAS datasets
4. To dramatically improve the integration performance with Oracle / Teradata, one can use
a) READBUFF =1000
b) OPTIONS DBSLICEPARM =(THREADED_APPS,3)
c) DATA <dataset-name>( COMPRESS=YES )
5. Use the “user-written” transformation for one-off user written code, specific to a particular job.
6. Avoid unnecessary I/O (I/O – greatest bottleneck in a SAS job)
a) Subset data first at the top of the job
b) Pay attention to sorts and joins – (Separate –UTILLOC and -SASWORK , SORTSIZE = 1 GB or higher, MEMSIZE =1.5 GB { 50% more than the SORTSIZE }, UBUFNO =20)
7. When one is debugging and working with large data files, specify OPTIONS OBS = < >; in the Pre and Post processing tab. The syntax for reseting this option is OPTIONS OBS=MAX;
8. If the SAS logs are too large, then it is better to suppress the MPRINT option in the production environment. For the same, put %LET ETLS_DEBUG=0; in the autoexec.sas file.
9. A process flow with no intermediate tables is more efficient than a process flow with intermediate tables.
10. To capture additional details in the SAS log, use
OPTIONS FULLSTIMER MSGLEVEL=I SOURCE2 MPRINT NOTES;
11. Inform your SAS BI administrator to apply the requisite hot-fixes on time.