Recently in the SAS Community Library: SAS' @StuartRogers provides a close look at the new Microsoft Entra Gallery application and details how it can be used.
Good morning, I'm currently analyzing a much larger database than I'm used to. For this analysis I need to use the proc mixed with repeated measures, however when I test the covariance matrices and use the command DDFM=KR or DDFM=kr2 I have two problems: 1- the script does not work due to lack of memory even using options memsize=; independent of how much memory you use (max obs used 20G) 2- The DEN F appear as infinite, However, I was unable to understand the differences between DDFM= betwithin, contain, kenwardroger, kenwardroger2, residual or satterthwaite. Here is the model used: PROC MIXED; CLASS id_Animal TRT day; MODEL visit = TRT dday trt*ddai / ddfm= ?? ; RANDOM id_Animal ; REPEATED day /TYPE = ?? SUBJECT = id; RUN;
... View more
Hi everyone, I'm reaching out to the SAS community today seeking some guidance. Our team is currently in the process of transitioning from base SAS to SAS Studio and migrating our data storage from a local file server to Amazon S3. We've successfully uploaded all our historical datasets (in .sas7bdat format) from the local server to S3. However, we're encountering an issue when trying to load/read this data directly from S3 into CAS using SAS Studio. While I can perform various S3 bucket management tasks like creating, deleting, uploading data from CAS, and copying data between buckets, I'm unable to utilize PROC S3 to directly load data from S3 into CAS using .sas7bdat files. proc s3 keyid= &accesskey secret=&scaccesskey region=®i; get "/viya-data/AAI/Pertape/datasets/pt31mar24.sas7bdat" "/nfsshare/data/sasdata/S3Transit/pt31mar24"; run; While using PROC S3 if a file size smaller than 2MB then i am getting different error than above one ". I cross checked with my sas admin that we have read and write permissions. Here's the interesting part: I can successfully use PROC CASUTIL's LOAD function to read data from S3 in formats like .sashdat, CSV, or Excel. It's specifically the .sas7bdat format that seems to be causing the issue. caslib amz dataSource=(srcType="s3" accesskeyid=&accesskey secretaccesskey=&scaccesskey region=®i bucket="viya-data" objectpath="/AAI/Pertape/datasets/"); proc casutil incaslib=amz outcaslib="S3Transit"; load casdata="pt31mar24.sas7bdat" casout="tl_website_30apr24"; quit; Error message while using PROC CASUTIL : I'm wondering if anyone in the community has faced similar challenges when working with .sas7bdat files in S3 and CAS. If so, any insights or solutions you could share would be greatly appreciated. We are currently using SAS Viya LTS 2023.03 Thank you in advance for your time and assistance! -Bhaskar
... View more
Hi all, Data steps and most SAS procedures require many statements terminated by a RUN; before they can run. Proc SQL is different. Every statement is complete and can be executed immediately, without knowing what follows. So I wonder, what is the role of the QUIT; statement? Does it ever make a difference, besides writing the true running time to the log? I never use to bother with the QUIT; statement. Now I tend to use it for neatness and for avoiding being scolded. But the question remains: is it ever necessary? PG
... View more
Hi all,
At https://developer.sas.com/apis/rest/v3.5/#filters there is supposed to be a link that explains filter expressions:
For a complete description of filter expressions, see the Filtering reference.
However, when clicking this link (on the word "Filtering") I do not see any such page. It looks like this "got lost" during the switch to the new developers.sas.com web site.
Anyone knows where can I find the doc about API filter expressions?
... View more
Hello Experts,
I am running Data Flux job which contains only SAS code and getting below error.
Code is very simple and environment has enough workspace but still getting below error::
ERROR: Close error:
ERROR: Error allocating statement handle:
WARNING: The data set WORK.ONE may be incomplete. When this step was stopped there were 0 observations and 0 variables.
NOTE: Compression disabled for the WORK.ONE table because the compression overhead would increase the size of the table.
Additionally code working fine in SAS Studio but while runnig it through SAS DF management server I am getting error in log.
Does anyone face similar issue and if yes please let me know what is resolution for this ..
Code ::
data work.&tblenm(compress=yes); set hdp.&tblenm; where input(load_date,DDMMYY10.) >= "&purge_dt."d; run;
Thank you in advance .
... View more