I consistently get "A lock is not available" errors when running SAS programs. It usually happens if I perform operations on the same dataset multiple times in one program. After researching this error, it's my understanding that this means 2 programs are trying to access the same dataset. In other words, it's similar to trying to open a document that is already in use by someone else or yourself. Here is an example of code that is giving me this error:
data TSTONE.map;
infile <PATH> delimiter = ',' truncover firstobs=2 dsd TERMSTR=cr LRECL=32760;
format assessment_edition $45.;
format ...
input
assessment_edition :$45.
...
;
run;
data tstone.map;
set tstone.map;
drop DistrictName ...;
run;
I entered "..." in a few places where I have long lists of fields to import or drop. So, first I'm imported a CSV file, and then performing a data step to overwrite the file dropping some fields I don't need. I should note that sometimes when I run programs like this I do not get any lock errors. No other users are accessing these datasets, they are local to my machine. Also, if I highlight and run the 2 data steps sequentially I get no issues.
EDIT 10/8/2015: The macros in this answer were updated 10/8/2015. Generally better debugging info, added options to terminate SAS nicely regardless of whether running in batch or interactive mode.
EDIT 12/8/2014: The macros in this answer were updated 12/8/2014. If you have a version prior to that you will notice slow lock times if your libname contains hundreds or thousands of datasets. The updated versions below have fixed this issue.
Answer : We have datasets that are updated every so many minutes, and at the same time we need adhoc and scheduled reports to access those datasets. To overcome locking issues we created macros to lock and unlock the tables prior to using them. It's been running without any reported errors for almost a year now (we've ironed out all the bugs we've encountered anyway).
Usage:
Program in session A:
Program in session B:
How it works.... Let's say session B launches first. It will lock the dataset prior to performing the update. Session A wants to use the same table (while B is still updating) it. Before Session A tries to do anything with it, it will check to see if it is locked. If it is locked session A will wait for a certain period of time (default=5mins) before deciding to give up. If session B finishes within the 5 minutes, it will release the lock and session A will lock it and continue none-the-wiser. Session A will unlock it when it is done. There's a bunch of other options you can pass in as necessary to customize how to handle things when a lock can't be attained.
The
%lock
macro:The unlock macro:
These programs also require some other utility macros to exist.
IsDir macro:
FileList Macro:
The stop_sas macro:
I have been having the same problem when I run proc append in a %do loop like so:
This was on my local machine, so there was nobody else attempting to access the dataset. What was happening was that the loop was running so fast that the base file hadn't closed before it started writing the next part of the loop. After a lot of hair pulling and trying MANY complicated solutions, it turns out that you can just extend the time SAS waits before declaring a failed lock. You do this when you create the library:
That just extends the wait time to 5 seconds before SAS decides it's a lock error (from the default of 0, I believe). This simple option fixed all the lock problems I had.
First off, lock errors may simply be issues with timing. The server may not have unlocked the file from the earlier import. Workaround there is to not reuse the dataset name; if you need to do some temporary work, then import the file into a temporary (
work.
) dataset and then set that dataset into atstone.
dataset. I find lock errors more likely on a server (using NAS in particular) because there are more delays possible there; on a local disk the delays in file locking are usually so short as to not occur (but are not impossible).Second, DROP (like KEEP and RENAME) can be done in data set options, almost anywhere OUT= or SET or anything similar exists. For example, you could do
There's no reason to force SAS to reprocess the entire dataset just to drop a few variables.