This partly depends on what your macro does. If we assume that your macro does something designed to run outside the data step (i.e., it does not just assign a data step variable), then you have several options.
CALL EXECUTE has already been explained and is a good option for some cases. However, it has some drawbacks, especially with macro accounts, which require some extra protection to protect in some cases - especially when you create macro variables inside your macro. Quentin in his comments shows a way around this (adding %NRSTR to the call), but I believe that I prefer to use only CALL EXECUTE when there is an advantage over this over other methods - especially if I want to use SAS data step methods (e.g. , FIRST or LAST, for example, or some form of loop) when creating my macro messages or when I have to do something in the data step anyway and can avoid the overhead of reading the file at another time, If I just write the data step like yours - data something, something install, call execute, run - I would not use it.
PROC SQL SELECT INTO is typically used to process a list (which is pretty much it). I like the simplicity of SQL when I do something not too complicated; for example, you can easily get only one version of each macro using DISTINCT without explicitly writing proc sort nodupkey or using the first / last processing. This also has the advantage of debugging, that you can write all your macro calls to the results window (if you don't add noprint ), which is a little easier to read than the log for me, if I try why my calls are not generated properly (and do not accept no additional PUT instructions).
proc sql; select catx(',','%macro(',arg1,arg2,arg3)||')' into :mvarlist separated by ' ' from dataset; quit; &mvarlist.
This launches them quite simply and has no synchronization problems (since you just write a bunch of macros).
The main disadvantage of this method is that you have a maximum of 64 thousand characters in a macro variable, so if you write a huge number of them, you will come across this. In this case, use the CALL EXECUTE or %INCLUDE files.
Files
%INCLUDE is pretty much useful either as a replacement for SELECT INTO , when the call exceeds the character limit, or if you find it useful to have a text file to view with your calls (if you use this for example in batch mode, it might be easier to get and / or analyze than the output of a magazine or listing). You just write your calls to a file, and then %INCLUDE this file.
filename myfile temp; *or a real file if you want to look at it.; data _null_; set dataset; file myfile; put @1 catx(',','%macro(',arg1,arg2,arg3)||')'; run; %include myfile;
I no longer use it anymore, but it is a common technique used, in particular, for senior SAS programmers who know so well.
DOSUBL is a relatively new method and to some extent can be used to replace CALL EXECUTE , since its default behavior is usually closer to what you intuitively expect than CALL EXECUTE . The document page is really the best example of how this works in different ways; basically, it fixes the synchronization problem by allowing each individual call to look for imports and export macro variables from / to the calling environment, which means that each iteration of DOSUBL runs at a specific time compared to CALL EXECUTE , where everything runs in a single bunch, and the macro the environment is βfixedβ (that is, any reference to a macro variable is committed at runtime, unless you escape it with %NRSTR ).
Another thing worth mentioning is RUN_MACRO , part of the FCMP language. This allows you to fully run the macro and import its contents back to the data step, which is an interesting option in some cases (for example, you could wrap the call using PROC SQL, which selected the counter of something, and then import it to the data set as a variable, all in one datastep). This is applicable if you are doing this with the goal of calling a macro to assign a data step variable, and not to start a process that does things that you donβt need to import in the data step, but it is worth considering if you really want to return this data to the data set that caused the process.