Tue 15 Feb 2011 01:15:01 PM UTC, original submission:
Hi,
We have real performance issue related to shell command line limitation. By design we need to output list of targets to file to prepare tar file. The list can be very long. I have used recommended approach to split list to multiple pieces and call $(shell $(part)). But too many shell invocations completely kills performance of this operation. I propose to have some new function to output any information to file in function.c
/**
- Output information to file.
- argv - file name
- argv + 1 - can be w,a, see fopen
- argv + 2,... - message to output to file
*/
static char * func_file(char o, char argv, const char funcname) {
char **argvp;
int len = 0;
FILE * pFile = fopen(argv,(argv+1));
if (pFile != 0) {
for (argvp = argv+2; *argvp != 0; ++argvp) {
char * param = *argvp;
replace_all(param,"\\n","\n");
replace_all(param,"\\t","\t");
/* need to put more here */
len += strlen(param) + 2;
fputs(param,pFile);
}
fclose(pFile);
} else {
error(reading_file, _("Could not open file: `%s`"), *argv);
}
return o;
}
After this patch operation to output looks like
$(file $(outputfile),w,$(patsubst %,%\n,$@))
You can even append to existing file
$(file $(outputfile),a,$(patsubst %,%\n,$@))
Now instead of minutes! it takes less then a second to output 10000 build targets to the file.
|