Issue
I am writing a perl script to extract certain data using curl commands. EG:
my $raw_json = `curl -X GET <some website url> -H <some parameters>`;
The issue is sometimes this website crashes and my code gets stuck at the same place for a long time. I want the code to skip this line and go to the next line if the request is taking more than a specified time, say 30 seconds.
I tried using $SIG{ALRM} in my script as follows:
my $timeout = 30;
eval {
local $SIG{ALRM} = sub { die "alarm\n" }; # NB: \n required
alarm $timeout;
my $raw_json = `curl -X GET <some website url> -H <some parameters>`;
alarm 0;
};
if ($@) {
print "\nERROR!\n";
die; # propagate unexpected errors
# timed out
}
else {
# didn't
}
I expected the run to stop after 30 seconds, but what is happening is I do get the "ERROR" statement printed after 30 seconds but get request keeps on running even after that.
Solution
The curl
is happening in a subprocess, so you need to stop that subprocess. Perl isn't going to stop that for you.
Use the --connect-timeout
or --max-time
to curl
so you don't need the alarm and curl
cleans itself up.
As @ikegami suggested, the next simplest thing is IPC::Run, which can handle the details of a timeout for an external process.
Or, if you want to handle the alarm yourself, you need to work at a lower level so you have the PID of the subprocess and can kill it yourself. See perlipc.
Answered By - brian d foy Answer Checked By - Mildred Charles (WPSolving Admin)