I am developing an application that should check if a computer is capable of ping google.com. To do this, I use the python subprocess module and make a call, as shown in the code below:
response = subprocess.call("ping -c 1 google.com -q", shell=True)
However, after starting for some time, the program terminates with a segmentation error.
I have the following code:
daemon.py
def dataset_save(smartphone, mote):
print("DATA LOGGER:\tStarting Now")
with open('dataset.csv', 'a+') as dataset:
dataset.write(str(datetime.datetime.today()) + ',' + \
str(datetime.datetime.today().weekday()) + ',' + \
str(smartphone.connected) + ',' + \
str(mote.A0_pw) + ',' + \
str(mote.B00_pw) + ',' + \
str(mote.B01_pw) + ',' + \
str(mote.B10_pw) + ',' + \
str(mote.B11_pw) + ',' + \
str(presence.get_value()) + ',' + \
str(temperature.get_value()) + ',' + \
str(luminosity.get_value()) + '\n')
print("DATA LOGGER: \tData successfully logged @ %s!" %str(datetime.datetime.today()))
return
def run():
check_internet()
while True:
dataset_save(smartphone, gateway)
check_presence()
check_internet.py
def check_internet():
response = subprocess.call("ping -c 1 google.com -q", shell=True)
print(response)
if response == 0:
print ("CONNECTIVITY: \tConnected to internet")
threading.Timer(1, check_internet).start()
return
else:
print("CONNECTIVITY: \tUnable to connect to internet")
threading.Timer(1, check_internet).start()
return
Running this in GDB I get the following trace from a segmentation error:
--- google.com ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 146.626/146.626/146.626/0.000 ms
0
CONNECTIVITY: Connected to internet
[New Thread 0xb55ffb40 (LWP 4064)]
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0xb65ffb40 (LWP 4043)]
PING google.com (216.58.222.110) 56(84) bytes of data.
__deallocate_stack (pd=0xb65ffb40) at allocatestack.c:760
760 allocatestack.c: No such file or directory.
(gdb)
--- google.com ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 146.504/146.504/146.504/0.000 ms
(gdb) bt
(gdb)
Is there a reason why I should not use threding.Timer as I use? It seems to me that sequential threading is responsible for this segmentation error.
Thank.