Multiprocessing in python, work with several files -


i have done script convert several files simultaneously, instead of converting 4 files several processes, code converts files 1 one several processes, here's code:

def convert (directoryname):   path, dirs, files in os.walk(directoryname):     f in files:       if f.endswith(".txt")         f1=f         path1=path                  p=mp.process(target=convert1, args=(path1,f1,))   p.start() 

does have idea?

your code overwrites p everytime, start 1 process when loop finished.

instead, call p.start when create process, , store can call join on processes in end:

def convert (directoryname):   process_list = []   path, dirs, files in os.walk(directoryname):     f in files:       if f.endswith(".txt")         f1=f         path1=path                  p=mp.process(target=convert_stdf_hdf5, args=(path1,f1,))         p.start()         process_list.append(p)   # wait processes finish   p in process_list:      p.join() 

Comments

Popular posts from this blog

javascript - Clear button on addentry page doesn't work -

c# - Selenium Authentication Popup preventing driver close or quit -

tensorflow when input_data MNIST_data , zlib.error: Error -3 while decompressing: invalid block type -