Skip to content

multiprocessing.Process vs concurrent.futures.ProcessPoolExecutor #210

Open
@smvazirizade

Description

@smvazirizade

Hi,

I wanted to discuss the multiprocessing video and script with you.

From what I understand, using multiprocessing.Process, start, and join should yield the same results as using concurrent.futures.ProcessPoolExecutor and map. However, I'm observing different performance outcomes. I was wondering if you have any insights on this matter. Below please find the code.

Thank you.

import time
import multiprocessing
start = time.perf_counter()
def do_something(seconds):
    print(f'Sleeping {seconds} second(s)...')
    time.sleep(seconds)
    ans = f'Done Sleeping...{seconds}'
    print(ans)
    return ans

processes = []
for i in range(5):
    p = multiprocessing.Process(target=do_something, args=[1+i/10])
    p.start()
    processes.append(p)
for process in processes:
    process.join()
finish = time.perf_counter()
print(f'Finished in {round(finish-start, 2)} second(s)')

image

import time
import concurrent.futures

start = time.perf_counter()
def do_something(seconds):
    print(f'Sleeping {seconds} second(s)...')
    time.sleep(seconds)
    ans = f'Done Sleeping...{seconds}'
    print(ans)
    return ans

secs = []
for i in range(5):
    secs.append(1+i/10)
with concurrent.futures.ProcessPoolExecutor() as executor:
    results = executor.map(do_something, secs)
finish = time.perf_counter()
print(f'Finished in {round(finish-start, 2)} second(s)')

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions