← Back to context

Comment by halfcat

3 days ago

> I don't want to pickle and re-instantiate a cloud client every time I download some bytes for instance

Have you tried multiprocessing.shared_memory to address this?

I haven't played with that much! This isn't really a problem in general for my approach to writing this sort of code - when I use multiprocessing, I use a Process class or a worker task function with a setup step followed by a while loop that pulls from a work/control queue. But in the Pyper functional programming world, it would be a concern.

IIRC multiprocessing.shared_memory is a much more low-level of abstraction than most python stuff, so I think I'd need to figure out how to make the client use the shared memory and I'm not sure if I could.