Concurrency considerations between pipes and code without pipes

I'm in the process of wrapping the C library for some encoding in the pipe interface, but I'm on some design decisions that need to be made.

After the C library is configured, we hold onto the encoder context. At the same time, we can either encode or change some parameters (let us refer to the Haskell interface to this last function tune :: Context -> Int -> IO () ). There are two parts to my question:

  • The coding part is easily wrapped in Pipe Foo Bar IO () , but I would also like to show tune . Since the simultaneous use of the encoding context should be protected from blocking, I will need to block every iteration in the pipe and protect tune with the same lock. But now I feel like imposing hidden locks on the user. Am I barking the wrong tree here? How is this usually resolved in the pipe ecosystem? In my case, I expect my code to be part of always working in its own thread, with the setup happening at the same time, but I don't want to force this point of view on any users. Other packages in the pipe ecosystem do not seem to force their users either.
  • An encoding context that is no longer in use must be properly de-initialized. How does one, in the pipe ecosystem, ensure that such things (in this case, performing som IO actions) are taken into account when the pipe breaks?

A specific example would be packaging a compression library, in which case the above could be:

  • Compressive strength is customizable. We set up the phone and it runs fun. What is the best way to allow compression settings to be changed during channel operation, assuming that parallel access to the compression codec context should be serialized?
  • The compression library library allocated a bunch of memory from the Haskell heap during setup, and we will need to call some library function to clear it when the pipe is demolished.

Thanks ... all of this may be obvious, but I'm completely new to the pipe ecosystem.

Edit: Reading this after posting, I am pretty sure that this is the most vague question I have ever asked. Ugh! Sorry; -)

+5
source share
1 answer

As for (1), the general solution is to change the Pipe type to:

 Pipe (Either (Context, Int) Foo) Bar IO () 

In other words, it accepts both Foo inputs and tune requests, which it processes internally.

So, let's assume that you have two parallel Producer , corresponding to the inputs and settings:

 producer1 :: Producer Foo IO () producer2 :: Producer (Context, Int) IO () 

You can use pipes-concurrency to create a buffer that they both enter, for example:

 example = do (output, input) <- spawn Unbounded -- input :: Input (Either (Context, Int) Foo) -- output :: Output (Either (Context, Int) Foo) let io1 = runEffect $ producer1 >-> Pipes.Prelude.map Right >-> toOutput output io2 = runEffect $ producer2 >-> Pipes.Prelude.map Left >-> toOutput output as <- mapM async [io1, io2] runEffect (fromInput >-> yourPipe >-> someConsumer) mapM_ wait as 

You can learn more about the pipes-concurrency library by reading this tutorial .

By making all tuning requests go through the same single-threaded Pipe , you can make sure that you do not have two simultaneous calls to the tune function.

As for (2), you can get the resource in two ways using pipes . A more sophisticated approach is to use the pipes-safe library, which provides a bracket function that you can use in Pipe , but this is probably too large for your purpose and exists only to get and release a few resources over the life of the pipe. A simpler solution is to use the following with idiom to get the channel:

 withEncoder :: (Pipe Foo Bar IO () -> IO r) -> IO r withEncoder k = bracket acquire release $ \resource -> do k (createPipeFromResource resource) 

Then the user simply writes:

 withEncoder $ \yourPipe -> do runEffect (someProducer >-> yourPipe >-> someConsumer) 

You can additionally use the managed package, which simplifies the types a bit and simplifies getting multiple resources. You can learn more about this by reading this blog post .

+4
source

Source: https://habr.com/ru/post/1201751/


All Articles