Let's say we were given N jobs and K jobs to complete these tasks. But for some jobs, we need 2 employees, while for some we need only one. Also, employees cannot complete all tasks. For example, employee 1 can complete tasks 1,2 and 5, rather than tasks 3 and 4. In addition, if we hire employee 1 to complete work 1, then we want him to complete tasks 2 and 5, since we already paid his.
So, let's say we have 5 jobs and 6 workers. For tasks 1,2 and 4 we need 2 people, and for jobs 3 and 5 we need only one. And here is a list of the jobs that every worker can do, and the wages he requires.
Worker 1 can do jobs 1,3,5 and he requires 1000 dollars. Worker 2 can do jobs 1,5 and he requires 2000 dollars. Worker 3 can do jobs 1,2 and he requires 1500 dollars. Worker 4 can do jobs 2,4 and he requires 2500 dollars. Worker 5 can do jobs 4,5 and he requires 1500 dollars. Worker 6 can do jobs 3,5 and he requires 1000 dollars.
After a little calculation and logical thinking, we can conclude that we must hire workers 1,3,4 and 5, which means that the minimum wage that we have to pay is: 1000 + 1500 + 2500 + 1500 = $ 5500.
But how can we find an efficient algorithm that will output this amount? This somehow reminds me of the Hungarian algorithm, but all these additional restrictions do not allow me to apply it.