Python multiprocessing.Pool map () "TypeError: string indexes must be integers, not str"

I am trying to use multiprocessing.Pool for parallel processing in a dictionary list. Example below

( Please note : this is an example of a toy, my actual example will perform intensive processor processing by the values ​​in the actual dictionary)

import multiprocessing

my_list = [{'letter': 'a'}, {'letter': 'b'}, {'letter': 'c'}]

def process_list(list_elements):
    ret_list = []
    for my_dict in list_elements:
        ret_list.append(my_dict['letter'])
    return ret_list

if __name__ == "__main__":
    pool = multiprocessing.Pool()
    letters = pool.map(process_list, my_list)
    print letters

If I run the code above, I get the following error:

Traceback (most recent call last):
  File "multiprocess_fail.py", line 13, in <module>
    letters = pool.map(process_list, my_list)
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 250, in map
    return self.map_async(func, iterable, chunksize).get()
  File "/usr/lib/python2.7/multiprocessing/pool.py", line 554, in get
    raise self._value
TypeError: string indices must be integers, not str

I do not know what string indices he means. Shouldn't iterate pool.mapover items in my_list(i.e. Dictionaries)? Should I change the way data is transferred to the map function in order to run it?

+4
source share
1 answer

pool.map() , iterable. , .

, , process_list() :

process_list({'letter': 'a'})
process_list({'letter': 'b'})
# etc.

list_elements . for:

for my_dict in list_elements:

, my_dict . , , my_dict 'letter' . :

my_dict['letter']

, 'letter'['letter'] , .

:

def process_list(list_element):
    return list_element['letter']

; map() , .

+6

Source: https://habr.com/ru/post/1531819/


All Articles