Json in Python: getting / checking for duplicate key error

The python json module is not much different from the specification when there are duplicate keys on the map:

 import json >>> json.loads('{"a": "First", "a": "Second"}') {u'a': u'Second'} 

I know that this behavior is documented :

The RFC indicates that the names in the JSON object must be unique, but does not indicate how duplicate names in the JSON objects should be processed. By default, this module does not throw an exception; instead, it ignores everything except the last name-value pair for the given name:

In my current project, I absolutely need to make sure that there are no duplicate keys in the file and that there is no error / exception, if so? How can I do that?

I still stick with Python 2.7, so a solution that also works with older versions will help me the most.

+6
source share
1 answer

Well, you can try using the JSONDecoder class and specify a custom object_pairs_hook that will receive duplicates before they are deduplicated.

 import json def dupe_checking_hook(pairs): result = dict() for key,val in pairs: if key in result: raise KeyError("Duplicate key specified: %s" % key) result[key] = val return result decoder = json.JSONDecoder(object_pairs_hook=dupe_checking_hook) # Raises a KeyError some_json = decoder.decode('''{"a":"hi","a":"bye"}''') # works some_json = decoder.decode('''{"a":"hi","b":"bye"}''') 
+10
source

Source: https://habr.com/ru/post/943442/


All Articles