If the dict values are all serializable
It looks like simply passing the DictProxy through the dict constructor allows you to serialize the data to JSON. The example below is from Python 3.6:
>>> import multiprocessing, json
>>> m = multiprocessing.Manager()
>>> d = m.dict()
>>> d["foo"] = "bar"
>>> d
<DictProxy object, typeid 'dict' at 0x2a4d630>
>>> dict(d)
{'foo': 'bar'}
>>> json.dumps(d)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
...
TypeError: Object of type 'DictProxy' is not JSON serializable
>>> json.dumps(dict(d))
'{"foo": "bar"}'
As you can see, while d is a DictProxy, using json.dumps(dict(d)) instead of json.dumps(d) allowed the data to be serialized. If you are using json.dump, the same applies.
If some dict values are also DictProxies
Unfortunately, the above method does not work if a value in the DictProxy is also a DictProxy. Such a value is created in this example:
>>> import multiprocessing
>>> m = multiprocessing.Manager()
>>> d = m.dict()
>>> d["foo"] = m.dict()
The solution is to extend the json.JSONEncoder class to handle DictProxy objects, like so:
>>> import multiprocessing, json
>>> class JSONEncoderWithDictProxy(json.JSONEncoder):
... def default(self, o):
... if isinstance(o, multiprocessing.managers.DictProxy):
... return dict(o)
... return json.JSONEncoder.default(self, o)
...
>>> m = multiprocessing.Manager()
>>> d = m.dict()
>>> d["foo"] = m.dict()
>>> d["foo"]["bar"] = "baz"
>>> json.dumps(d, cls=JSONEncoderWithDictProxy)
'{"foo": {"bar": "baz"}}'
>>> # This also works:
>>> JSONEncoderWithDictProxy().encode(d)
'{"foo": {"bar": "baz"}}'