- Notifications
You must be signed in to change notification settings - Fork225
MessagePack serializer implementation for Python msgpack.org[Python]
License
msgpack/msgpack-python
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
MessagePack is an efficient binary serialization format.It lets you exchange data among multiple languages like JSON.But it's faster and smaller.This package provides CPython bindings for reading and writing MessagePack data.
$ pip install msgpack
The extension module in msgpack (msgpack._cmsgpack
) does not support PyPy.
But msgpack provides a pure Python implementation (msgpack.fallback
) for PyPy.
When you can't use a binary distribution, you need to install Visual Studioor Windows SDK on Windows.Without extension, using pure Python implementation on CPython runs slowly.
Usepackb
for packing andunpackb
for unpacking.msgpack providesdumps
andloads
as an alias for compatibility withjson
andpickle
.
pack
anddump
packs to a file-like object.unpack
andload
unpacks from a file-like object.
>>>import msgpack>>> msgpack.packb([1,2,3])'\x93\x01\x02\x03'>>> msgpack.unpackb(_)[1, 2, 3]
Read the docstring for options.
Unpacker
is a "streaming unpacker". It unpacks multiple objects from onestream (or from bytes provided through itsfeed
method).
importmsgpackfromioimportBytesIObuf=BytesIO()foriinrange(100):buf.write(msgpack.packb(i))buf.seek(0)unpacker=msgpack.Unpacker(buf)forunpackedinunpacker:print(unpacked)
It is also possible to pack/unpack custom data types. Here is an example fordatetime.datetime
.
importdatetimeimportmsgpackuseful_dict= {"id":1,"created":datetime.datetime.now(),}defdecode_datetime(obj):if'__datetime__'inobj:obj=datetime.datetime.strptime(obj["as_str"],"%Y%m%dT%H:%M:%S.%f")returnobjdefencode_datetime(obj):ifisinstance(obj,datetime.datetime):return {'__datetime__':True,'as_str':obj.strftime("%Y%m%dT%H:%M:%S.%f")}returnobjpacked_dict=msgpack.packb(useful_dict,default=encode_datetime)this_dict_again=msgpack.unpackb(packed_dict,object_hook=decode_datetime)
Unpacker
'sobject_hook
callback receives a dict; theobject_pairs_hook
callback may instead be used to receive a list ofkey-value pairs.
NOTE: msgpack can encode datetime with tzinfo into standard ext type for now.Seedatetime
option inPacker
docstring.
It is also possible to pack/unpack custom data types using theext type.
>>>import msgpack>>>import array>>>defdefault(obj):...ifisinstance(obj, array.array)and obj.typecode=='d':...return msgpack.ExtType(42, obj.tostring())...raiseTypeError("Unknown type:%r"% (obj,))...>>>defext_hook(code,data):...if code==42:... a= array.array('d')... a.fromstring(data)...return a...return ExtType(code, data)...>>> data= array.array('d', [1.2,3.4])>>> packed= msgpack.packb(data,default=default)>>> unpacked= msgpack.unpackb(packed,ext_hook=ext_hook)>>> data== unpackedTrue
As an alternative to iteration,Unpacker
objects provideunpack
,skip
,read_array_header
andread_map_header
methods. The former tworead an entire message from the stream, respectively de-serialising and returningthe result, or ignoring it. The latter two methods return the number of elementsin the upcoming container, so that each element in an array, or key-value pairin a map, can be unpacked or skipped individually.
Early versions of msgpack didn't distinguish string and binary types.The type for representing both string and binary types was namedraw.
You can pack into and unpack from this old spec usinguse_bin_type=False
andraw=True
options.
>>>import msgpack>>> msgpack.unpackb(msgpack.packb([b'spam','eggs'],use_bin_type=False),raw=True)[b'spam', b'eggs']>>> msgpack.unpackb(msgpack.packb([b'spam','eggs'],use_bin_type=True),raw=False)[b'spam', 'eggs']
To use theext type, passmsgpack.ExtType
object to packer.
>>>import msgpack>>> packed= msgpack.packb(msgpack.ExtType(42,b'xyzzy'))>>> msgpack.unpackb(packed)ExtType(code=42, data='xyzzy')
You can use it withdefault
andext_hook
. See below.
To unpacking data received from unreliable source, msgpack providestwo security options.
max_buffer_size
(default:100*1024*1024
) limits the internal buffer size.It is used to limit the preallocated list size too.
strict_map_key
(default:True
) limits the type of map keys to bytes and str.While msgpack spec doesn't limit the types of the map keys,there is a risk of the hashdos.If you need to support other types for map keys, usestrict_map_key=False
.
CPython's GC starts when growing allocated object.This means unpacking may cause useless GC.You can usegc.disable()
when unpacking large message.
List is the default sequence type of Python.But tuple is lighter than list.You can useuse_list=False
while unpacking when performance is important.
Package name on PyPI was changed frommsgpack-python
tomsgpack
from 0.5.
When upgrading from msgpack-0.4 or earlier, dopip uninstall msgpack-python
beforepip install -U msgpack
.
Python 2 support
The extension module does not support Python 2 anymore.The pure Python implementation (
msgpack.fallback
) is used for Python 2.msgpack 1.0.6 drops official support of Python 2.7, as pip andGitHub Action (setup-python) no longer support Python 2.7.
Packer
- Packer uses
use_bin_type=True
by default.Bytes are encoded in bin type in msgpack. - The
encoding
option is removed. UTF-8 is used always.
- Packer uses
Unpacker
- Unpacker uses
raw=False
by default. It assumes str types are valid UTF-8 stringand decode them to Python str (unicode) object. encoding
option is removed. You can useraw=True
to support old format (e.g. unpack into bytes, not str).- Default value of
max_buffer_size
is changed from 0 to 100 MiB to avoid DoS attack.You need to passmax_buffer_size=0
if you have large but safe data. - Default value of
strict_map_key
is changed to True to avoid hashdos.You need to passstrict_map_key=False
if you have data which contain map keyswhich type is not bytes or str.
- Unpacker uses
About
MessagePack serializer implementation for Python msgpack.org[Python]
Topics
Resources
License
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Uh oh!
There was an error while loading.Please reload this page.