Skip to content

Commit

Permalink
Treat UTF-16 strings in binary VDF as little-endian
Browse files Browse the repository at this point in the history
Integers in binary VDF are already treated as little-endian (least
significant byte first) regardless of CPU architecture, but the 16-bit
units in UTF-16 didn't get the same treatment. This led to a test failure
on big-endian machines.

Resolves: ValvePython#33
Signed-off-by: Simon McVittie <[email protected]>
  • Loading branch information
smcv committed Dec 15, 2023
1 parent d762926 commit fe85ab8
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions vdf/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -361,7 +361,7 @@ def read_string(fp, wide=False):
result = buf[:end]

if wide:
result = result.decode('utf-16')
result = result.decode('utf-16le')
elif bytes is not str:
result = result.decode('utf-8', 'replace')
else:
Expand Down Expand Up @@ -469,7 +469,7 @@ def _binary_dump_gen(obj, level=0, alt_format=False):
value = value.encode('utf-8') + BIN_NONE
yield BIN_STRING
except:
value = value.encode('utf-16') + BIN_NONE*2
value = value.encode('utf-16le') + BIN_NONE*2
yield BIN_WIDESTRING
yield key + BIN_NONE + value
elif isinstance(value, float):
Expand Down

0 comments on commit fe85ab8

Please sign in to comment.