Skip to content
This repository has been archived by the owner on Jan 9, 2022. It is now read-only.

AttributeError: 'NoneType' object has no attribute 'encode' #9

Open
ohheyrj opened this issue Oct 2, 2021 · 4 comments
Open

AttributeError: 'NoneType' object has no attribute 'encode' #9

ohheyrj opened this issue Oct 2, 2021 · 4 comments

Comments

@ohheyrj
Copy link

ohheyrj commented Oct 2, 2021

Hello,

I am trying to get my backups into S3 however when I try and run the add-on I get the following error:

[s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 00-banner.sh: executing... 
-----------------------------------------------------------
 Add-on: Amazon S3 Backup
 Automatically backup Home Assistant snapshots to Amazon S3
-----------------------------------------------------------
 Add-on version: 1.1
 You are running the latest version of this add-on.
 System: Home Assistant OS 6.4  (aarch64 / raspberrypi4-64)
 Home Assistant Core: 2021.9.7
 Home Assistant Supervisor: 2021.09.6
-----------------------------------------------------------
 Please, share the above information when looking for help
 or support in, e.g., GitHub, forums or the Discord chat.
-----------------------------------------------------------
[cont-init.d] 00-banner.sh: exited 0.
[cont-init.d] 01-log-level.sh: executing... 
Log level is set to DEBUG
[cont-init.d] 01-log-level.sh: exited 0.
[cont-init.d] done.
[services.d] starting services
[services.d] done.
[20:35:09] INFO: Starting Amazon S3 Backup...
DEBUG:__main__:Local file /backup/0a98d79d.tar found in S3 with matching size of 101488640 bytes
WARNING:__main__:Local file /backup/4286cc97.tar not found in S3
Traceback (most recent call last):
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 201, in <module>
    upload_file(file, s3_bucket, supervisor_api)
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 153, in upload_file
    s3_bucket.upload_file(str(file), metadata)
  File "/usr/bin/amazon-s3-backup/s3bucket.py", line 64, in upload_file
    self.s3_client.upload_file(Filename=file,
  File "/usr/lib/python3.8/site-packages/boto3/s3/inject.py", line 129, in upload_file
    return transfer.upload_file(
  File "/usr/lib/python3.8/site-packages/boto3/s3/transfer.py", line 279, in upload_file
    future.result()
  File "/usr/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/usr/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/usr/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
    return self._execute_main(kwargs)
  File "/usr/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
    return_value = self._main(**kwargs)
  File "/usr/lib/python3.8/site-packages/s3transfer/upload.py", line 692, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 648, in _make_api_call
    request_dict = self._convert_to_request_dict(
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 694, in _convert_to_request_dict
    api_params = self._emit_api_params(
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 723, in _emit_api_params
    self.meta.events.emit(
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/handlers.py", line 529, in validate_ascii_metadata
    value.encode('ascii')
AttributeError: 'NoneType' object has no attribute 'encode'
[cont-finish.d] executing container finish scripts...
[cont-finish.d] 99-message.sh: executing... 
[cont-finish.d] 99-message.sh: exited 0.
[cont-finish.d] done.
[s6-finish] waiting for services.
[s6-finish] sending all processes the TERM signal.
[s6-finish] sending all processes the KILL signal and exiting.

I thought it was just one bad apple, but deleted that one, and then another came up with an issue.

If you need anymore info please let me know

@ScottG489
Copy link

I'm also getting this error. It was able to upload one, but then not the second one just like in your stack trace.

@ScottG489
Copy link

If I disable upload_missing_files in the add-on's configuration this error goes away. However, this causes it to pass over existing backups, which means it's possible it could miss some in the future which seems dangerous.

@ScottG489
Copy link

I think I figured it out!

amazon-s3-backup.py upload_file method retrieves the snapshot json information from supervisorapi.py. In HA using something like the Terminal add-on you can run ha backups --raw-json and you can see that "homeassistant" isn't a json field at the top level. It's under content. Perhaps this was something changed in the supervisor since the initial release.

If you look in the S3 properties for one of the files uploaded, you can see this information there. None of it is actually necessary, so you can completely remove the lookup of that field. The more correct approach to preserve this functionality would be to get the homeassistant field under content which I might take a swing at. I've forked the repo and I have it working.

I'm happy to submit a PR here but it doesn't appear that @gdrapp is paying attention to the PRs here because there are other critical PR's that haven't been merged. The aforementioned PR being from a repo that I actually forked, not this one.

In any case, I'll be using my fork for now and if I don't hear from @gdrapp in a while I'll just create a new repo with his code (credited of course) that I'll try to actively maintain and incorporate all the outstanding PRs here.

Also, I'm not a python guy, so if anyone knows how to actually get the content.homeassistant field from the json (code here) let me know and I'll add that as well. For now I'm just going to leave it out of the metadata uploaded to S3 associated with the file.

@ohheyrj
Copy link
Author

ohheyrj commented Dec 13, 2021

@ScottG489 good find!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants