Skip to content

Commit

Permalink
Updates for v0.6
Browse files Browse the repository at this point in the history
See the changelog for details
  • Loading branch information
sabeechen committed Apr 3, 2019
1 parent 49d8636 commit e726d17
Show file tree
Hide file tree
Showing 15 changed files with 499 additions and 182 deletions.
14 changes: 7 additions & 7 deletions .vscode/tasks.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"version": "2.0.0",
"tasks": [
{
"label": "Run Sync Server Locally",
"label": "RUN: Add-on",
"type": "shell",
"command": "python run.py ../dev/data/options.json",
"options": {
Expand All @@ -17,7 +17,7 @@
}
},
{
"label": "Run Mock Hass.io Server",
"label": "RUN: Mock Hass.io Server",
"type": "shell",
"command": "python mock_hassio.py",
"options": {
Expand All @@ -26,7 +26,7 @@
"problemMatcher": []
},
{
"label": "Run local Docker Instance",
"label": "RUN: Add-on in Docker Instance",
"type": "shell",
"command": "${workspaceFolder}/run_local_docker.bat",
"options": {
Expand All @@ -35,13 +35,13 @@
"problemMatcher": []
},
{
"label": "Stop Local Docker Instance",
"label": "STOP: Add-on in docker",
"type": "shell",
"command": "docker kill $(docker ps -q --filter \"label=run-from-vscode=1\")",
"problemMatcher": []
},
{
"label": "Publish a new version to Docker Hub",
"label": "PUBLISH: New version to Docker Hub",
"type": "shell",
"command": "${workspaceFolder}/deploy.bat",
"problemMatcher": [],
Expand All @@ -51,9 +51,9 @@
}
,
{
"label": "Publish local add-on",
"label": "DEPLOY: Add-on to local hassio",
"type": "shell",
"command": "python deploy_local_addon.py \\\\192.168.1.144",
"command": "python deploy_local_addon.py \\\\hassio",
"problemMatcher": [],
"options": {
"cwd": "${workspaceFolder}"
Expand Down
61 changes: 48 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,24 +42,30 @@ The add-on is installed like any other.
## Configuration Options
In addition to the options described in the instructions above:
* **snapshot_time_of_day** (default: None): The time of day (local time) that new snapshots should be created in 24 hour "HH:MM" format. When not specified (the default), snapshots are created at the same time of day of the most recent snapshot.
> #### Example: Create snapshots at 1:30pm
> `"snapshot_time_of_day": "13:30"`
> #### Example: Create snapshots at 1:30pm
> `"snapshot_time_of_day": "13:30"`
* **snapshot_stale_minutes** (default: 180): How long to wait after a snapshot should have been created to consider snapshots stale and in need of attention. Setting this too low can cause you to be notified of transient errors, ie the internet, Google Drive, or Home Assistant being offline briefly.
> #### Example: Notify after 12 hours of staleness
> `"snapshot_stale_minutes": "500"`
> #### Example: Notify after 12 hours of staleness
> `"snapshot_stale_minutes": "500"`
* **require_login** (default: true): When true, requires your home assistant username and password to access the backpup status page. Turning this off isn't recommended.
> #### Example: Don't require login
> `"require_login": false`
> #### Example: Don't require login
> `"require_login": false`
* **certfile** (default: /ssl/certfile.pem): The path to your ssl keyfile
* **keyfile** (default: /ssl/keyfile.pem): the path to your ssl certfile
> #### Example: Use certs you keep in a weird place
> `"certfile": "/ssl/weird/path/cert.pem"`,
>
> `"keyfile": "/ssl/weird/path/key.pem"`
> #### Example: Use certs you keep in a weird place
> ```json
> "certfile": "/ssl/weird/path/cert.pem",
> "keyfile": "/ssl/weird/path/key.pem"
> ```
* **verbose** (default: false): If true, enable additional debug logging. Useful if you start seeing errors and need to file a bug with me.

> #### Example: Turn on verbose logging
> `"verbose": true`
> #### Example: Turn on verbose logging
> `"verbose": true`
* **generational_*** (default: None): When set, older snapshots will be kept longer using a [generational backup scheme](https://en.wikipedia.org/wiki/Backup_rotation_scheme). See the [question below](#can-i-keep-older-backups-for-longer) for configuration options.
> #### Example: Keep a snapshot once every day 3 days and once a week for 4 weeks
> ```json
> "generational_days": 3,
> "generational_weeks": 4
> ```
## FAQ
### How will I know this will be there when I need it?
Expand Down Expand Up @@ -124,6 +130,35 @@ You might need to change the `url:` if you use ssl or access Home Assistant thro
### Can I specify what time of day snapshots should be created?
You can add `"snapshot_time_of_day": "13:00"` to your add-on configuration to make snapshots always happen at 1pm. Specify the time in 24 hour format of `"HH:MM"`. When unspecified, the next snapshot will be created at the same time of day as the last one.

### Can I keep older backups for longer?
The add-on can be configured to keep [generational backups](https://en.wikipedia.org/wiki/Backup_rotation_scheme) on daily, weekly, monthly, and yearly intervals instead of just deleting the oldest snapshot. This can be useful if, for example, you've made an erroneous change but haven't noticed for several days and all the backups before the change are gone. With a configuration setting like this...
```json
"generational_days": 3,
"generational_weeks": 4,
"generational_months": 12,
"generational_years": 5
```
... a snapshot will be kept for the last 3 days, the last 4 weeks, the last 12 months, and the last 5 years. Additionally you may configure the day of the week, day of the month, and day of the year that weekly, monthly, and yearly snapshots are maintained.
```json
"generational_days": 3,

"generational_weeks": 4,
"generational_day_of_week": "mon", // Can be 'mon', 'tue', 'wed', 'thu', 'fri', 'sat' or 'sun' (defaults to 'mon')

"generational_months": 12,
"generational_day_of_month": 1, // Can be 1 through 31 (defaults to 1)

"generational_years": 5,
"generational_day_of_year": 1, // can be 1 through 365 (defaults to 1)
```
* Any combination of days, weeks, months, and years can be used. They all default to 0.
* Its highly reccommended to set '`"days_between_snapshots": 1`' to ensure a snapshot is available for each day.
* Ensure you've set `max_snapshots_in_drive` appropriatly high to keep enough snapshots (24 in the example above).
* Once this option is enabled, it may take several days or weeks to see older snapshots get cleaned up. Old snapshots will only get deleted when the number present exceeds `max_snapshots_in_drive` or `max_snapshots_in_hassio`

### I already have something that creates snapshots on a schedule. Can I use this just to backup to Google Drive?
If you set '`"days_between_snapshots": 0`', then the add-on won't try to create new snapshots but will still back up any it finds to Google Drive and clean up old snapshots in both Home Assistant and Google Drive. This can be useful if you already have for example an automation that creates snapshots on a schedule.

### Does this store any personal information?
On a matter of principle, I only keep track of and store information necessary for the add-on to function. To the best of my knowledge the scope of this is:
* Once authenticated with Google, your Google credentials are only stored locally on your Home Assistant instance. This isn't your actual username and password, only an opaque token returned from Google used to verify that you previously gave the Add-on permission to access your Google Drive. Your password is never seen by me or the add-on.
Expand Down
15 changes: 8 additions & 7 deletions dev/data/options.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"max_snapshots_in_hassio": 5,
"max_snapshots_in_google_drive": 5,
"days_between_snapshots": 1,
"verbose" : true,
"verbose" : false,
"use_ssl": false,
"hassio_base_url": "http://127.0.0.1:5000/",
"ha_base_url": "http://127.0.0.1:5000/homeassistant/api/",
Expand All @@ -13,10 +13,11 @@
"folder_file_path": "../dev/data/folder.dat",
"credentials_file_path": "../dev/data/credentials.dat",
"snapshot_time_of_day": "21:24",
"generational_backup": {
"days": 3,
"weeks": 2,
"months": 2,
"years": 2
}
"generational_days": 1,
"generational_weeks": 2,
"generational_months": 3,
"generational_years": 4,
"generational_day_of_week": "tue",
"generational_day_of_month": 5,
"generational_day_of_year": 6
}
15 changes: 15 additions & 0 deletions hassio-google-drive-backup/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,19 @@
# Changelog
## [0.6] - 2019-04-03
### Added
- Adds a config options for generational backup to keep daily, weekly, monthly, and yearly snapshots. See the [FAQ](https://github.com/sabeechen/hassio-google-drive-backup#can-i-keep-older-backups-for-longer) on GitHub for details.
- Adds the ability to turn off automatic snapshots by setting `"days_between_snapshots": 0`
- Adds uniform logging (timestamps and level) throughout the project.
- Adds a top level menu for viewing the add-on debug logs and "simualting" backup errors.
- Adds better error messaging when Drive runs out of space or credentials are invalidated.

### Fixes
- Fixes a configuration error that caused the defualt configuration options to be invalid.

### Changes
- Delegates Google credential authentication entirely to the domain so project crednetials aren't stored in the add-on.
- Changes the "Manual" authentication workflow to requre users to generate their own client Id and client secret.

## [0.52] - 2019-03-31
### Added
- Adds a config option for fixing setting time of day snapshots should happen, try adding `"snapshot_time_of_day": "13:00"` to your config for example to schedule snapshots at 1pm.
Expand Down
1 change: 0 additions & 1 deletion hassio-google-drive-backup/backup/backupscheme.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
from datetime import timedelta
from calendar import monthrange


class BackupScheme(ABC):
def __init__(self):
pass
Expand Down
62 changes: 39 additions & 23 deletions hassio-google-drive-backup/backup/config.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import os.path
import pprint
import json
import logging
from .logbase import LogBase
from typing import Dict, List, Tuple, Any, Optional

HASSIO_OPTIONS_FILE = '/data/options.json'
Expand Down Expand Up @@ -32,7 +33,7 @@
}


class Config(object):
class Config(LogBase):

def __init__(self, file_paths: List[str] = [], extra_config: Dict[str, any] = {}):
self.config: Dict[str, Any] = DEFAULTS
Expand All @@ -43,12 +44,20 @@ def __init__(self, file_paths: List[str] = [], extra_config: Dict[str, any] = {}
for config_file in file_paths:
if os.path.isfile(config_file):
with open(config_file) as file_handle:
print("Loading config from " + config_file)
self.info("Loading config from " + config_file)
self.config.update(json.load(file_handle))

self.config.update(extra_config)
print("Loaded config:")
pprint.pprint(self.config)
self.info("Loaded config:")
self.info(json.dumps(self.config, sort_keys=True, indent=4))
if self.verbose():
self.setConsoleLevel(logging.DEBUG)
else:
self.setConsoleLevel(logging.INFO)
gen_config = self.getGenerationalConfig()
if gen_config:
self.info("Generationl backup config:")
self.info(json.dumps(gen_config, sort_keys=True, indent=4))

def maxSnapshotsInHassio(self) -> int:
return int(self.config['max_snapshots_in_hassio'])
Expand Down Expand Up @@ -116,22 +125,29 @@ def snapshotTimeOfDay(self) -> Optional[str]:
return None

def getGenerationalConfig(self) -> Optional[Dict[str, Any]]:
if 'generational_backup' in self.config:
base = self.config['generational_backup']
if 'days' not in base:
base['days'] = 0
if 'weeks' not in base:
base['weeks'] = 0
if 'months' not in base:
base['months'] = 0
if 'years' not in base:
base['years'] = 0
if 'day_of_week' not in base:
base['day_of_week'] = 'mon'
if 'day_of_month' not in base:
base['day_of_month'] = 1
if 'day_of_year' not in base:
base['day_of_year'] = 1
return base
else:
if 'generational_days' not in self.config and 'generational_weeks' not in self.config and 'generational_months' not in self.config and 'generational_years' not in self.config:
return None
base = {
'days': 0,
'weeks': 0,
'months': 0,
'years': 0,
'day_of_week': 'mon',
'day_of_month': 1,
'day_of_year': 1
}
if 'generational_days' in self.config:
base['days'] = self.config['generational_days']
if 'generational_weeks' in self.config:
base['weeks'] = self.config['generational_weeks']
if 'generational_months' in self.config:
base['months'] = self.config['generational_months']
if 'generational_years' in self.config:
base['years'] = self.config['generational_years']
if 'generational_day_of_week' in self.config:
base['day_of_week'] = self.config['generational_day_of_week']
if 'generational_day_of_month' in self.config:
base['day_of_month'] = self.config['generational_day_of_month']
if 'generational_day_of_year' in self.config:
base['day_of_year'] = self.config['generational_day_of_year']
return base
28 changes: 15 additions & 13 deletions hassio-google-drive-backup/backup/drive.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from datetime import datetime
from googleapiclient.discovery import build # type: ignore
from googleapiclient.discovery import Resource
from googleapiclient.errors import ResumableUploadError
from apiclient.http import MediaIoBaseUpload# type: ignore
from apiclient.errors import HttpError # type: ignore
from oauth2client.client import Credentials # type: ignore
Expand All @@ -21,6 +22,7 @@
from requests import Response
from .config import Config
from .responsestream import ResponseStream
from .logbase import LogBase

# Defines the retry strategy for calls made to Drive
# max # of time to retry and call to Drive
Expand All @@ -41,7 +43,7 @@
CREATE_FIELDS = "id,name,appProperties,size,trashed"


class Drive(object):
class Drive(LogBase):
"""
Stores the logic for makign calls to Google Drive and managing credentials necessary to do so.
"""
Expand Down Expand Up @@ -103,16 +105,16 @@ def saveSnapshot(self, snapshot: Snapshot, download_url: str, parent_id: str) ->
if last_percent != new_percent:
last_percent = new_percent
snapshot.uploading(last_percent)
print("Uploading {1} {0}%".format(last_percent, snapshot.name()))
self.info("Uploading {1} {0}%".format(last_percent, snapshot.name()))
snapshot.uploading(100)
snapshot.setDrive(DriveSnapshot(drive_response))

def deleteSnapshot(self, snapshot: Snapshot) -> None:
print("Deleting: {}".format(snapshot))
self.info("Deleting: {}".format(snapshot))
if not snapshot.driveitem:
raise Exception("Drive item was null")
self._retryDriveServiceCall(self._drive().files().delete(fileId=snapshot.driveitem.id()))
print("Deleted snapshot backup from drive '{}'".format(snapshot.name()))
self.info("Deleted snapshot backup from drive '{}'".format(snapshot.name()))
snapshot.driveitem = None

def _timeToRfc3339String(self, time: datetime) -> str:
Expand Down Expand Up @@ -147,13 +149,13 @@ def _retryDriveServiceCall(self, request: Any, func: Any=None) -> Any:
except HttpError as e:
if attempts >= DRIVE_MAX_RETRIES:
# fail, too many retries
print("Too many calls to Drive failed, so we'll give up for now")
self.error("Too many calls to Drive failed, so we'll give up for now")
raise e
# Only retry 403 and 5XX error, see https://developers.google.com/drive/api/v3/manage-uploads
if e.resp.status != 403 and int(e.resp.status / 5) != 5:
print("Drive returned non-retryable error code: {0}".format(e.resp.status))
self.error("Drive returned non-retryable error code: {0}".format(e.resp.status))
raise e
print("Drive returned error code: {0}:, we'll retry in {1} seconds".format(e.resp.status, backoff))
self.error("Drive returned error code: {0}:, we'll retry in {1} seconds".format(e.resp.status, backoff))
sleep(backoff)
backoff *= DRIVE_EXPONENTIAL_BACKOFF

Expand All @@ -168,30 +170,30 @@ def getFolderId(self) -> str:
folder = self._retryDriveServiceCall(self._drive().files().get(fileId=folder_id, fields='id,trashed,capabilities'))
caps = folder.get('capabilities')
if folder.get('trashed'):
print("The Drive Snapshot Folder is in the trash, so we'll make a new one")
self.info("The Drive Snapshot Folder is in the trash, so we'll make a new one")
return self._createDriveFolder()
elif not caps['canAddChildren']:
print("Can't add Snapshot to the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
prself.infont("Can't add Snapshot to the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
return self._createDriveFolder()
elif not caps['canListChildren']:
print("Can't list Snapshot of the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
self.info("Can't list Snapshot of the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
return self._createDriveFolder()
elif not caps['canRemoveChildren']:
print("Can't delete Snapshot of the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
self.info("Can't delete Snapshot of the Drive backup folder (maybe you lost ownership?), so we'll create a new one.")
return self._createDriveFolder()
return folder_id
except HttpError as e:
# 404 means the folder oean't exist (maybe it got moved?)
if e.resp.status == 404:
print("The Drive Snapshot folder is gone, so we'll create a new one.")
self.info("The Drive Snapshot folder is gone, so we'll create a new one.")
return self._createDriveFolder()
else:
raise e
else:
return self._createDriveFolder()

def _createDriveFolder(self) -> str:
print('Creating folder "{}" in "My Drive"'.format(FOLDER_NAME))
self.info('Creating folder "{}" in "My Drive"'.format(FOLDER_NAME))
file_metadata: Dict[str, str] = {
'name': FOLDER_NAME,
'mimeType': FOLDER_MIME_TYPE
Expand Down
Loading

0 comments on commit e726d17

Please sign in to comment.