Last active
January 27, 2026 18:43
-
-
Save jikamens/5be9e73a22dbf5caf5a59d82d1b1759d to your computer and use it in GitHub Desktop.
Script to import output of bitwarden-backup.py into 1Password
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| #!/usr/bin/env python3 | |
| # Script for importing bitwarden-backup.py output into 1Password | |
| # | |
| # Previously, I wrote and shared bitwarden-backup.py, a script for exporting | |
| # a comprehensive backup of your private and organization Bitwarden data, | |
| # including attachments: | |
| # | |
| # https://gist.github.com/jikamens/15f4b25cec019cb81ddeeee8dacbcfb9 | |
| # | |
| # More recently I found myself considering switching from Bitwarden to | |
| # 1Password. But how do I migrate my Bitwarden data, including folders and | |
| # attachments, into 1Password? 1Password's Bitwarden import functionality | |
| # doesn't support attachments, since the stock Bitwarden export doesn't | |
| # include attachments, and I'm not sure whether it does anything with folders. | |
| # I wanted to do better, hence this script. | |
| # | |
| # The script maps Bitwarden folders to 1Password tags and maps Bitwarden | |
| # collections to 1Password vaults. When a Bitwarden item is in multiple | |
| # collections, a new vault is created by joining the names of those collections | |
| # together, separated by slashes. You can easily plug in your own code to | |
| # customize the collection to vault mapping; search for `collections_to_vault` | |
| # below for details. | |
| # | |
| # The script stores the Bitwarden ID of imported items in a custom field in | |
| # 1Password and uses that to avoid importing items that have already been | |
| # imported. You can override this behavior with `--force`. Note that `--force` | |
| # will create new items in 1Password, not overwrite existing ones. | |
| # | |
| # To use this script, you unpack your Bitwarden backup ZIP file into a | |
| # directory and then run this script in that directory. Run the script with | |
| # `-h` or `--help` for additional usage information. | |
| # | |
| # The script uses the 1Password CLI to do all the work of interacting with | |
| # 1Password, so you need that installed and in your search path and signed into | |
| # the 1Password account you want to import into. If you have multiple 1Password | |
| # accounts you may need to set the `OP_ACCUNT` environment variable as | |
| # described in the CLI documentation. | |
| # | |
| # Depending on what kind of 1Password account you are importing into, you | |
| # may need to change `PRIVATE_VAULT` below from 'Private' to 'Personal' or | |
| # 'Employee'. Or you can specify a different `PRIVATE_VAULT` value in your | |
| # `--customize` script. | |
| # | |
| # RUNNING THE SCRIPT MULTIPLE TIMES | |
| # | |
| # As noted above, in its default mode if the script encounters an item that | |
| # was already imported it won't import it again. Note that this is vault- and | |
| # title-specific, i.e., if you import an item from Bitwarden and then move it | |
| # into a different vault or change its title in 1Password without making the | |
| # corresponding change in Bitwarden, if you then rerun the import it will | |
| # think it's new and import it again. | |
| # | |
| # Also as noted above, you can specify `--force` to force everything to be | |
| # imported again, but the script can do much more than that. | |
| # | |
| # If the script gets part-way through an import and then crashes or something, | |
| # and you want to restart it where it left off, you can do that with | |
| # `--resume`. Make sure you're using the same Bitwarden backup you used the | |
| # first time or the order of items in the backup may be different and some | |
| # items may not be imported properly. | |
| # | |
| # You can use `--interactive` to get the script to prompt you before adding new | |
| # items or before updating existing items to match what's in Bitwarden. For the | |
| # latter, the script diff Bitwarden and 1Password contents and prompts you | |
| # about each difference. The reason why it prompts rather than just migrating | |
| # changes over automatically is because the data in 1Password may have been | |
| # updated and be more correct than the data in Bitwarden, so overwriting a | |
| # change is not necessarily the right thing to do. | |
| # | |
| # Note that `--interactive` checks file names but not file contents, so if you | |
| # change the contents of a file attachment in Bitwarden while preservings its | |
| # name, Bitwarden will not treat that as a new change that needs to be | |
| # migrated. | |
| # | |
| # `--interactive` mode is slow because it has to fetch from 1Password every | |
| # item that exists already to compare its contents to Bitwarden's. You can | |
| # greatly speed this up by exporting your 1Password items into a ".1pux" file | |
| # with the desktop app, and then passing the path to that file into this script | |
| # with `--1password-export`. If you do this then the script gets the item data | |
| # from the 1pux file quickly instead of using the CLI to fetch it. | |
| # | |
| # If you are using `--interactive` and you don't want to have to keep answering | |
| # the same questions over and over about differences you don't want to migrate | |
| # into 1Password, you can specify `--remember` and the script will remember | |
| # your answers and store them in the file `remembered.txt` and read that file | |
| # the next time it runs to get the answers to questions you've already | |
| # answered. | |
| # | |
| # In summary, if you want to run this script multiple times to detect and | |
| # migrate changes made in Bitwarden while you're in the process of migrating to | |
| # 1Password, then before each run you'll want to export and unpack a new | |
| # Bitwarden backup (but make sure not to lose `remembered.txt`), export from | |
| # 1Password, and use the `--interactive`, `--1password-export`, and | |
| # `--remember` flags. | |
| # | |
| # CAVEATS AND LIMITATIONS | |
| # | |
| # * Password history, revision date, creation date, master password | |
| # reprompting, favorite designations, and custom checkbox fields aren't | |
| # migrated. | |
| # * Passkeys aren't migrated (I don't think this is possible). | |
| # * Identities and SSH keys aren't migrated (if you want to send me a patch to | |
| # add them, I'm happy to take it, but I don't use them so didn't bother to | |
| # write code to handle them). | |
| # * There are a bunch of exceptions scattered throughout the code whenever the | |
| # script encounters data it doesn't know how to handle because I've never | |
| # seen it myself and therefore don't know the right way to handle it. If you | |
| # encounter one of these feel free to reach out and I'll try to help. | |
| # | |
| # The logic in the script for transforming and mapping data was determined | |
| # empirically by looking at data exported from Bitwarden and 1Password and | |
| # comparing them. To my knowledge there is no documentation of the Bitwarden | |
| # and 1Password export formats to consult to identify all the fields that need | |
| # to be transformed and what their possible values are. | |
| # | |
| # By Jonathan Kamens <jik@kamens.us>. This script is in the public domain. | |
| import argparse | |
| import bisect | |
| import copy | |
| from deepdiff import DeepDiff | |
| from deepdiff.helper import CannotCompare | |
| from hashlib import sha256 | |
| import importlib.util | |
| import json as json_package # So that we can have a kwarg `json` | |
| import os | |
| import pprint | |
| import re | |
| import secrets | |
| import string | |
| import subprocess | |
| import sys | |
| import tempfile | |
| import zipfile | |
| # If you specify a script with `customize` and define `PRIVATE_VAULT` in it, | |
| # it will override this. | |
| PRIVATE_VAULT = 'Private' | |
| remembered_prompts = None | |
| class UnwantedException(Exception): | |
| pass | |
| # Takes an item dictionary. Returns filled-in template, vault name, | |
| # and dict containing file names and paths. Raises UnwantedException for | |
| # items we don't want to import. | |
| def parse_item(args, item, collections, folders): | |
| item = copy.deepcopy(item) | |
| id_ = item.pop('id') | |
| item.pop('passwordHistory', None) | |
| item.pop('revisionDate', None) | |
| item.pop('creationDate', None) | |
| if item.pop('deletedDate', False): | |
| raise Exception(f'Unexpectedly encountered deleted item {id_}') | |
| obj = item.pop('object', None) | |
| if obj != 'item': | |
| raise Exception(f'Unknown object value {obj} in item {id_}') | |
| folder = folders[item.pop('folderId', None)] | |
| # I think a recent update to the CLI changed the behavior of returning | |
| # "No Folder" to remove the field from the JSON instead, but I'm not | |
| # going to remove the "No Folder" check for the time being. - 2026-01-27 | |
| if folder == 'No Folder': | |
| folder = None | |
| tp = item.pop('type') | |
| # 1 == Login | |
| # 2 == Note | |
| # 3 == Card | |
| # 4 == Identity | |
| if tp not in (1, 2, 3, 4): | |
| raise Exception(f'Unknown type {tp} in item {id_}') | |
| if tp == 4: | |
| raise UnwantedException("Don't want to import identities") | |
| item.pop('reprompt', None) | |
| item.pop('organizationId', None) | |
| name = item.pop('name') | |
| notes = item.pop('notes') | |
| item.pop('favorite', None) | |
| login = item.pop('login', None) if tp == 1 else None | |
| if login: | |
| urls = [u['uri'] for u in login.pop('uris', [])] | |
| username = login.pop('username', None) | |
| password = login.pop('password', None) | |
| totp = login.pop('totp', None) | |
| login.pop('passwordRevisionDate', None) | |
| fido2Credentials = login.pop('fido2Credentials', None) | |
| if fido2Credentials: | |
| raise Exception('Unexpected fido2Credentials in item {id_}') | |
| check_empty(login, f'login data for item {id_}') | |
| collections = [collections[c] for c in item.pop('collectionIds', [])] | |
| secureNote = item.pop('secureNote', None) if tp == 2 else None | |
| if secureNote and (len(secureNote) > 1 or | |
| secureNote.get('type', None) != 0): | |
| raise Exception(f'Unexpected secureNote value in item {id_}') | |
| attachments = item.pop('attachments', []) | |
| card = item.pop('card', None) if tp == 3 else None | |
| if card: | |
| cardholderName = card.pop('cardholderName', None) | |
| brand = card.pop('brand', None) | |
| number = card.pop('number', None) | |
| expMonth = card.pop('expMonth', None) | |
| expYear = card.pop('expYear', None) | |
| code = card.pop('code', None) | |
| check_empty(card, f'card data for item {id_}') | |
| fields = item.pop('fields', []) | |
| if fields: | |
| try: | |
| next((f for f in fields if f.get('linkedId', False))) | |
| except StopIteration: | |
| pass | |
| else: | |
| raise Exception(f"Don't know what to do with non-empty linkedId " | |
| f"in id {id_}") | |
| try: | |
| next((k for f in fields for k in f.keys() | |
| if k not in ('name', 'value', 'type', 'linkedId'))) | |
| except StopIteration: | |
| pass | |
| else: | |
| raise Exception(f'Unrecognized field key in id {id_}') | |
| # I have no idea what this field does, so no choice but to ignore it. I | |
| # think it was added recently. - 2026-01-27 | |
| item.pop('key', None) | |
| check_empty(item, f'item {id_}') | |
| # OK, now we've parsed the entire item and we need to figure out what to | |
| # do with its data. Here are the data we need to account for, boiled down | |
| # from above: folder, tp, name, notes, urls, username, password, totp, | |
| # collections, attachments, cardholderName, brand, number, expMonth, | |
| # expYear, code, fields. Some of these may be unexpected for some item | |
| # types so we need to check for that as well. | |
| # This takes care of `name`. | |
| template = {'title': name, | |
| 'fields': []} | |
| template['fields'].append({ | |
| 'id': 'BitwardenID', | |
| 'type': 'STRING', | |
| 'label': 'Bitwarden ID', | |
| 'value': id_ | |
| }) | |
| # This takes care of `notes` | |
| if notes: | |
| template['fields'].append({ | |
| 'id': 'notesPlain', | |
| 'type': 'STRING', | |
| 'purpose': 'NOTES', | |
| 'label': 'notesPlain', | |
| 'value': notes | |
| }) | |
| # This takes care of `attachments`. | |
| paths = {} | |
| if attachments: | |
| dir = f'attachments/{folder}/{name} {id_}' | |
| if not os.path.exists(dir): | |
| dir = f'attachments/{folder}/{name}' | |
| for a in attachments: | |
| fileName = a['fileName'] | |
| aid = a['id'] | |
| path = f'{dir}/{fileName} {aid}' | |
| if not os.path.exists(path): | |
| path = f'{dir}/{fileName}' | |
| if not os.path.exists(path): | |
| raise Exception( | |
| f'Missing attachment {fileName} for id {id_}') | |
| paths[fileName] = path | |
| # This takes care of `fields`. | |
| for field in fields: | |
| if field['type'] == 2: # checkbox, not transferable | |
| continue | |
| elif field['type'] == 0: # visible text | |
| # This might be a form-fill field that isn't transferable, but | |
| # it's possible that it contains important information so we need | |
| # to preserve it. | |
| template['fields'].append({ | |
| 'type': 'STRING', | |
| 'label': field['name'], | |
| 'value': field['value'] | |
| }) | |
| elif field['type'] == 1: # concealed text | |
| template['fields'].append({ | |
| 'type': 'CONCEALED', | |
| 'label': field['name'], | |
| 'value': field['value'] | |
| }) | |
| # Figure out what vault this is going into. This takes care of | |
| # `collections`. | |
| # You can customize how vault names are determined by writing a | |
| # Python function called `collections_to_vault` that takes a list of | |
| # Bitwarden collection names and returns a single 1Password vault name, | |
| # saving this function into a file, and passing the path of that file to | |
| # this script with the `--customize` argument. | |
| def collections_to_vault(collections): | |
| if not collections: | |
| return getattr(args.customize, 'PRIVATE_VAULT', PRIVATE_VAULT) | |
| else: | |
| return '/'.join(sorted(collections)) | |
| vault = getattr(args.customize, 'collections_to_vault', | |
| collections_to_vault)(collections) | |
| # This takes care of `folder`. | |
| if folder: | |
| template['tags'] = [folder] | |
| # This takes care of `urls`, `username`, `password`, `totp`. | |
| if tp == 1: # login | |
| if card is not None: | |
| raise Exception(f'Unexpected non-login data in id {id_}') | |
| if username: | |
| template['category'] = 'LOGIN' | |
| template['fields'].append({ | |
| 'id': 'username', | |
| 'type': 'STRING', | |
| 'purpose': 'USERNAME', | |
| 'label': 'username', | |
| 'value': username | |
| }) | |
| else: | |
| template['category'] = 'PASSWORD' | |
| if password is not None: | |
| template['fields'].append({ | |
| 'id': 'password', | |
| 'type': 'CONCEALED', | |
| 'purpose': 'PASSWORD', | |
| 'label': 'password', | |
| 'value': password | |
| }) | |
| if totp: | |
| template['fields'].append({ | |
| 'id': 'TOTP', | |
| 'type': 'OTP', | |
| 'label': 'one-time password', | |
| 'value': totp | |
| }) | |
| if urls: | |
| template['urls'] = [{'primary': True, 'href': urls.pop(0)}] | |
| for url in urls: | |
| template['urls'].append({'label': 'website', 'href': url}) | |
| elif tp == 2: # note | |
| if login is not None or card is not None: | |
| raise Exception(f'Unexpected non-note data in id {id_}') | |
| template['category'] = 'SECURE_NOTE' | |
| # This takes care of `cardholderName`, `brand`, `number`, `code`, | |
| # `expMonth`, and `expYear`. | |
| elif tp == 3: # card | |
| if login is not None: | |
| raise Exception(f'Unexpected non-card data in id {id_}') | |
| template['category'] = 'CREDIT_CARD' | |
| if cardholderName: | |
| template['fields'].append({ | |
| 'id': 'cardholder', | |
| 'type': 'STRING', | |
| 'label': 'cardholder name', | |
| 'value': cardholderName | |
| }) | |
| if brand: | |
| template['fields'].append({ | |
| 'id': 'type', | |
| 'type': 'CREDIT_CARD_TYPE', | |
| 'label': 'type', | |
| 'value': brand | |
| }) | |
| if number: | |
| template['fields'].append({ | |
| 'id': 'ccnum', | |
| 'type': 'CREDIT_CARD_NUMBER', | |
| 'label': 'number', | |
| 'value': number | |
| }) | |
| if code: | |
| template['fields'].append({ | |
| 'id': 'cvv', | |
| 'type': 'CONCEALED', | |
| 'label': 'verification number', | |
| 'value': code | |
| }) | |
| if (not expMonth) != (not expYear): | |
| raise Exception(f'Missing expiration month or year in item {id_}') | |
| elif expMonth: | |
| template['fields'].append({ | |
| 'id': 'expiry', | |
| 'type': 'MONTH_YEAR', | |
| 'label': 'expiry date', | |
| 'value': f'{expYear}{int(expMonth):02d}' | |
| }) | |
| return (template, vault, paths) | |
| def check_empty(dct, desc): | |
| try: | |
| extra_key = next((k for k in dct.keys())) | |
| except StopIteration: | |
| pass | |
| else: | |
| raise Exception(f'Unexpected key "{extra_key}" in {desc}') | |
| def add_existing_item(item, items): | |
| vault = item['vault']['name'] | |
| title = item['title'] | |
| if vault not in items: | |
| items[vault] = {} | |
| if title not in items[vault]: | |
| items[vault][title] = [item] | |
| elif not any((item['id'] == already['id'] | |
| for already in items[vault][title])): | |
| items[vault][title].append(item) | |
| def get_existing_items(args, exported_items): | |
| indexed_items = {} | |
| if exported_items: | |
| for item in exported_items.values(): | |
| add_existing_item(item, indexed_items) | |
| else: | |
| items = op(args, ('op', 'item', 'list', '--format', 'json'), | |
| 'listing items', safe=True) | |
| for item in items: | |
| add_existing_item(item, indexed_items) | |
| return indexed_items | |
| def check_existing_item(args, vault, title, bitwarden_id, existing_vaults, | |
| existing_items, bisecting=False): | |
| '''In interactive mode (bisecting==False, args.interactive==True), | |
| returns the item fetched from 1Password if it already exists, so the | |
| caller can figure out whether/how to merge our differences into | |
| 1Password.''' | |
| force = False if bisecting else args.force | |
| interactive = False if bisecting else args.interactive | |
| if force: | |
| return False | |
| if vault not in existing_items: | |
| return False | |
| if title not in existing_items[vault]: | |
| return False | |
| for item in existing_items[vault][title]: | |
| id_ = item['id'] | |
| data = None | |
| if args.export: | |
| data = item | |
| elif interactive: | |
| data = op( | |
| args, ('op', 'item', 'get', id_, '--format', 'json'), | |
| 'fetching item {id_}', check=False, safe=True) | |
| if data: | |
| item_id = next((f for f in data.get('fields', []) | |
| if f['label'] == 'Bitwarden ID'))['value'] | |
| if bitwarden_id == item_id: | |
| return data if interactive else True | |
| else: | |
| vault_id = existing_vaults[vault] | |
| item_id = op(args, | |
| ('op', 'read', f'op://{vault_id}/{id_}/Bitwarden ID'), | |
| 'getting bitwarden ID for {id_}', | |
| json=False, check=False, safe=True) | |
| if bitwarden_id == item_id: | |
| return True | |
| return False | |
| def get_existing_vaults(args): | |
| items = op(args, ('op', 'vault', 'list', '--format', 'json'), | |
| 'listing vaults', safe=True) | |
| indexed_items = {} | |
| for item in items: | |
| indexed_items[item['name']] = item['id'] | |
| return indexed_items | |
| def ensure_vault(args, name, existing_vaults): | |
| if name in existing_vaults: | |
| return existing_vaults[name] | |
| print(f'Creating vault {name}') | |
| result = op(args, ('op', 'vault', 'create', '--format', 'json', name), | |
| 'creating vault {name}') | |
| # We still call op() above because it generates useful --debug output | |
| if args.dryrun: | |
| result = {'id': f'fake-id-for-vault-{name}'} | |
| existing_vaults[name] = result['id'] | |
| return result['id'] | |
| def op(args, cmd, description, input=None, json=True, check=True, safe=False): | |
| if args.debug: | |
| print(f'Running: {cmd}') | |
| if input: | |
| print(f'with input: {input}') | |
| if args.dryrun and not safe: | |
| return None | |
| result = subprocess.run(cmd, | |
| stdin=None if input else subprocess.DEVNULL, | |
| input=input, | |
| capture_output=True, | |
| encoding='utf8') | |
| if result.returncode == 0: | |
| if json: | |
| try: | |
| data = json_package.loads(result.stdout) | |
| except Exception: | |
| print(f'For {description}, expected well-formatted JSON\n' | |
| f'in output of {cmd}, got this instead:\n' | |
| f'{result.stdout}', file=sys.stderr) | |
| save_input(input) | |
| sys.exit(1) | |
| return data | |
| else: | |
| # We never read binary data so .strip() is reasonable. | |
| return result.stdout.strip() | |
| if not check: | |
| return None | |
| print(f'For {description}, {cmd} failed.', file=sys.stderr) | |
| if result.stderr: | |
| print(f'Command stderr:\n{result.stderr}', file=sys.stderr) | |
| if result.stdout: | |
| print(f'Command stdout:\n{result.stdout}', file=sys.stderr) | |
| save_input(input) | |
| sys.exit(1) | |
| def save_input(input): | |
| if not input: | |
| return | |
| with tempfile.NamedTemporaryFile(delete=False) as f: | |
| f.write(input.encode()) | |
| print(f'Command intput saved in {f.name}.\n' | |
| f'Make sure to delete it if it contains sensitive content!', | |
| file=sys.stderr) | |
| # https://medium.com/@david.bonn.2010/ | |
| # dynamic-loading-of-python-code-2617c04e5f3f with changes from me. | |
| def gensym(length=32, prefix="gensym_"): | |
| """ | |
| generates a fairly unique symbol, used to make a module name, | |
| used as a helper function for load_module | |
| :return: generated symbol | |
| """ | |
| alphabet = string.ascii_uppercase + string.ascii_lowercase + string.digits | |
| symbol = "".join([secrets.choice(alphabet) for i in range(length)]) | |
| return prefix + symbol | |
| def load_module(source, module_name=None): | |
| """ | |
| reads file source and loads it as a module | |
| :param source: file to load | |
| :param module_name: name of module to register in sys.modules | |
| :return: loaded module | |
| """ | |
| if module_name is None: | |
| module_name = gensym() | |
| if not os.path.exists(source): | |
| sys.exit(f'Customization script {source} does not exist') | |
| spec = importlib.util.spec_from_file_location(module_name, source) | |
| module = importlib.util.module_from_spec(spec) | |
| sys.modules[module_name] = module | |
| spec.loader.exec_module(module) | |
| return module | |
| def parse_args(): | |
| parser = argparse.ArgumentParser( | |
| description='Import bitwarden-backup.py backup into 1Password') | |
| parser.add_argument('--dryrun', action='store_true', help='Work through ' | |
| 'the data without actually changing anything in ' | |
| '1Password') | |
| parser.add_argument('--debug', action='store_true', help='Print extra ' | |
| 'debugging information') | |
| group = parser.add_mutually_exclusive_group() | |
| group.add_argument('--force', action='store_true', help="Don't check " | |
| "if items already exist in 1Password") | |
| group.add_argument('--interactive', action='store_true', help='Ask before ' | |
| 'importing new items and what to do about differences ' | |
| 'between existing items in Bitwarden and 1Password') | |
| parser.add_argument('--remember', action='store_true', help='Remember ' | |
| 'answers to prompts in "remembered.txt" so if you ' | |
| 'need to kill and restart the script you won\'t need ' | |
| 'to answer the same questions again') | |
| parser.add_argument('--resume', action='store_true', help='Assume that we ' | |
| 'are resuming an import and items.py has not been ' | |
| 'modified and quickly find the correct resumption ' | |
| 'point via a binary search') | |
| parser.add_argument('--1password-export', dest='export', action='store', | |
| help='Current 1Password 1pux export file to speed up ' | |
| 'checks for existing or modified items') | |
| parser.add_argument('--customize', metavar='FILE', type=load_module, | |
| action='store', help='Path to file containing ' | |
| 'customization functions (see docs in script)') | |
| return parser.parse_args() | |
| def main(): | |
| args = parse_args() | |
| collections = {c['id']: c['name'] | |
| for c in json_package.load(open('collections.json'))} | |
| folders = {f['id']: f['name'] | |
| for f in json_package.load(open('folders.json'))} | |
| items = json_package.load(open('items.json')) | |
| existing_vaults = get_existing_vaults(args) | |
| if args.export: | |
| exported_items = { | |
| item['id']: item for item in parse_1pux( | |
| args, args.export)} | |
| else: | |
| exported_items = {} | |
| existing_items = get_existing_items(args, exported_items) | |
| if args.resume: | |
| def resume_key(item): | |
| template, vault, attachments = \ | |
| parse_item(args, item, collections, folders) | |
| exists = check_existing_item( | |
| args, vault, template['title'], item['id'], | |
| existing_vaults, existing_items, bisecting=True) | |
| return not exists | |
| start = bisect.bisect_left(items, True, key=resume_key) | |
| else: | |
| start = 0 | |
| for item in items[start:]: | |
| try: | |
| template, vault, attachments = \ | |
| parse_item(args, item, collections, folders) | |
| except UnwantedException: | |
| continue | |
| title = template['title'] | |
| ensure_vault(args, vault, existing_vaults) | |
| action = 'create' | |
| if existing := check_existing_item( | |
| args, vault, title, item['id'], existing_vaults, | |
| existing_items): | |
| if args.interactive: | |
| # The more fields there are in the 1Password version of the | |
| # data that aren't in the Bitwarden template we create, the | |
| # more likely it is that the DeepDiff algorithm will get | |
| # confused and just decide the entire documents are different. | |
| # To avoid this, we stash fields we don't care about it before | |
| # doing the diff and restore them afterward. | |
| saved = {} | |
| for field in ('additional_information', 'created_at', | |
| 'id', 'last_edited_by', 'updated_at', 'vault', | |
| 'version', 'favorite'): | |
| if field in existing: | |
| saved[field] = existing.pop(field) | |
| def compare_func(x, y, level=None): | |
| for field in ('id', 'name', 'label'): | |
| try: | |
| return x[field] == y[field] | |
| except KeyError: | |
| pass | |
| except TypeError: | |
| raise CannotCompare() from None | |
| raise CannotCompare() from None | |
| # Temporarily add attachments to template so we can compare | |
| # them. | |
| if attachments: | |
| template['files'] = [] | |
| for file_name in attachments.keys(): | |
| try: | |
| template['files'].append( | |
| next(f for f in existing.get('files', []) | |
| if f['name'] == file_name)) | |
| except StopIteration: | |
| template['files'].append({'name': file_name}) | |
| ddiff = DeepDiff(template, existing, ignore_order=True, | |
| iterable_compare_func=compare_func, | |
| cutoff_distance_for_pairs=1, | |
| cutoff_intersection_for_pairs=1, | |
| view='tree') | |
| existing.update(saved) | |
| ddiff.pop('dictionary_item_added', None) | |
| ddiff.pop('iterable_item_added', None) | |
| if not ddiff: | |
| existing = True | |
| if existing is True: | |
| print(f'Skipping existing item {title} in {vault}') | |
| continue | |
| elif existing: | |
| action = 'edit' | |
| made_changes = False | |
| potential_changes = [] | |
| changed = ddiff.pop('values_changed', []) | |
| for change in changed: | |
| path = change.path(use_t2=True) | |
| msg = f'{title}: {path} in 1Password is "{change.t2}", ' \ | |
| f'but "{change.t1}" in Bitwarden.' | |
| def fixer(): | |
| exec(f'{path} = value', | |
| globals={'root': existing, 'value': change.t1}) | |
| potential_changes.append((msg, fixer)) | |
| removed = ddiff.pop('iterable_item_removed', []) | |
| new_attachments = {} | |
| for change in removed: | |
| path = change.up.path(use_t2=True) | |
| if path == "root['files']": | |
| file_name = change.t1['name'] | |
| msg = f'{title}: Bitwarden attachment {file_name} ' \ | |
| f'missing from 1Password.' | |
| def fixer(): | |
| new_attachments[file_name] = attachments[file_name] | |
| else: | |
| msg = f'{title}: {path} in 1Password does not have: ' \ | |
| f'{pprint.pformat(change.t1)}.' | |
| def fixer(): | |
| exec(f'{path}.append(value)', | |
| globals={'root': existing, 'value': change.t1}) | |
| potential_changes.append((msg, fixer)) | |
| if ddiff: | |
| raise Exception(f'Unexpected changes: {ddiff}') | |
| skip = False | |
| for msg, fixer in potential_changes: | |
| result = ask_about_change(args, template, existing, msg, fixer) | |
| if result == 'skip': | |
| skip = True | |
| break | |
| elif result == 'migrate': | |
| made_changes = True | |
| elif result == 'discard': | |
| pass | |
| if skip or not made_changes: | |
| continue | |
| attachments = new_attachments | |
| template = existing | |
| if action == 'create': | |
| if args.interactive: | |
| while True: | |
| response = remember_input( | |
| args, f'Add {title} to {vault} (y/n)? ') | |
| if response.lower().startswith('y'): | |
| response = True | |
| break | |
| elif response.lower().startswith('n'): | |
| response = False | |
| break | |
| else: | |
| print('Bad response.') | |
| if not response: | |
| continue | |
| else: | |
| print(f'{"Adding" if action == "create" else "Updating"} ' | |
| f'{title} to vault {vault}') | |
| cmd = ['op', 'item', action] | |
| if action == 'edit': | |
| cmd.append(template['id']) | |
| cmd.extend(('--format', 'json', '--vault', vault)) | |
| cmd.append('-') | |
| for file_name, path in attachments.items(): | |
| file_name = re.sub(r'([.=\\])', r'\\\1', file_name) | |
| cmd.append(f'{file_name}[file]={path}') | |
| result = op(args, cmd, 'importing item {item["id"]}', | |
| input=json_package.dumps(template)) | |
| # We still call op() above because it generates useful output when | |
| # --debug is specified. | |
| if args.dryrun: | |
| result = { | |
| 'vault': {'name': vault}, | |
| 'title': title, | |
| 'id': f'fake-id-for-item-{item["id"]}' | |
| } | |
| add_existing_item(result, existing_items) | |
| def ask_about_change(args, template, existing, msg, fixer): | |
| while True: | |
| print(msg) | |
| print('What do you want to do?') | |
| print(' Display (B)itwarden data.') | |
| print(' Display (1)Password data.') | |
| print(' (S)kip this item.') | |
| print(' (M)igrate this change from Bitwarden to 1Password.') | |
| print(' (K)eep 1Password as-is.') | |
| response = remember_input(args, 'Enter b, 1, s, m, or k: ', key=msg, | |
| dont_remember_re=r'^(?i:[1b])') | |
| if response.lower().startswith('b'): | |
| pprint.pprint(template) | |
| elif response.startswith('1'): | |
| pprint.pprint(existing) | |
| elif response.lower().startswith('s'): | |
| return 'skip' | |
| elif response.lower().startswith('m'): | |
| fixer() | |
| return 'migrate' | |
| elif response.lower().startswith('k'): | |
| return 'keep' | |
| else: | |
| print('Bad response.') | |
| def remember_input(args, prompt, key=None, dont_remember_re=None): | |
| global remembered_prompts | |
| if not args.remember: | |
| return input(prompt) | |
| if remembered_prompts is None: | |
| if os.path.exists('remembered.txt'): | |
| remembered_prompts = { | |
| hash: value for line in open('remembered.txt') | |
| for (hash, value) in (line.strip().split(':', 1),)} | |
| else: | |
| remembered_prompts = {} | |
| hash = sha256((prompt if key is None else key).encode()).hexdigest() | |
| try: | |
| print(f'{prompt}REMEMBERED: {remembered_prompts[hash]}') | |
| return remembered_prompts[hash] | |
| except KeyError: | |
| pass | |
| response = input(prompt) | |
| if not (dont_remember_re and re.search(dont_remember_re, response)): | |
| remembered_prompts[hash] = response | |
| with open('remembered.txt', 'a') as f: | |
| f.write(f'{hash}:{response}\n') | |
| return response | |
| def parse_1pux(args, file_name): | |
| '''Parses a 1Password export file and returns a list of items in the | |
| format that `op item get` returns / `op item edit` expects.''' | |
| with zipfile.ZipFile(file_name) as z, z.open('export.data') as f: | |
| data = json_package.load(f) | |
| accounts = data['accounts'] | |
| if len(accounts) > 1: | |
| raise Exception("Can't handle 1pux with multiple accounts") | |
| return filter(None, | |
| (transform_1pux_item(item) | |
| for item in flatten_1pux(accounts[0]['vaults']))) | |
| def transform_1pux_item(from_item): | |
| to_item = { | |
| 'id': from_item.pop('uuid'), | |
| 'fields': [], | |
| 'files': [] | |
| } | |
| if from_item.pop('favIndex', 0) != 0: | |
| to_item['favorite'] = True | |
| # Some fields don't get transformed, we just remove those. | |
| from_item.pop('createdAt', None) | |
| from_item.pop('updatedAt', None) | |
| from_item.pop('state', None) | |
| # 001 - Login | |
| # 002 - Card | |
| # 003 - Note | |
| # 004 - Identity | |
| # 005 - Also login? | |
| # 006 - Document | |
| categoryUuid = from_item.pop('categoryUuid') | |
| if categoryUuid == '001': | |
| to_item['category'] = 'LOGIN' | |
| elif categoryUuid == '002': | |
| to_item['category'] = 'CREDIT_CARD' | |
| elif categoryUuid == '003': | |
| to_item['category'] = 'SECURE_NOTE' | |
| elif categoryUuid == '004': | |
| # I'm not dealing with identities in this script. | |
| return None | |
| elif categoryUuid == '005': | |
| to_item['category'] = 'PASSWORD' | |
| elif categoryUuid == '006': | |
| to_item['category'] = 'DOCUMENT' | |
| details = from_item.pop('details', {}) | |
| loginFields = details.pop('loginFields', []) | |
| for field in loginFields: | |
| transform_1pux_field(field, to_item) | |
| if notesPlain := details.pop('notesPlain', None): | |
| to_item['fields'].append({ | |
| "id": "notesPlain", | |
| "type": "STRING", | |
| "purpose": "NOTES", | |
| "label": "notesPlain", | |
| "value": notesPlain, | |
| }) | |
| sections = details.pop('sections', []) | |
| for section in sections: | |
| section_title = section.pop('title', None) | |
| # No, I don't know why the section id is called "name" in the export | |
| section_id = section.pop('name', None) | |
| section_spec = { | |
| 'title': section_title, | |
| 'id': section_id | |
| } if section_title and section_id else None | |
| fields = section.pop('fields', []) | |
| for field in fields: | |
| transform_1pux_field(field, to_item, section=section_spec) | |
| if password := details.pop('password', None): | |
| to_item['fields'].append({ | |
| 'id': 'password', | |
| 'type': 'CONCEALED', | |
| 'purpose': 'PASSWORD', | |
| 'label': 'password', | |
| 'value': password | |
| }) | |
| if attributes := details.pop('documentAttributes', None): | |
| to_item['files'].append({ | |
| 'id': attributes.pop('documentId'), | |
| 'name': attributes.pop('fileName'), | |
| 'size': attributes.pop('decryptedSize') | |
| }) | |
| if attributes: | |
| raise Exception(f'Extra data in documentAttributes: {attributes}') | |
| for unwanted in ('passwordHistory',): | |
| details.pop(unwanted, None) | |
| if details: | |
| raise Exception(f'Extra data in details: {details}') | |
| overview = from_item.pop('overview', {}) | |
| if subtitle := overview.pop('subtitle', None): | |
| to_item['additional_information'] = subtitle | |
| url = overview.pop('url', None) | |
| if urls := overview.pop('urls', None): | |
| to_item['urls'] = [] | |
| for url_spec in urls: | |
| new_url = {} | |
| if label := url_spec.pop('label', None): | |
| new_url['label'] = label | |
| if url_string := url_spec.pop('url', None): | |
| new_url['href'] = url_string | |
| if url_string == url: | |
| new_url['primary'] = True | |
| if mode := url_spec.pop('mode', None): | |
| # Note: I don't know about other URL modes, but "host", at | |
| # least, is not included when you fetch an item with the CLI | |
| # with `op item get`, so all we can do here is ignore it. | |
| # D'oh. | |
| if mode not in ('default', 'host'): | |
| raise Exception( | |
| f"Don't know how to handle URL mode {mode}") | |
| if url_spec: | |
| raise Exception(f'Extra data in URL details: {url_spec}') | |
| to_item['urls'].append(new_url) | |
| if tags := overview.pop('tags', None): | |
| to_item['tags'] = tags | |
| if title := overview.pop('title'): | |
| to_item['title'] = title | |
| for unwanted in ('icons', 'ps', 'pbe', 'pgrng', 'watchtowerExclusions', | |
| 'b5UserUuid'): | |
| overview.pop(unwanted, None) | |
| if overview: | |
| raise Exception(f'Extra data in overview: {overview}') | |
| to_item['vault'] = from_item.pop('vault') | |
| if from_item: | |
| raise Exception(f'Extra data in 1pux item: {from_item}') | |
| return to_item | |
| def transform_1pux_field(from_field, to_item, section=None): | |
| to_field = {} | |
| to_file = None | |
| if section: | |
| to_field['section'] = section | |
| if fieldType := from_field.pop('fieldType', None): | |
| if fieldType in ('T', 'E'): | |
| to_field['type'] = 'STRING' | |
| elif fieldType == 'P': | |
| to_field['type'] = 'CONCEALED' | |
| else: | |
| raise Exception(f'Unkown fieldType in 1pux: "{fieldType}"') | |
| if value := from_field.pop('value', None): | |
| if isinstance(value, dict): | |
| if file := value.pop('file', None): | |
| to_file = { | |
| 'id': file.pop('documentId'), | |
| 'name': file.pop('fileName'), | |
| 'size': file.pop('decryptedSize') | |
| } | |
| if file: | |
| raise Exception(f'Unexpected data in file: {file}') | |
| elif totp := value.pop('totp', None): | |
| to_field['value'] = totp | |
| to_field['type'] = 'OTP' | |
| elif url := value.pop('url', None): | |
| to_field['value'] = url | |
| to_field['type'] = 'URL' | |
| else: | |
| types = { | |
| 'string': 'STRING', | |
| 'creditCardType': 'CREDIT_CARD_TYPE', | |
| 'creditCardNumber': 'CREDIT_CARD_NUMBER', | |
| 'concealed': 'CONCEALED', | |
| 'monthYear': 'MONTH_YEAR', | |
| 'email': 'EMAIL', | |
| 'menu': 'MENU', | |
| 'phone': 'PHONE', | |
| } | |
| for type_ in types.keys(): | |
| if type_ in value: | |
| # str() because monthYear is a number in the 1pux | |
| # file but a string everywhere else. Grr. | |
| to_field['value'] = str(value.pop(type_)) | |
| if 'type' not in to_field: | |
| to_field['type'] = types[type_] | |
| if value: | |
| raise Exception(f'Unexpected data in value dict: {value}') | |
| else: | |
| to_field['value'] = value | |
| if name := from_field.pop('name', None): | |
| to_field['label'] = name | |
| if title := from_field.pop('title', None): | |
| if name: | |
| raise Exception('Field has both name and title') | |
| to_field['label'] = title | |
| if designation := from_field.pop('designation', None): | |
| if designation == 'username': | |
| to_field['purpose'] = 'USERNAME' | |
| elif designation == 'password': | |
| to_field['purpose'] = 'PASSWORD' | |
| else: | |
| raise Exception( | |
| f'Unknown field designation in 1pux: "{designation}"') | |
| if id_ := from_field.pop('id', None): | |
| to_field['id'] = id_ | |
| elif designation: | |
| to_field['id'] = designation | |
| for unwanted in ('guarded', 'multiline', 'dontGenerate', 'inputTraits', | |
| 'capitalization', 'clipboardFilter'): | |
| from_field.pop(unwanted, None) | |
| if from_field: | |
| raise Exception(f'Extra data in 1pux field: {from_field}') | |
| if to_file: | |
| to_item['files'].append(to_file) | |
| elif 'value' in to_field: | |
| to_item['fields'].append(to_field) | |
| def flatten_1pux(vaults): | |
| for vault in vaults: | |
| vault_spec = { | |
| 'id': vault['attrs']['uuid'], | |
| 'name': vault['attrs']['name'] | |
| } | |
| for item in vault['items']: | |
| item['vault'] = vault_spec | |
| yield item | |
| if __name__ == '__main__': | |
| main() |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment