migration update

This commit is contained in:
tonyrewin 2022-07-13 19:01:48 +03:00
parent 82fe60f764
commit 51c7f68ceb
2 changed files with 12 additions and 12 deletions

View File

@ -62,7 +62,6 @@ def shouts_handle(storage):
discours_author = 0
pub_counter = 0
for entry in storage['shouts']['data']:
oid = entry['_id']
# slug
slug = get_shout_slug(entry)
@ -82,7 +81,7 @@ def shouts_handle(storage):
# print('[migration] ' + shout['slug'] + ' with author ' + author)
if entry.get('published'):
export_mdx(shout)
if 'mdx' in sys.argv: export_mdx(shout)
pub_counter += 1
# print main counter

View File

@ -12,8 +12,7 @@ pipenv install -r requirements.txt
Put the unpacked mongodump to the `data` folder and operate with
`pipenv shell && python`
1. get old data jsons
#### get old data jsons
```py
import bson2json
@ -21,22 +20,24 @@ import bson2json
bson2json.json_tables() # creates all the needed data json from bson mongodump
```
2. migrate users
#### migrate all
```sh
pipenv install
pipenv run python migrate.py users
pipenv run python migrate.py all
```
#### or migrate all with mdx exports
```sh
pipenv install
pipenv run python migrate.py all mdx
```
Note: this will create db entries and it is not tolerant to existed unique
email.
3. then topics and shouts
#### or one shout by slug
```sh
pipenv run python migrate.py topics
pipenv run python migrate.py shouts
pipenv run python migrate.py - <shout-slug>
```
Now you got the \*.dict.json files which contain all the data with old and
new(!) ids.