Skip to content

Commit b657241

Browse files
committed
chore: support importing/exporting uploads folder
- ignore folder used for staging upload imports and exports - add script for exporting/backing up uploads - update naming convention for import/export npm scripts - add script to import uploads - group sql dumps and uploads zip files in sync folder - support simultaneous named exports/backups
1 parent b453832 commit b657241

File tree

8 files changed

+84
-26
lines changed

8 files changed

+84
-26
lines changed

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,5 +49,6 @@ plugins/rollbar/
4949

5050
src/plugins/**/build/
5151

52-
# DB imports/exports
52+
# import and export files for syncing between environments
53+
sync
5354
sql

README.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -108,33 +108,33 @@ This theme uses the following files for linting:
108108
The preferred mechanism for syncing your environment with others is to use database imports and exports. This repo has a few scripts to make this process as easy as possible. While your containers are running, you can run any of these commands to import, export, or backup a database. Here are the quick commands, with more instructions below.
109109

110110
```sh
111-
# import a DB from the `sql` folder
112-
npm run import-db
111+
# import a DB from the `sync/sql` folder
112+
npm run import:db
113113

114114
# export your DB
115-
npm run export-db
115+
npm run export:db
116116

117117
# export your DB with a custom name
118-
npm run export-db validation-data
118+
npm run export:db validation-data
119119

120120
# backup your DB in case you need to restore it later
121-
npm run backup-db
121+
npm run backup:db
122122

123123
# backup your DB with a custom name
124-
npm run backup-db work-in-progress
124+
npm run backup:db work-in-progress
125125
```
126126

127127
#### Importing Databases
128128

129-
You can import databases from production, a saved backup, or another developer's DB export with the `import-db` script. To use it, put a `*.sql.gz` file in a top-level `sql` folder in the repo and run `npm run import-db`. This will first back up your existing database in case you need to revert back to it, and then it will import the database from the given file, effectively replacing your database with a new one.
129+
You can import databases from production, a saved backup, or another developer's DB export with the `import:db` script. To use it, put a `*.sql.gz` file in a top-level `sync/sql` folder in the repo and run `npm run import:db`. This will first back up your existing database in case you need to revert back to it, and then it will import the database from the given file, effectively replacing your database with a new one.
130130

131131
#### Exporting Databases
132132

133-
You can export your database for another developer to import or to import to a staging environment by running `npm run export-db`. By default, this will create a timestamped and gzipped file in `sql/exports`, but you can specify a name by running `npm run export-db <your-db-name-here>`. The exported file will still be timestamped, but it will use the name you give it instead of the default prefix.
133+
You can export your database for another developer to import or to import to a staging environment by running `npm run export:db`. By default, this will create a timestamped and gzipped file in `sync/sql/exports`, but you can specify a name by running `npm run export:db <your-db-name-here>`. The exported file will still be timestamped, but it will use the name you give it instead of the default prefix.
134134

135135
#### Backing Up Databases
136136

137-
This will happen automatically when you import a database, but if you want to manually backup your database, you can run `npm run backup-db`. This functions nearly identically to the `export-db` script, except for using a different prefix and putting the file in `sql/backups`. As with `export-db`, you can specify a name for your DB backup if you want.
137+
This will happen automatically when you import a database, but if you want to manually backup your database, you can run `npm run backup:db`. This functions nearly identically to the `export:db` script, except for using a different prefix and putting the file in `sync/sql/backups`. As with `export:db`, you can specify a name for your DB backup if you want.
138138

139139
### Atom
140140

package.json

Lines changed: 11 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -31,10 +31,17 @@
3131
"lint:twig": "./scripts/run.sh vendor/bin/twigcs src/php/views",
3232
"lint": "run-s lint:*",
3333
"php:run": "./scripts/run.sh",
34-
"export-db": "./scripts/export-db.sh",
35-
"backup-db": "BACKUP=true ./scripts/export-db.sh",
36-
"import-db": "./scripts/import-db.sh",
37-
"preimport-db": "npm run backup-db",
34+
"export": "run-s \"export:* {1}\" --",
35+
"backup": "run-s \"backup:* {1}\" --",
36+
"import": "run-s import:*",
37+
"export:db": "./scripts/export-db.sh",
38+
"backup:db": "BACKUP=true ./scripts/export-db.sh",
39+
"import:db": "./scripts/import-db.sh",
40+
"export:uploads": "./scripts/export-uploads.sh",
41+
"backup:uploads": "BACKUP=true ./scripts/export-uploads.sh",
42+
"import:uploads": "./scripts/import-uploads.sh",
43+
"preimport:db": "npm run backup:db",
44+
"preimport:uploads": "npm run backup:uploads",
3845
"generate:custom-block": "node ./generators/custom-block.js",
3946
"generate:custom-blocks-plugin": "node ./generators/custom-blocks-plugin.js",
4047
"generate:page-template": "node ./generators/page-template.js",

scripts/.gitkeep

Whitespace-only changes.

scripts/export-db.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,12 @@
33
export $(grep -v '^#' .env | xargs)
44

55
timestamp=$(date -u +%Y-%m-%dT%H-%M-%S_%Z)
6-
path='sql/exports'
6+
path='sync/sql/exports'
77
prefix='db-export'
88

99
if [ $BACKUP ]
1010
then
11-
path='sql/backups'
11+
path='sync/sql/backups'
1212
prefix='db-backup'
1313
fi
1414

scripts/export-uploads.sh

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
#!/bin/bash
2+
3+
export $(grep -v '^#' .env | xargs)
4+
5+
timestamp=$(date -u +%Y-%m-%dT%H-%M-%S_%Z)
6+
path='sync/uploads/exports'
7+
prefix='uploads-export'
8+
9+
if [ $BACKUP ]
10+
then
11+
path='sync/uploads/backups'
12+
prefix='uploads-backup'
13+
fi
14+
15+
dirname=$prefix-$timestamp
16+
filename=../$prefix-$timestamp.tar.gz
17+
18+
if [ $1 ]
19+
then
20+
dirname=$1-$timestamp
21+
filename=../$1-$timestamp.tar.gz
22+
fi
23+
24+
mkdir -p $path/$dirname
25+
cp -rv uploads/ $path/$dirname
26+
cd $path/$dirname
27+
tar -czf $filename .
28+
cd ..
29+
rm -rf $dirname

scripts/import-db.sh

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -2,26 +2,27 @@
22

33
export $(grep -v '^#' .env | xargs)
44

5-
# copy the most recent .sql.gz file in the sql folder for import
6-
cp "$(ls -t sql/*.sql.gz | head -1)" sql/db-import-raw.sql.gz
5+
# copy the most recent .sql.gz file in the sync/sql folder for import
6+
cp "$(ls -t sync/sql/*.sql.gz | head -1)" sync/sql/db-import-raw.sql.gz
77
if [ $? -ne 0 ]
88
then
9-
echo "There must be at least one .sql.gz file in the sql folder to import"
10-
exit 1
9+
echo "There must be at least one .sql.gz file in the sync/sql folder to import"
10+
echo "Skipping database import"
11+
exit 0
1112
fi
1213

13-
gunzip sql/db-import-raw.sql.gz
14+
gunzip sync/sql/db-import-raw.sql.gz
1415

1516
# replace environment-specific URLs with localhost URL
16-
sed "s/$SITE_URL/http:\/\/localhost:8000/g" sql/db-import-raw.sql > sql/db-import.sql
17+
sed "s/$SITE_URL/http:\/\/localhost:8000/g" sync/sql/db-import-raw.sql > sync/sql/db-import.sql
1718

1819
# drop existing database, create a new one, and load it up with data
1920
docker exec -i sparkpress_db mysql --user=$MYSQL_USER --password=$MYSQL_PASSWORD -e "drop database if exists $MYSQL_DATABASE"
2021
docker exec -i sparkpress_db mysql --user=$MYSQL_USER --password=$MYSQL_PASSWORD -e "create database $MYSQL_DATABASE"
21-
docker exec -i sparkpress_db mysql --user=$MYSQL_USER --password=$MYSQL_PASSWORD $MYSQL_DATABASE < sql/db-import.sql
22+
docker exec -i sparkpress_db mysql --user=$MYSQL_USER --password=$MYSQL_PASSWORD $MYSQL_DATABASE < sync/sql/db-import.sql
2223

2324
# clean up files that aren't useful after import
24-
rm sql/db-import*
25+
rm sync/sql/db-import*
2526

26-
mkdir -p sql/previous-imports
27-
mv sql/*.sql.gz sql/previous-imports
27+
mkdir -p sync/sql/previous-imports
28+
mv sync/sql/*.sql.gz sync/sql/previous-imports

scripts/import-uploads.sh

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
#!/bin/bash
2+
3+
export $(grep -v '^#' .env | xargs)
4+
5+
# copy the most recent .tar.gz file in the sync/uploads folder for import
6+
cp "$(ls -t sync/uploads/*.tar.gz | head -1)" sync/uploads/uploads-import.tar.gz
7+
if [ $? -ne 0 ]
8+
then
9+
echo "There must be at least one .tar.gz file in the sync/uploads folder to import"
10+
echo "Skipping uploads import"
11+
exit 0
12+
fi
13+
14+
rm -rf uploads/*
15+
mkdir -p uploads
16+
tar -zxvf sync/uploads/uploads-import.tar.gz -C uploads
17+
rm sync/uploads/uploads-import.tar.gz
18+
19+
mkdir -p sync/uploads/previous-imports
20+
mv sync/uploads/*.tar.gz sync/uploads/previous-imports

0 commit comments

Comments
 (0)