Data is priceless. We usually store important business data in the database, and we need to make regular automatic backups of the database to prevent abnormal data loss and irreparable loss.
Small program cloud development provides a convenient cloud database for us to use directly. Cloud development uses the cloud database provided by Tencent Cloud, and has a perfect data guarantee mechanism, so don't worry about data loss. But we will inevitably worry about the security of data in the database, such as accidentally deleting data sets and writing dirty data.
Fortunately, the cloud development console provides the function of exporting and importing data sets, and we can back up the database manually. However, it is too much trouble to always back up the database manually. All repetitive things should be solved by code. Let me talk about how to get an automatic backup of the cloud development database.
Looking at the documentation of WeChat, we can find that cloud development provides a data export interface databaseMigrateExport.
POST/TCB/databasemigmigrateexport? Access token = access token
Through this interface, combined with the timing trigger function of cloud function, we can automatically back up the database regularly. Sort out the general process:
Create a cloud function that is triggered at regular intervals.
The cloud function calls the interface to export database backup files.
Upload backup files to cloud storage for use.
1. Get access token
Calling the interface of WeChat requires Access_token, so we need to get access_token first. From the documentation, we know that using the auth.getAccessToken interface, we can get access_token with appid and secret of applet.
//get access_token
Request. Get (
`/cgi-bin/token? Authorization type = customer certificate. appid = $ { appid } & ampsecret=${secret} `,
(err,res,body)= & gt; {
If (error) {
//Handling error
Return;
}
const data = JSON . parse(body);
//data. access _ token
}
);
2. Create a database export task
After obtaining access_token, you can export data for backup by using databaseMigrateExport interface.
The DatabaseMigrateExport interface will create a database export task and return a job_id. How to use this job_id will be discussed below. Obviously, the data export of the database is not synchronous, but takes some time. The larger the amount of data, the longer it takes to export. Personal measurement, 2W record, 2M size, takes about 3~5 S to export.
Calling databaseMigrateExport interface requires passing in environment Id, storage file path, export file type (1 is JSON, 2 is CSV) and a query query statement.
Because we are doing database backup, it is more compatible to export JSON data here. The data to be backed up can be constrained by queries, which is still very flexible here. It can be the data of the whole collection or a specified part of the data. Here we use db.collection('data'). Get () to back up all the data in the data set. At the same time, we use the current time as the file name for future use.
request.post(
`/TCB/databasemigmigrateexport? access_token=${accessToken} `,
{
body: JSON.stringify({
env,
file_path: `${date}.json `,
file_type: ' 1 ',
Query:' db.collection("data "). get()'
})
},
(err,res,body)= & gt; {
If (error) {
//Handling error
Return;
}
const data = JSON . parse(body);
// data. Job ID
}
);
3. Query the task status and get the file address.
After creating the database export task, we will get a job_id. It will take a long time if the export collects more. At this point, we can use the databaseMigrateQueryInfo interface to query the progress of database export.
After the export is completed, a file_url will be returned, which is a temporary link to download the database export file.
request . post(`/TCB/databasmigratequeryinfo? access_token=${accessToken} `,
{
body: JSON.stringify({
env,
job_id: jobId
})
},
(err,res,body)= & gt; {
If (error) {
Reject (err);
}
const data = JSON . parse(body);
// data.file_url
}
);
After obtaining the file download link, we can download the file and store it in our own cloud storage for backup. If you don't need to save the backup for a long time, you don't need to download the file. You only need to store job_id. When you need to restore the backup, you can query the new link through job_id and download the data for recovery.
As for where the job_id exists, it depends on my personal thoughts, so I choose to store it in the database here.
await db . collection(' db _ back _ info ')。 Add ({
Data: {
Date: New date (),
Job ID: Job ID
}
});
4. Functional timing trigger
The cloud function supports timed triggering and can be automatically executed according to the set time. The timing trigger of cloud development adopts Cron expression syntax, which can be accurate to the second at most. For detailed usage, please refer to the official document: Timing Trigger | WeChat Open Document.
Here we configure the function to trigger at 2 am every day, so that the database can be backed up every day. Create a new config.json file in the cloud function directory, and write the following:
{
"Trigger": [
{
" name": "dbTrigger ",
"Type": "Timer",
" config": "0 0 2 * * * * "
}
]
}
Complete code
Finally, paste the complete code that can be used in the cloud function, and you only need to create a fixed-time triggered cloud function and set relevant environment variables to use it.
appid
secret
BackupColl: the name of the collection to be backed up, for example, "data".
BackupInfoColl: the name of the collection used to store backup information, such as "db _ back _ info".
Note that the default timeout for cloud functions is 3 seconds. When creating the backup function, it is recommended to set the timeout to a maximum of 20S to allow enough time to query the task results.
/* eslint-disable */
const request = require(' request ');
const cloud = require(' wx-server-SDK ');
//environmental variables
const env = ' xxxx
cloud.init({
Envelope/enclose (short for the verb envelop)
});
//in exchange for access_token
Asynchronous function getaccessstoken(appid, secret) (
Return new commitment ((resolve, reject) =>{
Request. Get (
`/cgi-bin/token? Authorization type = customer certificate. appid = $ { appid } & ampsecret=${secret} `,
(err,res,body)= & gt; {
If (error) {
Reject (err);
Return;
}
resolve(JSON . parse(body));
}
);
});
}
//Create export task
Asynchronous function createExportJob(accessToken, collection) (
Const date = new date (). toiso string();
Return new commitment ((resolve, reject) =>{
request.post(
`/TCB/databasemigmigrateexport? access_token=${accessToken} `,
{
body: JSON.stringify({
env,
file_path: `${date}.json `,
file_type: ' 1 ',
Query: `db.collection ("$ {collection}"). get()`
})
},
(err,res,body)= & gt; {
If (error) {
Reject (err);
}
resolve(JSON . parse(body));
}
);
});
}
//Query the export task status
Asynchronous function waitJobFinished(accessToken, jobid) (
Return new commitment ((resolve, reject) =>{
//Rotation training task status
const timer = setInterval(()= & gt; {
request.post(
`/TCB/databasmigratequeryinfo? access_token=${accessToken} `,
{
body: JSON.stringify({
env,
job_id: jobId
})
},
(err,res,body)= & gt; {
If (error) {
Reject (err);
}
const { status,file _ URL } = JSON . parse(body);
console . log(' query ');
if (status === 'success') {
ClearInterval (timer);
resolve(file _ URL);
}
}
);
}, 500);
});
}
Exports.main = async (event, context) =>{
//Read appid and secret and data set from cloud function environment variables.
const { appid,secret,backupColl,backupInfoColl } = process.env
const db = cloud . database();
Try {
//get access_token
const { errmsg,access _ token } = await getAccessToken(appid,secret);
if(errmsg & amp; & Error code! == 0) {
A new error is thrown ("Cannot get access _ token: $ {errmsg}" || "Get access _ token is empty");
}
//Export database
const { errmsg: jobErrMsg,errcode: jobErrCode,job _ id } = await createExportJob(access _ token,backup coll);
//Print to log
console . log(job _ id);
if (jobErrCode! == 0) {
A new error is thrown (` Unable to create database backup task: $ {joberrmsg }`);
}
//Save the task data in the database.
const RES = await db . collection(' db _ back _ info ')。 Add ({
Data: {
Date: New date (),
Job ID: Job ID
}
});
//Wait for the task to be completed
const file URL = await waitJobFinished(access _ token,job _ id);
Console.log ('export succeeded', fileurl);
//stored in the database
Waiting database
. BackupInfoColl
. Document (Resource ID)
. Update ({
Data: {
File Url
}
});
} catch (e) {
A new error is thrown (` Export database exception: $ {e.message }`);
}
};