
If you're running an application in production, it is critical to have an automated backup system for your database. This system could automatically dump database backups and upload to the cloud every couple of hours.
In this tutorial, we'll create this script using node.js. This script would run themysqldump
command using the node.js child process. First let's examine this command.
mysqldump-u <username>-p<password> <database-name>
Running this command would generate a record of the table structure and the data from the specified database in the form of a list of SQL statements.
Let's create the node.js backup script that would run this command as a child process. First, we'll import all the modules we need.
constfs=require('fs')constspawn=require('child_process').spawn
- We need the
fs
module to write the dump content to the dump file. - We need the
spawn
method from thechild_process
module to run themysqldump
command.
Thespawn
runs the command and returns a stream. The spawn would not wait for the command to finish running before returning the output to us. This is very important because some large databases can run for many hours.
Next, we'll need a unique name for the database dump.
constdumpFileName=`${Math.round(Date.now()/1000)}.dump.sql`
This uses the date object in javascript to generate the current epoch time and attaches.dump.sql
to it. We'll use this as the dump file name.
Next, let's create a write stream. When we stream output from the spawn method, we'll pass the output to the write stream, which would write the output to a file.
constdumpFileName=`${Math.round(Date.now()/1000)}.dump.sql`constwriteStream=fs.createWriteStream(dumpFileName)
The write stream will create a file with the specified file name.
Next, let's create the child process using spawn .
constdump=spawn('mysqldump',['-u','<username>','-p<password>','<database-name>',])
The first argument to thespawn
method is the command, and the second is a list of all arguments to be passed to this command. As seen above, we are passing through all the commands just like we did on the command line.
This method returns a child process, and we can now stream for every output emitted from the child process.
dump.stdout.pipe(writeStream).on('finish',function(){console.log('Completed')}).on('error',function(err){console.log(err)})
Here, we arepipe
ing the output from the dump as input to the writeStream. So as the child process runs, every time there's a new chunk of output, the write stream would write it to the file.
We can also listen to the finish and error events and pass callbacks to handle them. In this case we just log a message.
Here's the complete script:
constfs=require('fs')constspawn=require('child_process').spawnconstdumpFileName=`${Math.round(Date.now()/1000)}.dump.sql`constwriteStream=fs.createWriteStream(dumpFileName)constdump=spawn('mysqldump',['-u','ghost','-pghost','ghost',])dump.stdout.pipe(writeStream).on('finish',function(){console.log('Completed')}).on('error',function(err){console.log(err)})
To automate this process, you can create a cron job that executes this script every x amount of time.
Top comments(0)
For further actions, you may consider blocking this person and/orreporting abuse