Backup Redis Database to Telegram Automatically with Cronjob Setup

Backup Redis Database to Telegram Automatically with Cronjob Setup

Daily short news for you
  • Previously, there was a mention of openai/codex - a type of agent that runs conveniently in the Terminal from OpenAI, especially since it is open source and they have now added support for other providers instead of just using the chatgpt model as before.

    Recently, Anthropic also introduced Claude Code which is quite similar to Codex, except it is not open source and you are required to use their API. Since I don't have money to experiment, I've only heard that people in the programming community praise it a lot, and it might even be better than Cursor. On the flip side, there's the risk of burning a hole in your wallet at any moment 😨

    » Read more
  • For a long time, I have been thinking about how to increase brand presence, as well as users for the blog. After much contemplation, it seems the only way is to share on social media or hope they seek it out, until...

    Wearing this shirt means no more worries about traffic jams, the more crowded it gets, the more fun it is because hundreds of eyes are watching 🤓

    (It really works, you know 🤭)

    » Read more
  • A cycle of developing many projects is quite interesting. Summarized in 3 steps: See something complex -> Simplify it -> Add features until it becomes complex again... -> Back to a new loop.

    Why is that? Let me give you 2 examples to illustrate.

    Markdown was created with the aim of producing a plain text format that is "easy to write, easy to read, and easy to convert into something like HTML." At that time, no one had the patience to sit and write while also adding formatting for how the text displayed on the web. Yet now, people are "stuffing" or creating variations based on markdown to add so many new formats that… they can’t even remember all the syntax.

    React is also an example. Since the time of PHP, there has been a desire to create something that clearly separates the user interface from the core logic processing of applications into two distinct parts for better readability and writing. The result is that UI/UX libraries have developed very robustly, providing excellent user interaction, while the application logic resides on a separate server. The duo of Front-end and Back-end emerged from this, with the indispensable REST API waiter. Yet now, React doesn’t look much different from PHP, leading to Vue, Svelte... all converging back to a single point.

    However, the loop is not bad; on the contrary, this loop is more about evolution than "regression." Sometimes, it creates something good from something old, and people rely on that goodness to continue the loop. In other words, it’s about distilling the essence little by little 😁

    » Read more

Problem

Have you ever "accidentally" deleted a running database before? I haven't, but that's because I'm always careful and responsible. I couldn't afford to make such a mistake, as it could result in me losing my job, not to mention the aftermath of fixing and compensating for the damage caused. So, I always take precautions when working with databases or when executing commands that can modify data, such as UPDATE, DELETE, etc. I make sure to have a backup plan before executing any of these commands, in case something goes wrong, or I double-check the commands for accuracy. I even run multiple tests in a development environment.

Understanding the importance of data, most systems have a backup plan for their databases. Backup becomes crucial because if there's any issue that causes data loss, there's still a way to recover. When it comes to implementing backups, there are various methods, depending on the project and the frequency of data updates. Some methods include master-slave replication, outbox, cluster, etc.

2coffee.dev uses Redis as its database. As you can see, it already has very little data (articles), and there are more readers than writers (comments), so backing up the data is relatively quick and easy. I spend a little time each day copying the backup file, and that's it. However, if something can be automated, it should be automated, so I have a way to back up the data to Telegram automatically.

Redis Data Storage Mechanism

Redis has two mechanisms for persisting data to disk: RDB and AOF. As you may know, Redis stores data in RAM to speed up queries, but data in RAM is not persistent and can be lost if the computer is shut down or if the power is cut off. Therefore, Redis needs a mechanism to persist data to disk. Both methods provided by Redis can achieve this, with differences in how they work.

RDB (Redis Database) is a method that stores data in a very compact file. RDB files are perfect for backups. You can configure RDB to create backups every 1 hour, 24 hours, 30 days, etc. This allows you to easily restore different versions of the dataset in case of any issues. Simply put, a new RDB file will be created to replace the old one whenever the scheduled time comes. You can use this file to quickly restore the data. In addition, the data recovery speed of RDB is faster than AOF.

The Append Only File (AOF) is a different approach compared to RDB. AOF works by appending every command or data change to a file. When restoring data with AOF, Redis replays all the commands stored in the file. AOF is great when you want almost instant data persistence. It includes options such as turning it off, writing every second, and writing every query. With the every second option, the write performance is still impressive. If there is an unexpected power loss, you may only lose the data from the last second.

Redis has a detailed documentation on these two mechanisms and how to set them up. If you're interested, you can refer to Redis persistence.

Currently, I'm using a combination of both methods. The directory used to store backup data is /data, so all I need to do is set up a cronjob to compress the /data directory into a file, and then send it to Telegram using a Telegram bot. Telegram has a file size limit of 50MB when sending files with a bot, but that's much larger than my file size, which only takes ~2MB for the .zip file. I think with this speed, it would take another 10 years for the data to exceed 50MB :D. Just kidding, if the file size increases in the future, there will be alternative solutions, such as sending it to Google Drive.

Implementation

You can use any programming language you prefer to implement the cronjob, as long as it's feasible. All you need to do is compress the /data directory and send it to Telegram through a single API call. I chose Golang for its lightweight and efficiency.

The main functions include zipFile, generateFilename, createTelegramDocument, and removeFile.

The zipFile function is used to compress the /data directory, and it takes a filename parameter as the name of the compressed file.

func zipFile(filename string) (*os.File, error) {
  var buf bytes.Buffer
  err := utils.Compress(config.DIR_TO_BACKUP, &buf)
  if err != nil {
    return nil, err
  }

  fileToWrite, err := os.OpenFile(fmt.Sprintf("./%s", filename), os.O_CREATE|os.O_RDWR, os.FileMode(0777))
  if err != nil {
    return nil, err
  }

  if _, err := io.Copy(fileToWrite, &buf); err != nil {
    return nil, err
  }

  return fileToWrite, nil
}

The generateFilename function is used to generate the file name. In my case, I want the file name to be in the format 2coffee concatenated with the creation date.

func generateFilename() string {
  return fmt.Sprintf("estacks-%s.zip", time.Now().Format("2006-02-01"))
}

The removeFile function is used to delete the compressed file after successfully sending it to Telegram to clean up memory.

func removeFile(filePath string) error {
  err := os.Remove(filePath)
  return err
}

Combining these functions, I created a TeleBackupRedis struct with a run method to perform the backup and send the message.

type TeleBackupRedis struct{}

func (t TeleBackupRedis) run() {
  teleBot := utils.TeleBot{}
  teleBot.NewBot(config.TELE_REQUEST_BOT)
  generationFilename := generateFilename()
  backupFilePath := fmt.Sprintf("%s%s", config.ROOT_PATH, generationFilename)
  _, err := zipFile(generationFilename)
  if err != nil {
    fmt.Println("Error when zip file", err)
  }

  caption := fmt.Sprintf("Redis data backup on %s", time.Now().Format("2006-01-02"))
  teleFile := &tb.Document{File: tb.File{FileLocal: filePath}, FileName: fileName, Caption: caption}
  err = teleBot.SendChannelMessage(config.TELE_REQUEST_CHANNEL_ID, teleFile)
  if err != nil {
    fmt.Println("Error when send file", err)
  }

  err = removeFile(backupFilePath)
  if err != nil {
    fmt.Println("Error when remove zip file", err)
  }

  fmt.Println("Last running:", time.Now().Format(time.RFC3339))
}

Finally, I run the run function at 0 hour 1 minute every day.

Summary

Backing up data is crucial. The implementation depends on the type of database. For Redis, set up the backup mechanism accordingly and store the generated Redis backup file. In my case, I added an additional step to send the backup file to Telegram for convenient monitoring and future recovery.

Premium
Hello

The secret stack of Blog

As a developer, are you curious about the technology secrets or the technical debts of this blog? All secrets will be revealed in the article below. What are you waiting for, click now!

As a developer, are you curious about the technology secrets or the technical debts of this blog? All secrets will be revealed in the article below. What are you waiting for, click now!

View all

Subscribe to receive new article notifications

or
* The summary newsletter is sent every 1-2 weeks, cancel anytime.

Comments (0)

Leave a comment...