Thursday, March 22, 2018

How to take backup of a database which has table size 105 GB?

Leave a Comment

There is a database in MYSQL InnoDB engine. When I take backup of my database which is about 150 GB by mysqldump command on linux server. I get the error message

mysqldump Lost connection.

There is a table of size 105 GB and about 420 000 000 rows and about 100 000 rows inserting average per hour. I got that error when backup is running at this table. How to take backup and what is the foolproof process?

3 Answers

Answers 1

Really similar to Lost connection to MySQL server during query? & https://dba.stackexchange.com/questions/97846/mysqldump-error-2013

Add those parameters in your mysqldump command line:

--max_allowed-packet=1G --net-read-timeout=7200 --net-write-timeout=7200 --net-buffer-length=32704 

Answers 2

You can do a backup in parts using a WHERE clause to split it to several files.

You should look for an index or perhaps use the primary key and use several mysqldump commands. Note that the data will not be a snapshot of the whole table at the same time.

mysqldump -u root -p  dbName tableName --where="id>=10000 AND id<20000" > dbName_tableName_10_20.sql 

Consider splitting the table on the future for easier maintenance.

Answers 3

The "Lost Connection" problem is caused by too much data through the dumping channel - you need to increase max_allowed_packet in my.cnf.

However, backing up will lock the database (or the tables) and cause service problems. So you might be better off by, first, setting up replication, so that the slave is kept in a consistent state by continuous updates from its master. When you need to backup, you lock the slave and set it to read only, then can either run a mysqldump or just back up the binary data (which the manual recommends for larger databases).

If You Enjoyed This, Take 5 Seconds To Share It

0 comments:

Post a Comment