This is a multi-part series with tips and solutions to roadblocks that I faced in dealing with the "MySQL in App" service.
Part 2 - Migrating a really large MySQL database to Azure's "MySQL in App"
You can use the phpMyAdmin interface to do most operations on your database, but it has never been good with importing/exporting databases over 8MB, and these days 8MB is not that big.
I had to operate on a 3GB database and using phpMyAdmin to restore it was no option at all.
Export your database from your source with the "mysqldump.exe" command:
mysqldump -u [user_name]-pdbpassword [database_name] --routines > 20170416_dbbackup_full.sql
(notice the -pPASSWORD syntax, a bit weird but there is no space there).
NOTE: if your .sql file is "huge" see spliting a .sql file on Windows below. "Huge" means the import times out given the size of the App Service resources that you have provided.
Soon we will use the Kudu console, but first access the FTP service that is enabled on your Azure App Service. Get the FTP hostname and credentials from the Overview screen, and upload your large database.
FTP your large database to the D:\home folder:
Once your file is there, you can manipulate it with mysql.exe.
The key thing to gather is the path and port of your MySQL instance (phpMyAdmin could tell you both. The username and passwords you need to know are the "root" and "azure" accounts, both with passwords "password". You also need to specify the --bind-address parameter to localhost.
Pretty much your command looks like this:
D:\home\sql_restore>"D:\Program Files (x86)\mysql\188.8.131.52\bin\mysql.exe" --user=root --password=password --port=56172 --bind-address=127.0.0.1 mydbname-test-01 < 20170401_dbbackup_mini.sql
This is enough for you to restore a large database, but there is one big problem with this approach. The Kudu console will timeout (I think after 30 or 60 minutes) and kill the process that is hosting it. BAD. Basically it looks like this:
Eventually you might get a timeout of the console before the operation completes, terminating it half-way.
See my next post for a solution to the above problem: Part 3 - Splitting a really large MySQL database dump/export file into smaller pieces