When you want to import a large SQL query in phpMyAdmin, you can run into problems as the default upload size limit is restricted by whatever it is in the php.ini file – and often, you cannot change this or simply don’t want to. It tends to default to a measly 2MB.
However, there’s a feature that will allow you to FTP your SQL file to an upload directory of your choosing, and then instruct phpMyAdmin to run the query file. Here’s how you do it.
- Open up your FTP program, and navigate to the phpmyadmin folder on your server. Inside that folder is a file called config.inc.php – open this up in your trusty text editor.
- While you’re still in your FTP program – create a directory inside the phpmyadmin directory called upload:
- Towards the bottom of the config file, you’ll notice a potential setting for $cfg[‘UploadDir’]. Change it so that it reads the following:
Don’t forget to add the “./” before “upload” – this just tells the program to look in the existing directory (phpmyadmin) for a folder called “upload” – there’s no need to change file permissions or anything.
- Save it, and then go back to your ftp program, and upload a sql file to that directory.
- When you go to the Import tab in phpMyAdmin, you’ll now see a drop-down list of available files within your upload directory. Pick the one you want to run.
- Sit back, have some coffee.
- You can compress the file, which will certainly make your FTP upload time shorter. I found that gzip works best for that. Sometimes a zip file would give me an error.
- Note that you can still run into trouble if your query is HUGE. If your php execution time and/or timeout settings are not big enough in your php.ini file, executing your query can be problematic. However, you can always chop up your data inserts into numerous files, although that can be a pain. (Or, if you have SSH access, you can execute the query file that way.) I encountered this problem while trying to import a big Drupal database. I solved it be emptying (NOT dropping) all the search-related database tables, and then re-indexing the site once all the other data was inserted.
That’s it – I was pleased to discover this feature and hope it helps someone else out too!