Saving neural network weights to MySQL database (AML part 5)

I’ve tested two options for saving and loading weights. The first option works with a MySQL database, the second saves to a CSV file.

To take a closer look at this issue, you need to refer to my library. The DataBase library was developed by me for working with neural networks. To quickly connect to the database and work with it. This library is available at the link on github https://github.com/scisoftdev/Python/blob/master/Database/Database.py.

Before saving the weight coefficients, you need to create a database and prepare the tables in which the saving will be performed. The DataBase library can itself create the database if it does not already exist on the server. To do this, just enter the connection data and the name of the database. With tables, things are a little more complicated, since each table stores only one set of weights between the layers. Therefore, if there are four layers, then there will be three sets of weighting factors. I wrote a simple code for this task.

This program reads the list of parameters after connecting to the database server. According to the specified parameters of the neural network, tables are added to the database.

How it works? For example, a neural network has parameters [784, 200, 10]. In this case, two tables will be created with column names of type wn, where n is the column number. The first table will have 784 columns and 200 rows, and the second 200 columns and 10 rows. The data type for each column will be DUOBLE. The program clearly shows how the sql query for creating tables is formed.

For this code to work correctly, it needs to be inserted into the class implementation. And write the initial parameters into an array. However, when testing the program, I did not do this, since the main task was to correctly save the weight coefficients. At this stage, I separately run the program to create tables and store the weights of the neural network. In the future, I will add a function to the program that will save the neural network with all parameters and with any number of layers. At this point, it is worth taking a closer look at some of the useful functions from the DataBase library.

One of these functions is createTable_weights. It contains a special sql query to add a table.

The name_w array stores the table names, which are then used in the createTable_weights function.

To write the obtained weights into the prepared tables, you need to use the following functions.

The showColumnInfo function creates an array containing complete information about all the columns in the table. To implement a sql query to add data to the database, you need to know what columns exist in the table. The getColumnInfo function gets the name of the column at index 0 in the array. The last function is insertInto which writes data to the table. If you need a more detailed description of the DataBase library, please write in a comment.

For a program that uses source code, the query function looks like this:

The complete code of the neural network that saves the weights to the database is available at the gihub link https://github.com/scisoftdev/Python/blob/master/Save_weights_of_NN/NN_save_mysql.py.

And here:

the float_type function converts the float64 datatype to a database-readable DOUBLE type. Otherwise, an error occurs.

In the future, I will describe how to save the neural network weights to a CSV file. It should be noted that saving and using CSV files in the future is more convenient than MySQL. I already have a code for a neural network that can work with a variable number of layers with saving and loading weights, as well as saving metadata of the neural network.

Fine! It works!