Today we will look into JDBC Batch insert and update examples in MySQL and Oracle databases. Sometimes we need to run bulk queries of a similar kind for a database. For example, loading data from CSV files to relational database tables.
As we know that we have option to use Statement
or PreparedStatement
to execute queries. Apart from that JDBC provides Batch Processing feature through which we can execute the bulk of queries in one go for a database.
JDBC Batch
JDBC batch statements are processed through Statement and PreparedStatement addBatch()
and executeBatch()
methods. This tutorial is aimed to provide details about JDBC Batch insert example for MySQL and Oracle database.
We will look into different programs so we have a project with the structure as below image.
Notice that I have MySQL and Oracle DB JDBC Driver jars in the project build path so that we can run our application across MySQL and Oracle DB both.
Let’s first create a simple table for our test programs. We will run the bulk of JDBC insert queries and look at the performance with different approaches.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
--Oracle DB CREATE TABLE Employee ( empId NUMBER NOT NULL, name varchar2(10) DEFAULT NULL, PRIMARY KEY (empId) ); --MySQL DB CREATE TABLE `Employee` ( `empId` int(10) unsigned NOT NULL, `name` varchar(10) DEFAULT NULL, PRIMARY KEY (`empId`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8; |
We will read the Database configuration details from the property file so that switching from one database to another is quick and easy.
db.properties
1 2 3 4 5 6 7 8 9 10 11 12 13 |
#mysql DB properties DB_DRIVER_CLASS=com.mysql.jdbc.Driver DB_URL=jdbc:mysql://localhost:3306/UserDB #DB_URL=jdbc:mysql://localhost:3306/UserDB?rewriteBatchedStatements=true DB_USERNAME=pankaj DB_PASSWORD=pankaj123 #Oracle DB Properties #DB_DRIVER_CLASS=oracle.jdbc.driver.OracleDriver #DB_URL=jdbc:oracle:thin:@localhost:1871:UserDB #DB_USERNAME=scott #DB_PASSWORD=tiger |
Before we move into actual JDBC batch insert example to insert bulk data into the Employee table, let’s write a simple utility class to get the database connection.
DBConnection.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
package com.journaldev.jdbc.batch; import java.io.FileInputStream; import java.io.IOException; import java.sql.Connection; import java.sql.DriverManager; import java.sql.SQLException; import java.util.Properties; public class DBConnection { public static Connection getConnection() { Properties props = new Properties(); FileInputStream fis = null; Connection con = null; try { fis = new FileInputStream("db.properties"); props.load(fis); // load the Driver Class Class.forName(props.getProperty("DB_DRIVER_CLASS")); // create the connection now con = DriverManager.getConnection(props.getProperty("DB_URL"), props.getProperty("DB_USERNAME"), props.getProperty("DB_PASSWORD")); } catch (IOException e) { e.printStackTrace(); } catch (ClassNotFoundException e) { e.printStackTrace(); } catch (SQLException e) { e.printStackTrace(); } return con; } } |
Now let’s look at the different approach we can take for JDBC batch insert example.
- Use Statement to execute one query at a time.
JDBCStatement.java
12345678910111213141516171819202122232425262728293031package com.journaldev.jdbc.batch;import java.sql.Connection;import java.sql.SQLException;import java.sql.Statement;public class JDBCStatement {public static void main(String[] args) {Connection con = null;Statement stmt = null;try {con = DBConnection.getConnection();stmt = con.createStatement();long start = System.currentTimeMillis();for(int i =0; i<10000;i++){String query = "insert into Employee values ("+i+",'Name"+i+"')";stmt.execute(query);}System.out.println("Time Taken="+(System.currentTimeMillis()-start));} catch (SQLException e) {e.printStackTrace();}finally{try {stmt.close();con.close();} catch (SQLException e) {e.printStackTrace();}}}} - Use PreparedStatement to execute one query at a time.
JDBCPreparedStatement.java
123456789101112131415161718192021222324252627282930313233package com.journaldev.jdbc.batch;import java.sql.Connection;import java.sql.PreparedStatement;import java.sql.SQLException;public class JDBCPreparedStatement {public static void main(String[] args) {Connection con = null;PreparedStatement ps = null;String query = "insert into Employee (empId, name) values (?,?)";try {con = DBConnection.getConnection();ps = con.prepareStatement(query);long start = System.currentTimeMillis();for(int i =0; i<10000;i++){ps.setInt(1, i);ps.setString(2, "Name"+i);ps.executeUpdate();}System.out.println("Time Taken="+(System.currentTimeMillis()-start));} catch (SQLException e) {e.printStackTrace();}finally{try {ps.close();con.close();} catch (SQLException e) {e.printStackTrace();}}}}
This approach is similar to using Statement but PreparedStatement provides performance benefits and avoids SQL injection attacks. - Using Statement Batch API for bulk processing.
JDBCStatementBatch.java
1234567891011121314151617181920212223242526272829303132333435package com.journaldev.jdbc.batch;import java.sql.Connection;import java.sql.SQLException;import java.sql.Statement;public class JDBCStatementBatch {public static void main(String[] args) {Connection con = null;Statement stmt = null;try {con = DBConnection.getConnection();stmt = con.createStatement();long start = System.currentTimeMillis();for(int i =0; i<10000;i++){String query = "insert into Employee values ("+i+",'Name"+i+"')";stmt.addBatch(query);//execute and commit batch of 1000 queriesif(i%1000 ==0) stmt.executeBatch();}//commit remaining queries in the batchstmt.executeBatch();System.out.println("Time Taken="+(System.currentTimeMillis()-start));} catch (SQLException e) {e.printStackTrace();}finally{try {stmt.close();con.close();} catch (SQLException e) {e.printStackTrace();}}}}
We are processing 10,000 records with a batch size of 1000 records. Once the batch size reaches, we are executing it and continue processing remaining queries. - Using PreparedStatement Batch Processing API for bulk queries.
JDBCPreparedStatementBatch.java
1234567891011121314151617181920212223242526272829303132333435package com.journaldev.jdbc.batch;import java.sql.Connection;import java.sql.PreparedStatement;import java.sql.SQLException;public class JDBCPreparedStatementBatch {public static void main(String[] args) {Connection con = null;PreparedStatement ps = null;String query = "insert into Employee (empId, name) values (?,?)";try {con = DBConnection.getConnection();ps = con.prepareStatement(query);long start = System.currentTimeMillis();for(int i =0; i<10000;i++){ps.setInt(1, i);ps.setString(2, "Name"+i);ps.addBatch();if(i%1000 == 0) ps.executeBatch();}ps.executeBatch();System.out.println("Time Taken="+(System.currentTimeMillis()-start));} catch (SQLException e) {e.printStackTrace();}finally{try {ps.close();con.close();} catch (SQLException e) {e.printStackTrace();}}}}
Let’s see how our programs work with MySQL database, I have executed them separately multiple times and below table contains the results.
MySQL DB | Statement | PreparedStatement | Statement Batch | PreparedStatement Batch |
---|---|---|---|---|
Time Taken (ms) | 8256 | 8130 | 7129 | 7019 |
When I looked at the response time, I was not sure whether it’s right because I was expecting some good response time improvements with Batch Processing. So I looked online for some explanation and found out that by default MySQL batch processing works in a similar way like running without batch.
To get the actual benefits of Batch Processing in MySQL, we need to pass rewriteBatchedStatements
as TRUE while creating the DB connection. Look at the MySQL URL above in db.properties file for this.
With rewriteBatchedStatements
as true
, below table provides the response time for the same programs.
MySQL DB | Statement | PreparedStatement | Statement Batch | PreparedStatement Batch |
---|---|---|---|---|
Time Taken (ms) | 5676 | 5570 | 3716 | 394 |
As you can see that PreparedStatement Batch Processing is very fast when rewriteBatchedStatements is true. So if you have a lot of batch processing involved, you should use this feature for faster processing.
Oracle Batch Insert
When I executed above programs for Oracle database, the results were in line with MySQL processing results and PreparedStatement Batch processing was much faster than any other approach.
JDBC Batch Processing Exceptions
Let’s see how batch programs behave in case one of the queries throw exceptions.
JDBCBatchExceptions.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
package com.journaldev.jdbc.batch; import java.sql.Connection; import java.sql.PreparedStatement; import java.sql.SQLException; import java.util.Arrays; public class JDBCBatchExceptions { public static void main(String[] args) { Connection con = null; PreparedStatement ps = null; String query = "insert into Employee (empId, name) values (?,?)"; try { con = DBConnection.getConnection(); ps = con.prepareStatement(query); String name1 = "Pankaj"; String name2="Pankaj Kumar"; //longer than column length String name3="Kumar"; ps.setInt(1, 1); ps.setString(2, name1); ps.addBatch(); ps.setInt(1, 2); ps.setString(2, name2); ps.addBatch(); ps.setInt(1, 3); ps.setString(2, name3); ps.addBatch(); int[] results = ps.executeBatch(); System.out.println(Arrays.toString(results)); } catch (SQLException e) { e.printStackTrace(); }finally{ try { ps.close(); con.close(); } catch (SQLException e) { e.printStackTrace(); } } } } |
When I executed the above program for MySQL database, I got below exception and none of the records were inserted in the table.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'name' at row 2 at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2939) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1623) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1715) at com.mysql.jdbc.Connection.execSQL(Connection.java:3249) at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1268) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1541) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1455) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1440) at com.mysql.jdbc.PreparedStatement.executeBatchedInserts(PreparedStatement.java:1008) at com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:908) at com.journaldev.jdbc.batch.JDBCBatchExceptions.main(JDBCBatchExceptions.java:37) |
When executed the same program for Oracle database, I got below exception.
1 2 3 4 5 6 |
java.sql.BatchUpdateException: ORA-12899: value too large for column "SCOTT"."EMPLOYEE"."NAME" (actual: 12, maximum: 10) at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10070) at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:213) at com.journaldev.jdbc.batch.JDBCBatchExceptions.main(JDBCBatchExceptions.java:38) |
But the rows before exception were inserted into the database successfully. Although the exception clearly says what the error is but it doesn’t tell us which query is causing the issue. So either we validate the data before adding them for batch processing or we should use JDBC Transaction Management to make sure all or none of the records are getting inserted in case of exceptions.
Same program with JDBC transaction management looks like below.
JDBCBatchExceptions.java
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
package com.journaldev.jdbc.batch; import java.sql.Connection; import java.sql.PreparedStatement; import java.sql.SQLException; import java.util.Arrays; public class JDBCBatchExceptions { public static void main(String[] args) { Connection con = null; PreparedStatement ps = null; String query = "insert into Employee (empId, name) values (?,?)"; try { con = DBConnection.getConnection(); con.setAutoCommit(false); ps = con.prepareStatement(query); String name1 = "Pankaj"; String name2="Pankaj Kumar"; //longer than column length String name3="Kumar"; ps.setInt(1, 1); ps.setString(2, name1); ps.addBatch(); ps.setInt(1, 2); ps.setString(2, name2); ps.addBatch(); ps.setInt(1, 3); ps.setString(2, name3); ps.addBatch(); int[] results = ps.executeBatch(); con.commit(); System.out.println(Arrays.toString(results)); } catch (SQLException e) { e.printStackTrace(); try { con.rollback(); } catch (SQLException e1) { e1.printStackTrace(); } }finally{ try { ps.close(); con.close(); } catch (SQLException e) { e.printStackTrace(); } } } } |
As you can see that I am rolling back the transaction if any SQL exception comes. If the batch processing is successful, I am explicitly committing the transaction.
Summary
That’s all for JDBC Batch insert update example, make sure to experiment with your data to get the optimal value of batch size for bulk queries. One of the limitations of JDBC batch processing is that we can’t execute different type of queries in the batch.