Analytics Issue - Streaming result set com.mysql.jdbc.RowDataDynamic@ee075cf4 is still active - Please Help!

We currently have Kaltura Community Edition 10.21.0 setup on a cloud server with Red Hat Enterprise Linux 6 (PV)

this has been running fine over the past few months, however last week the drive space completly filled 100% and after spending some time clearing some space, we have noticed that the stats no longer work.

Looking in the log files, I’m seering errors like :-

KalturaLogs - Error disconnecting from database

and

ERROR 14-03 13:00:14,932 - KalturaLogs - org.pentaho.di.core.exception.KettleDatabaseException:
Error comitting connection
Streaming result set com.mysql.jdbc.RowDataDynamic@e213f663 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

at org.pentaho.di.core.database.Database.commit(Database.java:711)
at org.pentaho.di.core.database.Database.commit(Database.java:680)
at org.pentaho.di.core.database.Database.disconnect(Database.java:571)
at org.pentaho.di.job.Job.beginProcessing(Job.java:876)
at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
at org.pentaho.di.job.Job.execute(Job.java:503)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:420)
at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
at java.lang.Thread.run(Thread.java:809)

Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@e213f663 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2531)
at com.mysql.jdbc.ConnectionImpl.commit(ConnectionImpl.java:1612)
at org.pentaho.di.core.database.Database.commit(Database.java:700)
… 11 more

ERROR 14-03 13:00:14,933 - Run kjbVar job - Error running job entry ‘job’ :
ERROR 14-03 13:00:14,934 - Run kjbVar job - org.pentaho.di.core.exception.KettleJobException:
Unable to begin processing by logging start in logtable etl_log

Couldn’t execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Streaming result set com.mysql.jdbc.RowDataDynamic@e213f663 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

at org.pentaho.di.job.Job.beginProcessing(Job.java:874)
at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
at org.pentaho.di.job.Job.execute(Job.java:503)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:420)
at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
at java.lang.Thread.run(Thread.java:809)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Can somebody please tell me what i need to do to fix this issue.

Many Thanks

Hi,

Please run /opt/kaltura/bin/kaltura-run-dwh.sh

Then let us have another look at /opt/kaltura/dwh/logs/ and see if there are strings start with “ERROR:” and look at the lines leading to them. Please post the specific log in which you’ve seen the final error as well as the lines leading to it.

Thanks,

Hi Jess,

I’ve run /opt/kaltura/bin/kaltura-run-dwh.sh and this is what appears in the hourly log

/opt/kaltura/pentaho/pdi/kitchen.sh: line 11: cd: /root: Permission denied
INFO 18-03 08:01:41,852 - Kitchen - Start of run.
INFO 18-03 08:01:42,016 - hourly - Start of job execution
INFO 18-03 08:01:42,022 - hourly - Starting entry [Register etl_server]
INFO 18-03 08:01:42,024 - Register etl_server - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/register_etl_server.ktr]
INFO 18-03 08:01:42,136 - register_etl_server - Dispatching started for transformation [register_etl_server]
INFO 18-03 08:01:42,259 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO 18-03 08:01:42,683 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO 18-03 08:01:42,689 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,758 - Set hostname command - Optimization level not specified. Using default of 9.
INFO 18-03 08:01:42,911 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,935 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,938 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,940 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,941 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,963 - Copy rows to result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,965 - Register server - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:42,969 - hourly - Starting entry [Seize hourly etl server lock]
INFO 18-03 08:01:42,971 - Seize hourly etl server lock - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/seize_lock_by_name.ktr]
INFO 18-03 08:01:43,027 - seize_lock_by_name - Dispatching started for transformation [seize_lock_by_name]
INFO 18-03 08:01:43,123 - get_etl_server_hourly_lock_name - Dispatching started for transformation [get_etl_server_hourly_lock_name]
INFO 18-03 08:01:43,181 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO 18-03 08:01:43,215 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO 18-03 08:01:43,222 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,223 - Set hostname command - Optimization level not specified. Using default of 9.
INFO 18-03 08:01:43,225 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,285 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,293 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,294 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,296 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,297 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,299 - Add hourly lock prefix - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,306 - Create hourly lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,307 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,308 - Get lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,309 - Create lock states - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,313 - Register lock - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,316 - Get free lock - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,317 - Lock is free - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,318 - Get Now - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,322 - Seize lock - Finished processing (I=1, O=0, R=1, W=1, U=1, E=0)
INFO 18-03 08:01:43,332 - hourly - Starting entry [Set jobs sequence file name]
INFO 18-03 08:01:43,333 - hourly - Starting entry [Execute hourly sequence]
INFO 18-03 08:01:43,368 - Sequence executioner - Starting entry [get jobs]
INFO 18-03 08:01:43,371 - get jobs - Loading transformation from XML file [file:///opt/kaltura/dwh/etlsource/execute/get_jobs.ktr]
INFO 18-03 08:01:43,420 - get_jobs - Dispatching started for transformation [get_jobs]
INFO 18-03 08:01:43,439 - get etls - Opening file: /opt/kaltura/dwh/etlsource/execute/hourly_etl_execution_sequence.txt
INFO 18-03 08:01:43,442 - get etls - Finished processing (I=1, O=0, R=0, W=1, U=1, E=0)
INFO 18-03 08:01:43,444 - Filter # - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,446 - Copy rows to result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,448 - Sequence executioner - Starting entry [execute kettle-job file]
INFO 18-03 08:01:43,478 - execute kjb file - Starting entry [set kjb and log path into vars]
INFO 18-03 08:01:43,480 - set kjb and log path into vars - Loading transformation from XML file [file:///opt/kaltura/dwh/etlsource/execute/set_file_var.ktr]
INFO 18-03 08:01:43,540 - set_file_var - Dispatching started for transformation [set_file_var]
INFO 18-03 08:01:43,547 - Get rows from result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,549 - jobName from fileName - Optimization level not specified. Using default of 9.
INFO 18-03 08:01:43,585 - Set Variables - Setting environment variables…
INFO 18-03 08:01:43,586 - Set Variables - Set variable fileVar to value [events/events.kjb]
INFO 18-03 08:01:43,586 - Set Variables - Set variable jobVar to value [events_events]
INFO 18-03 08:01:43,589 - jobName from fileName - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,591 - Set Variables - Finished after 1 rows.
INFO 18-03 08:01:43,591 - Set Variables - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:43,596 - execute kjb file - Starting entry [Write To Log - start job]
INFO 18-03 08:01:43,600 - Status - Executing events/events.kjb
Etl base path: /opt/kaltura/dwh/etlsource/
Writing to log [/opt/kaltura/dwh/logs/log_events_events]

INFO 18-03 08:01:43,602 - execute kjb file - Starting entry [Run kjbVar job]
ERROR 18-03 08:01:43,791 - KalturaLogs - Error disconnecting from database:

Error comitting connection
Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

ERROR 18-03 08:01:43,792 - KalturaLogs - org.pentaho.di.core.exception.KettleDatabaseException:
Error comitting connection
Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

at org.pentaho.di.core.database.Database.commit(Database.java:711)
at org.pentaho.di.core.database.Database.commit(Database.java:680)
at org.pentaho.di.core.database.Database.disconnect(Database.java:571)
at org.pentaho.di.job.Job.beginProcessing(Job.java:876)
at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
at org.pentaho.di.job.Job.execute(Job.java:503)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:420)
at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
at java.lang.Thread.run(Thread.java:809)

Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2531)
at com.mysql.jdbc.ConnectionImpl.commit(ConnectionImpl.java:1612)
at org.pentaho.di.core.database.Database.commit(Database.java:700)
… 11 more

ERROR 18-03 08:01:43,793 - Run kjbVar job - Error running job entry ‘job’ :
ERROR 18-03 08:01:43,795 - Run kjbVar job - org.pentaho.di.core.exception.KettleJobException:
Unable to begin processing by logging start in logtable etl_log

Couldn’t execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

at org.pentaho.di.job.Job.beginProcessing(Job.java:874)
at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
at org.pentaho.di.job.Job.execute(Job.java:503)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:642)
at org.pentaho.di.job.Job.execute(Job.java:420)
at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
at java.lang.Thread.run(Thread.java:809)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchIdUsingLockTables(BaseDatabaseMeta.java:1738)
at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchId(BaseDatabaseMeta.java:1763)
at org.pentaho.di.core.database.DatabaseMeta.getNextBatchId(DatabaseMeta.java:2600)
at org.pentaho.di.job.Job.beginProcessing(Job.java:805)
... 8 more

Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@bb64c6bc is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2535)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1911)
at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1203)
at com.mysql.jdbc.StatementImpl.createResultSetUsingServerFetch(StatementImpl.java:666)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:810)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:742)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
… 13 more

INFO 18-03 08:01:43,802 - execute kjb file - Starting entry [Write To Log on error]
INFO 18-03 08:01:43,804 - Status - Failed events/events.kjb

INFO 18-03 08:01:43,807 - execute kjb file - Starting entry [Abort job 1]
ERROR 18-03 08:01:43,809 - Abort job 1 - Aborting job.
INFO 18-03 08:01:43,810 - execute kjb file - Finished job entry [Abort job 1] (result=[false])
INFO 18-03 08:01:43,810 - execute kjb file - Finished job entry [Write To Log on error] (result=[false])
INFO 18-03 08:01:43,811 - execute kjb file - Finished job entry [Run kjbVar job] (result=[false])
INFO 18-03 08:01:43,811 - execute kjb file - Finished job entry [Write To Log - start job] (result=[false])
INFO 18-03 08:01:43,811 - execute kjb file - Finished job entry [set kjb and log path into vars] (result=[false])
INFO 18-03 08:01:43,831 - Sequence executioner - Finished job entry [execute kettle-job file] (result=[false])
INFO 18-03 08:01:43,832 - Sequence executioner - Finished job entry [get jobs] (result=[false])
INFO 18-03 08:01:43,844 - hourly - Starting entry [Release hourly etl server lock ]
INFO 18-03 08:01:43,846 - Release hourly etl server lock - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/release_lock_by_name.ktr]
INFO 18-03 08:01:43,886 - release_lock_by_name - Dispatching started for transformation [release_lock_by_name]
INFO 18-03 08:01:43,959 - get_etl_server_hourly_lock_name - Dispatching started for transformation [get_etl_server_hourly_lock_name]
INFO 18-03 08:01:44,012 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO 18-03 08:01:44,031 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO 18-03 08:01:44,033 - Set hostname command - Optimization level not specified. Using default of 9.
INFO 18-03 08:01:44,034 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,044 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,066 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,072 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,074 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,076 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,076 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,077 - Add hourly lock prefix - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,079 - Create hourly lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,081 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,082 - Get lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,083 - Set lock free state - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,085 - Get Now - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 18-03 08:01:44,089 - Free lock - Finished processing (I=1, O=0, R=1, W=1, U=1, E=0)
INFO 18-03 08:01:44,094 - hourly - Starting entry [Success 1]
INFO 18-03 08:01:44,095 - hourly - Finished job entry [Success 1] (result=[true])
INFO 18-03 08:01:44,095 - hourly - Finished job entry [Release hourly etl server lock ] (result=[true])
INFO 18-03 08:01:44,096 - hourly - Finished job entry [Execute hourly sequence] (result=[true])
INFO 18-03 08:01:44,096 - hourly - Finished job entry [Set jobs sequence file name] (result=[true])
INFO 18-03 08:01:44,096 - hourly - Finished job entry [Seize hourly etl server lock] (result=[true])
INFO 18-03 08:01:44,096 - hourly - Finished job entry [Register etl_server] (result=[true])
INFO 18-03 08:01:44,096 - hourly - Job execution finished
INFO 18-03 08:01:44,097 - Kitchen - Finished!
INFO 18-03 08:01:44,098 - Kitchen - Start=2016/03/18 08:01:41.854, Stop=2016/03/18 08:01:44.097
INFO 18-03 08:01:44,098 - Kitchen - Processing ended after 2 seconds.

Thanks