Analytics not working on version 10.16

Analytics has actually not been wokring since I was on 10.4. I would like to stay on 10.16 just for now please so I hope you will bear with me. I have read https://github.com/kaltura/platform-install-packages/blob/Kajam-11.0.0/doc/kaltura-packages-faq.md#analytics-issues and followed all the steps up to dropping the DBs (which I have tried, but then it always gives me an error on kaltura-dwh-config.sh and causes a lot of problems, therefore I just rolled it back on my Virtual Machine, we use SCVMM here)

When I do follow those steps the /opt/kaltura/dwh/logs/log_events_events.log shows this
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) :
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) :
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.beginProcessing(Job.java:874)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:503)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:642)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:642)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:642)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.execute(Job.java:420)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at java.lang.Thread.run(Thread.java:745)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Couldn’t execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) :
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) :
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchIdUsingLockTables(BaseDatabaseMeta.java:1738)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchId(BaseDatabaseMeta.java:1763)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.DatabaseMeta.getNextBatchId(DatabaseMeta.java:2600)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.job.Job.beginProcessing(Job.java:805)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : … 8 more
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2535)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1911)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1203)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.StatementImpl.createResultSetUsingServerFetch(StatementImpl.java:666)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:810)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:742)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
2016/08/12 10:06:20 - Run kjbVar job - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : … 13 more

etl_hourly log shows
/opt/kaltura/pentaho/pdi/kitchen.sh: line 11: cd: /root: Permission denied
INFO 12-08 10:06:10,613 - Kitchen - Start of run.
INFO 12-08 10:06:10,956 - hourly - Start of job execution
INFO 12-08 10:06:10,965 - hourly - Starting entry [Register etl_server]
INFO 12-08 10:06:10,969 - Register etl_server - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/register_etl_server.ktr]
INFO 12-08 10:06:11,211 - register_etl_server - Dispatching started for transformation [register_etl_server]
INFO 12-08 10:06:11,458 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO 12-08 10:06:12,406 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO 12-08 10:06:12,416 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:12,588 - Set hostname command - Optimization level not specified. Using default of 9.
INFO 12-08 10:06:12,896 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:12,914 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:12,918 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:12,924 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:12,925 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,029 - Copy rows to result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,029 - Register server - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,035 - hourly - Starting entry [Seize hourly etl server lock]
INFO 12-08 10:06:13,036 - Seize hourly etl server lock - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/seize_lock_by_name.ktr]
INFO 12-08 10:06:13,171 - seize_lock_by_name - Dispatching started for transformation [seize_lock_by_name]
INFO 12-08 10:06:13,307 - get_etl_server_hourly_lock_name - Dispatching started for transformation [get_etl_server_hourly_lock_name]
INFO 12-08 10:06:13,398 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO 12-08 10:06:13,415 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO 12-08 10:06:13,426 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,428 - Set hostname command - Optimization level not specified. Using default of 9.
INFO 12-08 10:06:13,435 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,452 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,459 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,464 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,471 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,475 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,478 - Add hourly lock prefix - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,483 - Create hourly lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,486 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,487 - Get lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,492 - Create lock states - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,510 - Register lock - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,518 - Get free lock - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,523 - Lock is free - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,526 - Get Now - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,533 - Seize lock - Finished processing (I=1, O=0, R=1, W=1, U=1, E=0)
INFO 12-08 10:06:13,541 - hourly - Starting entry [Set jobs sequence file name]
INFO 12-08 10:06:13,544 - hourly - Starting entry [Execute hourly sequence]
INFO 12-08 10:06:13,583 - Sequence executioner - Starting entry [get jobs]
INFO 12-08 10:06:13,584 - get jobs - Loading transformation from XML file [file:///opt/kaltura/dwh/etlsource/execute/get_jobs.ktr]
INFO 12-08 10:06:13,653 - get_jobs - Dispatching started for transformation [get_jobs]
INFO 12-08 10:06:13,681 - get etls - Opening file: /opt/kaltura/dwh/etlsource/execute/hourly_etl_execution_sequence.txt
INFO 12-08 10:06:13,694 - get etls - Finished processing (I=1, O=0, R=0, W=1, U=1, E=0)
INFO 12-08 10:06:13,703 - Filter # - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,707 - Copy rows to result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,710 - Sequence executioner - Starting entry [execute kettle-job file]
INFO 12-08 10:06:13,771 - execute kjb file - Starting entry [set kjb and log path into vars]
INFO 12-08 10:06:13,772 - set kjb and log path into vars - Loading transformation from XML file [file:///opt/kaltura/dwh/etlsource/execute/set_file_var.ktr]
INFO 12-08 10:06:13,803 - set_file_var - Dispatching started for transformation [set_file_var]
INFO 12-08 10:06:13,813 - Get rows from result - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,816 - jobName from fileName - Optimization level not specified. Using default of 9.
INFO 12-08 10:06:13,838 - Set Variables - Setting environment variables…
INFO 12-08 10:06:13,838 - Set Variables - Set variable fileVar to value [events/events.kjb]
INFO 12-08 10:06:13,838 - Set Variables - Set variable jobVar to value [events_events]
INFO 12-08 10:06:13,843 - jobName from fileName - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,848 - Set Variables - Finished after 1 rows.
INFO 12-08 10:06:13,849 - Set Variables - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO 12-08 10:06:13,855 - execute kjb file - Starting entry [Write To Log - start job]
INFO 12-08 10:06:13,858 - Status - Executing events/events.kjb
Etl base path: /opt/kaltura/dwh/etlsource/
Writing to log [/opt/kaltura/dwh/logs/log_events_events]

INFO  12-08 10:06:13,860 - execute kjb file - Starting entry [Run kjbVar job]
ERROR 12-08 10:06:20,398 - KalturaLogs - Error disconnecting from database:

Error comitting connection
Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

ERROR 12-08 10:06:20,400 - KalturaLogs - org.pentaho.di.core.exception.KettleDatabaseException:
Error comitting connection
Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

        at org.pentaho.di.core.database.Database.commit(Database.java:711)
        at org.pentaho.di.core.database.Database.commit(Database.java:680)
        at org.pentaho.di.core.database.Database.disconnect(Database.java:571)
        at org.pentaho.di.job.Job.beginProcessing(Job.java:876)
        at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
        at org.pentaho.di.job.Job.execute(Job.java:503)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:420)
        at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2531)
        at com.mysql.jdbc.ConnectionImpl.commit(ConnectionImpl.java:1612)
        at org.pentaho.di.core.database.Database.commit(Database.java:700)
        ... 11 more

ERROR 12-08 10:06:20,400 - Run kjbVar job - Error running job entry 'job' :
ERROR 12-08 10:06:20,401 - Run kjbVar job - org.pentaho.di.core.exception.KettleJobException:
Unable to begin processing by logging start in logtable etl_log

Couldn't execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.


        at org.pentaho.di.job.Job.beginProcessing(Job.java:874)
        at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:786)
        at org.pentaho.di.job.Job.execute(Job.java:503)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:642)
        at org.pentaho.di.job.Job.execute(Job.java:420)
        at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:63)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn't execute SQL: DELETE FROM etl_log WHERE ID_JOB= -1

Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.

        at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
        at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
        at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchIdUsingLockTables(BaseDatabaseMeta.java:1738)
        at org.pentaho.di.core.database.BaseDatabaseMeta.getNextBatchId(BaseDatabaseMeta.java:1763)
        at org.pentaho.di.core.database.DatabaseMeta.getNextBatchId(DatabaseMeta.java:2600)
        at org.pentaho.di.job.Job.beginProcessing(Job.java:805)
        ... 8 more
Caused by: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@527ddc43 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:880)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:876)
        at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:3099)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2354)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2535)
        at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1911)
        at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1203)
        at com.mysql.jdbc.StatementImpl.createResultSetUsingServerFetch(StatementImpl.java:666)
        at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:810)
        at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:742)
        at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
        ... 13 more

INFO  12-08 10:06:20,405 - execute kjb file - Starting entry [Write To Log on error]
INFO  12-08 10:06:20,407 - Status - Failed events/events.kjb

INFO  12-08 10:06:20,408 - execute kjb file - Starting entry [Abort job 1]
ERROR 12-08 10:06:20,410 - Abort job 1 - Aborting job.
INFO  12-08 10:06:20,411 - execute kjb file - Finished job entry [Abort job 1] (result=[false])
INFO  12-08 10:06:20,412 - execute kjb file - Finished job entry [Write To Log on error] (result=[false])
INFO  12-08 10:06:20,412 - execute kjb file - Finished job entry [Run kjbVar job] (result=[false])
INFO  12-08 10:06:20,412 - execute kjb file - Finished job entry [Write To Log - start job] (result=[false])
INFO  12-08 10:06:20,412 - execute kjb file - Finished job entry [set kjb and log path into vars] (result=[false])
INFO  12-08 10:06:20,422 - Sequence executioner - Finished job entry [execute kettle-job file] (result=[false])
INFO  12-08 10:06:20,423 - Sequence executioner - Finished job entry [get jobs] (result=[false])
INFO  12-08 10:06:20,434 - hourly - Starting entry [Release hourly etl server lock ]
INFO  12-08 10:06:20,435 - Release hourly etl server lock  - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/release_lock_by_name.ktr]
INFO  12-08 10:06:20,467 - release_lock_by_name - Dispatching started for transformation [release_lock_by_name]
INFO  12-08 10:06:20,500 - get_etl_server_hourly_lock_name - Dispatching started for transformation [get_etl_server_hourly_lock_name]
INFO  12-08 10:06:20,535 - print_hostname_to_stream - Dispatching started for transformation [print_hostname_to_stream]
INFO  12-08 10:06:20,546 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO  12-08 10:06:20,556 - Set hostname command - Optimization level not specified.  Using default of 9.
INFO  12-08 10:06:20,556 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,565 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,584 - Set hostname command - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,593 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,597 - Select hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,601 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,602 - Get hostname - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,604 - Add hourly lock prefix - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,609 - Create hourly lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,615 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,616 - Get lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,618 - Set lock free state - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,624 - Get Now - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  12-08 10:06:20,628 - Free lock - Finished processing (I=1, O=0, R=1, W=1, U=1, E=0)
INFO  12-08 10:06:20,633 - hourly - Starting entry [Success 1]
INFO  12-08 10:06:20,635 - hourly - Finished job entry [Success 1] (result=[true])
INFO  12-08 10:06:20,635 - hourly - Finished job entry [Release hourly etl server lock ] (result=[true])
INFO  12-08 10:06:20,635 - hourly - Finished job entry [Execute hourly sequence] (result=[true])
INFO  12-08 10:06:20,636 - hourly - Finished job entry [Set jobs sequence file name] (result=[true])
INFO  12-08 10:06:20,636 - hourly - Finished job entry [Seize hourly etl server lock] (result=[true])
INFO  12-08 10:06:20,636 - hourly - Finished job entry [Register etl_server] (result=[true])
INFO  12-08 10:06:20,636 - hourly - Job execution finished
INFO  12-08 10:06:20,639 - Kitchen - Finished!
INFO  12-08 10:06:20,640 - Kitchen - Start=2016/08/12 10:06:10.615, Stop=2016/08/12 10:06:20.639
INFO  12-08 10:06:20,640 - Kitchen - Processing ended after 10 seconds.

As always, thank you for any help you can provide.

Any thoughts on this issue?

Hi @rpelletier,

Can you try to remove these two lines from /opt/kaltura/dwh/etlsource/events/events.kjb
KalturaLogs
etl_log

and rerun?

Yesterday I decided the only way to move forward was to push through and get kaltura upgraded to the latest version. It took a while to hunt down and fix several errors during the process, but I was finally able to get kaltura upgraded to 11.21. I had to completely drop the dwh databases according to this article. https://github.com/kaltura/platform-install-packages/blob/Kajam-11.21.0/doc/kaltura-packages-faq.md#analytics-issues

When I tried to drop the tables I kept getting this error “ERROR 1051 (42S02): Unknown table” for many of the tables. The only solution ended up being to delete the files using rm -Rf /var/lib/mysql/kalturadw/*

I also kept getting quite a few PHP Fatal errors. I made a script based on similar problems others have faced and it worked.

#!/bin/bash
rm -Rf /opt/kaltura/app/base-config-generator.lock
rm -Rf /opt/kaltura/app/base-config.lock
service httpd stop
service memcached stop
service kaltura-monit stop
php /opt/kaltura/app/generator/generate.php
php /opt/kaltura/app/deployment/base/scripts/installPlugins.php
service kaltura-monit start
service httpd start
service memcached start

I then had to delete my cache because I was now getting permission errors
> find /opt/kaltura/app/cache/ -type f -exec rm {} \;

My Kaltura ENV is now 11.21. I no longer get any errors in my DWH logs. However, the only statistic that works is the Top Contributors Analytics. Nothing else works. I have played several videos (embedded on a separate site) and the stats do not change. Should I open up another thread?

Hi,

Glad to hear you upgraded, better anyhow.
As far as Analytics, remember it does not update in real time. If you wish to force it, call:
# /opt/kaltura/bin/kaltura-run-dwh.sh

I neglected to mention that I have run that script several times without any difference in result.

The result of > grep ERROR /opt/kaltura/dwh/logs/*

/opt/kaltura/dwh/logs/etl_hourly-20160818-10.log:ERROR 18-08 10:08:33,198 - Copy to originals - Because of an error, this step can't continue: Could not copy "file:///opt/kaltura/web/logs/emlnx-media1.emcc.edu-kaltura_apache_access_ssl.log-20160818-10.gz" to "file:///opt/kaltura/dwh/cycles/originals/3/emlnx-media1.emcc.edu-kaltura_apache_access_ssl.log-20160818-10.gz".
/opt/kaltura/dwh/logs/etl_hourly-20160818-10.log:ERROR 18-08 10:08:33,284 - get files for registered cycles - Errors detected!
/opt/kaltura/dwh/logs/etl_hourly-20160818-10.log:ERROR 18-08 10:08:33,285 - get files for registered cycles - Errors detected!
/opt/kaltura/dwh/logs/etl_hourly-20160818-10.log:ERROR 18-08 10:08:33,589 - Abort job 1 - Aborting job.
/opt/kaltura/dwh/logs/log_events_events.log:2016/08/18 10:08:33 - Copy to originals.0 - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Because of an error, this step can't continue: Could not copy "file:///opt/kaltura/web/logs/emlnx-media1.emcc.edu-kaltura_apache_access_ssl.log-20160818-10.gz" to "file:///opt/kaltura/dwh/cycles/originals/3/emlnx-media1.emcc.edu-kaltura_apache_access_ssl.log-20160818-10.gz".
/opt/kaltura/dwh/logs/log_events_events.log:2016/08/18 10:08:33 - get files for registered cycles - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Errors detected!
/opt/kaltura/dwh/logs/log_events_events.log:2016/08/18 10:08:33 - get files for registered cycles - ERROR (version 4.2.1-stable, build 15952 from 2011-10-25 15.27.10 by buildguy) : Errors detected!

Looks like a permissions issue to me,
who owns /opt/kaltura/dwh/cycles/originals/3?

Should be owned by the ‘kaltura’ user.

Nope. owned by root:root, what is proper user:group? what is proper? kaltura:apache or kaltura:kaltura

kaltura.kaltura should be fine.
Apache shouldn’t write to it.

Is there a script that checks the permissions on all kaltura folders. This seems to be a very common issue. Considering I have never copied any of the folders or changed the permissions myself, it is odd that they are incorrect.

In the particular case of /opt/kaltura/dwh, the ownership is set in the RPM spec:

Works fine in all the ENVs I have tested on. If you are able to reproduce this issue with a fresh install, do let me know.

i have the same problem with this one too, tried the solution, not really works, i don’t know what went wrong…

Hello,

This thread discusses several different issues.
What exactly is your current one? Please go over https://github.com/kaltura/platform-install-packages/blob/Kajam-11.21.0/doc/kaltura-packages-faq.md#analytics-issues

Open a new thread and provide log output as well as output for the relevant queries specified in the link above so I can help you further.