Analytics - Strange Problem


Actually, etl_update_dims can’t connect to the DB.

ERROR 13-11 21:00:13,435 - Update Dimensions - A serious error occurred during job execution: org.pentaho.di.core.exception.KettleDatabaseException: Error occured while trying to connect to the database

Error connecting to database: (using class
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ‘???’ at line 1

  1. I would consider dropping the entire kalturadw db and running the entire ddl script
  2. Are you using openJDK or official Java driver?


I have installed the official Java environment as per Orly’s recommendation at the head of this thread.


I have the same problem. Have you solved it?


I follow the instruction on page:
and all the post in this page.
I have the record in the table kalturadw.dwh_dim_entries
no error in the file etl_update_dims-YYYTMMDD-HH.log
but in etl_daily-YYYTMMDD.log
I have this error:

INFO 23-10 00:59:09,604 - Table input - Finished processing (I=10, O=0, R=0, W=10, U=0, E=0)
ERROR 23-10 00:59:09,668 - Execute SQL script - Unexpected error
ERROR 23-10 00:59:09,668 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.core.database.Database.execStatement(
at org.pentaho.di.core.database.Database.execStatement(
at org.pentaho.di.core.database.Database.execStatements(
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(
... 2 more

Caused by: java.sql.SQLException: Table has no partition for value 20131001
at com.mysql.jdbc.SQLError.createSQLException(
at com.mysql.jdbc.MysqlIO.checkErrorPacket(
at com.mysql.jdbc.MysqlIO.checkErrorPacket(
at com.mysql.jdbc.MysqlIO.sendCommand(
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(
at com.mysql.jdbc.ConnectionImpl.execSQL(
at com.mysql.jdbc.ConnectionImpl.execSQL(
at com.mysql.jdbc.StatementImpl.execute(
at com.mysql.jdbc.StatementImpl.execute(
at org.pentaho.di.core.database.Database.execStatement(
… 5 more

INFO 23-10 00:59:09,672 - Execute SQL script - Finished reading query, closing connection.
ERROR 23-10 00:59:09,672 - calc_storage_usage - Errors detected!
ERROR 23-10 00:59:09,673 - calc_storage_usage - Errors detected!
INFO 23-10 00:59:09,674 - Execute SQL script - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Calc storage usage] (result=[false])
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Add Storage Usage Missing Days] (result=[false])
INFO 23-10 00:59:09,717 - execute kjb file - Starting entry [Write To Log on error]
INFO 23-10 00:59:09,719 - Status - Failed storage_usage/calc_storage_usage.kjb

I’ve Kaltura CE 6.2.0 dev packager version. on centos 6.4
Any idea to fix the problem?


I have the same problem. Have you solved it?


Hey, actually, I got same problem… have you come across any solution?


Tanks, my site works now


I actually just found a fix to this problem. It has also been plaguing me for months and I’d like to share my solution here.

The first step is from the terminal run:

mysql_upgrade -uroot -p

Enter your password and any issues with your Kaltura tables should be fixed. Please see this page for references on msql_upgrade:

That should do it. To test it out run:


It will take a while if calc_storage_usage hasn’t worked for a while. Mine took 2 mins for 6 months of data. Now all the storage usage stats are working in analytics!

Good luck!


what I did to solve it, was to upgrade kaltura to the latest version
which it comes with an upgrade oh the pentaho

and that did it


I did upgrade to the latest version and the problem still existed, so unsure why it didn’t get fixed, but all good!

I do have a question for you though if you can help… Plays and Views statistics are not being updated at all (Top Content, Content Drop-off and Content Interactions). I’m trying to find the exact bug, but looks as though dwh isn’t updating these stats successfully into the database to start… nothing in kaltlog at all…

Do you have any idea @aquileasfx1


Hello, I was able to solve in one instance on my virtual machine
but not in a different instance of kaltura in other machine

I have the same problem as you, also for total entries I have 0


Hi @aquileasfx1. Turns out that after waiting for logrotate and daily crons (overnight) the Plays and Views updated! Old Plays and Views stats aren’t showing, but the new ones from the day I fixed the above issue are logging perfectly now. I’ve got a handle on what needs to be where for these stats to show up, have spent alot of time analysing how the cron jobs work as well as picking apart any errors in all Kaltura logs to hunt down any bugs. My kaltlog is now perfectly free from errors (apart from hacker attempts), I’m really happy, it’s the first time in years that I have a perfectly working copy of Kaltura CE. I’m almost nervous to upgrade to Jupiter when it is ready, and will take a snapshot of my VM prior to testing it out.


thats very great to hear!

i happens me the same!

I can see plays now, from now on but not from older days
I will also try to find a way to recover plays from past days

and share with you if I find anything.


Hi @aquileasfx1, I looked into updating older plays, but the reason it is difficult is because the logs have already been rotated. I tried to uncompress old rotated logs, rename them and put them back in the original folder, but they weren’t processed, only rotated out again. It doesn’t worry me, at least all the stats are working from this point on. I haven’t had any issues for 22 days now.



If you want to reprocess older days just go to /opt/kaltura/web/logs and rename the files.
For instance, rename web01-kaltura_apache_access.log-20190312-03.gz to web01-kaltura_apache_access.log-20190312-04.gz

That should do the trick.

Hope this helps,