Analytics - Strange Problem

yeppa! now analytics works!
thanks!
but analytics count only the video viewed with the kaltura player and not the video viewed through the api.
I’m using another player to view the content and get them with api (es: http://video.xxxxx.xxx/p/106/sp/0/playManifest/entryId/0_kg16uygk/format/url/flavorParamId/32/video.mp4), but not appear in statistics.
That is right?
How I can get analytics in that case?
Regards
Sam

You will have to do it yourself using the api

  1. Look on the stats service , collect function
  2. Look on the statistics plugin of kaltura to see what they do: https://github.com/kaltura/kdp/blob/master/plugins/statisticsPlugin/src/com/kaltura/kdpfl/plugin/component/StatisticsMediator.as

I have had some issues with the mysql driver supplied by OpenJDK and so proceeded to install the official Java package as suggested by Orly.

Connecting to the DB is now successful, but two ETL jobs report errors.

etl_daily:

ERROR 13-11 16:06:11,737 - Execute SQL script - Unexpected error
ERROR 13-11 16:06:11,737 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131106)

Table has no partition for value 20131106

at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:208)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
at java.lang.Thread.run(Unknown Source)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131106)

Table has no partition for value 20131106

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.Database.execStatements(Database.java:1800)
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:182)
... 2 more

Caused by: java.sql.SQLException: Table has no partition for value 20131106
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:824)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:667)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
… 5 more

etl_perform_retention_policy

ERROR 13-11 16:06:38,576 - Move old partitions to archive - An error occurred executing this job entry :
Couldn’t execute SQL: CALL move_innodb_to_archive()

Cannot remove all partitions, use DROP TABLE instead

The hourly and update_dimensions ETL jobs appear to be working as intended.
The analytics dashboard remains conspicuously empty, likely due to the errors I am now reporting.

Look on my comment from November 3rd

Actually, etl_update_dims can’t connect to the DB.

ERROR 13-11 21:00:13,435 - Update Dimensions - A serious error occurred during job execution: org.pentaho.di.core.exception.KettleDatabaseException: Error occured while trying to connect to the database

Error connecting to database: (using class org.gjt.mm.mysql.Driver)
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ‘???’ at line 1

  1. I would consider dropping the entire kalturadw db and running the entire ddl script
  2. Are you using openJDK or official Java driver?

I have installed the official Java environment as per Orly’s recommendation at the head of this thread.

I have the same problem. Have you solved it?

Hi,
I follow the instruction on page:
https://github.com/kaltura/ce-packager/issues/35
and all the post in this page.
I have the record in the table kalturadw.dwh_dim_entries
no error in the file etl_update_dims-YYYTMMDD-HH.log
but in etl_daily-YYYTMMDD.log
I have this error:

INFO 23-10 00:59:09,604 - Table input - Finished processing (I=10, O=0, R=0, W=10, U=0, E=0)
ERROR 23-10 00:59:09,668 - Execute SQL script - Unexpected error
ERROR 23-10 00:59:09,668 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:208)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
at java.lang.Thread.run(Thread.java:679)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.Database.execStatements(Database.java:1800)
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:182)
... 2 more

Caused by: java.sql.SQLException: Table has no partition for value 20131001
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:824)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:667)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
… 5 more

INFO 23-10 00:59:09,672 - Execute SQL script - Finished reading query, closing connection.
ERROR 23-10 00:59:09,672 - calc_storage_usage - Errors detected!
ERROR 23-10 00:59:09,673 - calc_storage_usage - Errors detected!
INFO 23-10 00:59:09,674 - Execute SQL script - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Calc storage usage] (result=[false])
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Add Storage Usage Missing Days] (result=[false])
INFO 23-10 00:59:09,717 - execute kjb file - Starting entry [Write To Log on error]
INFO 23-10 00:59:09,719 - Status - Failed storage_usage/calc_storage_usage.kjb

I’ve Kaltura CE 6.2.0 dev packager version. on centos 6.4
Any idea to fix the problem?
Regards
Sam

I have the same problem. Have you solved it?

Hey, actually, I got same problem… have you come across any solution?

Tanks, my site works now

I actually just found a fix to this problem. It has also been plaguing me for months and I’d like to share my solution here.

The first step is from the terminal run:

mysql_upgrade -uroot -p

Enter your password and any issues with your Kaltura tables should be fixed. Please see this page for references on msql_upgrade: http://dev.mysql.com/doc/refman/5.5/en/mysql-upgrade.html

That should do it. To test it out run:

/opt/kaltura/dwh/etlsource/execute/etl_daily.sh

It will take a while if calc_storage_usage hasn’t worked for a while. Mine took 2 mins for 6 months of data. Now all the storage usage stats are working in analytics!

Good luck!

what I did to solve it, was to upgrade kaltura to the latest version
which it comes with an upgrade oh the pentaho

and that did it

I did upgrade to the latest version and the problem still existed, so unsure why it didn’t get fixed, but all good!

I do have a question for you though if you can help… Plays and Views statistics are not being updated at all (Top Content, Content Drop-off and Content Interactions). I’m trying to find the exact bug, but looks as though dwh isn’t updating these stats successfully into the database to start… nothing in kaltlog at all…

Do you have any idea @aquileasfx1

Hello, I was able to solve in one instance on my virtual machine
but not in a different instance of kaltura in other machine

I have the same problem as you, also for total entries I have 0

Hi @aquileasfx1. Turns out that after waiting for logrotate and daily crons (overnight) the Plays and Views updated! Old Plays and Views stats aren’t showing, but the new ones from the day I fixed the above issue are logging perfectly now. I’ve got a handle on what needs to be where for these stats to show up, have spent alot of time analysing how the cron jobs work as well as picking apart any errors in all Kaltura logs to hunt down any bugs. My kaltlog is now perfectly free from errors (apart from hacker attempts), I’m really happy, it’s the first time in years that I have a perfectly working copy of Kaltura CE. I’m almost nervous to upgrade to Jupiter when it is ready, and will take a snapshot of my VM prior to testing it out.

thats very great to hear!

i happens me the same!

I can see plays now, from now on but not from older days
I will also try to find a way to recover plays from past days

and share with you if I find anything.

Hi @aquileasfx1, I looked into updating older plays, but the reason it is difficult is because the logs have already been rotated. I tried to uncompress old rotated logs, rename them and put them back in the original folder, but they weren’t processed, only rotated out again. It doesn’t worry me, at least all the stats are working from this point on. I haven’t had any issues for 22 days now.

Hello,

If you want to reprocess older days just go to /opt/kaltura/web/logs and rename the files.
For instance, rename web01-kaltura_apache_access.log-20190312-03.gz to web01-kaltura_apache_access.log-20190312-04.gz

That should do the trick.

Hope this helps,

David