Analytics - Strange Problem

Hi,

I have a ce6 installed and running.
The analytics is running (logs/penthao plugins/…), no errors in logs.
The dwh_fact_events has data in it.

But nothing shows up in the graphs.

I’m guessing that the aggregation phase filter out the data, I just don’t understand why.
I’m going to drill down on the queries that are doing that, but I wander if someone has an idea that can save me some time.

10x,

Ofer

After checking the tables and comparing to another installation this is the info I have:

  1. the dwh_dim_entries table doesn’t contain entries
  2. The widget_id on all records in dwh_fact_events is -1

Any hint? (maybe missing parameter in the player initialization)

Hi Ofer,

The reason you don’t see data in the reports is that your entries dimensions table (kalturadw.dwh_dim_entries) is empty.

  1. Please verify that your DB timezone settings are the same as PHP timezone settings (php.ini)
  2. Re-sync dimension tables from the day you installed your server:
    2.1 Update kalturadw_ds.parameters where parameter_name = ‘dim_sync_last_update’. You need to set date_value to the date you installed your server.
    2.2 Run /opt/kaltura/dwh/etlsource/execute/etl_update_dims.sh
    2.3 Verify that kalturadw.dwh_dim_entries was populated (If not check /opt/kaltura/dwh/logs/etl_update_dims-YYYTMMDD-HH.log for errors)
  3. Update kalturadw.aggr_managment to run all aggregation again. Update kalturadw.aggr_managment set data_insert_time = NOW();
  4. Run /opt/kaltura/dwh/etlsource/execute/etl_daily.sh

Thanks,
Orly.

Worked like a charm.
Any hint what may have gone wrong at first place? (DB Timezone was ok)

Check to see if there are errors in the update_dims logs (/opt/kaltura/dwh/logs/etl_update_dims-YYYTMMDD-HH.log)

Hi

I’me experiencing exactly the same issue but I’ve errors in my tl_update_dims-YYYTMMDD-HH.log:

ERROR 15-07 09:23:08,113 - Mapping with Field Input - Unexpected error
ERROR 15-07 09:23:08,113 - Mapping with Field Input - org.pentaho.di.core.exception.KettleException:
java.lang.NoSuchMethodError: org.pentaho.di.core.logging.LogWriter.logDetailed(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/Object;)V
org.pentaho.di.core.logging.LogWriter.logDetailed(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/Object;)V

        at org.kaltura.mapping.MappingFiledRunner.processRow(MappingFiledRunner.java:161)
        at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
        at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.NoSuchMethodError: org.pentaho.di.core.logging.LogWriter.logDetailed(Ljava/lang/String;Ljava/lang/String;[Ljava/lang/Object;)V
        at org.kaltura.mapping.MappingFiledRunnerMeta.loadMappingMeta(MappingFiledRunnerMeta.java:470)
        at org.kaltura.mapping.MappingFiledRunner.processRow(MappingFiledRunner.java:83)
        ... 2 more

I’ve copied the the pdi 42 plugins under /usr/local/pentaho/pdi/plugins/steps/

but I’ve a strange directories structure:

??? DimLookup
?   ??? CMB.png
?   ??? lookup.jar
?   ??? plugin.xml
??? DimLookup32
?   ??? DimLookup
?   ?   ??? CMB.png
?   ?   ??? lookup.jar
?   ?   ??? plugin.xml
?   ??? resources
?   ?   ??? commons-collections-3.1.jar
?   ?   ??? swt.jar
?   ??? src
?       ??? CMB.png
?       ??? org
?       ?   ??? kaltura
?       ?       ??? dimensionlookup
?       ?           ??? CombinationLookupData.java
?       ?           ??? CombinationLookupDialog.java
?       ?           ??? CombinationLookup.java
?       ?           ??? CombinationLookupMeta.java
?       ?           ??? messages
?       ?           ?   ??? messages_en_US.properties
?       ?           ??? Messages.java
?       ??? plugin.xml
??? DimLookup42
?   ??? DimLookup
?   ?   ??? CMB.png
?   ?   ??? lookup.jar
?   ?   ??? plugin.xml
?   ??? resources
?   ?   ??? commons-collections-3.1.jar
?   ?   ??? swt.jar
?   ??? src
?       ??? CMB.png
?       ??? org
?       ?   ??? kaltura
?       ?       ??? dimensionlookup
?       ?           ??? CombinationLookupData.java
?       ?           ??? CombinationLookupDialog.java
?       ?           ??? CombinationLookup.java
?       ?           ??? CombinationLookupMeta.java
?       ?           ??? messages
?       ?               ??? messages_en_US.properties
?       ??? plugin.xml

How does it should look like?

Thanks in advance & BR,

You shouldn’t have DimLookup42 and DimLookup32.
You just need to copy /opt/kaltura/dwh/pentaho-plugins/DimLookup42/DimLookup to /usr/local/pentaho/pdi/plugins/steps/
This is true for all oher plugins in /opt/kaltura/dwh/pentaho-plugins/

Thanks Orly,

I’ve modified tha folders structure to get the original one but it seems that the location of the plugins is not the problem because PDI is using it (if I remove the plugins I get mora and different errors).

I don’t know why but I’m getting this error executing the /opt/kaltura/dwh/etlsource/execute/etl_update_dims.sh script. The dimensions table is empty so I don’t have plays statistics.

I’m using jre-1.6.0-openjdk.x86_64 (after change it)

I’ve Kaltura CE 6.2.0 dev packager version.

All other things are working properly but I cannot solve this issue. :frowning:

CentOS 6.4
Kaltura CE 6.2.0 dev Kitchen

@orly.lampert

When I attempt running /opt/kaltura/dwh/etlsource/execute/etl_update_dims.sh and then still have no entries, I check the update log, and the errors I see in there are with lock files

INFO  17-10 11:46:17,483 - Kitchen - Start of run.
INFO  17-10 11:46:17,639 - Update Dimensions - Start of job execution
INFO  17-10 11:46:18,123 - Update Dimensions - Starting entry [Check if daily_lock is seized]
INFO  17-10 11:46:18,147 - Update Dimensions - Starting entry [Seize update dims lock]
INFO  17-10 11:46:18,148 - Seize update dims lock - Loading transformation from XML file [/opt/kaltura/dwh/etlsource//common/seize_lock_by_name.ktr]
INFO  17-10 11:46:18,287 - seize_lock_by_name - Dispatching started for transformation [seize_lock_by_name]
INFO  17-10 11:46:18,378 - get_update_dims_lock_name - Dispatching started for transformation [get_update_dims_lock_name]
INFO  17-10 11:46:18,389 - Generate Row - Finished processing (I=0, O=0, R=0, W=1, U=0, E=0)
INFO  17-10 11:46:18,392 - Mapping input specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,394 - Set lock name = dims_lock - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,395 - Mapping output specification - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,396 - Get lock name - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,397 - Create lock states - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,403 - Register lock - Finished processing (I=1, O=0, R=1, W=1, U=0, E=0)
ERROR 17-10 11:46:18,406 - Lock is already seized - Row nr 1 causing abort : [update_dims_lock], [1], [0], [null]
ERROR 17-10 11:46:18,406 - Lock is already seized - Aborting after having seen 1 rows.
ERROR 17-10 11:46:18,407 - seize_lock_by_name - Errors detected!
ERROR 17-10 11:46:18,407 - seize_lock_by_name - Errors detected!
INFO  17-10 11:46:18,407 - Lock is already seized - Finished processing (I=0, O=0, R=1, W=1, U=0, E=1)
INFO  17-10 11:46:18,408 - seize_lock_by_name - seize_lock_by_name
INFO  17-10 11:46:18,408 - Lock is free - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,408 - seize_lock_by_name - seize_lock_by_name
INFO  17-10 11:46:18,409 - Get free lock - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  17-10 11:46:18,592 - Update Dimensions - Starting entry [Success 1]
INFO  17-10 11:46:18,593 - Update Dimensions - Finished job entry [Success 1] (result=[true])
INFO  17-10 11:46:18,593 - Update Dimensions - Finished job entry [Seize update dims lock] (result=[true])
INFO  17-10 11:46:18,593 - Update Dimensions - Finished job entry [Check if daily_lock is seized] (result=[true])
INFO  17-10 11:46:18,594 - Update Dimensions - Job execution finished
INFO  17-10 11:46:18,655 - Kitchen - Finished!
INFO  17-10 11:46:18,656 - Kitchen - Start=2013/10/17 11:46:17.484, Stop=2013/10/17 11:46:18.656
INFO  17-10 11:46:18,656 - Kitchen - Processing ended after 1 seconds.

Hi,
I follow the instruction on page:
https://github.com/kaltura/ce-packager/issues/35
and all the post in this page.
I have the record in the table kalturadw.dwh_dim_entries
no error in the file etl_update_dims-YYYTMMDD-HH.log
but in etl_daily-YYYTMMDD.log
I have this error:

INFO 23-10 00:59:09,604 - Table input - Finished processing (I=10, O=0, R=0, W=10, U=0, E=0)
ERROR 23-10 00:59:09,668 - Execute SQL script - Unexpected error
ERROR 23-10 00:59:09,668 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:208)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
at java.lang.Thread.run(Thread.java:679)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131001)

Table has no partition for value 20131001

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.Database.execStatements(Database.java:1800)
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:182)
... 2 more

Caused by: java.sql.SQLException: Table has no partition for value 20131001
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:824)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:667)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
… 5 more

INFO 23-10 00:59:09,672 - Execute SQL script - Finished reading query, closing connection.
ERROR 23-10 00:59:09,672 - calc_storage_usage - Errors detected!
ERROR 23-10 00:59:09,673 - calc_storage_usage - Errors detected!
INFO 23-10 00:59:09,674 - Execute SQL script - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,675 - calc_storage_usage - calc_storage_usage
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Calc storage usage] (result=[false])
INFO 23-10 00:59:09,677 - calc_storage_usage - Finished job entry [Add Storage Usage Missing Days] (result=[false])
INFO 23-10 00:59:09,717 - execute kjb file - Starting entry [Write To Log on error]
INFO 23-10 00:59:09,719 - Status - Failed storage_usage/calc_storage_usage.kjb

I’ve Kaltura CE 6.2.0 dev packager version. on centos 6.4
Any idea to fix the problem?
Regards
Sam

Well, I solved the lock errors, and the empty entries table issue…

  1. The lock errors were solved following the advice here http://www.kaltura.org/stats-stop-working-after-db-error

I update the status of daily_lock record in lock table to be 0
to be more accurate: I had to update all records to status 0 (in locks table)

  1. Then followed the steps outlined above by @orly.lampert

I now have entries in the kalturadw.dwh_dim_entries but still nothing in KMC > Analytics. When I look in /opt/kaltura/dwh/logs/etl_daily-20131101.log I see

`
ERROR 01-11 00:59:09,697 - Execute SQL script - Unexpected error
ERROR 01-11 00:59:09,697 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn't execute SQL: Call kalturadw.calc_entries_sizes(20130901)

Table has no partition for value 20130901


at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:208)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
at java.lang.Thread.run(Thread.java:679)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException: 
Couldn't execute SQL: Call kalturadw.calc_entries_sizes(20130901)

Table has no partition for value 20130901

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.Database.execStatements(Database.java:1800)
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:182)
... 2 more
Caused by: java.sql.SQLException: Table has no partition for value 20130901
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:824)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:667)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
... 5 more

`

So, similar to @samuelef perhaps the javaMySQL connector? Has anyone else dealt with this?

v

My best guess is that when running the installation the end of the dwh scritp didn’t run

so if you look under ce-packager/git-repositories/ce-dwh/setup (in installer)

you will see a file named dwh_ddl_install.sh

I believe the last 3 lines were not executed try running them manually. (especially populate_table_partitions.sql)

Thanks @oferc , I appreciate your suggestions

populate_table_partitions.sql did seem to do something… took some time and reported success. populate_time_dim.sql and populate_dwh_dim_ip_ranges.sql both returned duplicate key errors, so I assume they have been run at some point or another.

I let it all sit for a day, but still nothing showing in KMC > Analytics. I’m not sure what to try now. There does seem to be correct lines in kaltura_api_v3_analytics.log

I ran /opt/kaltura/dwh/etlsource/execute/etl_update_dims.sh again and this time when checking /opt/kaltura/dwh/logs/etl_daily-20131101.log I don’t see any errors, I don’t want to quote the entire thing here, but will if you like. Essentially there a lot of info and then a lot of 0 rows inserted/updated blocks. Here’s a sample…

INFO  05-11 10:16:20,715 - Calculator - Finished processing (I=0, O=0, R=6, W=6, U=0, E=0)
INFO  05-11 10:16:20,719 - Table input - Finished processing (I=10, O=0, R=5, W=10, U=0, E=0)
INFO  05-11 10:16:20,720 - Table input - Finished processing (I=6, O=0, R=5, W=6, U=0, E=0)
INFO  05-11 10:16:20,727 - input - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  05-11 10:16:20,786 - parse_partner_custom_data - Dispatching started for transformation [parse_partner_custom_data]
INFO  05-11 10:16:20,826 - Read Table Names - Finished reading query, closing connection.
INFO  05-11 10:16:20,827 - Read Table Names - Finished processing (I=27, O=0, R=0, W=27, U=0, E=0)
INFO  05-11 10:16:21,004 - create Sqls - Optimization level not specified.  Using default of 9.
INFO  05-11 10:16:21,018 - input - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  05-11 10:16:21,020 - Read Table Partners - Finished reading query, closing connection.
INFO  05-11 10:16:21,142 - Update Locations - Dispatching started for transformation [Update Locations]
INFO  05-11 10:16:21,223 - Table input - Finished reading query, closing connection.
INFO  05-11 10:16:21,230 - input - Finished processing (I=0, O=0, R=1, W=1, U=0, E=0)
INFO  05-11 10:16:21,265 - Write to log - 
------------> Linenr 1------------------------------
table_name = entry_media_source
inserted rows = 0
updated rows = 0

====================
INFO  05-11 10:16:21,266 - Write to log - 
------------> Linenr 2------------------------------
table_name = entry_media_type
inserted rows = 0
updated rows = 0

====================
INFO  05-11 10:16:21,268 - Write to log - 
------------> Linenr 3------------------------------
table_name = entry_status
inserted rows = 0
updated rows = 0

====================
INFO  05-11 10:16:21,270 - Write to log - 
------------> Linenr 4------------------------------
table_name = entry_type
inserted rows = 0
updated rows = 0

I’m not really sure where to go from here… any pointers if I were to just blow away all the analytics stuff and start fresh?

Thanks,
v

thanks to @oferc, yes, I’m in the same situation of verdonv.
but the strange thing is in the section analytics don’t work anything except the “Top Contribution” section…
Thanks
Sam

Do again the process suggested by Orly

yeppa! now analytics works!
thanks!
but analytics count only the video viewed with the kaltura player and not the video viewed through the api.
I’m using another player to view the content and get them with api (es: http://video.xxxxx.xxx/p/106/sp/0/playManifest/entryId/0_kg16uygk/format/url/flavorParamId/32/video.mp4), but not appear in statistics.
That is right?
How I can get analytics in that case?
Regards
Sam

You will have to do it yourself using the api

  1. Look on the stats service , collect function
  2. Look on the statistics plugin of kaltura to see what they do: https://github.com/kaltura/kdp/blob/master/plugins/statisticsPlugin/src/com/kaltura/kdpfl/plugin/component/StatisticsMediator.as

I have had some issues with the mysql driver supplied by OpenJDK and so proceeded to install the official Java package as suggested by Orly.

Connecting to the DB is now successful, but two ETL jobs report errors.

etl_daily:

ERROR 13-11 16:06:11,737 - Execute SQL script - Unexpected error
ERROR 13-11 16:06:11,737 - Execute SQL script - org.pentaho.di.core.exception.KettleStepException:
Error while running this step!

Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131106)

Table has no partition for value 20131106

at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:208)
at org.pentaho.di.trans.step.RunThread.run(RunThread.java:40)
at java.lang.Thread.run(Unknown Source)

Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Couldn’t execute SQL: Call kalturadw.calc_entries_sizes(20131106)

Table has no partition for value 20131106

at org.pentaho.di.core.database.Database.execStatement(Database.java:1650)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1595)
at org.pentaho.di.core.database.Database.execStatements(Database.java:1800)
at org.pentaho.di.trans.steps.sql.ExecSQL.processRow(ExecSQL.java:182)
... 2 more

Caused by: java.sql.SQLException: Table has no partition for value 20131106
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2619)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2569)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:824)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:667)
at org.pentaho.di.core.database.Database.execStatement(Database.java:1618)
… 5 more

etl_perform_retention_policy

ERROR 13-11 16:06:38,576 - Move old partitions to archive - An error occurred executing this job entry :
Couldn’t execute SQL: CALL move_innodb_to_archive()

Cannot remove all partitions, use DROP TABLE instead

The hourly and update_dimensions ETL jobs appear to be working as intended.
The analytics dashboard remains conspicuously empty, likely due to the errors I am now reporting.

Look on my comment from November 3rd