Dwh not working; permission denied

Hello Community,

I am testing kaltura and experience an error with analytics; they stay empty. When looking at the mail spooler i see cron is sending the followin error to kaltura user:

/opt/kaltura/dwh -k /opt/kaltura/pentaho/pdi/kitchen.sh/etlsource/execute/etl_update_dims.sh -p /opt/kaltura/dwh -k /opt/kaltura/pentaho/pdi/kitchen.sh -k @KETTLE_SH@

/bin/sh: 1: /opt/kaltura/dwh: Permission denied

this happens for all cronjobs.

If i execute a cronjob manually like this one:

su kaltura -c “/opt/kaltura/dwh/etlsource/execute/etl_update_dims.sh -p /opt/kaltura/dwh -k /opt/kaltura/pentaho/pdi/kitchen.sh”

It works as expected.

I have upgraded to 10.21.0 last evening but still the same problem.

Please help?

Hello,

Post upgrading, you need to run the configure scripts once more which, among other things, will replace the tokens in the config, such as @KETTELE_SH@ for example.

Please see:


or, for deb:

Hello Jess,

These errors appear right after the initial installation on october 1st on my Ubuntu 14.04 system. Also after upgrading to the 10.21.0 release and running reconfigure statements as explained i still get these errors and dwh / analytics not working!

Hello,

Were the tokens replaced in /opt/kaltura/dwh/.kettle/kettle.properties?
If you look at /var/lib/dpkg/info/kaltura-base.postinst, you will see it is part of what it does. If not, we need to understand why, if all “@.*@” tokens were correctly replaced, we need to look at the logs again, also see:


for troubleshooting tips in regards to Analytics

Hello Jess,

These tokens aren’t replaced, the rest i removed as these were configured correctly. Are there any tokens listed below which should have been replaced? I’m running standard 1-server install without external CDN etc.

    # This file was generated by Pentaho Data Integration version 3.1.0.
#
# Here are a few examples of variables to set:
#
# PRODUCTION_SERVER = hercules
# TEST_SERVER = zeus
# DEVELOPMENT_SERVER = thor
#
# Note: lines like these with a # in front of it are comments
#

DailyLockTimeout = @DAILY_LOCK_TIMEOUT@
HourlyLockPoleInterval = @HOURLY_LOCK_POLE_INTERVAL@

EventsFTPUser = @EVENTS_FTP_USER@
EventsFTPPassword = @EVENTS_FTP_PASSWORD@
EventsFTPServer = @EVENTS_FTP_SERVER@
EventsFTPPort = @EVENTS_FTP_PORT@

AkamaiEventsLogsDir = @AKAMAI_EVENTS_LOGS_DIR@
AkamaiEventsWildcard = @AKAMAI_EVENTS_WILDCARD@
AkamaiEventsFetchMethod = @AKAMAI_EVENTS_FETCH_METHOD@
AkamaiEventsFTPUser = @AKAMAI_EVENTS_FTP_USER@
AkamaiEventsFTPPassword = @AKAMAI_EVENTS_FTP_PASSWORD@
AkamaiEventsFTPServer = @AKAMAI_EVENTS_FTP_SERVER@
AkamaiEventsFTPPort = @AKAMAI_EVENTS_FTP_PORT@

APICallsLogsDir = @APICallsLogsDir@
APICallsWildcard = @APICallsWildcard@
APICallsFetchMethod = @APICallsFetchMethod@
APICallsFTPUser = @APICallsFTPUser@
APICallsFTPPassword = @APICallsFTPPassword@
APICallsFTPServer = @APICallsFTPServer@
APICallsFTPPort = @APICallsFTPPort@

# IMPORTANT: to encrypt passwords, use the following tool: /opt/kaltura/bin/pentaho/encr.sh -kettle <desired-password>

# the DWH DB settinge

# the Read DWH connection

# the operational Db settings

# operational DB with write priviledge (for exporting plays/views)

# The db containing monitoring data
MonitoringDbHostName = @DB_MONITORING_HOST@
MonitoringDbPort = @DB_MONITORING_PORT@
MonitoringDbUser = @DB_MONITORING_USER@
MonitoringDbPassword = @DB_MONITORING_PASSWORD@
MonitoringSchemaName = @DB_MONITORING_SCHEMA_NAME@

## fms streaming logs settings
FMSLiveStreamingFetchMethod = @FMS_LIVE_STREAMING_FETCH_METHOD@
FMSLiveStreamingFtpUser = @FMS_LIVE_STREAMING_FTP_USER@
FMSLiveStreamingFtpPassword = @FMS_LIVE_STREAMING_FTP_PASSWORD@
FMSLiveStreamingFtpHost = @FMS_LIVE_STREAMING_FTP_HOST@
FMSLiveStreamingLogsDir = @FMS_LIVE_STREAMING_LOGS_DIR@
FMSLiveStreamingFtpPort = @FMS_LIVE_STREAMING_FTP_PORT@
FMSLiveStreamingFileWildcard = @FMS_LIVE_STREAMING_FILE_WILDCARD@

FMSOnDemandStreamingFetchMethod = @FMS_ONDEMAND_STREAMING_FETCH_METHOD@
FMSOnDemandStreamingFtpUser = @FMS_ONDEMAND_STREAMING_FTP_USER@
FMSOnDemandStreamingFtpPassword = @FMS_ONDEMAND_STREAMING_FTP_PASSWORD@
FMSOnDemandStreamingFtpHost = @FMS_ONDEMAND_STREAMING_FTP_HOST@
FMSOnDemandStreamingLogsDir = @FMS_ONDEMAND_STREAMING_LOGS_DIR@
FMSOnDemandStreamingFtpPort = @FMS_ONDEMAND_STREAMING_FTP_PORT@
FMSOnDemandStreamingFileWildcard = @FMS_ONDEMAND_STREAMING_FILE_WILDCARD@


## Bandwidth Usage settings ##
# Akamai
BandwidthUsageAkamaiProcessID = @BANDWIDTH_USAGE_AKAMAI_PROCESS_ID@
# remote or local (This is true for all fetch method bandwidth usage vars)
BandwidthUsageAkamaiFetchMethod = @BANDWIDTH_USAGE_AKAMAI_FETCH_METHOD@
BandwidthUsageAkamaiFTPUser = @BANDWIDTH_USAGE_AKAMAI_FTP_USER@
BandwidthUsageAkamaiFTPPassword = @BANDWIDTH_USAGE_AKAMAI_FTP_PASSWORD@
BandwidthUsageAkamaiFTPServer = @BANDWIDTH_USAGE_AKAMAI_FTP_SERVER@
BandwidthUsageAkamaiLogsDir = @BANDWIDTH_USAGE_AKAMAI_LOGS_DIR@ 
BandwidthUsageAkamaiFTPPort = @BANDWIDTH_USAGE_AKAMAI_FTP_PORT@
BandwidthUsageAkamaiWildCard = @BANDWIDTH_USAGE_AKAMAI_WILDCARD@
# LLN
BandwidthUsageLLNProcessID = @BANDWIDTH_USAGE_LLN_PROCESS_ID@
BandwidthUsageLLNFetchMethod = @BANDWIDTH_USAGE_LLN_FETCH_METHOD@
BandwidthUsageLLNFTPUser = @BANDWIDTH_USAGE_LLN_FTP_USER@
BandwidthUsageLLNFTPPassword = @BANDWIDTH_USAGE_LLN_FTP_PASSWORD@
BandwidthUsageLLNFTPServer = @BANDWIDTH_USAGE_LLN_FTP_SERVER@
BandwidthUsageLLNLogsDir = @BANDWIDTH_USAGE_LLN_LOGS_DIR@
BandwidthUsageLLNFTPPort = @BANDWIDTH_USAGE_LLN_FTP_PORT@
BandwidthUsageLLNWildCard = @BANDWIDTH_USAGE_LLN_WILDCARD@
# Level3
BandwidthUsageLevel3ProcessID = @BANDWIDTH_USAGE_LEVEL3_PROCESS_ID@
BandwidthUsageLevel3FetchMethod = @BANDWIDTH_USAGE_LEVEL3_FETCH_METHOD@
BandwidthUsageLevel3FTPUser = @BANDWIDTH_USAGE_LEVEL3_FTP_USER@
BandwidthUsageLevel3FTPPassword = @BANDWIDTH_USAGE_LEVEL3_FTP_PASSWORD@
BandwidthUsageLevel3FTPServer = @BANDWIDTH_USAGE_LEVEL3_FTP_SERVER@
BandwidthUsageLevel3LogsDir = @BANDWIDTH_USAGE_LEVEL3_LOGS_DIR@
BandwidthUsageLevel3FTPPort = @BANDWIDTH_USAGE_LEVEL3_FTP_PORT@
BandwidthUsageLevel3WildCard = @BANDWIDTH_USAGE_LEVEL3_WILDCARD@

BandwidthUsageLiveURTMPProcessID = @BANDWIDTH_USAGE_LIVE_URTMP_PROCESS_ID@
BandwidthUsageLiveURTMPFetchMethod = @BANDWIDTH_USAGE_LIVE_URTMP_FETCH_METHOD@
BandwidthUsageLiveURTMPFTPUser = @BANDWIDTH_USAGE_LIVE_URTMP_FTP_USER@
BandwidthUsageLiveURTMPFTPPassword = @BANDWIDTH_USAGE_LIVE_URTMP_FTP_PASSWORD@
BandwidthUsageLiveURTMPFTPServer = @BANDWIDTH_USAGE_LIVE_URTMP_FTP_SERVER@
BandwidthUsageLiveURTMPLogsDir = @BANDWIDTH_USAGE_LIVE_URTMP_LOGS_DIR@ 
BandwidthUsageLiveURTMPFTPPort = @BANDWIDTH_USAGE_LIVE_URTMP_FTP_PORT@
BandwidthUsageLiveURTMPWildCard = @BANDWIDTH_USAGE_LIVE_URTMP_WILDCARD@

# Local vars for jobs

Hello,

Not all tokens are relevant, here are the important ones:

@DB1_HOST@
@DB1_PORT@
@DB2_HOST@
@DB2_PORT@
@DWH_DIR@
@DWH_HOST@
@DWH_PASS@
@DWH_PORT@
@DWH_USER@
@EVENTS_FETCH_METHOD@
@EVENTS_LOGS_DIR@
@PHP_BIN@
@TIME_ZONE@
@TMP_DIR@

You can do:
# diff -u /opt/kaltura/dwh/.kettle/kettle.properties /opt/kaltura/dwh/.kettle/kettle.template.properties
or:
# vimdiff -u /opt/kaltura/dwh/.kettle/kettle.properties /opt/kaltura/dwh/.kettle/kettle.template.properties

to ensure these were replaced. Then also make sure /etc/cron,d/kaltura-dwh looks correct and proceed with the troubleshooting manual I pasted in the last post.

1 Like

Jess,

Yes all tokens were replaces successfully. After runnen base reconfigure i didn’t have any errors on the hourly cronjob. I executed /opt/kaltura/bin/kaltura-run-dwh.sh and i see some reports. I will monitor it the coming hours/day and let you know!

Thanks so far.

Most welcome && happy Kalturing,