PHP has a limit of 2GB per upload, if you use the chunked upload functionality we have in the API then you can go beyond it.
There is some patch to PHP that increases the limit, but we are using the standard PHP version.
Also, in general, for large files, it’s better to split it to chunks instead of raising the limit
The limitation I’m referring to is server side, which in Kaltura’s case runs on PHP.
Interested in your use case though… are you building a desktop application and interfacing with our API using the libcurl bindings?
You might be interested in our CLI client libs here:
If nothing else, it can help you understand how calls are made and you can then translate it into your code.
You can use a drop folder, or else buik upload with CSV or XML. Even if its just one file, bulk upload will work for you cause it processes the CSV||XML and then the actually downloads the file from the server side which means there will be no limitation. I’m suggesting bulk because it requires less setup. Not that a dropbox is that hard to setup but still:)
If you use our own API then you can use chunked uploads as I suggested before.
Jess, I am having the same issue. I am using your Python client to upload large files, and get the same error.
I would love to use the chunking, can you direct me to the documentation that covers the chunking with the Kaltura Client?
the uploadToken.upload action supports a ‘resume’ operation.
There are several parameters to control that:
resumeAt - offset to add current chunk
resume - should be set to true
finalChunk - should be set to 0 for all except last chunk.