Upload over 2GB error

Hi,

I’m encountering strange issue when uploading files over 2GB.
Every time I get:
MISSING_MANDATORY_PARAMETER
uploadTokenId

I’m using HTTP POST /api_v3/?service=uploadToken&action=upload and upload to Kaltura SaaS trial account.
Smaller files works, i.e. below 2GB uploads fine.
Also upload takes at least 1h.

Any suggestions?
Thanks.

Below full response from server

<?xml version="1.0" encoding="utf-8"?><xml><result><error><objectType>KalturaAPIException</objectType><code>MISSING_MANDATORY_PARAMETER</code><message>Missing parameter &quot;uploadTokenId&quot;</message><args><item><objectType>KalturaApiExceptionArg</objectType><name>PARAM_NAME</name><value>uploadTokenId</value></item></args></error></result><executionTime>0.0034580230712891</executionTime></xml>

Hello,

PHP has a limit of 2GB per upload, if you use the chunked upload functionality we have in the API then you can go beyond it.
There is some patch to PHP that increases the limit, but we are using the standard PHP version.
Also, in general, for large files, it’s better to split it to chunks instead of raising the limit

Jess,

I’m not using PHP. I use C++ and libcurl.
I’ll try to split into chunks.

Thanks,
Michal

The limitation I’m referring to is server side, which in Kaltura’s case runs on PHP.
Interested in your use case though… are you building a desktop application and interfacing with our API using the libcurl bindings?

You might be interested in our CLI client libs here:

If nothing else, it can help you understand how calls are made and you can then translate it into your code.

Thanks Jess,

I thought that 2GB limit in Kaltura is set for desktop upload only. Does it mean that upload limit is set for HTTP calls?
How to upload larger files than? Use drop folder feature?

are you building a desktop application and interfacing with our API using the libcurl bindings

Yes, I’m using libcurl to make HTTP calls to Kaltura.

Michal

You can use a drop folder, or else buik upload with CSV or XML. Even if its just one file, bulk upload will work for you cause it processes the CSV||XML and then the actually downloads the file from the server side which means there will be no limitation. I’m suggesting bulk because it requires less setup. Not that a dropbox is that hard to setup but still:)

If you use our own API then you can use chunked uploads as I suggested before.

BTW, you might also want to utilize the kaltura CLI clientlibs and simply use one of the C++ exec family functions to call them. Its setup and API is quite simple.

Jess, thanks for help.
I tried chunked upload. It fails but in different way…

 <?xml version="1.0" encoding="utf-8"?><xml><result><error><objectType>KalturaAPIException</objectType>
 <code>UPLOAD_ERROR</code><message>Upload failed</message><args></args></error></result>
<executionTime>0.0071260929107666</executionTime></xml>

So I assume I reached 2GB limit.

I’ll have a look at CLI libs.
Again, thanks for help.

Michal

Jess, I am having the same issue. I am using your Python client to upload large files, and get the same error.
I would love to use the chunking, can you direct me to the documentation that covers the chunking with the Kaltura Client?

Matt

Hello,

You can look at this Java code for reference:

Basically:
the uploadToken.upload action supports a ‘resume’ operation.
There are several parameters to control that:
resumeAt - offset to add current chunk
resume - should be set to true
finalChunk - should be set to 0 for all except last chunk.

Hope that helps,