As in a Ruby block to upload a large file to S3, pre-encoded it in Base64?

You must download a large media file (more available RAM) in S3, the pre-coding it in Base64.

Help to optimize the Ruby code to read a file, encode to Base64 and then loading up on S3 was performed in blocks.
blob = StringIO.new( Base64.encode64( IO.binread('big_file.mp4')), 'rb' )
...
@client.put_object(
 bucket: @s3_bucket.name
 key: target_name,
 body: blob
 acl: 'public-read'
 )

Documentation Base64.encode64 is not possible to process the file stream in chunks. Are there any tricks?
June 26th 19 at 14:04
1 answer
June 26th 19 at 14:06
Official S3 Ruby client is able to upload files by chunks (at least so says the documentation).

docs.aws.amazon.com/AWSRubySDK/latest/AWS/S3/S3Obj... (section Uploading Files)
Yes, I know that. It was about chunked transfer encoding to Base64.encode64. - alfonso56 commented on June 26th 19 at 14:09
Then you need to create a class that will be at the entrance to the path to the file and implement the IO interface, and inside this class will encode in base64 chunks.

ruby-doc.org/core-2.4.1/IO.html#method-i-read - jayme7 commented on June 26th 19 at 14:12
: show a simple example - alfonso56 commented on June 26th 19 at 14:15

Find more questions by tags Ruby