Sunday, May 14, 2017

Streaming-s3 is not uplaoding file to aws properly

Leave a Comment

I am using nodejs to upload files to aws server. and found that file size is not properly there. I am getting 2.1KB.

here is my code

var uploadFile = function (fileReadStream, awsHeader, cb) {      //set options for the streaming module     var options = {         concurrentParts: 2,         waitTime: 20000,         retries: 2,         maxPartSize: 10 * 1024 * 1024     };     //call stream function to upload the file to s3     var uploader = new streamingS3(fileReadStream, config.aws.accessKey, config.aws.secretKey, awsHeader, options);     //start uploading     uploader.begin();// important if callback not provided.      // handle these functions     uploader.on('data', function (bytesRead) {         //console.log(bytesRead, ' bytes read.');     });      uploader.on('part', function (number) {         //console.log('Part ', number, ' uploaded.');     });      // All parts uploaded, but upload not yet acknowledged.     uploader.on('uploaded', function (stats) {         //console.log('Upload stats: ', stats);     });      uploader.on('finished', function (response, stats) {         console.log(response);         cb(null, response);     });      uploader.on('error', function (err) {         console.log('Upload error: ', err);         cb(err);     });  }; 

although, i got file name in my aws bucket. but then i try to open the file it says failed.

I am trying to upload this file from url : https://arxiv.org/pdf/1701.00003.pdf

Any suggestions please ?

Thanks

1 Answers

Answers 1

There is no need anymore for an external module for uploading streams to s3. Now there is a new method in aws-sdk called s3.upload, which can upload an arbitrarily sized buffer, blob, or stream. You can check the documentation here

The code I used:

const aws = require('aws-sdk');  const s3 = new aws.S3({      credentials:{          accessKeyId: "ACCESS_KEY",          secretAccessKey: "SECRET_ACCESS_KEY"      }  });  const fs = require('fs');  const got = require('got');    //fs stream test  s3.upload({      Bucket: "BUCKET_NAME",      Key: "FILE_NAME",      ContentType: 'text/plain',      Body: fs.createReadStream('SOME_FILE')  })  .on("httpUploadProgress", progress => console.log(progress))  .send((err,resp) => {      if(err) return console.error(err);      console.log(resp);  })      //http stream test  s3.upload({      Bucket: "BUCKET_NAME",      Key: "FILE_NAME",      ContentType: 'application/pdf',      Body: got.stream('https://arxiv.org/pdf/1701.00003.pdf')  })  .on("httpUploadProgress", progress => console.log(progress))  .send((err,resp) => {      if(err) return console.error(err);      console.log(resp);  })

To prove my point even further I tried the code with the pdf you posted in your question and here is the link for my test bucket showing that the pdf works as expected.

If You Enjoyed This, Take 5 Seconds To Share It

0 comments:

Post a Comment