Direct Upload to S3 (with PHP & Composer)

Direct Upload to S3 (with PHP & Composer)


This is a continuation in the ‘Direct Upload‘ series:

  • First we began with a look at how you can directly upload a file to s3, talking it through in detail (13/10/2013).
  • We later made another post, explaining how to handle multiple files and updating the code to use AWS’s signature V4 (7/3/2015).

Now we’re back with another improvement!

Instead of the copy and paste from a blog post solution we were advocating in past blog posts, we’ve now built a Composer package instead. With all the improvements we’ve made to the script over time it seemed a shame that it wasn’t easy for users to update and maintain (because we weren’t using a package manager).

All the information on the package, like how to install, use and contribute can be found on it’s Github page (but we’ll explain how to use it here as well).

  View on Github

How to Install

To install we just need to include the package in our composer.json file. Here’s a command which will add it for you:

composer require eddturtle/direct-upload

We need to make sure composer’s autoloader is included for this to work as well (which if you’re already using composer it will be).

How to Use

Once installed, we can just call the package and create our uploader, like so:


$uploader = new \EddTurtle\DirectUpload\Signature(


<form action="<?php echo $uploader->getFormUrl(); ?>" method="POST" enctype="multipart/form-data">
    <?php echo $uploader->getFormInputsAsHtml(); ?>
    <input type="file" name="file" multiple>

The Result:

This is an example gif, showing multiple files getting uploaded using our example repository.

Screencast from 03-12-15 20_17_49

41 responses to “Direct Upload to S3 (with PHP & Composer)

    1. Hi John, if you’re using the composer package, then leaving the options blank will fill a ‘key’ input with the value ${filename} — this will take the file’s name to save on to S3.

      1. Wow thanks for prompt reply. I tried a few variations, the file name still didn’t maintain in s3 πŸ™ . Could you provide an example if I want to store it with the name follow by the key? Appreciate it!

  1. This is really amazing. The only thing I can’t figure out how to do is add metadata to the form. I want to add Content-Disposition = attachment but when I hack it on the page – I get a Invalid according to Policy: Extra input fields: content-disposition error. Any advice?

  2. Hi Edd, I’m hoping to lean on you for what I imagine would be a quicky. I’m trying to handle multiple file select w/ a single button to kick off the process. I’m having trouble with 2 things, 1 sending all the files selected to S3 and then capturing the results of each upload.
    I add each file to an array when added. Then when a user hits submit, I loop through and fire filesData[i].submit(); This kicks off the initial transfer but only sends the last file selected. Then if I look within the data.results object within the done callback I see the same information for both instances. Let me know if I’m on the right track or if I’m going about this all wrong. Cheers!

    1. Hi Ilya, so sorry for the late response (just got back from a conference abroad). Is it still a problem? Uploading seems to be much easier when the user choses a file, not when a button is clicked, but it should still be possible. If you are still having problems, it might be easier emailing (my email’s edd at All the best

      1. Edd, I haven’t played around with this further as I’m still on the fence about the functionality. My users will be uploading large video files and I can see pros and cons to both upload on select or upload on button click. I think my happy medium will be upload on select with the option to cancel at anytime. I’ll dig around bluimp’s library to see if I can add this though the documentation is a bit weedy for me.

  3. Edd,

    Thanks for this uploader.
    it works great. Exactly what I wanted to attach to my little project.
    detailed analysis of this code would also allow to understand how it works πŸ™‚ i am in the process of doing so.

    I played with the content_type variable. it allows to specify what file types user can upload: only images or PDF’s.
    but as I understand it takes only one file type as input. i.e. only JPEG’s.. what if I’d like to allow jpegs, gifs & png’s to be uploaded.. is there a way to specify in the policy that this coming file is either this or this or this type..

    Many thanks

      1. Hi Gizat, I found it easier to control the accepted file types by the following code in the add method.

        var acceptFileTypes = /(gif)|(jpe?g)|(png)/i;
        var type = data.files[0]['type'];

        Unfortunately, for some files it comes back blank and I just set it back on the extension

        if(type.length==0){type=data.files[0]['name'].substr(data.files[0]['name'].lastIndexOf('.') + 1);}
        if(type.length && !acceptFileTypes.test(type)) {uploadErrors.push('Not an accepted file if(uploadErrors.length > 0) {alert(uploadErrors.join("n"));}
  4. New to S3 and I stumbled upon your code. First, off, thank you! Secondly, a newbie question about security. It seems as though the hidden inputs that are written to the form could be viewed by anyone looking at the HTML Source page and then used to upload or download from the bucket (by creating a form using these inputs). Please let me know if I’m mistaken about this. Obviously, this is just a tutorial, but a lot of folks (including myself) are looking at this code and possibly using it in a more production environment … so I just wanted to touch on the security aspect.

    1. Hi Justin. Sure, if you were able to get a copy of the all the inputs and their values you could use them. For this reason it’s important to set a low expiry time in the policy (so the signature just invalidates itself after an hour or so). Also it’s important to use this alongside https so that data sent across http can’t be read or changed. It’s not secure or insecure doing it this way, it’s simply how you set it up. Hope that helps πŸ™‚

  5. Hi Ed, still not quite sure on getting the JS not to change the file’s name on S3, if I comment out the line that renames the file, placing it in a folder stops working. Any ideas? Bit of a newbie on js and php it has to be said. I tried commenting out line 87 but that stopped the form from working altogether. I found commenting out line 90 (in your example) sorted the renaming, but disabled the folder part.

    1. Hi Iain, the file name on S3 is determined by a hidden input called key – if the key contains ${filename} then the file’s original name is used. So… at a guess, I’d change line 90 to something like:

  6. Hi there, thank you the code. Is there any security concerns with this concept? For example, exposing the AWS credentials to the browser data… Also, I’m having a problem with 500MB files upload… I’m just getting an error. 300MB works fine…

    1. Hi, shouldn’t be any security concerns because your api key/secret should never be shown or accessible to the web browser (if they are you’re doing it wrong). They’re simply used to make the signature which is then used and verified on AWS’ end. If a file of 300mb works, and the 500mb one doesn’t it could be a policy based thing on the bucket – I’m not sure, have you had any luck with it?

  7. HI, ‘”
    Thank you for this series. But, i am receiving an error in the browser console : “Failed to load Response to preflight request doesn’t pass access control check: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘http://localhost’ is therefore not allowed access.”

    i tried to changing the CORS file of the bucket to ”


    Then also, i am getting the error. How do i solve? please help.

    1. Hi lakshya, feel free to pop me an email at edd [@] with a copy of your CORS policy and I’d be more than happy to help. My comments aren’t playing nice with xml at the moment (it’s on the todo list to fix).

  8. sorry for the above CORS file, i didnt know it gets changed
    AllowedOrigin: http://*

    1. Hi Edd,

      Thanks for giving us such a nice package to upload file directly to s3. The problem I am facing is very strange. Please help me .

      I download code from github and run composer update command and set all the necessary config related to s3 in index.php file and it runs fine, but I am trying to use same script with Laravel and it gives me error which is mentioned below:-

      Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at (Reason: CORS header β€˜Access-Control-Allow-Origin’ missing).

      I am using following s3 CORS policy :-


      Very strange thing is that I can not post file to s3 while I am using this with Laravel and its working in my simple setup.

      Thanks in Advance and hoping for your quick response.


  9. I’m glad to have stumbled upon this, very eager to work this into my project. I’m wondering how this works with very large files (over 100mb) β€” Amazon recommends using multipart uploads for files over 100mb, but since this uploads directly from the browser to S3 and circumvents my server, I’m 50/50 on whether or not that is even necessary since there is no php script to timeout? Any insight appreciated.

    1. Going directly from the client makes this slightly less important, but multipart uploads can I believe be useful as they would allow you to resume for where you left off on reload – they are however complicated and I can’t quite work out how to build the signature(s) to do this yet.

  10. Hi Edd, im glad that ive come across your series of tuts for S3 direct uploads. Ive followed everything from setting up bucket to codes. First i was stumble upon cross-origin issue, had some fix for that. Lastly im stuck with 403 forbidden. Had check my region setting, and everything is correct. Any other reason that i might get 403. Help πŸ™

    1. Hi Shahril, can you give the region you’re using? I know some regions only support signature v4. Not sure without a little info sorry.

  11. hey,
    Through this method i am able to upload only 10-15 images at once to my s3 bucket. How to let my website users upload 500-1000 images in one go into my s3 bucket?

    1. Hi Shailesh, that’s interesting – are you able to give some more info though? What happens if you go over 10-15 images? Does this vary per web browser?

  12. Hi Edd,
    I have got your code working, but my objective is to first make a ajax request to one of my page. On this page, I will create an entry in the Database. This is to keep track of who has uploaded the file and then make a call to form.fileupload({ /* code */ }); currently it directly uploads the files to S3. I tried different ways to get this working, can you please give me a hint on how to go ahead

    1. Hi Edd,
      I have managed to get this working using evaporate.js which is very useful for uploading large files in GBs, if anyone needs help on this please contact me. I have build it using s3 and php and works smoothly for file more then 40 GB

  13. Hi Ed,
    Great library thank for sharing it. I had a question, is there any way to upload a file which is dynamic rather than uploading one. I have am saving a recording from a webpage using “mattdiamond/Recorderjs” so the file already exists. I want to then simply upload to S3 as I stop recording without selecting the file with a . Any help appreciated. Thanks again for a great library.

    1. Hi Hamish, good question. A lot of this is using a file to upload – but you can upload an object (of bytes) directly, which might suit you better. I know you can do this through an sdk from back-end languages, so I don’t see why you couldn’t do this as a direct upload either. But I can’t quite find out how to do this at the moment. Edd

Leave a Reply

Your email address will not be published. Required fields are marked *