Thursday, February 21, 2013

Using Java AWS SDK to upload files to Amazon S3

Amazon S3 is a highly available and durable storage suitable for storage large files that do not change frequently. This post will focus on how to upload files programmatically via the Java Amazon SDK. For an introduction to S3, read What is Amazon Simple Storage Service (Amazon S3)?

My specs:
  • Eclipse Juno
  • SpringMVC 3.1.x
  • Maven 3.0.x

Install AWS Toolkit

In eclipse, click on help in the menu bar and then "Install New Software".

In the "Work with:" input box, put " http://aws.amazon.com/eclipse" and Click Add...

Check on the AWS Toolkit for Eclipse and click Yes to install all the tools.

In the Eclipse toolbar, you will see a red cube icon. Click on the down arrow next to this icon. Click Preference.

Fill in your Access Key ID and Secret Access Key. Give it an Account Name (Ex. use your email). You can find your keys in the Amazon Management Console (My Account/Console -> Security Credentials). Click on Apply and OK.

In the Eclipse menu bar, click on Window -> Preferences. Expand the AWS Toolkit. Right click on your key. Click Select Private Key File. Associate it with your private key. Click OK.

Click on the down arrow next to the Amazon cube icon. Select Show AWS Explorer View. You should be able to see the Amazon S3 service and all your related buckets (if you have any).


Download and Install the AWS SDK for Java

You can download it here. Click on the AWS SDK for Java button.

Extract the file. Code Samples are located in /samples.

If you are using Maven, you can add the AWS SDK as a dependency in the pom.xml file.


< dependency>
< groupId>com.amazonaws</ groupId>
< artifactId>aws-java-sdk</ artifactId>
< version>1.3.32</ version>
< /dependency>


Choose the version you want here.

Alternatively, you can just add it as a library (Right Click on the project -> Java Build Path -> Libraries -> Add External JARs).


Running the default AWS Sample Apps

We will begin by setting up a sample project that you can check out how S3 works.

Click on the down arrow next to the Amazon icon.

Select New AWS Java Project.

Give a Project name.

Select your account.

Select Amazon S3 Sample, Amazon S3 Transfer Progress Sample, and AWS Console Application. Click Next.

Expand the newly created project. Left click on the AwsConsoleApp.java. In the Eclipse menu bar, click on Run -> Run.

You should see output like the following:


===========================================
Welcome to the AWS Java SDK!
===========================================
You have access to 3 Availability Zones.
You have 14 Amazon EC2 instance(s) running.
You have 0 Amazon SimpleDB domain(s)containing a total of 0 items.
You have 8 Amazon S3 bucket(s), containing 71841 objects with a total size of 224551364 bytes.



If you run the S3Sample.java, you will get the following:


===========================================
Getting Started with Amazon S3
===========================================

Creating bucket my-first-s3-bucket-39065c55-2ee5-413a-9de1-6814dbb253c1

Listing buckets
 - my-first-s3-bucket-39065c55-2ee5-413a-9de1-6814dbb253c1

Uploading a new object to S3 from a file

Downloading an object
Content-Type: text/plain
    abcdefghijklmnopqrstuvwxyz
    01234567890112345678901234
    !@#$%^&*()-=[]{};':',.<>/?
    01234567890112345678901234
    abcdefghijklmnopqrstuvwxyz

Listing objects
 - MyObjectKey  (size = 135)

Deleting an object

Deleting bucket my-first-s3-bucket-39065c55-2ee5-413a-9de1-6814dbb253c1


Integrate the S3 SDK

To begin, you need to have the file AwsCredentials.properties at the root of you class path. You can just copy the one generated during the sample project to your project class path. Or you can just create one with the following content:

secretKey=
accessKey=


Create an authenticated S3 object:

AmazonS3 s3 = new AmazonS3Client(new ClasspathPropertiesFileCredentialsProvider());

Objects in S3 are stored in the form of buckets. Each bucket is globally unique. You cannot create a bucket with a name that another user has created. Each bucket contains key and value pairs you can define in any ways you want.


Create a bucket:

String bucketName = "my-s3-bucket-" + UUID.randomUUID();
s3.createBucket(bucketName);

For readability, I have skipped the exception handling, I will come back to it at the end. The name of the bucket must conform to all the DNS rules. I usually name them using my domain name.


Delete a bucket:

s3.deleteBucket(bucketName);


List all buckets:

for (Bucket bucket : s3.listBuckets()) {
    System.out.println(" - " + bucket.getName());
}


Save an object in a bucket:

String key = "myObjectKey";

PutObjectRequest putObject = new PutObjectRequest(bucketName, key, myFile);
s3.putObject(putObject);

myFile is of class File above.


Delete an object:

s3.deleteObject(bucketName, key);


Get/Download an object:

String key = "myObjectKey";
GetObjectRequest getObject = new GetObjectRequest(bucketName, key);
S3Object object = s3.getObject(getObject);


List objects by prefix:

ObjectListing objectListing = s3.listObjects(new ListObjectsRequest()
                    .withBucketName(bucketName)
                    .withPrefix("My"));
for (S3ObjectSummary objectSummary : objectListing.getObjectSummaries()) {
    System.out.println(" - " + objectSummary.getKey() + "  " +
                                   "(size = " + objectSummary.getSize() + ")");
}


Uploading large files

Use TransferManager whenever possible. It makes use of S3 multipart uploads to achieve enhanced throughput, performance, and reliability. It uses multiple threads to upload multiple parts of a single upload at once.

AWSCredentials myCredentials = new BasicAWSCredentials(...);
TransferManager tx = new TransferManager(myCredentials);
Upload myUpload = tx.upload(myBucket, myFile.getName(), myFile);

 while (myUpload.isDone() == false) {
     System.out.println("Transfer: " + myUpload.getDescription());
     System.out.println("  - State: " + myUpload.getState());
     System.out.println("  - Progress: " + myUpload.getProgress().getBytesTransfered());
     // Do work while we wait for our upload to complete...
     Thread.sleep(500);
 }


Exceptions

Whenever you call any of the AWS API, you should surround the calls with try and catch clauses like the following:

try{
    // AWS requests here

} catch (AmazonServiceException ase) {
            System.out.println("Caught an AmazonServiceException, which means your request made it "
                    + "to Amazon S3, but was rejected with an error response for some reason.");
            System.out.println("Error Message:    " + ase.getMessage());
            System.out.println("HTTP Status Code: " + ase.getStatusCode());
            System.out.println("AWS Error Code:   " + ase.getErrorCode());
            System.out.println("Error Type:       " + ase.getErrorType());
            System.out.println("Request ID:       " + ase.getRequestId());
        } catch (AmazonClientException ace) {
            System.out.println("Caught an AmazonClientException, which means the client encountered "
                    + "a serious internal problem while trying to communicate with S3, "
                    + "such as not being able to access the network.");
            System.out.println("Error Message: " + ace.getMessage());
        }
    }


If you interested in securing your S3 contents for your authenticated users only, check out AWS Java - Securing S3 content using query string authentication.

7 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. Thanks a lot! This was exactly what I was looking for!

    ReplyDelete
  3. Hi, I wrote aws web project to upload files to s3 its working fine in eclipse.but its failed to work in war. give mi best solutions

    ReplyDelete
  4. Good article, i have gone couple of articles in this blogs, all were very informative, I admire the valuable information you offer in your articles. keep write more.

    Best Regards,
    CourseIng - AWS Training in Hyderabad

    ReplyDelete
  5. Very good points you wrote here..Great stuff...I think you've made some truly interesting points.Keep up the good work.
    Minecraft Server List

    ReplyDelete
  6. nice information provided by you i feel good after studied your blog AWS Online Training

    ReplyDelete