{"id":207,"date":"2008-02-03T20:12:42","date_gmt":"2008-02-04T04:12:42","guid":{"rendered":"http:\/\/www.khaitan.org\/blog\/?p=207"},"modified":"2008-02-03T20:12:42","modified_gmt":"2008-02-04T04:12:42","slug":"using-jets3t-to-upload-larger-number-of-files-to-s3","status":"publish","type":"post","link":"https:\/\/www.khaitan.org\/blog\/2008\/02\/using-jets3t-to-upload-larger-number-of-files-to-s3\/","title":{"rendered":"Using JetS3t to upload larger number of files to S3"},"content":{"rendered":"<p>I was looking for a tool to upload large number of files to S3. While I have been a great fan of the bash tools for browsing and accessing s3 objects and buckets and a managing a limited number of files &#8212; I could not find an easy way of uploading a large number of files (the first batch being around 800K).<\/p>\n<p>Then I downloaded <a href=\"http:\/\/jets3t.s3.amazonaws.com\/index.html\" title=\"JetS3t\">JetS3t<\/a>. It has a nice gui called Cockpit for managing the files on S3. The GUI is pretty neat. However, for simple upload\/download <a href=\"https:\/\/addons.mozilla.org\/en-US\/firefox\/addon\/3247\">S3 organizer<\/a>, a simple Firefox plugin does the job. If you need to extensively manage your files then JetS3t&#8217;s cockpit is the way-to-go.<\/p>\n<p>For uploading a large number of files, I was looking for something which is multi-threaded and configurable. JetS3t S3 suite has a &#8220;synchronize&#8221; application which is meant to synchronize files between a local PC and S3. JetS3t allows you to configure the number of threads and connections to the S3 service. Without reinventing the wheel, I got what\u00a0I wanted. However, one additional thing I needed was the ability to delete the local files once the upload was complete. On tinkering with the java src, I modded the Synchronize.java and added the following code fragments:<br \/>\n<code><br \/>\npublic void uploadLocalDirectoryToS3(FileComparerResults disrepancyResults, Map filesMap,Map s3ObjectsMap, S3Bucket bucket, String rootObjectPath, String aclString) throws Exception\u00a0 {<br \/>\n...<br \/>\nList filesToDelete = new ArrayList();<br \/>\n...<br \/>\nif (file.isDirectory() != true){<br \/>\n\u00a0 filesToDelete.add(file.getPath());<br \/>\n}<br \/>\n...<\/code><\/p>\n<p><code>\/\/ delete files once objects are S3d<br \/>\nfor (Iterator ite = filesToDelete.iterator(); ite.hasNext();){<br \/>\n\u00a0String fName = (String)ite.next();<br \/>\n\u00a0File f = new File(fName);<br \/>\nf.delete();<br \/>\n}<br \/>\n}<br \/>\n<\/code><\/p>\n","protected":false},"excerpt":{"rendered":"<p>I was looking for a tool to upload large number of files to S3. While I have been a great fan of the bash tools for browsing and accessing s3 objects and buckets and a managing a limited number of files &#8212; I could not find an easy way of uploading a large number of [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[51,96,53],"tags":[123,326,312],"_links":{"self":[{"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/posts\/207"}],"collection":[{"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/comments?post=207"}],"version-history":[{"count":0,"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/posts\/207\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/media?parent=207"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/categories?post=207"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.khaitan.org\/blog\/wp-json\/wp\/v2\/tags?post=207"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}