Transfer Google Drive data between accounts using Google scripts

I had a number of important files on a google drive account administered by other organization.  Now that I’ve completed the job, I believe that my account will be deleted in the near future, and I’m hoping to move my files to my new google drive account administered by me or my other Google account.

If you have a operating system that support Google drive Application which in case Windows and Mac are then you can do it just by sharing the files with other account(where you want to transfer files) but you are still not owner of those shared files but using a Google drive application you can copy that shared folder to other folder and you are done. this process is time consuming and also will not work with operating systems which don’t have Google Drive application. In my case using Ubuntu I had no possible to way to do it, but fortunately I found a great solution to this problem.

I’ll describe following steps to make this happen.

Using Google Drive web interface:

  1. Create a new folder and name it what you want.
  2. Go into the Pre-existing folders/files, select all the files, and move these files to created folder.
  3. Share that folder with destination account(where files need to transfer).
  4. Login to destination account and in Google drive interface you will see shared folder under “Share with me” link.
  5. In Google Drive, there’s no easy way to duplicate a folder. It is possible to make a copy of individual files but there’s no command for creating duplicate folders that are a mirror of another folder or are shared. but fortunately we can solve this problem by Google App Scrips. Bellow is the piece of JavaScript code which can help you to duplicate nested folder in drive.
  6. Visit Google app scrips page and click start scripting and paste the following code.

To run this code successfully you will need to provide Google drive permission and you can run above code. from list of function select duplicate function and click run. this will copy source folder and files to destination folder.

ECMAScript Harmony – Compile ES6 code to ES5 using Traceur/Babel

ECMAScript Edition 6 (ES6) comes with lot of features such as classes, generators, destructuring, modules and much more. Currently ES6 code is not yet implemented/supported in all browsers and JS engines, you might wanna look here. Using Firefox and chrome(latest) developer tools you can test features like arrow functions etc. for chrome you will also need to enable chrome harmony flag. chrome://flags/#enable-javascript-harmony.

It might be bit early but you wanna to use latest ES6 features and here you will need a compiler that compiles your ES6 code down to regular Javascript that could run in browsers. currently we have two most famous compilers Traceur and Babel for this purpose.


You can try Traceur in several ways:

  • Include Traceur in a Web page and it will compile ES6 code on the fly.
  • Or use node to compile ES6 to ES5 offline and include the result in Web pages or just run the result in node.

For browser you can include following script and it will convert automatically.

In above example first file included above is the traceur compiler; next comes a small file of JS to apply the compiler to the Web page. Traceur itself written in ES6. so my simple example manipulate the array using high order function .map  that takes arrows function as callback.

You can also import ES6 modules

Other way is using offline compilation. for that you need node installed. Traceur includes a shell script traceur to compile ES6 code to ES5/Regular JavaScript.

We can test the module.js we just compiled like:

This runtime file contains polyfills as well as helper functions which reduce the generated code size. you can find more about traceur here.


Babel is another compiler for EcmaScript6.

You will need to include following files to your web page.

  • node_modules/babel-core/browser.js
  • node_modules/es6-module-loader/dist/es6-module-loader.js

Following is example to import ES6 module

Let’s setup development envirnmnet and start using ES6 using Babel in your applications. you’ll need to install following presets and plugins

babel-preset-es2015 is for ES6 and babel-preset-stage-0 is for ES7 experimental features, this is not required in our case, it may be good idea to have it.

Simple express application written using ES6 could be:

Above command will transpile ES6 code from express.js to out/express.js. so that will normal JavaScript which can run in ES5 or less supported environments.

Following is the output:

Also it is possible to compile whole directory containing ES6 code files using following way:


Node.js Streams, Pipe and chaining

A stream is an abstract interface implemented by various objects in Node.js.  Due to asynchronous and event driven nature Node.js is very good at handling I/O bound tasks/streams, streams are actually unix pipes.

For example a request to an HTTP server is a stream, as is stdout. request is a readable stream and response is a writable stream, streams could be Readable, Writable, or Duplex (both readable and writable). Readable streams let you read data from a source while writable streams let you write data to a destination. A “duplex” stream is one that is both Readable and Writable, such as a TCP socket connection.

All streams are instances of EventEmitter. Take a look at following example.

The function fs.createReadStream() gives you a readable stream object. using that object you listen to data event with attach callback. As soon as chunks of data are read and passed to your callback, we appends those chunks to our text string. At then end when there is no more data to read, the stream emits an end event. In the above code, we listen to this event to using another callback and we logged the text string.

You can set encoding on the stream by calling readStream.setEncoding().As a result, the data is interpreted as utf8 and passed to your callback as string.

At this point we also have interesting function called pipe() which we can use to transfer streams flow from source to destination. like we have piped the readable stream to writable stream like bellow.

So now we have piped readable stream of file1.txt to writable stream created for file2.txt, pipe() will manages the data flow, you don’t have handle it yourself and callbacks.

Its also comes up with chaining feature, so multiple destinations can be piped safely.

In above example we have piped readStream to gzip. Purpose of pipe() method is to limit the buffering of data to acceptable levels, so that sources and destinations of varying speed will not overwhelm the available memory.

Angular.js Promises – Deferred and Promise handling

A promise is an object that represents the return value or the thrown exception that the function may eventually provide. A promise can also be used as a proxy for a remote object to overcome latency.

Promises are highly useful when you wish to synchronize multiple asynchronous functions and also want to avoid Javascript callback hell. as following is a asynchronous example.

and alternative use of promise library for above example is:

Angular.js provide a service $q with implementation of promises/deferred objects inspired by Kris Kowal’s Q.  you can simply start using it as a service.

In above code, getUser() is creating a $q.defer() object which has returned a promise at the end of function, we have called it remote object earlier above. now deffered.resolve provide us the results in case of success and deffered.reject object gonna tell us the reason why it failed.

$q.defer() also have a method notify() which gives the status of the promise’s execution. This may be called multiple times before the promise is either resolved or rejected.

You can also wrap multiple promises using $q.all. all function combines multiple promises into a single promise and later on you can foreach through responses.

If you want to take look at further examples of Promises, you can see it Kris Kowal’s Q and Angular $q service