How to copy simultaneously from multiple nodes using Fabric?

I have just started using Fabric, looks like a very useful tool. I am able to write a tiny script to run some commands in parallel on my Amazon EC2 hosts, something like this:

@parallel
def runs_in_parallel():
    sudo("sudo rm -rf /usr/lib/jvm/j2sdk1.6-oracle")

Also, I have written another script to copy all the Hadoop logs from all EC2 nodes to my local machine. This script creates a folder with timestamp as name, within that 1 folder for each node as its IP address and then copies that node's logs in this IP address named folder. E.g.:

2014-04-22-15-52-55
    50.17.94.170 
         hadoop-logs
    54.204.157.86  
         hadoop-logs
    54.205.86.22 
         hadoop-logs

Now I want to do this copy task using Fabric so that I can copy the logs in parallel, to save time. I thought I can easily do it the way I did in my first code snippet, but that won't help, as it runs commands on the remote server. I have no clue as of now how to do this. Any help is much appreciated.


ANSWERS:


You could likely use the get() command to handle pulling down files. You'd want to make them into tarballs, and have them pull into unique filenames on your client to keep the gets from clobbering one another.



 MORE:


 ? Running Fabric from an ec2 instance, to another ec2 instance
 ? Python Fabric is hanging
 ? Best way to store auth credentials on fabric deploys?
 ? Python Fabric Parallel Execution Failure on EC2: Updated
 ? Is there possible execute function while fabric SUDO is running
 ? Is there possible execute function while fabric SUDO is running
 ? Is there possible execute function while fabric SUDO is running
 ? python fabric prompted for password everytime i execute a sudo command
 ? Possible to use conditional execution in Fabric?
 ? execute as sudo in fabric