One Liners

Move it your way

John Marc Imbrescia
2 min readNov 12, 2013

--

Since 2004 I have run a webserver for my friends. Recently I wanted to backup all of their domains onto a remote server. The requirements are thus:

  • Source: /var/www/vhosts/*
  • Destination: remote.com:/data/bak/
  • There is not enough space on the Source machine to compress all the domains and then transfer them.
  • The domains should be in a domain.tar.gz file on the Destination
  • The domains should be compressed as they are sent into the wire
  • Need to run sudo on Source to see every file

Solution

formulated by Jayson Paul @JaysonMPaul

On the source machine run:

cd /var/www/vhosts ; for i in $(sudo find . -maxdepth 1 -mindepth 1 -type d -print); do DEST=$(echo “$i” | cut -d’/’ -f2); sudo tar czf — $i | ssh jm@remote.com “dd of=/data/source.bak/$DEST.tar.gz”; done

Requirers passwordless sudo on find and tar on Source machine, and key-authenticated ssh

We debated about replacing “dd of=” with “cat - >” but dd just seems like it should be faster.

Source output:

Status update when each directory finishes

--

--

John Marc Imbrescia

NY startup code slinger,@okcupid, @etsy, now @betaworks. Rensselaer, Andover. Married to @taliator. I dance at shows in NYC.