Question. Suppose you’ve got a server (Linux, of course) that’s at 94% storage capacity and you want to remotely back 40% of it, no physical access. But because there are hundreds of thousands of files on that bad sucka, an FTP download would take forever, you dig?
So you want to tar it all up, and because you’re afraid you’ll get disorganized and you have limited time you don’t want to chop up the 40% into a bunch of tars to grab and delete one by one. Ain’t nobody got time for that.
What you gonna do?
The voices in my head argue that it has got to be theoretically possible and surely others have been in this situation, which by my math means someone figured it out, and possibly like this: You somehow connect your home computer to your server, perhaps involving SCP and SSH. You get the server to start tarring or catting the files.
But here’s the magic, you ready boss?
Tar (or cat) doesn’t write anything to the server’s disk, rather it pipes its output over that connection, with your computer on the other end collecting the bytes and packing them up happily into a tar. Something like that. Maybe?