cmd line help

About writing shell scripts and making the most of your shell
Forum rules
Topics in this forum are automatically closed 6 months after creation.
Phuz

cmd line help

Post by Phuz »

Greetings!
I use cmd line so infrequently, I need some help with ssh
I have a website that is 30 GB, 9000+ pages and many directories deep. Filezilla fails attempting to download the site for backup. Gets 2 dirs down and chokes.

I think what I need to do is go in with ssh and create a giant tar.gz file of the public directory and save it to a directory I made called "backup". Then though it'll
take awhile, filezilla should be able to grab that.

So I am requesting the input to tar.gz all of /public and deposit it in /backup.

Thanks in advance!
Last edited by LockBot on Wed Dec 28, 2022 7:16 am, edited 1 time in total.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
Habitual

Re: cmd line help

Post by Habitual »

You would be correct, try:

Code: Select all

tar -pcvf /backup/archive.tar.gz /public &
in a terminal on the remote host.
This will send it to the "background" until it finishes, so you aren't trapped by using the operation.

Code: Select all

ls -hl /backup/archive.tar.gz
will show you its growth
Phuz

Re: cmd line help

Post by Phuz »

Thanks so much Habitual!!!

You don't happen to know the directory depth limitations of Filezilla do you? I'm guessing that's been the issue...
User avatar
jimallyn
Level 19
Level 19
Posts: 9075
Joined: Thu Jun 05, 2014 7:34 pm
Location: Wenatchee, WA USA

Re: cmd line help

Post by jimallyn »

wget goes 5 levels deep by default, but it can be set to any number you like.
“If the government were coming for your TVs and cars, then you'd be upset. But, as it is, they're only coming for your sons.” - Daniel Berrigan
Phuz

Re: cmd line help

Post by Phuz »

Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.

Thoughts? All help appreciated!
User avatar
jimallyn
Level 19
Level 19
Posts: 9075
Joined: Thu Jun 05, 2014 7:34 pm
Location: Wenatchee, WA USA

Re: cmd line help

Post by jimallyn »

Got your email, Phuz, and have responded to it.
“If the government were coming for your TVs and cars, then you'd be upset. But, as it is, they're only coming for your sons.” - Daniel Berrigan
rene
Level 20
Level 20
Posts: 12212
Joined: Sun Mar 27, 2016 6:58 pm

Re: cmd line help

Post by rene »

Please note that if you already have ssh access to the server you might as well use scp or, better, rsync to download the files to your own system. That is; if you normally log in to your server as say ssh myserver and if what you need to get is the directory "~/public" on it, you can simply use, on your own machine,

Code: Select all

rsync -av myserver:~/public /where/ever/backup/
to verbosely synchronize the ~/public directory on "myserver" to the local directory /where/ever/backup/public. If a regular ssh login asks for a a password (i.e., if you're not logging in through public key) so will that rsync and conversely, if not then neither. Syncing it back after doing local changes is of course as easy.
Habitual

Post by Habitual »

Phuz wrote:Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.

Thoughts? All help appreciated!
You need help? With ssh, filezilla, recursive, or tar errors?
Help doesn't begin....

Disk Space...?
Do you know how to check?
Phuz

Re:

Post by Phuz »

Habitual wrote:
Phuz wrote:Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.

Thoughts? All help appreciated!
You need help? With ssh, filezilla, recursive, or tar errors?
Help doesn't begin....

Disk Space...?
Do you know how to check?
Disk space on the server where I'm trying to remotely .tar.gz this directory cluster is not an issue. The errors occur while the command

Code: Select all

tar -pcvf /backup/archive.tar.gz /public &
is running.

Yes I know how to check space. Filezilla won't dig deep enough through directory structure. I think the error is gz related at this point.

I'm going to try just making a tarball with no compression and see if it succeeds. At least then, it will essentially be one file to download.
Let you know.

Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
Last edited by Phuz on Fri Mar 24, 2017 6:32 pm, edited 1 time in total.
rene
Level 20
Level 20
Posts: 12212
Joined: Sun Mar 27, 2016 6:58 pm

Re: cmd line help

Post by rene »

Phuz wrote:I'm going to try just making a tarball with no compression and see if it succeeds.
rsync too difficult?
Phuz

Re: cmd line help

Post by Phuz »

Never used it... I'll review and test

definitely nervous about it as there is currently no backup for this 16 year collection of data that would be impossible to duplicate. That's why I'm desperate to get a transfer done.
Phuz

Re: cmd line help

Post by Phuz »

OK, just saw an issue with rsync which is one reason I've been trying to create a .tar.gz or even just .tar

I am on a crappy dsl line and if I understand correctly, since rsync is going to get one file at a time (like ftp), it equates to a 3 day download due to start/end sequences on each file. Am I understanding rsync wrong?

I have a 300Mb-down/30M-up line coming, but it's not here yet. Limited to 12Mb/sec down right now. This may be a partial reason ftp has also failed. ???
rene
Level 20
Level 20
Posts: 12212
Joined: Sun Mar 27, 2016 6:58 pm

Re: cmd line help

Post by rene »

Well, yes, rsync is indeed going to transfer files one at a time but I believe you are overestimating the overhead involved. Moreover, generally speaking rsync is to be advised in circumstances where there might be interrupts/dropouts as seems might be the case for you. I tend to advise to think of rsync as "a better cp" rather than as anything special but certainly one of those better things is that you can just restart a failed/interrupted rsync (with the same options) and have it pick up from where it left of. The "one giant file" scenario is much more fragile in that situation.

I'd just start the rsync, syntax provided above, and if you find it interrupted, just repeat the same command to have it pick up from where it left of. Make the "-av" be "-avz" to use compression; possibly useful in your situation.
Phuz

Re: cmd line help

Post by Phuz »

Thanks Rene! I do have a 20 times fatter pipe coming tomorrow. I'll tug on the rsync handle then and see what happens. With all the issues I've had trying to get this archive d/l'd, it could be this crummy circuit I'm on, could be a corrupt file on the other end (though I doubt it since the troubles have appeared at different locations, download wise...).

Post back after attempting again tommorow afternoon. New pipe install scheduled for 8-10 am.
Habitual

Re: Re:

Post by Habitual »

Phuz wrote:
Habitual wrote:Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
Try it as root.

See also Simple, Secure Backups for Linux with rsync
Phuz

Re: Re:

Post by Phuz »

Habitual wrote:
Phuz wrote:
Habitual wrote:Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
Try it as root.

See also Simple, Secure Backups for Linux with rsync
Tried 7 times today. Tarball creation has failed and created 6 different sizes of files. Anywhere from 2.4 GB to 25.2 GB.
Tried rsync with su and it told me permission denied. Didn't say wrong password. I am going to be hollering at their tech
support tomorrow. Just got off phone and the only tech online said "I don't know any Linux or Unix". They are Unix based
server farm for crying out loud. They have now had webmail shutdown for all 9 domains I have over there, for 2 weeks. Say,
"it 'should' be fixed on the 29th or later with new version. So now MY customers are calling and complaining. I've built forwards
from the webmail interface for all of them and have them on pop clients which is my personal preference anyway... but GEEZ!

I'm going to tell them tomorrow if their level 2 guys can't/won't even make a clean zip file of the main site and drop it in a directory for me
to D/L I'll go elsewhere.

So, who has great hosting with phone support? :)

Sorry for ranting, just getting really frustrated at not being able to get what should be a simple backup. Also with my MUCH higher speed today, discovered they have ftp speed capped at 20 Mb/s.
rene
Level 20
Level 20
Posts: 12212
Joined: Sun Mar 27, 2016 6:58 pm

Re: cmd line help

Post by rene »

Phuz wrote:Tried rsync with su and it told me permission denied.
Don't use su(do). Why are you? You said/indicated you have ssh access to the server and are using it from the command line. How do you login with ssh to your server? if it's as ssh myserver you can quickly test things: from your own machine, and without any sort of su(do) use,

Code: Select all

$ echo example >example.txt
$ rsync -av example.txt myserver:~/
$ rsync -av myserver:~/example.txt example.new 
$ cat example.new
Both rsync's should ask you for your password if the regular "ssh myserver" also does, not if not. Anything exciting to report? If not, the previously quoted method of grabbing whatever it is that you need to get from "myserver" will work as well. Note, there's no magic: "myserver:~/" is just your home directory on "myserver". If you need to grab, say, /var/www/mysite.com just use "myserver:/var/www/mysite.com".
Habitual

Re: Re:

Post by Habitual »

Phuz wrote:I'm going to tell them tomorrow if their level 2 guys can't/won't even make a clean zip file of the main site and drop it in a directory for me
to D/L I'll go elsewhere.

So, who has great hosting with phone support? :)

Sorry for ranting, just getting really frustrated at not being able to get what should be a simple backup. Also with my MUCH higher speed today, discovered they have ftp speed capped at 20 Mb/s.
Leave "support" alone. You can do this without them, IF this isn't a shared hosting environment. eg: Do you "own" the server and/or have root privileges?

http://stevespages.com/Not/TarError.png shows a dollar sign prompt after the failed tar command.
This is a user environment.

Also do you as the "user" have rights to write to /backups and /public?

There's at least 2 reason for the failure.

Elevate

Code: Select all

sudo su -


Operate

Code: Select all

tar -pcvf /backup/archive.tar /public &
if your not concerned about saving disk space?
-z here will save disk space and time. Add gzip compression using the tar -z switch/option as shown.

Code: Select all

tar -pcfz /backup/archive.tar.gz /public &
Both of these sends the tar process into the background "&" because you want to 'watch" "30 GB, 9000+" fly by?
lose the -v verbose? yawn.

Code: Select all

man tar
-p preserve permissions
-c create
-f fil
-z compress
-v verbose


Investigate

Code: Select all

lsof +D /backup/
and you could easily do this with

Code: Select all

lsof +D  /public 
also.
If tar is still operating, it will show in the output here, otherwise empty output.
Wait 3 minutes and check again.

Code: Select all

su - 
sudo su - 
There is a difference

This server, shared? shell40 doesn't bode well for your login. Looks like you need root and since su failed, try sudo su -
If it shared hosting, is there a "Control Panel" like cPanel/WHM or what not?

That should get you started.
Good Luck.
Phuz

Re: cmd line help [solved}

Post by Phuz »

OK. Got it solved. There where many issues, mostly behind the scenes. First I finally reached a guy there on phone support who used to admin a Solaris network and actually knew what he was talking about vs the other guys that work there on support. To cut server overhead they instituted a process timeout which was responsible for the ssh failures (when it actually tired to capture the entire /public directory. Second they had my download bandwidth capped at 20 Mb/s so FTP would time out long before I finished a 30GB transfer. He (Solaris guy) talked to senior management and senior management jumped on the chests of a bunch of lazy level 2 guys there.

Problem was solved! Happy camper now! Have full backup (including the cgi.bin stuff) and have it saved here. In the last week I have added 3.5 more GB in data (mostly .pdf's) over there and have that all on DVD, so in case of absolute catastrophe, I can still recover. This is good!!! I can get back (and have) to my real biz of making knives.

Thanks SO much to those who contributed here... it has not fallen on deaf ears! :mrgreen: Super Green (5th Element)

Chris
Phuz

Re: cmd line help

Post by Phuz »

Habitual,
many thanks for the options printout. I've saved and will be using. After all, practice makes perfect and I go by the 3 time rule these days. 1st time is introduction, doesn't mean I can duplicate it. 2nd time is refresher, 3rd time it sticks!
Locked

Return to “Scripts & Bash”