cmd line help
Forum rules
Topics in this forum are automatically closed 6 months after creation.
Topics in this forum are automatically closed 6 months after creation.
cmd line help
Greetings!
I use cmd line so infrequently, I need some help with ssh
I have a website that is 30 GB, 9000+ pages and many directories deep. Filezilla fails attempting to download the site for backup. Gets 2 dirs down and chokes.
I think what I need to do is go in with ssh and create a giant tar.gz file of the public directory and save it to a directory I made called "backup". Then though it'll
take awhile, filezilla should be able to grab that.
So I am requesting the input to tar.gz all of /public and deposit it in /backup.
Thanks in advance!
I use cmd line so infrequently, I need some help with ssh
I have a website that is 30 GB, 9000+ pages and many directories deep. Filezilla fails attempting to download the site for backup. Gets 2 dirs down and chokes.
I think what I need to do is go in with ssh and create a giant tar.gz file of the public directory and save it to a directory I made called "backup". Then though it'll
take awhile, filezilla should be able to grab that.
So I am requesting the input to tar.gz all of /public and deposit it in /backup.
Thanks in advance!
Last edited by LockBot on Wed Dec 28, 2022 7:16 am, edited 1 time in total.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
Re: cmd line help
You would be correct, try:
in a terminal on the remote host.
This will send it to the "background" until it finishes, so you aren't trapped by using the operation.
will show you its growth
Code: Select all
tar -pcvf /backup/archive.tar.gz /public &
This will send it to the "background" until it finishes, so you aren't trapped by using the operation.
Code: Select all
ls -hl /backup/archive.tar.gz
Re: cmd line help
Thanks so much Habitual!!!
You don't happen to know the directory depth limitations of Filezilla do you? I'm guessing that's been the issue...
You don't happen to know the directory depth limitations of Filezilla do you? I'm guessing that's been the issue...
Re: cmd line help
wget
goes 5 levels deep by default, but it can be set to any number you like.“If the government were coming for your TVs and cars, then you'd be upset. But, as it is, they're only coming for your sons.” - Daniel Berrigan
Re: cmd line help
Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.
Thoughts? All help appreciated!
Thoughts? All help appreciated!
Re: cmd line help
Got your email, Phuz, and have responded to it.
“If the government were coming for your TVs and cars, then you'd be upset. But, as it is, they're only coming for your sons.” - Daniel Berrigan
Re: cmd line help
Please note that if you already have ssh access to the server you might as well use scp or, better, rsync to download the files to your own system. That is; if you normally log in to your server as say
to verbosely synchronize the ~/public directory on "myserver" to the local directory /where/ever/backup/public. If a regular ssh login asks for a a password (i.e., if you're not logging in through public key) so will that rsync and conversely, if not then neither. Syncing it back after doing local changes is of course as easy.
ssh myserver
and if what you need to get is the directory "~/public" on it, you can simply use, on your own machine,Code: Select all
rsync -av myserver:~/public /where/ever/backup/
You need help? With ssh, filezilla, recursive, or tar errors?Phuz wrote:Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.
Thoughts? All help appreciated!
Help doesn't begin....
Disk Space...?
Do you know how to check?
Re:
Disk space on the server where I'm trying to remotely .tar.gz this directory cluster is not an issue. The errors occur while the commandHabitual wrote:You need help? With ssh, filezilla, recursive, or tar errors?Phuz wrote:Went what looked like most of the way through creating the .tar file and gave message "tar: Error exit delayed from previous errors". Scrolled all the way back up through the terminal output lines looking for any error messages and found none.
Thoughts? All help appreciated!
Help doesn't begin....
Disk Space...?
Do you know how to check?
Code: Select all
tar -pcvf /backup/archive.tar.gz /public &
Yes I know how to check space. Filezilla won't dig deep enough through directory structure. I think the error is gz related at this point.
I'm going to try just making a tarball with no compression and see if it succeeds. At least then, it will essentially be one file to download.
Let you know.
Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
Last edited by Phuz on Fri Mar 24, 2017 6:32 pm, edited 1 time in total.
Re: cmd line help
rsync too difficult?Phuz wrote:I'm going to try just making a tarball with no compression and see if it succeeds.
Re: cmd line help
Never used it... I'll review and test
definitely nervous about it as there is currently no backup for this 16 year collection of data that would be impossible to duplicate. That's why I'm desperate to get a transfer done.
definitely nervous about it as there is currently no backup for this 16 year collection of data that would be impossible to duplicate. That's why I'm desperate to get a transfer done.
Re: cmd line help
OK, just saw an issue with rsync which is one reason I've been trying to create a .tar.gz or even just .tar
I am on a crappy dsl line and if I understand correctly, since rsync is going to get one file at a time (like ftp), it equates to a 3 day download due to start/end sequences on each file. Am I understanding rsync wrong?
I have a 300Mb-down/30M-up line coming, but it's not here yet. Limited to 12Mb/sec down right now. This may be a partial reason ftp has also failed. ???
I am on a crappy dsl line and if I understand correctly, since rsync is going to get one file at a time (like ftp), it equates to a 3 day download due to start/end sequences on each file. Am I understanding rsync wrong?
I have a 300Mb-down/30M-up line coming, but it's not here yet. Limited to 12Mb/sec down right now. This may be a partial reason ftp has also failed. ???
Re: cmd line help
Well, yes, rsync is indeed going to transfer files one at a time but I believe you are overestimating the overhead involved. Moreover, generally speaking rsync is to be advised in circumstances where there might be interrupts/dropouts as seems might be the case for you. I tend to advise to think of rsync as "a better cp" rather than as anything special but certainly one of those better things is that you can just restart a failed/interrupted rsync (with the same options) and have it pick up from where it left of. The "one giant file" scenario is much more fragile in that situation.
I'd just start the rsync, syntax provided above, and if you find it interrupted, just repeat the same command to have it pick up from where it left of. Make the "-av" be "-avz" to use compression; possibly useful in your situation.
I'd just start the rsync, syntax provided above, and if you find it interrupted, just repeat the same command to have it pick up from where it left of. Make the "-av" be "-avz" to use compression; possibly useful in your situation.
Re: cmd line help
Thanks Rene! I do have a 20 times fatter pipe coming tomorrow. I'll tug on the rsync handle then and see what happens. With all the issues I've had trying to get this archive d/l'd, it could be this crummy circuit I'm on, could be a corrupt file on the other end (though I doubt it since the troubles have appeared at different locations, download wise...).
Post back after attempting again tommorow afternoon. New pipe install scheduled for 8-10 am.
Post back after attempting again tommorow afternoon. New pipe install scheduled for 8-10 am.
Re: Re:
Try it as root.Phuz wrote:Habitual wrote:Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
See also Simple, Secure Backups for Linux with rsync
Re: Re:
Tried 7 times today. Tarball creation has failed and created 6 different sizes of files. Anywhere from 2.4 GB to 25.2 GB.Habitual wrote:Try it as root.Phuz wrote:Habitual wrote:Failed again at same point... here is screen shot
http://stevespages.com/Not/TarError.png
See also Simple, Secure Backups for Linux with rsync
Tried rsync with su and it told me permission denied. Didn't say wrong password. I am going to be hollering at their tech
support tomorrow. Just got off phone and the only tech online said "I don't know any Linux or Unix". They are Unix based
server farm for crying out loud. They have now had webmail shutdown for all 9 domains I have over there, for 2 weeks. Say,
"it 'should' be fixed on the 29th or later with new version. So now MY customers are calling and complaining. I've built forwards
from the webmail interface for all of them and have them on pop clients which is my personal preference anyway... but GEEZ!
I'm going to tell them tomorrow if their level 2 guys can't/won't even make a clean zip file of the main site and drop it in a directory for me
to D/L I'll go elsewhere.
So, who has great hosting with phone support?
Sorry for ranting, just getting really frustrated at not being able to get what should be a simple backup. Also with my MUCH higher speed today, discovered they have ftp speed capped at 20 Mb/s.
Re: cmd line help
Don't use su(do). Why are you? You said/indicated you have ssh access to the server and are using it from the command line. How do you login with ssh to your server? if it's asPhuz wrote:Tried rsync with su and it told me permission denied.
ssh myserver
you can quickly test things: from your own machine, and without any sort of su(do) use,Code: Select all
$ echo example >example.txt
$ rsync -av example.txt myserver:~/
$ rsync -av myserver:~/example.txt example.new
$ cat example.new
Re: Re:
Leave "support" alone. You can do this without them, IF this isn't a shared hosting environment. eg: Do you "own" the server and/or have root privileges?Phuz wrote:I'm going to tell them tomorrow if their level 2 guys can't/won't even make a clean zip file of the main site and drop it in a directory for me
to D/L I'll go elsewhere.
So, who has great hosting with phone support?
Sorry for ranting, just getting really frustrated at not being able to get what should be a simple backup. Also with my MUCH higher speed today, discovered they have ftp speed capped at 20 Mb/s.
http://stevespages.com/Not/TarError.png shows a dollar sign prompt after the failed tar command.
This is a user environment.
Also do you as the "user" have rights to write to /backups and /public?
There's at least 2 reason for the failure.
Elevate
Code: Select all
sudo su -
Operate
Code: Select all
tar -pcvf /backup/archive.tar /public &
-z here will save disk space and time. Add gzip compression using the tar -z switch/option as shown.
Code: Select all
tar -pcfz /backup/archive.tar.gz /public &
lose the -v verbose? yawn.
Code: Select all
man tar
-p preserve permissions
-c create
-f fil
-z compress
-v verbose
Investigate
Code: Select all
lsof +D /backup/
Code: Select all
lsof +D /public
If tar is still operating, it will show in the output here, otherwise empty output.
Wait 3 minutes and check again.
Code: Select all
su -
sudo su -
This server, shared? shell40 doesn't bode well for your login. Looks like you need root and since su failed, try
sudo su -
If it shared hosting, is there a "Control Panel" like cPanel/WHM or what not?
That should get you started.
Good Luck.
Re: cmd line help [solved}
OK. Got it solved. There where many issues, mostly behind the scenes. First I finally reached a guy there on phone support who used to admin a Solaris network and actually knew what he was talking about vs the other guys that work there on support. To cut server overhead they instituted a process timeout which was responsible for the ssh failures (when it actually tired to capture the entire /public directory. Second they had my download bandwidth capped at 20 Mb/s so FTP would time out long before I finished a 30GB transfer. He (Solaris guy) talked to senior management and senior management jumped on the chests of a bunch of lazy level 2 guys there.
Problem was solved! Happy camper now! Have full backup (including the cgi.bin stuff) and have it saved here. In the last week I have added 3.5 more GB in data (mostly .pdf's) over there and have that all on DVD, so in case of absolute catastrophe, I can still recover. This is good!!! I can get back (and have) to my real biz of making knives.
Thanks SO much to those who contributed here... it has not fallen on deaf ears! Super Green (5th Element)
Chris
Problem was solved! Happy camper now! Have full backup (including the cgi.bin stuff) and have it saved here. In the last week I have added 3.5 more GB in data (mostly .pdf's) over there and have that all on DVD, so in case of absolute catastrophe, I can still recover. This is good!!! I can get back (and have) to my real biz of making knives.
Thanks SO much to those who contributed here... it has not fallen on deaf ears! Super Green (5th Element)
Chris
Re: cmd line help
Habitual,
many thanks for the options printout. I've saved and will be using. After all, practice makes perfect and I go by the 3 time rule these days. 1st time is introduction, doesn't mean I can duplicate it. 2nd time is refresher, 3rd time it sticks!
many thanks for the options printout. I've saved and will be using. After all, practice makes perfect and I go by the 3 time rule these days. 1st time is introduction, doesn't mean I can duplicate it. 2nd time is refresher, 3rd time it sticks!