General :: Forcing Tar To Create An Empty Archive?
Apr 21, 2010
I'm trying to use tar to tar files before transfer, so I can keep the entire file path rather than losing it along the way. However, when I try to tar an empty folder, it tells me that it is cowardly refusing to create an empty archive. I want to keep the empty folder on the other end, but don't want to put anything else into the archive to make it non-empty. Is there any way to do this?
I'm trying to create backup/archive my Ubuntu 10.04 system files (so I can restore it in case my system get corrupted). More specifically, I'm trying to zip the important files in my root directory not including my home directory (which includes my documents which I backup separately/more frequently) to an external hard drive attached via USB (called 'My Book').
Since File Roller didn't give me quite the level of control I was looking for, I created a script that I could execute to backup and archive regularly. Here's a snippet: cd /media/"My Book"/"Linux Backups" NOW=$(date +"%b-%d-%y") LOGFILE=Backup_Root_FileSystem-$NOW.log sudo zip -r -T -v Backup_Root_FileSystem-$NOW / -x /media/'My Book'* /media* /proc* /sys* /mnt* /dev* /cdrom* /home* /'lost+found'* | tee -a $LOGFILE
I having been searching for a way to create an empty or blank iso file, so that I can mount it, and have a backup application think it's a blank CD. I am tired of wasting CD's by having the application write a recovery CD, just for me to turn around and export it to an iso image to be stored in a online archive, and then throw away the physical CD.
I am using Ubuntu 10.10 and I started to have a kind of annoying problem with file-roller. I do not know what I attempted to unpack or pack form my USB HDD a couple of days ago, but when I did I got a message from the archive manager, the message was: "Could not create the archive, Archive type not supported." Well, I just shrugged, and continued doing whatever I was doing, but later I noticed that whenever I tried to open the HDD for whatever reason from "PLACES" in the main panel, I was getting the same error message over and over again.
It is not produced when I open the HDD from the Desktop link in my Desktop, that link will open the HDD with no problems. But if I try to access the HDD from "PLACES" in the main panel I will always get the same error message. Ok, what have I tried: since the archive manager window shows the "file-roller" logo in the menu button, I assumed that file roller is stuck with the previous process for which it produced an error, therefore I tried to "kill" any pending processes by several ways: Login-off, re-starting and turning the computer off.
That did not worked. Whenever I attempted to open the HDD from "PLACES" the same error "Could not create the archive, Archive type not supported." will return. Assuming that nautilus was stuck with the process I tried to kill any pending processes by using Alt+F2 and trying to kill the pending nautilus process using: ps -e | grep nautilus and then the command kill {####}
That did not worked either. I was tired and uninstalled "file-roller" using the software manager, this solved the problem. But I actually use "file-roller" so I re-install the program and the problem returned. I tried to uninstall the program using synaptic manager using the "Mark for complete removal" option, uninstall, re-install and the problem was there. I uninstalled all the other recommended packages that were installed along with "file-roller" using synaptics and then just tried to install "file-roller" alone, the problem is still here.
Then I thought it was a problem related to the HDD. So I went ahead and re-formatted it using the EXT4 file system with no partitions. Re-booted and the problem is still here. Whatever process I started with "file-roller" or one of its companion pkgs is still active somewhere, I believe is "alive" in the system's HDD, because it returns after a complete uninstall/reboot/install process. I do not want to mess with "nautilus" itself, Do I need to uninstall/reinstall "nautilus-data" pkg?
System: Ubuntu 10.10, running solo in an EXT4 single partition. Desktop: Gnome v2.32.0 Programs: file-roller v.2.32.0-0ununtu1; unzip v6.0-4; zip v3.0-3; xz-utils 4.999.9beta+20100527-1 USB HDD: 500GB, single partition, EXT4 file system.
Is it possible to have Archive Manager create the specified folder when I enter it into the path field? Like if I want to put something in "/Main/Sub1/Sub2" But "sub2" doesn't exist, can I just have Archive Manager create it and put the files there?
I am new to the world of linux and when attempting to verify a tar archive I am displayed the following error. When running the command tar cvfW archivename.tar filename directoryname does not yield any errors.
I'd like to ask about archive mounter feature, can I mount zip file with read write mode? can gvfsd-archive do that?, or I must use fuse-zip to mount it? If I must use fuse-zip, how I wrap it so I can use it via nautilus or via gvfs-fuse-daemon
I have a TCP connection (SSH session to some computer for example) Network suddenly goes down and drops all packets (disconnected cable, out of range). TCP resends packets again and again, retrying with increasing delays. I see the problem and plug the cable back (or restore network somehow). TCP connection finally successfully resends some packet and continues.
The problem is that I need to wait for a some timeout on point 5. I want to use my opened SSH session now and not wait for 5-10 seconds until it finds out that connection is working again.
How to force all TCP connections to resend data without delays in GNU/Linux?
I have set up a linux box running slackware 12.0, along with Apache 2.2.4, on my LAN I have a couple of computers. I want to force them to a webpage under document root, the webpage will be a agreement webpage. Is this possible to do with Apache? This will not be real domain, so my guess it that I would have to tell my DNS server to resolve the ip address to the hostname of my slackware box.
In the upper left corner, I click places>documents (or anything besides computer) and I get the error,
"Could not create the archive, archive type not supported" from archive manager.
I saw a thread from a while back claiming I could just right click and "open with," but that simply isn't possible in these drop down menus, right clicks are treated as left clicks.
Does anybody have any ideas? It's very annoying trying to quickly access my files and having to go through Computer every time to get to them.
PS: I also deleted an "inode directory=" line in ~/.local/share/applications/mimeaps, as explained by a forum thread I found. Still, no luck...
I am a final year student doign Computer systems engineering and just been introduced to linux. While still strugling to catch up with the commands, I am now given an assignment under shell scripting.I seriously am strugling to understand this question, can you please assist me.Here follows the assignment:
Operating Systems III Some tips e.g. (test if a file is empty, if it is then display "file is empty" otherwise display
I have this nasty habit of refreshing desktop in a quick succession by right-clicking and selecting 'Refresh',on my XP system at office.(And,iam sure most of us do the same).With Ubuntu,if a right-click on desktop slowly and select 'align by...',it simulates the XP refresh action as explained above.But,if i perform the same action rapidly,it takes this first option from right-click context menu,which is 'Create Folder',and results in an empty folder being created on desktop.I tried double right-clicking and again it created an empty folder.Is there any workaround to handle this.I mean:Can the right-click context menu items be shuffled so that the 'Create Folder' option is moved from 1st place
I have a program that takes a relative path as input appends it to a some path string to get the actual path.
Now all I can input is the relative path. So if I want to go one level above my input will be ../mypath.
If I know the depth of the path used internally, I can use .. as many times to go to the root directory and then give the absolute path. But suppose I do not know the depth of the directory, can I construct a relative path string such that it considers it as a relative path. One way could be to have enough .. in the path string so that I can force an absolute path for some maximum depth of path.
Is there some path string syntax that I am not aware of but can achieve this?
I'm new to Ubuntu, and everytime i've tried to download a program like iTunes, the "Archive Manager" comes up and says "An Error Has Occured While Loading the Archive". how to fix this or download programs ?
I am stuck trying to write a script that does the following :
1. loops through all subdirectories of a given directory
2. for each found subdirectory, first create an archive carrying the same name as the subdir itself
3. then moves the actual subdirectory to a different path
Here is what I have so far: my base dir is /home/bob/Bureau, and it contains two sub dirs, "florissant 86 a" and "saule 84" I would like to create one archive for each subdir in /media/public/atelierPhotoArchive and then move the folder entirely to /tmp/photo.write
Everything goes well until I have to either tar or loop through the file with spaces in names
This last statement outputs different lines after each space in my lst file
Question 1: Is it possible to make it output once for each line in the file ?
Question 2: Is it possible to do some exec tar in the find command? I had difficulties extracting the "short" name for the archive ("saule 84") without /home/bob/Bureau, it is possible only with the printf %f, but how in the world can I get this value in to the exec option ?
I want to untar a package from one directory to another directory, directly from the command prompt. I want to untar the joomla package into the htdocs directory of xampp. How to do that directly from command prompt ? The reason i am asking this is if I try the "drag and drop" way, it won't be possible as xampp is stored in /opt directory and without super-user authentication nothing can be saved into it. You can argue saying that why did I untar xampp initially in su mode ,but that had to be done so that apache doesnt give me any start-up problems.
Is it possible to compress the mysqldump output into say db_backup.sql.tgz. Then add that to an existing archive e.g. backup.tgz in one command or on the fly to save space and deleting it?
I need files to be <= 5GB to put on S3. Right now I have an ugly tar / gzip / cut before upload, then cat / zcat / tar on download, but it's really ugly - and nearly every archiver should support archive splitting right? What's the best way?
I'm searching for a tutorial on how to convert my videos (AVI, WMV and MKV formats) to a format that offers both good quality and small size.The audio part should be 128 kbit/s mp3 lame.I'm using Fedora 14 with an rpmfusion repo. It would be especially nice to be able to batch-convert the AVIs.
I have been playing around with the tar command and I know this is how to use it. Code: tar -cf [filename] [directory] But what I want to make an archive from the current directory I thought just to not enter a directory but that doesn't work. I get an error about creating a empty archive so how to do I make it so how do I tell it to do the current directory?
The command tar -xvf wpa_tables.tar gives this error
tar: ./xai-0/334Regency: Cannot open: Invalid or incomplete multibyte or wide character tar: Exiting with failure status due to previous errors
Obviously the backslash is giving tar some problem, but I've been all over the docs and can't figure out how to either make it skip this file or interpret the character literally. Here's my command history to show some of the options I've tried which don't work.
tar --no-wildcards -xkvf wpa_tables.tar tar --exclude ./xai-0/\334Regency -xkvf wpa_tables.tar tar --exclude "./xai-0/\334Regency" -xkvf wpa_tables.tar tar --ignore-command-error -xkvf wpa_tables.tar tar --ignore-failed-read -xkvf wpa_tables.tar tar --transform 's/\/slash/g' -xvf wpa_tables.tar ./xai-0/\334Regency
I'm totally out of ideas at this point and would welcome some input from more experienced members.
Ubuntu 10.04. As part of my nightly backup script I archive my home directory with the following command tar -cvpzf /quitelarge/_mirror/mirror1/home-ken.gz /home/ken 2>> /quitelarge/_mirror/tar-error.log
It seems to work fine and I have recovered files from the archive on occasion. Actually I keep 7 rolling daily backups and a monthly burn to DVD. I had an sftp connection made by Nautilus to my server. Ubuntu for whatever reason places an icon on the desktop showing the connection. When I ran the script it decided to archive everything on my server - all 1.4 TB. I caught the problem when home-ken.gz was about 5 GB. I stopped the process, closed the sftp connection, rolled back the backups and tried again. This time I got a file of the expected size - about 45 MB.
In the error log I did find that the tar process was trying to suck the entire contents of the server into the archive file. tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c/sub0: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5/pcm0c: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound/ICH5: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/asound: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/scsi: file changed as we read it tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/event: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/fadt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/acpi/dsdt: Cannot open: Permission denied tar: /home/ken/.gvfs/sftp for ken on taylor10/proc/irq/21/smp_affinity: Cannot open: Permission denied
Is there an option I can place on the tar command to tell it NOT to follow the ssh connection which is sitting on my desktop? The closest thing I see in the documentation is -h which tells tar to "follow symlinks; archive and dump the files they point to." I am NOT specifying -h so if the ssh connection is treated as a symlink by tar I would still not expect the remote contents to be tarred.