i have about 22 gb of music (mp3 & ogg) on my laptop harddrive.i also have an unused sony mp3 player with a 20 gb hard drive.what i want to do is back up the 22gb into a 20 gb space the music does not need to be playable on the sony player...just using it as a back up device.ok...2 issues:1. when i've tried compressing (tar.gz) mp3 files, little to no space is saved, i assume that a mp3 is pretty compressed already.is there another way to compress effectively ? i dont want to reduce bit rates of the individual music tracks.2. i formatted the sony hd using ext4, but this leaves me with only 16 gb usable space. tried fat32 and this left me with about 18gb.
I notice in Nautilus file manager, if I right click on a directory, I can 'compress' that file or directory. clicking this provides options for different compression formats, but how do I know which one to use?Most of my files are data files, ie text. Though I do have photo/image folders and a music folde
hy I can't make a 7z archive with file roller? I have already installed p7zip, p7zip-plugins, and 7za. I don't understand what the problem is. I can make a 7z archive from the command line with no problems. I keep getting this message but it doesn't tell me what the error is. Quote:An error occurred while adding files to the archive.
I downloaded a file that was 40mb compressed and was almost 700mb when fully extracted. It was inside a .rar file that in turn was inside another .rar file. How can this be done in Ubuntu? Can this also be done with .zip and .7z files?
I have a 500GB external drive I want to use on a couple of Linux systems, and looking for a filesystem for it. External drives are frequently formatted in FAT32, but I don't need to interoperate with Windows and would rather avoid the ugly limited kludge that is FAT.
Since I only need to use it on Linux, I would use ext4 or XFS, but they store ownership information. Ideally, I'd use a proper Unix file system that doesn't track ownership (files are owned by whoever mounts the device, like they are when mounting a FAT32 partition), but I do not know of any file system that does that.What would be a good file system for this disk?
I've been using *Unix systems for many years now, and I've always been led to believe that its best to partition certain dirs into separate FileSystems, off the main root FS.
For instance, /tmp /var /usr etc
Leaving as little as possible on the main / system.
Its so that you don't fill up the root system be accident, by some user putting in too bigger files in /tmp, for example.
I would presume that filling the / system would not be too good for Linux, as it would not be able to write logs and possibly other things that it needs to.
I believe that if root gets full, then there is something like a 5% amount saved for just 'root' to write to, so that it can do its stuff.
However, eventually, / will become full, and writes will fail.
On top of this, certain scripting tools, such as awk, use the /tmp/ system to store temp files in, and awk wont be able to write to /tmp/ as its full, so awk will fail.
However, I'm being advised that there is no need to put /tmp /var etc onto separate FSs, as there is no problem nowerdays with / filling up. So, /tmp /var /usr are all on the root FS.
I'm talking about large systems, with TBs of data (which is on a separate FS), and with a user populations of around 800-1000 users, and 24/7 system access.
I have a bunch of disk images, made with ddrescue, on an EXT partition, and I want to reduce their size without losing data, while still being mountable. How can I fill the empty space in the image's filesystem with zeros, and then convert the file into a sparse file so this empty space is not actually stored on disk?
For example:
> du -s --si --apparent-size Jimage.image 120G Jimage.image > du -s --si Jimage.image 121G Jimage.image
This actually only has 50G of real data on it, though, so the second measurement should be much smaller.
This supposedly will fill empty space with zeros: cat /dev/zero > zero.file rm zero.file But if sparse files are handled transparently, it might actually create a sparse file without writing anything to the virtual disk, ironically preventing me from turning the virtual disk image into a sparse file itself. :) Does it? Note: For some reason, sudo dd if=/dev/zero of=./zero.file works when cat does not on a mounted disk image.
One of the good points of linux is that is easy to customize the partitioning scheme of the disk and put each directory (/home, /var, etc) in diferent partitions and/or diferent disk. Then we can use diferen file system/configurations for each of them for make them better. xamples:
noatime is a mount option to not write access time on the files. data=writeback is an option to layz write metadata on new files. ext3/4 has journaling that make the partition more secure in case of a crash. bigger blocks make the partition waste more space, but make it faster to read and may become more fragmented. (not sure) Then: What are the best filesystem/configurations for each directory? Note: given the answer of Patches, will only discuss /, /home and /var only.
/var -> It's modified constantly, it write logs, cache, temporal, etc. /home -> stores important files. /-> stores everything else (/etc and /usr should be here)
I'm trying to back up my /home/username folder for the purposes of a clean install. I'm attempting to compress the folder to fit onto my flash drive and am running into problems. Regardless of where I try to put the compression, or what compression algorithm I try, they all give me the error "no such file or directory" and that's all the computer will tell me. Currently I've tried the tar.gz and tar.bz2 algorithms and have tried compressing to //, /home/, and /tmp/.
I have some errors when run the mount -all command: mount: wrong fs type, bad option, bad superblock on /dev/sdc5, missing codepage or helper program, or other error In some cases useful info is found in syslog - try dmesg | tail or so Failed to open /proc/filesystems: No such file or directory
My ubuntu stops when mounting system hdd. The screen display the following messages :
mountall:/etc/fstab: No such file or directory fsck from util-linux-ng 2.16 WARNING: couldn't open /etc/fstab: No such file or directory init: mountall main process (545) terminated with status 1 General error mounting filesystems. A maintenance shell will now be started. CONTROL-D will terminate this shell and re-try. udevd[560]: can not read '/etc/udev/rules.d/z80_user.rules. Ubuntu: clean, 474879/24231936 files/28016581/96898047 blocks root@i7:~# exit_
I suspect the disk manager pysdm that i had just installed today and it had crash during the previous session. The /etc/fstab file does not exist anymore and i cant rename the fstab.bak because the disk is read-only even for my root user
You right click on a file and choose Compress. How do you select the compression level for algorithms that support different levels? If you can't, what compression level does it use?
I use mplayer to play my media files. Occasionally I want to take screenshots of media I am playing. I use the -vf screenshot option to take screenshots which generates shotxxxx.png. The issue is that all those png are not compressed and usually large. When dealing with Hi-Def media they are extremely large, each is 4-5 MB in size at the least. Is there a way I can set the option for compression of png images. If I use imagemagick the same files get compressed to like 1.5MB of png file.
I am working on a project where I am dialing out of a modem!! Old stuff, ya, but the modem allows my device send info from remote sites from my datibase through a phone line so that this IT departments dont have to worry my device being a security issue on their networks.
Any way, the modem I'm using isn't incredibly well designed, and when a certain ascii char is read by the modem, it reads it as an EOF indicator. It is also important that the files I send are compressed.
My question is: Does anyone know of a compression format that allows ME to dissallow IT's use of certain ascii chars?
just as an illustration:
Device --------> Modem ---------> Off-site
and the Modem stops talking to the device when a certain char is passed to it.
I have just installed Free Arc utility for compression from : [URL] and it works smoothly but in command Line mode.
Commands: a add files to archive c add comment to archive ch modify archive (recompress, encrypt and so on) create create new archive cw write archive comment to file d delete files from archive e extract files from archive ignoring pathnames f freshen archive j join archives k lock archive l list files in archive lb bare list of files in archive lt technical archive listing m move files and dirs to archive mf move files to archive r recover archive using recovery record rr add recovery record to archive s convert archive to SFX t test archive integrity u update files in archive v verbosely list files in archive x extract files from archive
Options: -- stop processing options -ad --adddir add arcname to extraction path -aeALGORITHM --encryption=ALGORITHM encryption ALGORITHM (aes, blowfish, serpent, twofish) -agFMT --autogenerate=FMT autogenerate archive name with FMT -apDIR --arcpath=DIR base DIR in archive -baMODE --BrokenArchive=MODE deal with badly broken archive using MODE -cfgFILE --config=FILE use config FILE (default: arc.ini) -d --delete delete files & dirs after successful archiving -df --delfiles delete only files after successful archiving -diAMOUNT --display=AMOUNT control AMOUNT of information displayed: [hoacmnwrfdtske]* -dmMETHOD --dirmethod=METHOD compression METHOD for archive directory -dpDIR --diskpath=DIR base DIR on disk -dsORDER --sort=ORDER sort files in ORDER -ed --nodirs don't add empty dirs to archive -envVAR read default options from environment VAR (default: FREEARC) -epMODE --ExcludePath=MODE Exclude/expand path MODE -f --freshen freshen files -fn --fullnames match with full names -hpPASSWORD --HeadersPassword=PASSWORD encrypt/decrypt archive headers and data using PASSWORD -iTYPE --indicator=TYPE select progress indicator TYPE (0/1/2) -ioff --shutdown shutdown computer when operation completed -k --lock lock archive -kb --keepbroken keep broken extracted files -kfKEYFILE --keyfile=KEYFILE encrypt/decrypt using KEYFILE -lcN --LimitCompMem=N limit memory usage for compression to N mbytes -ldN --LimitDecompMem=N limit memory usage for decompression to N mbytes -mMETHOD --method=METHOD compression METHOD (-m0..-m9, -m1x..-m9x) -maLEVEL set filetype detection LEVEL (+/-/1..9) -max maximum compression using external precomp, ecm, .....
some reason it seems like the lowest it goes is ~64kbps (~ implying variable bitrate).So yeah, any thing that'll let me do the compression limbo better? (How low can you go? ) A different program? Unlock bonus stage? What
I have printer whhich work perfectly on ubuntu desktop I shared it into network and trying to print from Mac and I'm getting the following error: "SpliX Compression algorithm 0x0 does not exist" Printer - Samsung SCX-4200
I compressed a directory containing many image files. The directory amounted to 5.3gig. Compressed with TAR using .tgz the compression took a couple minutes at most and compressed down to 4.3 gig. Compressed using .bz2 the compression took about 90 minutes and compressed down to 4.2 gig. Hardly worth the extra time. Do these numbers look normal to you?
I want to rip all my CDs to flac at the lowest compression (space is not an issue) via Banshee. I have tried a few tracks but the seem to be ripped at a higher compression to the ones I have done via sound juicer (set at compression 0)
How do I adjust the flac settings in Banshee to do this? The option to edit the settings is not available for flac? I guess there is a config file somewhere?
I am trying to convert hundreds of BMPs into a single AVI file, without compression. Since every pixel matters in my case, I don't want any kind of compression. It's fine if the output file is extremely large.
I'd like to know if it's possible to achieve this with packages coming with Ubuntu 11.04. I have tried ffmpeg/mencoder, but they either compressed the output file, or the output file is not playable in totem. I am a new user for both tools. there is actually a way to get uncompressed avi from these tools.
"BMP to AVI Sequencer" [URL] does the job perfectly. I successfully converted 180 1080P BMPs into a 1G avi, running at 30 fps. Unfortunately I need a command line tool this time.
Is there any compression software that supports PAQ8 and its variants with a graphical interface for Fedora (i'm using x86_64)? What's the best compression algorithm known to man? Time taken to compress is not a problem, just need to have the best compression available. Is there any program for linux?
how to do Lempel-Ziv-Welch compression using pen and paper for my algorithms and data structures class. Unfortunately we have only a couple of examples in our book of how it is done. I'd like to practice compressing and decompressing text using it, but I need to find a way to check if I'm right doing it right or wrong.
So I'm looking for some preferably free/open source program which can compress and decompress LZW for Windows or GNU/Linux. Programs without binary distributions are fine too.
I�m installing fail2ban to improve the security of a home asterisk server which from time to time becomes the target of some sip account cracker and/or ssh brute force attack.For those not familiar with fail2ban, this utility monitors log files to find matches with user specified expressions to identify the presence of a brute force attack. Then configures iptables rules to block the offending IP.Here�s an example:
Code: NOTICE[1734] chan_sip.c: Registration from '"613"<sip:613@xx.xxxx.xxx.xxx>' failed for 'yyy.yyy.yyyy.yyy' - No matching peer found
I finally got K9copy to work and I'm using it to extract DVDs to my hard drive. I extract a dual layer DVDs it becomes a 5GB DVD. In the settings, I have DVD size as 9GB because I assumed this would not cause any compression. When I use k9copy assistant to change the shrink factor before ripping, I can only change the shrink factor of some titles to 1.67 from 2.50. I want the shrink factor to be 1 since this seems to ensure that the output of the file is the same size as the title on the dvd. I want the copied DVD to be the same size as the original. I have no problem with hard drive space since I have 6 terrabytes of free space.