Ubuntu :: Printing - SpliX Compression Algorithm 0x0 Does Not Exist
Aug 20, 2010
I have printer whhich work perfectly on ubuntu desktop I shared it into network and trying to print from Mac and I'm getting the following error: "SpliX Compression algorithm 0x0 does not exist" Printer - Samsung SCX-4200
I expected more from ubuntu 10.4 with regards to printing with exact size photos and with poor auto colour printing but the situation remains unchanged! for instance .. the photo size configurations for ubuntu/fspot/gimp and others are not compatible with my printers (HP and Brother) .. here in Europe a typical standard size photo (10x15inches or 150x100mm are not even on the Ubuntu listing? I have tried all listed possibilities including "custom" (which does not seem to ever work correctly?)and the result at best is photos with uneven boarders or at worse my printer goes a bit crazy with much wasted photo paper and expensive ink ...even photos selected for "no boarders" still produces photos with the self same uneven boarders.
I have tried pretty much everything over time following advice in this forum and including using HPlip and updating drivers required for my Brother printer but the root problem seemingly lies with the Ubunto photo size setup listing. Working with Ubuntu over the years I have found that it can do pretty much everything that Windows can do except for this dam ongoing photo quality and configuration problem.
Since the last two weeks i`m looking for a method to work with the samsung scx 4x16f easy. Just printing something. Now it works fine. first you have to install "jbig" and "splix" from slackbuilds.org You will need cups and perhaps gutenprint, but in at the most installs of Slackware 13, i think it's included.
After that you have to connect printer and PC
Code: # /etc/rc.d/rc.cups restart and finally install the Samsung scx 4216f over http://localhost:631 CUPS Administration and choose the Samsung SCX-4200, SpliX V 2.0.0 driver.
You right click on a file and choose Compress. How do you select the compression level for algorithms that support different levels? If you can't, what compression level does it use?
I notice in Nautilus file manager, if I right click on a directory, I can 'compress' that file or directory. clicking this provides options for different compression formats, but how do I know which one to use?Most of my files are data files, ie text. Though I do have photo/image folders and a music folde
i have about 22 gb of music (mp3 & ogg) on my laptop harddrive.i also have an unused sony mp3 player with a 20 gb hard drive.what i want to do is back up the 22gb into a 20 gb space the music does not need to be playable on the sony player...just using it as a back up device.ok...2 issues:1. when i've tried compressing (tar.gz) mp3 files, little to no space is saved, i assume that a mp3 is pretty compressed already.is there another way to compress effectively ? i dont want to reduce bit rates of the individual music tracks.2. i formatted the sony hd using ext4, but this leaves me with only 16 gb usable space. tried fat32 and this left me with about 18gb.
I have a copy of the MD5 algorithm and I'm taking a look at the source. It's pretty interesting but there are a few things that I'm curious about and I was wondering if anyone a bit more intuitive than I could help me out.The function declarations in the MD5 files are a bit unfamiliar to me. There is a macro used called PROTO_LIST, which I'm still not sure as to what this thing is doing exactly, but it's littered everywhere throughout the source.The signature here isn't too unfamiliar to me with the exception of the position of the PROTO_LIST macro. So here is a function with an unnamed argument of type MD5_CTX*.To me, this resembles an initializer list found in C++ with constructors but I certainly don't think that is the case here. So my questions about this are (1) how is this legal code in C and (2) what functionality has the PROTO_LIST macro provided for the function?
I use mplayer to play my media files. Occasionally I want to take screenshots of media I am playing. I use the -vf screenshot option to take screenshots which generates shotxxxx.png. The issue is that all those png are not compressed and usually large. When dealing with Hi-Def media they are extremely large, each is 4-5 MB in size at the least. Is there a way I can set the option for compression of png images. If I use imagemagick the same files get compressed to like 1.5MB of png file.
I am working on a project where I am dialing out of a modem!! Old stuff, ya, but the modem allows my device send info from remote sites from my datibase through a phone line so that this IT departments dont have to worry my device being a security issue on their networks.
Any way, the modem I'm using isn't incredibly well designed, and when a certain ascii char is read by the modem, it reads it as an EOF indicator. It is also important that the files I send are compressed.
My question is: Does anyone know of a compression format that allows ME to dissallow IT's use of certain ascii chars?
just as an illustration:
Device --------> Modem ---------> Off-site
and the Modem stops talking to the device when a certain char is passed to it.
I have just installed Free Arc utility for compression from : [URL] and it works smoothly but in command Line mode.
Commands: a add files to archive c add comment to archive ch modify archive (recompress, encrypt and so on) create create new archive cw write archive comment to file d delete files from archive e extract files from archive ignoring pathnames f freshen archive j join archives k lock archive l list files in archive lb bare list of files in archive lt technical archive listing m move files and dirs to archive mf move files to archive r recover archive using recovery record rr add recovery record to archive s convert archive to SFX t test archive integrity u update files in archive v verbosely list files in archive x extract files from archive
Options: -- stop processing options -ad --adddir add arcname to extraction path -aeALGORITHM --encryption=ALGORITHM encryption ALGORITHM (aes, blowfish, serpent, twofish) -agFMT --autogenerate=FMT autogenerate archive name with FMT -apDIR --arcpath=DIR base DIR in archive -baMODE --BrokenArchive=MODE deal with badly broken archive using MODE -cfgFILE --config=FILE use config FILE (default: arc.ini) -d --delete delete files & dirs after successful archiving -df --delfiles delete only files after successful archiving -diAMOUNT --display=AMOUNT control AMOUNT of information displayed: [hoacmnwrfdtske]* -dmMETHOD --dirmethod=METHOD compression METHOD for archive directory -dpDIR --diskpath=DIR base DIR on disk -dsORDER --sort=ORDER sort files in ORDER -ed --nodirs don't add empty dirs to archive -envVAR read default options from environment VAR (default: FREEARC) -epMODE --ExcludePath=MODE Exclude/expand path MODE -f --freshen freshen files -fn --fullnames match with full names -hpPASSWORD --HeadersPassword=PASSWORD encrypt/decrypt archive headers and data using PASSWORD -iTYPE --indicator=TYPE select progress indicator TYPE (0/1/2) -ioff --shutdown shutdown computer when operation completed -k --lock lock archive -kb --keepbroken keep broken extracted files -kfKEYFILE --keyfile=KEYFILE encrypt/decrypt using KEYFILE -lcN --LimitCompMem=N limit memory usage for compression to N mbytes -ldN --LimitDecompMem=N limit memory usage for decompression to N mbytes -mMETHOD --method=METHOD compression METHOD (-m0..-m9, -m1x..-m9x) -maLEVEL set filetype detection LEVEL (+/-/1..9) -max maximum compression using external precomp, ecm, .....
some reason it seems like the lowest it goes is ~64kbps (~ implying variable bitrate).So yeah, any thing that'll let me do the compression limbo better? (How low can you go? ) A different program? Unlock bonus stage? What
I compressed a directory containing many image files. The directory amounted to 5.3gig. Compressed with TAR using .tgz the compression took a couple minutes at most and compressed down to 4.3 gig. Compressed using .bz2 the compression took about 90 minutes and compressed down to 4.2 gig. Hardly worth the extra time. Do these numbers look normal to you?
I want to rip all my CDs to flac at the lowest compression (space is not an issue) via Banshee. I have tried a few tracks but the seem to be ripped at a higher compression to the ones I have done via sound juicer (set at compression 0)
How do I adjust the flac settings in Banshee to do this? The option to edit the settings is not available for flac? I guess there is a config file somewhere?
I am trying to convert hundreds of BMPs into a single AVI file, without compression. Since every pixel matters in my case, I don't want any kind of compression. It's fine if the output file is extremely large.
I'd like to know if it's possible to achieve this with packages coming with Ubuntu 11.04. I have tried ffmpeg/mencoder, but they either compressed the output file, or the output file is not playable in totem. I am a new user for both tools. there is actually a way to get uncompressed avi from these tools.
"BMP to AVI Sequencer" [URL] does the job perfectly. I successfully converted 180 1080P BMPs into a 1G avi, running at 30 fps. Unfortunately I need a command line tool this time.
I want to change linux scheduling algorithm for some of my processes but when I click on processes in ksysguard and click renice project, all of the processes use normal level cpu scheduling. Why is it such a thing and there is no priority?
is linux kernel is priority preemptive kernel?if it is. where it is using round robin scheduling algorithm?when processes are scheduled for the processor process will be allocated as which sechudling alogorithm?
there are 3 nodes..A,B and C. Node A wants to send information to Node B but it does so by sending it to Node C first which then sends to B. And similarly Node B sends to A. In this simultaneously C doesn't send to both A and B. This is the in built algorithm..but i want to change it to: A and B send their packets to C and C sends both these packets to A and B..by ORing... In the receiver side...node A can receive the wanted packet and also B. Where do i change the algorithm?
Is there any compression software that supports PAQ8 and its variants with a graphical interface for Fedora (i'm using x86_64)? What's the best compression algorithm known to man? Time taken to compress is not a problem, just need to have the best compression available. Is there any program for linux?
hy I can't make a 7z archive with file roller? I have already installed p7zip, p7zip-plugins, and 7za. I don't understand what the problem is. I can make a 7z archive from the command line with no problems. I keep getting this message but it doesn't tell me what the error is. Quote:An error occurred while adding files to the archive.
how to do Lempel-Ziv-Welch compression using pen and paper for my algorithms and data structures class. Unfortunately we have only a couple of examples in our book of how it is done. I'd like to practice compressing and decompressing text using it, but I need to find a way to check if I'm right doing it right or wrong.
So I'm looking for some preferably free/open source program which can compress and decompress LZW for Windows or GNU/Linux. Programs without binary distributions are fine too.
I�m installing fail2ban to improve the security of a home asterisk server which from time to time becomes the target of some sip account cracker and/or ssh brute force attack.For those not familiar with fail2ban, this utility monitors log files to find matches with user specified expressions to identify the presence of a brute force attack. Then configures iptables rules to block the offending IP.Here�s an example:
Code: NOTICE[1734] chan_sip.c: Registration from '"613"<sip:613@xx.xxxx.xxx.xxx>' failed for 'yyy.yyy.yyyy.yyy' - No matching peer found
I finally got K9copy to work and I'm using it to extract DVDs to my hard drive. I extract a dual layer DVDs it becomes a 5GB DVD. In the settings, I have DVD size as 9GB because I assumed this would not cause any compression. When I use k9copy assistant to change the shrink factor before ripping, I can only change the shrink factor of some titles to 1.67 from 2.50. I want the shrink factor to be 1 since this seems to ensure that the output of the file is the same size as the title on the dvd. I want the copied DVD to be the same size as the original. I have no problem with hard drive space since I have 6 terrabytes of free space.