How can I create multipart rar file in Linux using the official console rar client?RAR 3.90 Copyright (c) 1993-2009 Alexander Roshal 16 Aug 2009Shareware version Type RAR -? for helpI want a multipart rar with each part size being 150 MB.
I can do:mkdir messages and then: touch messages/hello.txt Is there a command that will do both - create the directory if it doesn't exist, and then the empty file? Something like: touch -p messages/hello.txt
i need to write a short script that will compress a specific folder that`s on the Desktop (and all it`s content) and also will encrypt it with a password that is inside the script --->meaning it wont ask for a password+verification when compressing+encrypting
I am trying to create a self extracting file for Windows from Ubuntu 10.04 using 7zip.
I tried these commands:
7zr -sfx7z.sfx a output.exe *.txt 7zr -sfx/usr/local/etc/7z.sfx a output.exe *.txt
Both show this error:
Error: can't find specified sfx module
System error: E_FAIL
I have all the .sfx modules on /home/username/.sfx/ because I use them previously with "rar" and it worked fine, but it's not working for 7zip. I also copied 7z.sfx to /usr/local/etc/ to test the second command like shown here but I get the same error.
I use putty to get to my RHEL 5.3 workstation from my Windows laptop.
Typically, if I want a new terminal on my windows 7 workstation from another terminal or mc, I have to type start and I will see a new terminal window running the default shell.
QUESTION : What is the equivalent command in RHEL 5.3 (and or solaris) to create a new terminal window from the command line ? I will be entering this command from the shell prompt or mc's command line.
In Windows, if I want to start another terminal and in that terminal, I want to run a program, I can do "start program.exe arg1 arg2". this will create a new terminal window and runs program.exe in that terminal window. I don't have to create a terminal and then in a separate step run the program. How can I do this in Linux ?
I am currently interning at a place and my job is to essentially learn UNIX. My supervisor gives me problems here and there to help guide me with my learning but for the most part I'm doing this all by self-teaching myself. Needless to say I have run into a few obstacles...for instance-Create a *one* line command that, using tar, will collect the full /usr/local directory (you need to run this as root again) and copy the whole /usr/local structure under /optFor example /usr/local/bin/hello will become /opt/local/bin/hello, etc. I want this as follows:1. /usr/local is collected by tar, but the output of this tar command is its stdout.. what you get from the previous stdout, you compress with gzip and send it to stdout again 3. get this output and decompress with gzip.. get this output and pipe to tar in a way that will extract the tree under /opt.If anyone knows how I could go about doing this, please let me know, or at the very least point me in the right direction. What I've got so far (which could be completely wrong) is:tar cvf - usr/local/ | gzip -c - | gunzip -c - | tar xvf -in theory I feel like this should work (except for extracting the tree under /opt...i'm kinda stuck there)
I work with a Debian Squeeze on my laptop and I have a 160GB external hard disk. My hard disk was formatted FAT32, but I decided to format it using ext2. I formatted it using fdisk from command line and everything went well. Unfortunately, when I mount my hard drive(which is auto-mounted from Debian) it has got root both as owner and group. Then I can't write to it because I have no permission to do that. Is there a setting to create an ext2 partition which has as owner the logged system user in order to have right permission every time.
I need to be able to convert HTML email messages saved as text files (.eml or .msg) to PDF documents, one PDF per email, retaining formatting and images.
Are there any Linux tools that will allow me to do this from the command line (so it can be scripted)?
I have a project due for my Intro to C++ class and we are suppose to generate a file listing that will take an input of a C++ source code with .cpp extension and make a copy of it with a .lst extention that will have a line number preceding each and every line.
Let's say i have a link to a file http://www.domain.com/dir/myfile.ext
Is there a command line tool that will allow me to download this file. I'm looking for something like: download <http address> ... is there anything that simple?
I want to download a file from the Linux command line. Basically I'm using ssh and I'm trying to download a file to my file system on my laptop. How can I do that from the command line?
I have a jar, and I need to replace a class in it, at this moment, I can only open it with "archive manager" and then drag and drop the new compiled class into the jar, but I think this is really boring, if I can do with with just a command ?
I want to list all the files that don't have a copy with the same filename with -1 somewhere in it. So, in the example above, the results would be 3.png.
NB: the file and its copy with "-1" in it will be the same filesize, if that helps.
I've got a Debian Squeeze computer on which the graphics have packed up, but the terminal in single user mode work perfectly fine.
There are a few files on this Debian computer that I want to transfer off, to a networked computer, but I have no idea how to do this.
The destination computer is a freshly re-setup Mandriva install, without (as yet) samba. I don't think it's necessary though. The Mandriva install works fine, has graphics, etc, but can't see the Debian Squeeze computer on the network, possibly because it's in single user mode, thus prompting the problem of how to transfer the files, using only a command line.
In Linux, I'd like to know how to find the file(s) if any which as using a particular sector on the hard drive (ext2/3). There is a similar question here regarding Windows, however I need a Linux command line solution (this is a headless system).
I need to download a file from a website which has a URL formatted like:
[URL]
This redirects to a .zip file which has to be saved. There is also a need to authenticate based on username and password.
I tried to use wget, curl and lynx with no luck.
UPDATE:
wget doesnt work with redirection. It simply downloads the webpage instead of the zip file. curl gives the error "Maximum redirection exceeded > 50 " lynx also gives the same error.