I have on my windows machine several hundred files that are a format of .nc .ncs for a CNC machine. I need to convert them to txt which is something as easy as opening in notepad and then saving as .txt but there are so many that this kind of action would take way too long.
The reason I am writing the linuxquestions is because I would feel more comfortable in loading a live CD and using some sort of terminal command to do this than I would to download one of the many "freeware" type programs I have found for windows (even more so since I have had a root kit before and had to start all the way over to get rid of it).
I need to know:
1. Is this possible to do with the terminal without super advanced knowledge.
2. Can one please point me in the right direction; something to read or an example
I have been having a lot of trouble lately with installing from CD/DVD. The DVD reader/writer on this laptop is new. Nevertheless, trying to install Ubuntu onto an exernal HD, I get 'input output error on sr0 logical block (a large number) After a long time the booting proceeds to a point, but I never get the actual installation started, and have to shut down manually.
The CD is fine, says the Ubuntu-checker. I just installed using my sons laptop, and there was no trouble. Question: does this indicate a motherboard failure? A memory block damaged? Do you know of a diagnostic tool I can use to check the reading of a CD/DVD?
I've been looking high and low for a utility program or perl script or something that can take a linux directory structure as input and convert it to MS-DOS 8.3 directory structure.
The purpose of this is to conform to the path format that is expected on my rather old Creative Zen Neeon MP3 player for m3u play lists.
I'm trying to use convert, I have installed the imagemagick. I use this line:convert *.jpg test.pdf but I'm only able to convert to pdf 1 single jpg file, not multiple files at once. When there's more than one file, I get the following error: Segmentation fault
just start Ubuntu 9.04 said: File system chek failed a long is beging saved /var/long/fsck/checkfs if that location is writable Please repair the file systmen manually A maintenance shell will now be started Ctr+ D terminate this shell and resume system boot. Give root password for maintenance or type Control +D to continue. I did Ctr+D , and after login said , that can not find /home. I starte with the live cd:
how big and widespaced the fonts on Clementine playlist are and how good they look on the appmenu (where my mouse pointer is). This is not because Clementine is QT4, I've got the same problem with Chrome, Opera etc. I've been messing with system-settings (KDE settings tool) a day before the fonts become that widespaced in order to make my KDE apps look more native on my GNOME, but I haven't touched the fonts settings there.
I was running grsync (rsync gui) to do a backup of my root and home partitions to a local external hdd. Home is currently 21.2 gb and root is about 60 gb, but the backup ran for nearly 24 hours before I canceled it, without finishing. How long should an 80ish gb backup take to do?
I made sure to disconnect other externals so they wouldn't be backed up as well. It was just my root and home being backed up.
Down loaded the Ubuntu 10.4 ISO for this site when download was complete, got a screen telling me to insert a writable cd which I did. It went through a Format process and then asked me to drag the files to that directory. When I tried to do that I got a message saying that I was 138mb short of space. the Iso was 704mb and the cd had formatted to something over 500mb. the disk is a CD-RW rewritable cd.
I am running Centos 5.3. I ran no updates, performed no installs, nor changed any configuration immediately prior to this issue. My problem is this: when I run the command startx (default runlevel 3), it is a long time (5-10 minutes) before Gnome startx, and once it does start applications will not run. Also, when I try to use sudo (from any environment, even ssh), it is a long time (5-10) before the command is executed.
I cannot say for sure, but it seems like this is an intermittent problem. Sometimes X takes a long time to start, but once it starts it will launch programs. Sometimes X takes a long time to launch, but once it starts it will only launch certain programs. Though presently X always takes a long time to start, and I cannot successfully launch any programs.
A while back a had a similar problem to this (x taking long time to start, sudo taking long time to execute) and it ended up being a DNS problem. Unfortunately, I cannot remember exactly what it was and I stupidly did not document it. Maybe this is also DNS related, I don't know.
I don't know what log files to look at for problems with X, Gnome, and sudo taking a long time to start.
I'm always hesitant to use /var/tmp/, because I never quite know exactly how long the files are kept there for, or even what the directory is used for. What determines when a file gets removed from /var/tmp/, and how is the directory intended to be used?
Suppose I am almost sure that from last Thursday, 3.00pm up to the same day at 10.00pm I was away from the machine, but not absolutely sure. Linux probably knows better than me. Maybe there will be a text file from which I could infer the keyboard was idle from Thu 2.40pm up to 11.10 pm. In this case, I would reach absolute certainty. But where could such file be in the /. tree or what could its name be (for in the latter case an updatedb followed by locate would do)?
I managed to, in a bit of a "Oh shit, I'll lose my data" panic, be stupid enough to xkill the Ubuntu 10.04 installer while it was making an ext3 file system on my external hard drive. This drive was equipped with ext2, I had not set it to format (at least not consiously), and it had a LOT of data on it. Only half an hour after killing it I realized it might simply have been converting rather than formatting.
Anyway. Because this was my own fault I started searching for possible solutions. I tried e2fsck, ran testdisk and gpart, as well as several data recovery programs both from Windows and Linux. All I have been able to get back with that is a whole bunch of corrupted files and some music.
Now I have unleashed e2salvage on the beast, which so far has been looking promising, other than some "directory inside Inode table" errors. It found almost 20000 directories, and 174 directory beginnings.
I use a long mount command to mount a NAS drive but have to retype it every time I need to mount the drive. Because it is on my laptop I only need to mount the drive from time
I have two minor problems with Ubuntu which I've been running on my aging Fujitsu-Siemens Lifebook for a couple of years now.
First, I recently upgraded to v10.04 with no problems. However, I've just applied the latest updates via Update Manager and the laptop will now hang after the welcome screen.
There are no error messages, just a black screen and the case fan runs a full tilt until I force a shutdown. I've waited 5 or so minutes to see if it's actually doing anything but it would appear that it isn't.
The only way to boot the laptop is to choose an older Grub menu option, then it boots up fine. It may very well be a hardware issue because another (newer) laptop in the household has updated no problem.
Next, I tried to change the password of the admin account using "Users and Groups". It appeared to work but then I had to use the old password to log in again. On logging in I am prompted for the new password, the error message saying that the "token ring" password (I think it's token ring, I'm doing this from memory) doesn't match.
Again I can live with this quirk but it would be nice to put it right.
I'm having trouble with Vim in any terminal emulator I use. I have a link (vi) to vim. Occasionally it will take very long to load, whether I use 'vi' or 'vi file'. Before, if I could I would restart X, and then it would load instantly again, but I waited this time and it did load, after a minute or so. Is this a problem with X or vim?
I have accidentally created a very large folder that contains probably more than a million files. I have tried to delete it using all of these methods:(1) rm -rf myfolder/(2) Using midnight commander trying to delete the folder(3) find myfolder/ -type f | while read -r; do rm -v "$REPLY"; sleep 0.2; done(4) find myfolder/ -type f -print0 | xargs -0 rm -fThe find command would give an out of memory error, and other methods would just freeze the computer. I knew a few of the file names, but even if I tried to do ls myfolder/filename0, it hangs.
I'm trying to copy a 6Gb file across from my laptop to an external usb drive but it quits at about 4.2Gb every time with a "file size limit exceeded" error. i have checked the output of ulimit -a and there is no limit there on the file size. I'm using the Slax live Cd for this as it always gets the job done
I have Ubuntu 9.04 and just installed Sound Converter. I am trying to convert a bunch of .ogg files to mp3 to play on my iPod and it's not working so well. In the Sound Converter options I have is set to convert to high quality mp3. I choose the folder that the files are in and after a moment (slow laptop) Sound Converter populates, I hit 'convert' and it shows that the conversion completes in two seconds. All that it did was create the new folder structure of artist/album but there is nothing in there. Not sure what I am missing. I used Sound Converter before and it worked fine.
I am changing the password of a truecrypt file container. This takes around 1 minute. Why?
time truecrypt --text --change /tmp/user1.tc --keyfiles= --new-keyfiles= --password=known --new-password=known --random-source=/dev/null"
If I use strace I see that it basically does not do anything: it simply reads lots of random data from /dev/urandom (even if i specified /dev/null as random source) and finally changes the password:
The find command is taking too long on my machine to complete. When I use time command, I find that sys time and user time are too small as compared to real time. Is my find process not getting scheduled properly?
I interrupted the neverending find command and got the following statistics:
Real time : 5min Sys time : 1.1 sec User time : 3 sec
How to break strings of command into multi-lines in crontab? e.g. Code: # the following is a very long a gruesome command to be run at 09:59 Monday to Friday. 59 09 * * 1-5 source $HOME/some-definitions; sh /usr/local/my/long/name/application/bin/hello $(date +\%Y\%m\%d) >>/var/log/my/long/name/application/log/hello.log