Actually there are three c programs double,half and square which respectively double halfs and square an integer , i have place .out file in sbin folder ,now they are processes now i have to concatenate these three processes like double half it return 8 how i will write the codes for it.
I get a SD card. Put in the SD reader. It's empty. I go to my super-important-pictures-to-a-monthly-relatory folder and select all files. Select them for MOVE. Paste them on the SD card. When the move/paste process is finished, i click on the "Eject" button on top of the SD card name. Card's ejected. I can't access the card anymore. I take out the card and put on my other computer. From 300 pictures, there are only 10 available, the remaining ones are there, but with 0bytes and unrecoveable. I panic. I go back to my main computer, my pictures are not there anymore. The pictures were on the Home folder. I panic again. I reset the computer and boot on the LiveCD. I install foremost, scalpel, photorec and about everything till my USB drive complains about being filled up. I run everything and I can't recover my files. I'm in the danger of getting fired. Things like that makes Windows sounds more appealing. When you securely remove a pendrive, things get REALLY pasted there before screwing everything up with a removal.
I am used to Ubuntus simple sharing with samba. Just install it, reboot and then share the files.Then do I klick on network folder and see all the shared files on the computers in the network.
How do I install it so I only need to go into network folder and see the other computers shared files.Then, how do I share files?
I hope it's not so difficult and that I have to change i config-files.
i am downloaded some e-books in the format .rar. when i am extracting them i am getting error as There is no command installed for RAR archive files. Do you want to search for a command to open this file?
On a KDE4 environment after downloading some music from rapidshare with JDownloader the archives self-extracted with the symbol in some of the file names. Those files couldn't be renamed or deleted, the file manager said that the files didn't exist - very weird. The files should have had some swedish characters in their file names. Now I'm stuck with those files on my machine. Anyone knows how to get rid of them?
Before installing Fedora on my laptop I backed up my files onto a USB key. I wiped the hard disk and installed Fedora 12. I copied my files back to my home directory - but all the text files are now binary! Any ideas why this would be or how to fix it??---------- Post added at 10:54 AM CST ---------- Previous post was at 10:35 AM CST ----------Hmm it seems that only some files are binary. Looks like some .tgz files are corrupted too.---------- Post added at 11:02 AM CST ---------- Previous post was at 10:54 AM CST ----------Damn - I wonder if the files weren't completely written to the usb key before I removed it. I don't remember their being a safe mount option in Fedora 10 - but I definitely unmounted it before removing it from the laptop.
I recently upgraded from FC12 to FC14 due to errors using yum and rpm. I keep running into mirror sites with RPM files that don't install. These site have what appear to be valid RPM files, but if I use wget to retrieve the file and run "file" on it, it turns out to be an HTML file. I understand the fact that if a file is corrupted, it won't install, but I can open these "RPM" files using Mozilla after changing the extension from RPM to HTML. When I open the new HTML file, I'm prompted by the browser to open or save the RPM file. If I save the RPM file pointed to by the HTML file, I can download and install it using rpm -ivvh unless a dependency check fails. how does anyone install using yum if the package is an HTML file with an RPM extension. Here is a link as an example:
[URL]
If I use wget on this file and then run "file" on it, here is the output:
file openoffice.org-calc-core-3.3.0-20.2.fc14.i686.rpm openoffice.org-calc-core-3.3.0-20.2.fc14.i686.rpm: HTML document text
If I change the extension of this file to HTML and open it with my browser, I am prompted to download the RPM file which turns out to be a true RPM v3.0 file. Is anyone else having this problem or can anyone point me in the right direction to download the correct RPM file. I am running behind a proxy server and have installed cntlm, configured wgetrc, yum.conf and .curlrc to use the proxy server. I am able to download and install most of the simpler applications where the dependencies are either installed or non required.
I have installed this program ok but I am new to command lines in terminal.
I want to convert some wav files to wma files. I have the wav files currently in a folder called Test to make it easy. So I have entered the following command line:
ajpearson@ajpearson-laptop:~/Desktop/pacpl-4.0.5$ pacpl --to wma home/ajpearson/Desktop/Test and the error message I get is:
error: the following is not a file or directory: home/ajpearson/Desktop/Test
It does not matter what directory I use I get the same error. I am sure the answer is obvious - but not t me.
I have a Kingston 8gb Datatraveler that has been giving me troubles lately. For some reason after I delete files from it it still shows up as full and the files are shown in the hidden trash files. How do I get rid of these files? I can't delete them as they just show back up. Also, I tried to format the drive with gparted and it won't unmount. When I right click and select information, at the bottom it says: Unable to find mount point. Unable to read the contents of the file system. Because of this, some operations may be unavailable.
I am using the diff command with the -r option, to compare a large number of files and files in subdirectories. My main interest is to find out which files have been changed, and not what the actual changes are, and since a lot of files has been changed, it would be a lot easier to view the file names only. Is there and option for diff that might do this, or does there exist a similar tool/command that could do the job?
1. Every Sunday2. Find all files older than 1 day3. Gzip these file4. Tar up the gzipped files into one tar file.5. Name the tarball with a date stamp indicating what day it was created, so we know that week's files are in the file
I have bought an external usb hard drive on which I back up my three computers every once in a while.Space will quickly be used up.I can't find that little bit of research that I need yesterday.Here is what I would like to find:An application that eliminates doubles in identical files and renames files that have changed by appending the last saved date yyyymmdd to the file name.Does such an application already exist?
Until now i haven't had to dabble with bash scripts.
I have a program that reads in data files. These are named datafile01_R, datafile01_G, datafile01_B, they then increment, so datafile02_R etc i have about 600 of these. the program reads in 3 data sets at a time from each run, so files_01 r, g, and b.
The program then does its magic, and outputs about 40 different files, depending on the file, they gone to folders named R, G, B, psa, or tracking.
The program itself has configuration files to say where the files should gone when analyzed, there is also the config files that reads in the data sets.
At the moment i have to run one set of data, then go in and manually change the input file location, and run again. But, doing this, even though a different data set, the new set overwrites the old set in one of the output folders. So i need a way to increment the output filenames after they are written and before the program is run again with the new data set.
I've discovered that Dolphin seems to lose random files when copying many large folders.
I first noticed this a few months ago when I tried to copy my music library from one folder to another on the same HDD. It consisted of around 600 folders and 6500 files. During the copy there were no errors but after the copy I found that some of the newly copied folders were missing files. I put it down to human error or a glitch.
Yesterday I tried to copy 13 folders containing rips of some of my DVDs. Each folder basically had one film of either 700MB or 1.4GB. Again no errors showed up during the copy but I found 3 of the newly copied folders were empty.
It's not so critical with music or films but I can't afford to lose work data like this.
Has anyone experienced or seen a similar problem with Dolphin? I'm going to have to do some more extensive testing but this is not good.
The first time I noticed the problem I was running KDE4.3.4 (I think) and now the latest was with KDE4.4.0.
the VBO file that i want to burn into a dvd is around 7.9 gb. i want to know if it's possible to split them in order to burn the files into two diferent dvds.
I changed to wubi last night, i decided to go that way cause i was informed it was a safe way to try giving it a try for linux and ubuntu without any special effort needed to be payed and easily deleted if i am not pleased.Thankfully i am pretty satisfied with the results, i pretty much dealt with most of the issues i have faced so far successfully and i am running it ok beside one important thing.See i am using my laptop, which has a hdd of 250GB.On my Vista there are 2 different hdd C+E by default, they separated my hdd.So while i have windows on C and E is pretty much used for my additional data(see movies, music etc) when i installed wubi i installed it on E, thought it would be better and it had more space.
Now though, while i can access threw ubuntu all my files that were located on C(vista) i cant access any of the E ones, and search file dont helps either.The "vista hdd" as ubuntu describes it, its a 250 gb disk on computer, which means it should contain both of the vista disks.But sadly thats not the case.i cant seem to be able to find them anywhere.
What would be a nice, simple command to go through all files in a directory (no sub-directories), and change all the MP4 Video files I have to MP3 audio files (keeping the original filenames except for changing the "mp4" extension to "mp3")?
The files in question were videos taken with one of those Flip cameras, but I only need the audio off of it.
I have noticed that the files and folders search in Unity, only shows up those files which have been amended (or possibly just opened) since the install.
i was wondering if there was a way i could have the search index (or something vaguely equivalent) all the files on my machine. This is especially important given that i reinstall the OS every six months on a new distribution cycle, copying all my old files across.
Without being able to see my old files the search is pretty much reduced to a recent history search.
The problem I have is that I need to replace a more complex string, like this: Old string: /mnt/stor6-wc2-dfw1/627896/982574/ New string: /mnt/stor8-wc2-dfw1/369587/302589/ There I don't know how to do it... since the / is what separates the old from the new strings, and the strings that I want to replace have / in it. Also, I would like to know how to specify under what folder replace the files, for example, I want that it search/replaces all files under /var/www/mysite/htdocs folder.
I was reading [URL] which as the following in "Warning": Warning It is not a good idea to configure InnoDB to use data files or log files on NFS volumes. Otherwise, the files might be locked by other processes and become unavailable for use by MySQL. What does that mean, and how can one configure or check to ensure the above is being followed?
the permissions for my home directory were accidentally changed from 'access files' to 'create and delete files', and I changed them back, but ever since then I am not able to change any preferences/settings at all. power management, themes, panels, emerald, anything. my user account is supposed to be the administrator, and all the user privliges are checked. how to get control of my computer back?
Is there a method at the command line to copy files from one location to another and retain the source files group and user?I'm migrating some MySQL files from one machine to another.I want to back-up the original files in the directory presently. They have owner:group of mysql, some have owner:group root:mysql and so on. To copy them under cli or Nautilus everything changes to root for I execute sudo cp or gksudo nautilus and copy via gui.
Since it is MySQL data I could simply do a dump of the database and restore it on the other machine. But there's about 20 db's and I want to do this via a copy for it will be faster - at least that is what I think.