Now I want to append contents list2.cfg to list1.cfg(It ispposible using cat list2.cfg >>list1.cfg) but I want to check if content of (record) in list2.cfg is present in list1.cfg then dont append it otherwise append it.
I need a script that can do this: A script that searches all directories and subdirectories for .html files When a .html file is found it creates a index.html file in that folder. It then edits the index.html file and inserts links to all of the .html files that are in that folder into the body. If no .html files are found, it searches for folders. It then creates a index.html file with links to all of the folders.
How can I write to a file multiple times using fwrite without affecting the previous writes?The method shown below accepts a file name, buffer and offset. The method opens the file in reading/writing mode and writes the content of the buffer at offset.
I've got a bash script I'm using to download a text file list of links via axel. What I'd like to do is automate the movement of completed links in the for loop when axel has successfully completed the download. This is what I've got. I can figure that I can just echo append the line to a new file, but what is the easiest way to delete the line with the link I just downloaded?
#!/bin/bash for i in $( cat $1); do axel --alternate --num-connections=6 $i export RC=$?
I noticed that Ubuntu takes some time in building the file search index. It seems to do it overnight while my PC is powered down )Is there a way to build the index by a command so that I could do searches whenever I wanted, specially immediately after installing a new program?
I use nautilus at the moment on Ubuntu 10.10 Maverick. I use grep cli command to locate text words or comments in a folder containing hundreds of files, each having 5 or 6 text documents in Greek and English.
I am looking for a manager that will index all entries, and enable me to find a phrase or word within these documents quickly. I seen to recall that some time ago I saw a manager that was capable of this, that generated an index, as data was added, but have been unable to locate it with google.
I am running debian lenny running apache (latest) i am trying to make a webserver. i put the index.html file in my www folder. it shows up when i go to my site (localhost) i put an index.php file in my www folder and when i try to go to my site it Downloads the index.php file instead of showing it.
I have my home network up and running. I can access my index file by the ip I gave to eth0 - 192.168.0.2
I put this same ip into etc/hosts: 192.168.0.2 localhost localhost.localdomain One space separates the hostnames from the ip in /etc/hosts. The hostname command returns: localhost.localdomain
I have /etc/resolv.conf configured the right way, so I can use my isp's DNS servers. I know this because yum works and I can ping anywhere by url. The next step is to learn to access the index file by url locally on my own network.
Is there a program available that would allow me to create an index in a pdf file that has no security restrictions on it? I know people can lock there files so I am not worried about thise but if I have open permissions on a pdf file how do I go about creating an index. It seems that by default you get the thumbnail view but I like to be able to click on a index list to go to a page.
I have a set of PDFs I'd like to index and made searchable. Most importantly, I'd like a web interface so I can search and access remotely. This morning I researched into several different options, but all seemed to fail.
Google Desktop Search - No good because web interface is only to the local host Beagle - No good because it is no longer support and there are executation errors when running in Maverick Recoll - No web interface Strigi - No web interface(?)
I recall using one about 2 years ago that was deployed on Apache Tomcat, but I can't remember the name.
My setup is two laptops connected by crossover cable. One runs Windows xp the other Fedora 13. Neither is connected to internet. I'm using a subnet of 192.168.1.x On Fedora eth0 is up, apache runs because it creates pidfile. Everything pings fine. Windows xp ip pings fine from command line. Gave Windows xp a static ip of 192.168.1.1 mask 255.255.255.0 and gateway 1.2, same as eth0. xp says it sees the server. eth0 is up.
DirectoryIndex looks at index.html. I created that file with very simple code and put it in document root. Document root permissions are 755. Access_log 770. Error_log 644. Apache User 755. Listen 80 When I type the ip for eth0 (192.168.1.2) into firefox, firefox gives me an error message - can't find server. The connection status says its connected.
The error log includes a line: [warn]./mod_dnssd.c:No services found to register I don't know what this means. Apache is not writing to access_log. When I cat the path to access_log I get nothing, then a command prompt. I'm looking for the part I'm missing that will let Apache serve that index.html file to firefox so I can see how my code looks to firefox as I go.
I am trying to install flash-plugin-10.0.45.2-release.i386.rpm so I inputed the folowing command and got the following error output:
Code: bash-3.1$ rpm -i *.rpm error: cannot open Basenames index using db3 - No such file or directory (2) error: cannot open Providename index using db3 - No such file or directory (2) error: cannot open Conflictname index using db3 - No such file or directory (2) error: Failed dependencies:
i have a weird issue with my web server. every time i enter my domain name instead of processing my index file it downloads it. wth could be the problem first time this ever occurred. recently installed webmin.
Dear all I am new to git (and storing multiple copies of my code).
I was using git for one month (git status, git add, git commit) to store my files.
I have problems to add more files to my branch
git add BasisExpansion.R fatal: Unable to write new index file
this was working great unless the day my system administrator destroyed my main partition (it was an accident) and installed everything from scratch. My home directory stayed intact (I did not lose any files not git files that are stored under /home). But from this point of time git returns always that error. My system administrator creater the same user and group as my old user was but this didnot fix the problem.
I am not sure If I can recover my old git structure or would it be better to wipe out the old git dir and start a new one?
Can anyone point me to already built and mostly working utilities that will catalog the contents of (external) USB and flash media in a way that they might be searched?With cheap (under $100 US) terabyte external drives, it is too easy to get another drive and fill it. I'm looking for utilities or applications that will help me know what I have, cull the duplicates, and avoid the need to spin a drive just to see if it holds what I seek.
Years ago there were utilities that would read the contents of diskette media and create a printable "index" or "catalog" page. The pages were conveniently sized to match the diskette so that one could store the page in the diskette sleeve for future reference.Later, the pages got replaced with applications that stored a disketter ID along with each file name. One could then search for a name, or pattern, and discover which diskette held the file(s) of interest. Today we don't have diskette sleeves, but we do have cases for our external USB drives and wallets for our flash media. A printed index would be nice to have.
I want to change some thing inside the post install script of an existing rpm.there is any way to create a nearest spec file of this rpm, in order to change a bit the post install script inside this spec and then create again the rpm with the fixed spec file? no body wrote any program that can create spec file(99% identical from the orig spec file?)?