fgrep -iRn -C 2 'print' *.txt
---also recurses one level and displays all kinds of data
---------------------------------------------------------------
in root / ext4 directory:
Code:
fgrep -iRn -C 2 'print' *.* or fgrep -iRn -C 2 'print' *.txt
---displays msgs:
fgrep: *.*: No such file or directory
and
fgrep: *.txt: No such file or directory
I am probably missing something very basic.recursive does not seem to work from root directory.Is this (as they used to say) a feature? This is on an 11.2 system, upgraded to 11.3 (ran zypper verify to verify. According to it.. all is well).
I know grep can search recursively (ie through all subdirectories to the bottom of the directory tree), but is it possible to ask grep to only search say, 3 levels down? That means the current directory, any directories in the current directory, and within any directories within those?
The NS triggering the warning is running openSUSE 10.2 and the other one 11.1. Both configuration files /etc/named.conf are equivalent (well, forwarders are different). There is no such warning for the NS with 11.1. When I add "recursion no;" to the options in /etc/named.conf the warning goes away, but FF or SeaMonkey running on the server no longer get their DNS requests resolved.
Here's a silly question: suppose I want to "grep" an entire directory. Real life example: we reconfigured our mail server to a different IP address, so I needed to search for each occurrence of the original to make sure they'd all been changed.
A simple
Code: grep -i -l -r "192.168.1.200" *
... works great until it hits a socket. This particular mail server has sockets scattered at random through the directory. When grep hits one of these, it hangs.
I've tried every command line option I can think of it to force it to ignore these sockets, but to no avail. Any of you Grep Gurus out there want to give me a hand?
I'm still trying to learn, but seem to be hitting a few walls when it comes to man pages. I don't know where else to look so I hope to get some help here - or at least another idea. I'm trying to learn c++ as well as *nix terminal commands, and I like to make a backup of my compiled program and .c files each time I reach a successful compile (99-100% bug free).
My problem is I like to use an alias instead of a script to backup my work onto a USB. I don't want the hundreds of log files (10K+) copied since I usually delete them anyway.
My alias:
How do I tell cp to ignore the folder '~/work/log' during the backup?
If cp doesn't have such an option, I am open to other ideas, but I would still prefer something that I can alias since I already backup my .bash_aliases file.
I want to write simple non recursive makefile, but I am not getting syntax of it. Please give me example with simple description. I had read docs, html's but I am not getting how that works please give simple example. consider I have following directory structure.
I can call this routine and it works fine when I enter a valid name for $PROJ. If I enter an invalid name it goes to the else block and prints the statement. However, it does not call itself. Instead the script just exits.I've googled 'perl recursive subroutines' and the example don't appear to be doing anything different.
And I have a very long debuging log file I will not post unless requested. I also have added a ufw allow from all to all rule for testing purposes only.
To do this, I would like to recursively download all the pages within the domain, such that their link structure is preserved. This would be tedious to do by hand, however. As it stands, I could probably use wget for this, but I would prefer something more specially designed for site downloading. I have already tried webHTTrack, but found it unsuitable. Perhaps httrack with a particular set of command line parameters would work better?
find out a command to search among all *.dat files in a certain path (including subdirectories) looking for the following text in them:
Code:
-------------------------------------------------------------------------------- Elements with small area Element Adjusted nodes --------- -------------- 16294 NO 17889 NO
and getting the list of elements with small area printed in a file "ErrorEl.txt". The output should have this form:
"/path/01/A.dat bad-el#01 bad-el#02
[code]....
I know already how to find out the dat files containg a certain string
Code:
c=/path/ grep -R --include="*.dat" "Elements with small area" $c | cut -d: -f1>> ErrorEl.txt
but I don't know then how to get the element numbers(16294 and 17889 in the example above)
I am looking for a way to setup sudo access for a user, so that he can change permission of all files of the given dir.
eg:
By this user can change ownership of files which are on depth bellow to given dir (i.e /etc/userA-conf/), but while trying to change permission of /etc/userA-conf/../user-conf2 , getting error, user userA don;t have that permission.
Let me know what will be the right regex/pattern to achieve this.
In Solaris it's working fine, but I am trying it on Linux RHEL5.
I'm trying to make recursive makefile work but it's giving me two problems. I have a top folder with the main Makefile and one Makefile for each sub folder 'one' and 'two'. Makefile in subfolder 'one' and 'two' are identical. The top Makefile (still a bit messy) looking like this:
Code: # Directories CC = gcc CFLAGS = -Wall -Wextra TARGET_DIR = bin MAIN_FILE = one.c
Trying to configurate gadmid-bind, I change the user and group of my entire filesystem, I archive some advance getting all back but for now,sudo leave me with a problem about guid, i changed sudoers to root againg, but i don't get all back.I dosen't have network connection, because nm-applet dosen't start on my user, and when i run on a xserver with root user it give me: The device is not ready.
If I runls -R1I get a recursive listing of all files under the current directory.However, if I dols -R1 *.avi, ie I want to search only for files with the file descriptor .avi, I get an errorQuote:ls: cannot access *.avi:No such file or directorySo it seems I am using ls incorrectly. What's the correct way to use wild card pattern matching when using the -R switch? Or maybe that isn't possible?
I am writing a shell script that finds all files named <myFile> in a directory <dir> or any of its subdirectories, recursively. I also need to take care of symbolic links that may form cycles, to avoid infinite loops. I am not supposed to use find command for the same
I started writing the code but got stuck. I thought using recursion may be a smart way, but its not working.
However, can I run a command and create symbolic links for all files in a given folder and its subfolders and have all the links be in one folder?
I have a file structure such as:
FolderA FolderB FolderC
and I want to have symbolic links for all the files in the A, B, and C all in one new folder (FolderALL) for example. I have hundreds of folders that need to be done, so a simple 1 line command would be ideal if possible.
I'm having problems with compiling recursive Makefiles in my directory structure: My folder layout is: top/|- one/|- one.c (With main function)|- zero.c|- two/|- two.cin my top folder the make file looks like:
Code: MAKE_DIRECTORIES = one two .PHONY: all all: $(MAKE_DIRECTORIES)
.PHONY: $(MAKE_DIRECTORIES) $(MAKE_DIRECTORIES): @echo $@ $(MAKE) --directory=$@ in my one and two folder I have the following Makefile:
Code: .PHONE: all all: @echo $@ $(CC) $(CFLAGS) *.c But when I compile it from top folder: make
I get following output: Code: one two Which states that directory statement by echo in main Makefile is ok but the files are not compiled in one and two.
I need a program which generates events when a file is moved, removed or its extended attributes are changed. I'm running Ubuntu Karmic Koala 32-bit desktop.inotify is the standard solution for such problems, but inotify cannot install a recursive watch, so the only option is to the equivalent of find on the filesystem, and add an inotify watch on each node. This is what e.g. inotifywatch does. This won't work for me, because my filesystem has 1 million files, and installing watches to all of them takes forever.
fanotify could work except that I would have to patch the kernel for that (I'm currently running 2.6.31-20), and maintaining patches to the Linux kernel is beyond my time commitment.I used to use rfsdelta (whose kernel module is similar to rlocate), but it just doesn't compile on 2.6.31, because it uses obsolete Linux security framework APIs.
I would like to make a cronjob who makes a tag.gz of everything inside a directory in a recursive way. BUT there is a HUGE directory full of jpg's. I don't want this one in the backup.Additional points if it can backup symbolic links.