General :: Finding A File Which Is Being Written Recently In Directory Of 1000 Files?
Apr 13, 2011How to find a file which is being written recently in directory of 1000 files?
View 8 RepliesHow to find a file which is being written recently in directory of 1000 files?
View 8 RepliesHow to find the package that was installed recently or at the last? Is there any command to find this? in RHEL 5.0
View 1 Replies View RelatedI have iMac 2.4GHz with rEFIT installed. I installed Unity on one of the partition. Kernel is still 2.6.38.8. I am getting error message when I run the command
sudo gedit /etc/x11/xorg.conf
The error message is
(gedit:2139): Gtk-WARNING **: Attempting to store changes into `/root/.local/share/recently-used.xbel', but failed: Failed to create file '/root/.local/share/recently-used.xbel.AC7YXV': No such file or directory
(gedit:2139): Gtk-WARNING **: Attempting to set the permissions of `/root/.local/share/recently-used.xbel', but failed: No such file or directory
Assumed y directory structure looks as follow:
Code:
--root
-- dir1
-- dir1-1
-- dir2
-- dir2-1
-- dir2-2
And under each sub dir there are some log files ended with .log. Now I want to list all these log files. How to to that?
I have the following cron entry but it doesn't seem to be running:
Code:
The script does exist. And so does the directory /home/usr/log and writable. /var/log/syslog only has a bunch of these:
Code:
I don't see any file gets written to the log directory. That suggests to me that cron didn't run the job, as confirmed by /var/log/syslog.
Basically, I am trying to locate and copy the newest .json bookmark backup in my .mozilla/firefox/w987sdg9.default/bookmarkbackups directory.
I tried this
Code:
ls -t ~/.mozilla/firefox/b1ahb1ah.default/bookmarkbackups/ | head -1
which does return the newest file, but only the filename itself. I found readlink, but I haven't gotten that to output a full path which I can then feed to copy. So, it seems to me that find might work well here, and I know how to find based on absolute dates, but not relative.
I am total new to linux as I worked mostly on RTOS (symbian). My problem is, I need to find the file IOSTREAM.H and I am following commands below:
1) cd /
2) find . iostream.h ( finds the file / directory from the current path)
It shows No such File or Directory
I realise KPDF is quite old now but as this issue may recur when I move to a newer distro (well, newer than Hardy) with Okular I thought I'd better ask.I use Gnome, but prefer KPDF to evince when viewing PDF files. However, KPDF's "Open Recent" list behaves very oddly - there's no apparent way to clear it, and items which were on the list one day aren't on it another day (coinciding with old items reappearing on the list).
Is there any way to clear this list?Similarly, is there any way to clear the list of recently opened files in the "Location" drop-down box in File-Open? (which also seems to mysteriously lose list items inbetween reboots).
I once had a script that when run would find the first 800GB of files in a directory (including subdirectories) and write them to a file (ie: ./800gb.sh > manifest.txt).I used this to create manifests of 800GB worth of data from large directories in order to dump to tape (LTO4).I'm sure its gotta be a pretty simple script, but I am not very skilled at writing bash scripts.
View 4 Replies View Relatednew to ubuntu and linux, and using Lucid 10.04 LTS ok, i'm trying to get XBMC going, and following a nice step by step instruction on wiki.xbmc.org, but now i'm stuck at this step... In Ubuntu the SVN Repositories are not automatically added. You must add them manually. First, download the SVN Repo Installer from: [URL] Extract it to the ~/.xbmc/plugins/programs directory. If this directory does not exist, run XBMC one time and then exit back to Ubuntu. The directory should now exist.
i ran xbmc once (after the video driver problem was solved, though it messed up my dual monitors, gotta figure that one out yet.) i got the zip file, it's in my downloads. now my newbness really shows... i can't find an 'extract' command for gnome, can't find the .xbmc directory using the file browser, can't figure out how to hunt for folders instead of files, and don't know how to look inside folders i don't have permission to, that is, i don't know if there's a 'sudo' like option for the file browser. i've been searching the forums, but without the correct search terms, i'm wading in an ocean. i really want to give ubuntu an honest try, but i feel like a foreigner. EDIT: btw, up to this point, the forums have been invaluable, you all are great.
I have 1000 jpg files in which all have a white background. Is is possible to change the white background color to red (for example) of all the files in order not to have to do it one by one ?
I would prefer to use Linux but I can handle Windows.
For example, change this Logo with white background to red background.
the system currently have a directory with all the invalid files. how bad is it to move a single file to a directory containing 3 million files already?
View 2 Replies View RelatedI would like to create a cronjob that will delete all files within a directory 1 hours after it is created to the folderI found this cron find /path/to/file/* -ctime +1 -exec rm {} ; but it's deleted all files.I want to make an exception, all file should be deleted except one file (letsay file a.zip)
View 16 Replies View RelatedI want to find maximum length file in a given directory. It should search recursivley. I want this to be done using ls and simple looping constructs.
View 6 Replies View RelatedI have a router that is 1000 Base T and two computers each with ethernet cards that support 1000 Base T. All are equipped with Cat 5e cable. Before I had a router that only went up to 100 Base T and I would setup one box with linux running proftpd. On the other box,I would use win xp pro and use firefox to ftp into the other box and download a file. Download speeds went up to 11.2 MB/sec. Now when I switched routers, I expected something like 120 MB/sec but I'm only getting 5.3 MB/sec. What do I need to change?
View 12 Replies View RelatedWorking fine: ==> scp my_log-bin.01393[0-9] root@192.168.103.66:/backup/ error - No such file or directory: ==> scp my_log-bin.0139[30-99] root@192.168.103.66:/backup/
View 4 Replies View RelatedUse file globbing to copy all the files in the /labs/data directory that end with a .out to the lab05 directory.
Use file globbing to copy all the files in the /labs/data directory that start with a c, d, or n into the lab05 directory.
I have a small ubuntu server setup and I would like to create a directory that can be written to by a select number of users. I have a backup directory setup and I want to enable my account as well as three others to be able to read/write to that directory. So far I haven't had any luck.
The owner of the backups folder, a directory on a separate disk mounted under /srv/storage, was owned by root and under the root group. I added the group backups and then changed the backups directory group to backups. I then used chown to change the backups directory to 775 to enable group members to write to it. I then tried to touch a file in the backups folder but no such luck. I did notice that when I run groups, my user account isn't shown as belonging to backups but is shown under the /etc/group file. I even made sure the GID of backups is in fact below 1000.
Anyone have an idea on how to create a shared directory that everyone can create, modify and delete any file? I believe my problem is related to the fact that root is the owner of the backups directory.
How can I find a particular directory in a terminal window in Linux? I think it involves using grep, but I'm not sure how.
View 3 Replies View RelatedI need a sed command to print a list of files in "/home" directory, ending in ".sh"
View 4 Replies View RelatedI have 2 external hdd in wich I have all my files.... yesterday, I have copied all the files from hdd2 to hdd1 and I want to eliminate duplicates so I used FSLint to find them, now, I have a txt file that looks like this:
Code: /media/My Book/!!!MIS DOCUMENTOS/Documentos/2 sep2003-jun2009 USB/!TESIS/TESIS/TESIS CVT LABVIEW Y CODEWARRIOR/LabVIEW85RuntimeEngineFull.exe /media/My Book/HDD_Toshiba/Borrable/Pen_Drive_4GB/Tesis/Super CD de la tesis/LabView/LabVIEW85RuntimeEngineFull.exe multiplied by millions of entries...
now I want to make a shell script to delete all the files/entries (read from the log file) that begin with:
Code:
/media/My Book/HDD_Toshiba/**** Since HDD_Toshiba is the folder in hdd1 (MyBook) that contains all the files from hdd2
Is there anyway I can recover my files that used to be on a FAT partition which I recently formatted to ext4?
View 2 Replies View RelatedI'm constantly going 'cd ../../../../'. Is there a command/alias that could let me go 'cmd 4' and I'd be taken back 4 directories?
View 8 Replies View RelatedI've been having trouble writing to DVDs on one computer, and I am trying to figure out whetgher the problem is hardware or software related, or, perhaps, both. The brasero CD/DVD writer puts the md5 checksums of each of the original files in a hidden file on the DVD. I think it then calculates the sums of the copies and compares, but the last time I did it, it told me it had permission problems with the drive, and that failed. When I put the DVD in, udev finds it and mounts it in /media with me as its owner. Apparently, brasero mounted it some other way, and presumably that caused the problem.
Secondly, after I brought it up again, I went into the /media mount point and did md5sum on the files there. One of them a gzipped tar file, produced and input/output error. Yet if I use cmp to compare that file with the original it finds no difficulty. So what is going on? Finally, I have tried to write shell scripts which use cmp to compare the original files with the copies. The general form is
cd XXXX
find $* >> /tmp/X
for i in `cat /tmp/X`
do
cmp $i YYYY/$i
done
where XXX is the mount point in /media and YYY is the original directory containing the relevant files. This works fine except that if the file names have spaces in them, the shell script treats each word as a new file name. So I need some way to arrange things so that `cat /tmp/X` returns the complete file name. Nothing I've tried works.
I recently accidentally (permanently) deleted a bunch of files off my computer. I used "foremost" to recover all my images, but there are still a bunch of videos that need to be recovered. The problem is that foremost seems to have also recovered a crapload of files from before i switched to ubuntu (i just removed windoze today) so i have a LOT of jpg images right now (over 400,000) and i don't want to deal with that many video files!How do i recover my recently deleted videos without getting a bunch that i don't want?? (can i specify the folder they were deleted from or something?)PS: i used this code to recover my picturesCode:sudo foremost -t jpg -i /dev/sda1
View 1 Replies View RelatedHow do you perform a long directory listing of all files in the /bin directory that have exactly three characters in their name?
View 1 Replies View RelatedIs there a way, on Linux, to cause all new files created in a directory to be owned by the directory's group instead of the creating user's group?
View 2 Replies View RelatedI want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
View 4 Replies View RelatedApologies if this has been asked before, which I'm sure it has from what I see googling around, but I cant understand this fully.
I have a piles of files in the .Trash-1000 folder on my flash drive that I want to delete. I can see them if I go in as root using the command line and entering "gksu nautilus" but it still wont allow me to delete them.