General :: Creating Watch On One Directory In Which Files Are Continuously Coming?
May 4, 2010
Actuaaly i am creating watch on one directory in which files are continuously coming.Is there any command which can give listing of all files who have come in last 24 hrs.
Not sure if this was the right place to post this. Recently installed Ubuntu Netbook remix on my netbook. On Windows I used IIS and pointed my website to a directory full of video files all in .avi containers, enabled directory browsing and could watch files in the browser on my PS3.
I'm wondering how I go about doing this in Apache? Since there isn't any GUI to do it for me like in IIS :P.
Scenario: An IDE is set up on a Linux desktop box, editing PHP files locally. Every time I save a file, I want this change to appear on the linux server where Apache is running. The server has ssh (and samba and nfs for that matter).As a reference, when I edited files on Windows, I finally came over WinSCP as the exact tool I needed - WinSCP have just this feature present, with initial synch and then continuous update, using the filesystem watch service: "Keep Remote Directory up to Date".
On Linux, one could argue that sshfs could be employed to sidestep the need for synchronization entirely. On windows, a samba-share would do the same. However, I want the IDE to work with local files (on a SSD disk!), not having to go over the network to do PHP indexing and whatnots, which takes ages.But sshfs might be a part of the solution nevertheless - so that the continuous synchronization just needed to be done between two local directories.
I'm using openSUSE 11.4 (x86_64) with KDE: 4.6.00 (4.6.0) "release 6"Everytime I visit a direcotorya hidden file ".directory" is created. How to disable that? Is There a possibility to disable that behaviour only for public_html directory
I've just discovered that crontab is creating a new file in the root directory every time it executes a cronjob, and it doesn't erase over the old file so there are thousands of files in the root directory, they have the same name as the script file (appended with a numeral) but are all blank.here is what one of the cronjob's looks like[URL]
I can see the owner and group ids are shown because there are no corresponding entries in /etc/passwd and /etc/group respectively. I don't know much about linux and dare not to edit these files, I wonder if somebody already knows whether linux would map the owner id of files coming from other computers to the account name in /etc/passwd and display them when necessary (for example, when using ls -al)?
I've a user account in a remote machine. but it doesn't have a home directory in that machine.Is it possible to create a home directory without having root account details. If yes, how it can be done.
I have heard that creating hard link to a directory is not possible however when reading the man page of "ln" the "-d/-f" option says hard link directories ( super-user only). Thus this mean the super user i.e root can create hard link to directory and not a normal user , If yes then you . Even on specifying the above options I get a operation not permitted for a super user.
This may be a rookie mistake, but I created a user (new user) in Linux on a Ubuntu system and didn't actually create the home directory for this user. Now, when I log in, it says there are problems... If I delete the path home/<new user> and try to log in the system tells me I can use root as home directory but I will likely experience problems, and then it won't let me log in. What is the best way to create this directory with the appropriate permissions? Should I just create another user and delete this one?
jump into a Linux class in college with only 3 weeks left in the course. I thought I would be able to catch on, and go figure, it didn't exactly happen that way. I was given an assignment to do, and I am so far lost it isn't even funny. I need to create a directory structure, set up file security, create a step by step instruction manual on how to copy/delete said files, and create a guide to common Linux commands. How would I create these files in root and share them with the other users? and where can I find a list of common commands and their functions?
After some time i always see a trojan virus in my ubuntu machines shared folder. It is an exe detected by ClamAv as Trojan.Autokit-77 I thought i was getting it from some windows machine on the network but that isn't the case. I deleted the virus and removed my computer from the network and still the virus comes back. My computer however, is still connected to the internet through an independent mobile broadband usb stick.
So where is the virus coming from and why is it going to my shared folder. I thought ubuntu would not allow the virus to do something like this without me giving it permission. I am running 10.4.
I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively
If I have a directory /foo with a few files in it, how do I symlink each entry in /foo into /bar/? For instance, if /foo has the files a, b and c, I want to create three symlinks:
full snapshot of ArchLinux Repo for x86_64I want to use this as my restore backup should I need to reinstall Arch without network support.How do I build several *tar volumes of my /mount/my_repo to fit into 4.5GB DVDs ... the thing is 18 GB size...How do I extract all the *tar created to a folder later on...? is it the same as extracting a single *tar, will tar find all volumes in the same directory level so as to continue extracing or do I need to merge them in some way
how I can randomly write / create a 1 GB file in bash to test disk / network i/o? I was told I could use the 'dd' command but I don't know if there are some better ways and or what the 'dd' command looks like.
Concatenating two files without creating a newline between them? So how is it possible to do that? I've tried the following: Code: echo 123 > file1 echo 456 > file2 cat file2 file2 > file3
I am directly creating "qf" & "df" files into the sendmail queue folder and then processing this queue by command line. This is the only way to export data and email from this old application I am stuck with.This works quite well in my test enviornment but I am really new to linux/sendmail and just looking for any feedback on this process. Is this direct creation of queue files safe and any pitfalls that I should be aware of?
How would i go about copying files to a directory, yet skip the files that already exist in the directory, and also remove the files that are in the directory. For example:
Code:
$ls /dir1 img001.jpg img002.jpg
[code]....
Now i would like to copy from dir1 to dir2, but the contents of dir2 would be:
Please why my scanning is always creating huge 50Mb to 100Mb PDF files ?Each A4 Pnm file is of 6.5Mbytes by resolution of 150.If I decrease the resolution lower than 100, then it starts to be unreadable my text ...