I'm trying to find all zip files timestamped from the past 7 days, then unzip them into a different director.I tried the following, but it only unzipped one of three files that meet the 7 day criteria. What am I missing?Code:find /home/user/public_html/zip_files/ -iname "*.zip" -mtime -7 -print0 | xargs -n10 unzip -LL -o -d /home/user/public_html/another_directory/
I would like to overwrite files in a directory tree, recursively. The ones I would like to overwrite match the filename "x_alpha*.png" and have a size exactly 456 bytes. Is there any way to search for these recursively in a directory tree, and overwrite them with a reference file, for example "e:mydirgood.png"
I am using Windows 7, but I have UnxUtils, so I can use those too. What I am looking for is something like this, generated automatically: copy /y e:mydirgood.png e:mydiracx_alpha0023.png copy /y e:mydirgood.png e:mydirefgx_alpha0045.png copy /y e:mydirgood.png e:mydirhx_alpha0248.png
I need little help. I want to find all files with extension "*.tar" "*.gz" and "*.zip" and move all those files into "/opt/old" directory. I've tried this command:
I am a member of a group which has written a program whose source code is being held in a specific directory (~cs252/Assignments/basicAsst/project) and we want to go through and change the parameters for the function "sequentialInsert." My job is to find all occurances of the function call to "sequentialInsert" and to also list the files from where the code came from. Also, I have to be in the commandsAsst directory when I do this. I have tried grep and find combined together, and I am at a lost.
I have CentOS 5.5 Operating System files copied from the DVD, it's not a ISO image. My Centos dvd is not working so, I would like to make bootable image and burn it on a DVD from the files and install CentOS on other machine. I have tried creating ISO image using mkisofs command but, it is not booting.
I'm trying to setup an Apache server on my computer which will allow browsing of files in a specific directory and subdirectories, without needing any sort of authentication.
I've got the Apache2 server up and running through yast, and everything works fine as long as I try to point it to the /www/htdocs folder. However, I want to point it at another folder, which is on another partition. This partition is formatted as NTFS, if that matters at all (here's some background on some permissions issues I had with the NTFS partitions recently).
When I change the "Directory" setting in the Yast http server configuration utility to the directory on the NTFS partition I wish to use, attempting to access the server results in the following error:
Code: Access Forbidden: You don't have permission to access the requested directory. There is either no index document or the directory is read-protected. If you think this is a server error, please contact the webmaster.
Error 403 192.168.1.100 Mon Jun 13 23:43:29 2011 Apache/2.2.17 (Linux/SUSE)
I am trying to design an application which violates the DHCP. Specifically the difficulty in writing this application is physically sending the raw packet. I need some documentation on either a library that supports this or where to look for support for raw packet creation. I am not trying to create a raw datagram, that doesn't meet my needs because a raw datagram is still at layer 3 I need to craft a raw layer 2 PDU.
Specifically I want to Send a very specific DHCPDISCOVER Receive a DHCPOFFER and pull apart the offer while never sending a DHCPREQUEST.
Specifically I am pulling apart various options that are sent in the DHCPOFFER. I have a raw DHCPDISCOVER already crafted and the formatted struct sockaddr_ll where I fault is I can't send the damn thing. Getting the file descriptor after calling socket is okay but what now? How would I write to that file descriptor and have it transmit?
Code: int connfd; struct sockaddr_ll bcast; bcast.sll_family = PF_PACKET; ... connfd = socket(PF_PACKET,SOCK_RAW,0); //now what
I have some basic experiencing creating simple scripts/making directories/changing permissions/etc. but I'm stumped on this one.
I have two linux boxes. I have a script set up on box 'A' to SCP into box 'B', grab a copy of a database backup and store it on box 'A'. It looks like this:
I have generated a public key on box 'A' and placed it into the authorized_keys file on box 'B', so a password is not required and the file copies over successfully when the script is run. On to my problem...
I need to know what date the 'dump.23.gz' file was originally created when I'm viewing it after it's been copied to box 'A'. If I ls -l on box 'A' it only shows me the date it was created on box 'A' when it was copied.
What would I need to add to my script to append the backup's original creation date on box 'B' to the filename so that when it gets copied to box 'A' I know when the backup was created on box 'B'. I'm sure this is probably confusing. I've done lots of searching and can only find information on how to append the current date and time to a file name. I need to append it's original creation timestamp to the filename when it copies over.
I have a folder with hundreds of .txt files (logs of some java application) that I have to merge in to one single .txt file. This application produces a new log file everyday:
day1: logFriday10September2010.txt day2: logSaturday11September2010.txt ... day8: logFriday17September2010.txt ... and so on...
I could merge the files easily with "cat" and ">>" however, the problem is that I have to do it by taking into account the date (creation or modification) of the file.
If I simple use the cat command the output file will receive for example, all Fridays in a row, then all Saturdays, etc. and in that way I'm not considering the date.
I've searched for the options of the find command, since the files after creation are not modified...I try to use this for example:
$ find . -newer <some old file>
but that lists me all files after that <old file> and not by correct date.
If there's some file, how can i forbid user to open it ? Ex. some text file or video file and forbid regular users (except root) to open/play it . Wouldn't chmod work here or should i use some other method ?
I have used Awk in the past to isolate the file name from a given path..that is to say, I may have a list of files contained in list.txt.Can someone please post the Awk command that would do this? (I assume it will be very similar in form to the Awk command I showed above.)The point is, sometimes I may want to isolate the second directory, sometimes I may want to isolate the third directory or tenth or whatever - so I am hoping that if someone posts the Awk command to isolate the second level directory (to produce the output I showed in Fig.3) it should be fairly obvious by looking at the form of this command how to alter it and so isolate any other directory I want.
I would like to find the command that copy my eclipse options to another workspace code...
It doesn't work, and it could be source of error to write the path .metadata/.plugins manually. It certainly a better idea to create a complete script ?
Im trying to add users to my nfs server with a specific home directory that already exists. Can this be done? I've done some research on google and other forums but cant seem to find the answer.
If i am in the root directory and i need to search for a specific file in the sysconfig directory, is there any way to search this directory for a file?
I have .jpg files in many subdirectories from where I need to copy all the images from all the sub directories and paste them to a specific directory.I have used `cp -rf *.jpg media/sik/` which only copies the .jpg files of the directory in which I was working.
So here it is, I have a fileserver setup on Ubuntu Server 9.10 AMD64. In this system is 4 1.5TB SATAII drives. They are configured in RAID5 using MD. Now everything is working beautify, except when you need to create a directory. When you do, the system hangs for about 30 seconds. The I/O wait in top jumps up as well. Then the directory is created and everything works once again.
The odd part is it is only Directory creation that does this. I can copy, move, download and stream files off the server perfectly. I am baffeled as to what is causing it. It might be related to the fact that I expanded the array from 3 to 4 drives, it was after that, that I first noticed the problem.
I started an upgrade from Ubuntu 8.04 to 10.04 and it stopped with the message: Ubuntu desktop is listed to be removed but is on the removal blacklist. Then it restored back to 8.04. I don't know how to resolve this - it would be alright to remove the old Ubuntu desktop.
I'm trying to do something like thisi created a group called www and made this group the owner of the directory/var/www/htmlso i can read and write to it.of course I've add my self to this group, but it seems i can't read and write.the syntax i used was something like chown :www /var/www/html.didn't workonly when i used chown samurai:www /var/www/html i could finally could create new file.the reason i don't want to specify the user name is because I'm thinking of a scenario when i need to give permission to a large group of ppl and don't want to do it user by user.
I need to search a bunch of files in a specific folder for a specific number and add all the numbers together to a total sum. I use Rsync everyday, everytime I run rsync i get a logfile (rsync output) witch contains the textstring "Total bytes sent: xxxxxx".
The "xxxxx" can vary in lenght. I need to extract the "xxxxxx" from each file and add the numbers together to a total size over a week or a month. Is this possible? And I wish to only use bash. One way of doing stuff at a time my friends .
I have a directory on my server at /home/dave/www/images/site (ext3) which I want to mount directly to my Windows computer so that I can transfer data easily via command line tool. Is that something possible?
My home directory's permissions allow only myself access to it. Is it possible to put a file inside my home directory with.. say.. full permissions, and create a symlink to it so other users can access that file alone inside my home folder? System is Ubuntu Karmic.
When I am creating a user (say sandy) on my FC14 system, I find that the default permissions for her home directory (/home/sandy) are 700.Can I somehow set up my system so that these permissions are 711 in place of 700.
I want to run a cronjob every 15 minutes that checks a directory for files. If the directory contains more than ten files I want it to send an email to me.
All I have is this...
*/15 * * * * ls -l | wc -l | [filename] | mail -s "This is just a test" [email address]
I would rather not write a bash script. Is there an easier way to do this? I was looking into some commands like find and grep.
I'm quite new to linux but I have configured a simple ftp server and it's working great. I have a FTP-Shared folder with upload and download subfolders. Under upload's and download's I have identical category subfolders like mp3's, movies, software etc. in both. As the guy's upload, I would like to create a line crontab where I can move all the content under /FTP-Shared/upload/mp3/* older than 14 day's to FTP-Shared/downloads/mp3/ recursively (Like in cp command), but the timestamp must be searched on the first directory and not sub files example: /mp3/Club Dance/CD1/Hallo world.mp3This is how far I got:[root@clients ~]# /usr/bin/find /FTP_Shared/upload/Mp3s/ -depth -mindepth 1 -mtime +14 -type d -exec mv -f {} /FTP_Shared/download/Mp3s/ ;This command moves the directory and files, but it is not recursively