General :: Write A Shell Script Which Can Ready Content Of The Folder And Place Files On Remote FTP Server?
May 9, 2011
I need to write a shell script which can ready content of the folder and place files on remote FTP server. I need to make sure that a file that is already placed on remote FTP server is not attempted second time. The file names will be something like Records-2011-05-09. The files will be generated by MySQL every hour.
i would like to find and backup all *.mp4 files from /Pictures and its sub-directories and move them to a single directory on a remote. I can find and move the files but I don't want the directory structure...just the files to be placed in the remote directory.
I want to write a shell script which will simultaneously collect OS user information and write in an individual text files.Can anyone tell me the syntax of the script.N.B. The user name will be mentioned in an array within the shell script.
i have to write a shell script that will delete all the .dat files in /var/oracle/etl/incoming which the created date of the file is 7 days before the currrent date.
I need to write shell script which can take number of files and count total rows from all CSVs and display total number of rows counted in all files. Is there any possibility of doing that using shell script and if yes then how.
Terribly new to Linux and find it mindboggling. I work on brain imaging and unfortunately all of the analysis runs on Linux, and I do not understand computers well coming as I do from a medical background. So my question - There are various folders of patient MRI scans (folders called P1, P2, P3 etc) and within them are enclosed certain files that I am interested in (always called the same name in all folders, say image001). I would like a script that enables me to copy and move this image001 in all these individual folders to another folder altogether.
there is a folder. Its empty. When every I drag a new file and put into it it echo out "there is file in there" and keep monitoring the folder. How can I do it?
Is there a Commend to move the content of a folder -only image files no subfolders- to another destination folder?Actully, I had a look on the following post:Post tilte:Using mv to move the content of one directory into anotherFrom it, I tried that:
Code: # rsync -a SOURCE/ DEST/ But, it copy a folder as all to the distination.
I'm wondering how you can post web content on your desktop like you can in windows. Like to have your XBL gamercard on your desktop sort of thing. But what I want it for mainly is to put the countdown to 10.10 on there.
is there a recursive shell or Perl script to delete files with the same name as the parent folder? i wish to include the starting folder name as argument to the script.
I am trying to show library content (asp files, IIS server, MySQL database on Windows Server 2003 - Inetpub/wwwroot/library - 192.168.0.3) publicly. But Apache webserver is on Linux machine. In one of previos topics I was adviced to mount this remote machine webfolder into Linux /var/www. Well this way it want work. I was adviced it can be done through redirecting.
want to sync 2 folders, one on a desktop and the other on my server. My objective is to keep the desktop folder always updated with the content of the server folder. If I get this working, I can do the same with the rest of my desktop and laptop users. When online they can run a script with rsync and update data. Is it possible to get 2 way sync?
Following script name is 123.sh and I need to put this in the background if I do 123.sh -bg this will not bring me back to the prompt but echoes what ever I put (using echo hello >> /tmp/123) in to the /temp/123 file. the only way that I have found doing this is to do "nohup 123.sh &" to put this in to the background. Is this okay or is there any better way of doing this?
#!/bin/bash # file name is 123.sh tail -f /temp/123 | while read line
I am calling a URL from shell script and passing few argumants,Here i have to pass file content as one argument.How can i pass file content through URL.
eg: content=`cat /Users/test1.txt` open http://localhost:8080?filecontent=$content
I'm looking for a way to insert an SD memory card into my computer and have it copy the files from it (a specific directory) in the background while I view the images from the desktop.
I have two folders such as nonserved/ folder1/ folder2/
And a served folder via Apache media/ js/ css/ img/
In the end, I want to include/append contents of /nonserved to /media so that [URL] will be as such: /media /js /css /img /folder1 /folder2
I am running Ubuntu Server, I am up for either apache config or symbolic link based answer :) Plus nonserved folder is rather dynamic thus manual symbolic linking to each folder is impossible.
i want to move all files inside the folder moving to folder public_html which command i should use? m using centos5 64bit tell me full command which i should writer in ssh client..so my all files will be moved to public_html...from moving folder.
ok trying to use grip but it keeps giving me this messageCode:No write access to write encoded fileso i have a folder in my home folder called flac and i want to modify the folder so that all files within can be written toi have tried
I have a problem while copying files from a remote computer to my local one using the scp command. I am sure that I am using it correctly, please check it below: --- blah@blah.com:~/g4work> scp blah2@blah2.com:IndirectMethod_Spher...s/H_1.mac.root . --- What I get in return (instead of the statement saying 100% of file copied) is: --- On this machine the G4SYSTEM=Linux-g++ ---
The interesting point is that the above returned statement is one of the Environment variables set on both the machines that are necessary to work with a toolkit called Geant4. Here is what I get when I type 'printenv | grep G4' just to show you (note the statement in bold): --- G4LEVELGAMMADATA=/home/blah/geant4/geant4.9.3.p02/data/PhotonEvaporation2.0 G4INSTALL=/home/blah/geant4/geant4.9.3.p02 G4LEDATA=/home/blah/geant4/geant4.9.3.p02/data/G4EMLOW6.9 G4NEUTRONHPDATA=/home/blah/geant4/geant4.9.3.p02/data/G4NDL3.13 G4VIS_BUILD_OPENGLX_DRIVER=1 G4RADIOACTIVEDATA=/home/blah/geant4/geant4.9.3.p02/data/RadioactiveDecay3.2 G4ABLADATA=/home/blah/geant4/geant4.9.3.p02/data/G4ABLA3.0 G4LIB=/home/blah/geant4/geant4.9.3.p02/lib G4VIS_BUILD_RAYTRACERX_DRIVER=1 G4LIB_BUILD_SHARED=1 G4VIS_USE_OPENGLX=1 G4UI_USE_TCSH=1 G4VIS_USE_RAYTRACERX=1 G4REALSURFACEDATA=/home/blah/geant4/geant4.9.3.p02/data/RealSurface1.0 G4SYSTEM=Linux-g++ G4WORKDIR=/home/blah/g4work --- The other thing that I would like to mention is that these Geant4 Env. Variables are loaded each time a new (bash) shell is started as a result of the bash login script.
I needed to install a new OS on a new HD but i also need the data on the old HD which has some problems (it doesn't boot anymore). The problem is that in the old one i had linux (Slackware) and so it doesn't allow me to view the content of the folder /home/myself. If there was something possible to do to recover all the data or if it will be forever lost.