Debian :: Ftpsync Still Downloading Old Files Despite Exclude?
Sep 6, 2015
I was thinking of migrating my apt-mirror repository to the recommended ftp scrips: [URL] .....
I pre populated my pool with already downloaded files, and setup the scripts.
However, if I run the bin/ftpsync, and monitor rsync with lsof -p, I can see that it is still downloading files from oldstable (wheezy) despite exclude options.
I'm guessing it's a configuration error, but I can't seem to figure it out. Any thoughts? My etc/ftpsync.conf is as follows:
Code: Select allMIRRORNAME=`hostname -f`
TO="/server_storage/srv/mirrors/debian"
RSYNC_PATH="debian"
RSYNC_HOST=ftp.us.debian.org
LOGDIR="${BASEDIR}/log"
[Code] ....
Actually, I don't think it works like I thought it did. A few guides I found listed the exclude options, but the sample config file has this:
Code: Select all## If you do want to exclude files from the mirror run, put --exclude statements here.
## See rsync(1) for the exact syntax, these are passed to rsync as written here.
## DO NOT TRY TO EXCLUDE ARCHITECTURES OR SUITES WITH THIS, IT WILL NOT WORK!
#EXCLUDE=""
So it looks like it doesn't exclude the suites at all.
View 5 Replies
ADVERTISEMENT
Feb 3, 2010
i have created on folder in my server to upload some regular states. I want that user can modify or upload already stored files. but, should not upload any unwanted files orfolders.for that i want to use "rm" command as auto scheduler (putting this in cron tab.so that all files will be removed except some required files / folders for which this upload facility is activated. users are using secure-shell for uploading data.
View 1 Replies
View Related
Aug 4, 2015
I have some scripts that need to use a newer version of PHP, Im running Debian 6 which has PHP 5.3.3 support, I found I could install php 5.4 using [URL]. This worked, it updated my PHP to a newer version, the only issue is that when the install completed apache now downloads the PHP file instead of rendering it.
Im guessing this has something to do with the Apache configs, but I don't know what to do.
Code:
Select alltom@vps:~$ dpkg --list |grep -E '(apache)|(php5-)'
ii apache2 2.2.16-6+squeeze12 Apache
HTTP Server metapackage
ii apache2-doc 2.2.16-6+squeeze12 Apache
[Code] ...
View 1 Replies
View Related
Mar 21, 2010
opensuse v11.2, linux 2.6.31.12-0.1-desktop x86_64
ZIP v2.32
I wish to exclude some files from a zip archive. On other OSes to exclude an entire directory I would use the "-x" option like so:
Code:
zip -r archive-name * -x dir1/* Simple. And just add "-x"'s as needed (or use an exclusion file).
Not how it works here, it would seem. AFAICT all "-x" options are ignored. (The entries in an exclusion file also.) For instance, "-x diy/mplayer/*" should ignore everything in the <diy/mplayer> directory. It does not. I have tried fully qualified paths as well; no joy.
What is different about ZIP on linux?
View 7 Replies
View Related
Dec 7, 2010
I'm trying to find all java files in bash that contains the method "assign()".I would like to retrieve the same list except without the Test* files. How can I do that?
View 3 Replies
View Related
Dec 26, 2010
In reading the rsync man page and browsing a lot of websites, I ended up a bit confused, or maybe it was just too much eggnog. Anyway, to exclude a directory "videos" with everything in it, which is /home/user1/camera/videos and I'm rsyncing the whole user1 directory to an external drive
[code]...
View 1 Replies
View Related
Feb 11, 2010
I have a Linksys AG300 "Adsl gateway" router/modem. When I download files with Iceweasel, the connection to the internet drops out (the internet connection light goes off, downloading stops). It's been happening for a while with Etch (and whatever version of Iceweasel Etch has), but I've today installed Lenny and it is still happening in Lenny.
My ISP said it could be a problem with the phone line because my computer is connected to an extension, but it does not happen at all if I download with Opera, and I would have thought that the browser wouldn't matter if it was the phone line or something in the router/modem. I'm not that fussed because I can use Opera to get my downloads, and the new version of Iceweasel will let you continue on if the download stops so all is not lost if it stops (it's just annoying). I'd be interested if anyone has any ideas as to why this happens. It seems to be an "Iceweasel thing".
View 4 Replies
View Related
Jun 28, 2010
I have the following command which finds all files that have changed in the last day and lists them. How can I exclude hidden files like .bash_history?
View 3 Replies
View Related
Nov 28, 2010
I'd like to backup my whole system to a 2nd disk using rsync (other tools not possible).Which paths should I exclude from the packup?I was thinking about /proc, /dev, the lost+found directories...What other paths am I forgetting?
View 2 Replies
View Related
May 24, 2011
I have successfully created an iso of my current running system using live build with the --bootstrap copy option..As expected, the image is gigantic. I would like to be able to use live-build to create copy-of-host iso's, but with specific options to -exclude specific pathways (ie. music folders, picture folders etc). Is there a way to do this? I did run a configuration and build using an option similar to that found in tar (something like -exclude=/home/user/music) and it ran through without any apparent errors, however, there was not any iso image to be found.
View 2 Replies
View Related
Apr 25, 2016
I'm trying to configure auditd to monitor "strange" events with apache2 weberver on Wheezy (though same problem occurs on Jessie), tried both with "vanilla" 3.2 and backports 3.16 kernel I am actually using.
Here's auditd rules I have problem with:
Code: Select all-a exit,never -F arch=b64 -S stat -F path=/var/www/server-status -k web
-a exit,always -F arch=b64 -S stat -F uid=www-data -F success=0 -k web
So to recap, I want to log stat syscall failures for www-data user, but excluding some "known" issues, such as that "/var/www/server-status" (after a2enmod status, /server-status path can be accessed for statistics, though apache2 still tries to find physical file for that path and fails).
But the problem is.. excluding does not work.
Here's "auditctl -l" output:
Code: Select all# auditctl -l
LIST_RULES: exit,never arch=3221225534 (0xc000003e) watch=/var/www/server-status key=web syscall=stat
LIST_RULES: exit,always arch=3221225534 (0xc000003e) uid=33 (0x21) success=0 key=web syscall=stat
But when I execute:
Code: Select all# wget -O - http://localhost/server-status
audit.log appears:
Code: Select alltype=SYSCALL msg=audit(1461591557.077:365): arch=c000003e syscall=4 success=no exit=-2 a0=7f1bedab9358 a1=7ffef316ac20 a2=7ffef316ac20 a3=7f1bedab91f8 items=1 ppid=2398 pid=2451 auid=4294967295 uid=33 gid=33 euid=33 suid=33 fsuid=33 egid=33 sgid=33 fsgid=33 tty=(none) ses=4294967295 comm="apache2" exe="/usr/lib/apache2/mpm-prefork/apache2" key="web"
type=CWD msg=audit(1461591557.077:365): cwd="/"
type=PATH msg=audit(1461591557.077:365): item=0 name="/var/www/server-status" nametype=UNKNOWN
type=UNKNOWN[1327] msg=audit(1461591557.077:365): proctitle=2F7573722F7362696E2F61706163686532002D6B007374617274
So, syscall=4 (stat) is still captured. Looks like "path" is known for auditd, but not excluded.
I've tried various rule combinations, for example simpler, more generic one:
Code: Select all-a exit,never -F path=/var/www/server-status
But it's the same.
Sadly man audit.rules and man auditctl does not have "exit,never" examples, only some (sometimes also similarly unsuccessfull) google results.
Could it be that Debian kernel does not support some audit features?
View 1 Replies
View Related
Oct 12, 2010
I have to administer a few mail servers, a mail log server, 4 nameservers and a web server -all running on Centos 5 server distributions. Now I have a task: to avoid accidental crashes on the production servers while installing updates, my boss asked me to do clones (these clones will all be VMware virtual machines) of the servers (EXCLUDING the actual e-mails and mail log contents) and then to run those clones on VMWare Server. This way, first I will install and test updates on the clones and - if they will be running without crashes - I will apply the updates on the real production servers themselves. I have already installed VMWare Server 2.0 I have a few questions:
- How do I build the virtual machines to exclude the actual mail files and mail logs? Can I use VMware Converter for this purpose, or do I have to use another program?
- How do I actually do this cloning? Is there a tutorial on how to do this?
View 3 Replies
View Related
Feb 1, 2011
I understand wget is used to download files. Is there a way I can search a url for what files are available for me to download. I need to install a plug-in from an adobe website.
View 2 Replies
View Related
Nov 14, 2010
I've been using PAN for quite some time - recently installed 0.133 from the Ubuntu software centre. Worked fine for a while, no issue. Then, a week or so, it started downloading .msg files in company with the binary files I was getting. Sometimes one .msg file for one binary, sometimes quite a few. It seems to have some correlation to the size of the binary, that is the larger the binary, the greater the number of .msg files were downloaded.This morning, it would ONLY download .msg files. I could see the decoded binary in the PAN viewer pane, but it isn't present on my system. I have made NO changes to any configuration files, other than installing the recommended updates.
How do I correct this?Can anyone tell me what these .msg files are, and how to stop them from downloading?Are there as-good or better newsreaders out there that I can/should try?
View 3 Replies
View Related
Mar 27, 2011
How to set up a script to log on to a server through SSH, copy file from the server to the local machine, and then run a script on the downloaded file. More specifically, I've got a minecraft sever that is run on ubuntu.
I know that I can do
# ssh username@ipaddress
to log on to the server through the terminal.
After this it asks me for a password. Once I type in the password I have access to the directories on the server. How can I set up a script to log on to the server and enter password? Once I do this, how do I automate it to copy a file from that server to ~/Desktop? If I can do this, I have a script that will run from there.
I've learned that I can do
scp -r remoteuser@remotebox:/remote/directory /local/directory to copy files from a server to my local machine, but it still asks me for the server password. how do I make it so that the password is automatically entered?
View 2 Replies
View Related
Mar 23, 2010
I want users of my website to be able to download files from it.
When they request a download, I would like them to see a window with the following attributes: ability to choose where downloaded file goes; progress of the download, and estimate of remaining time; a button to cancel the download.
There are hundreds of sites out there that do this, so obviously it's possible, but can it be done from a php script on the server? If not, does anyone know of a script in java(/script)?
I would also like to monitor when a download failed and succeeded, and if the failure was not because the cancel button was pressed, pickup whatever system information is available about the failure cause.
View 4 Replies
View Related
Sep 20, 2009
I keep on downloading tar.gz files into my downloads folder and i cant do anything with them. what i need to do to install the file so i can use it? An example, i am trying to install Frets on Fire, and am failing bad.
View 9 Replies
View Related
Apr 1, 2011
I need to download some very large files (circa 75 GB) from a remote server via SFTP. I've been using SFTP via the command line on my Linux netbook. Around halfway through, the transfer stops and says "stalled." Can anybody recommend a reliable way to download these files?
View 1 Replies
View Related
Nov 19, 2010
When I try to load a php file my browser downloads the file instead of displaying the page. Here is my apache conf file.
#
# Based upon the NCSA server configuration files originally by Rob McCool.
# This is the main Apache server configuration file. It contains the configuration directives that give the server its instructions.
# See [URL] for detailed information about the directives.
# Do NOT simply read the instructions in here without understanding what they do. They're here only as hints or reminders. If you are unsure consult the online docs. You have been warned.
# The configuration directives are grouped into three basic sections:
# 1. Directives that control the operation of the Apache server process as a whole (the 'global environment').
# 2. Directives that define the parameters of the 'main' or 'default' server, which responds to requests that aren't handled by a virtual host.
# These directives also provide default values for the settings of all virtual hosts.
# 3. Settings for virtual hosts, which allow Web requests to be sent to different IP addresses or hostnames and have them handled by the same Apache server process.
# Configuration and logfile names: If the filenames you specify for many of the server's control files begin with "/" (or "drive:/" for Win32), the server will use that explicit path. If the filenames do *not* begin with "/", the value of ServerRoot is prepended -- so "/var/log/apache2/foo.log" with ServerRoot set to "" will be interpreted by the
# server as "//var/log/apache2/foo.log".
#
View 7 Replies
View Related
Apr 8, 2010
How do I download all the files form here: [URL]. I am on freeBSD 7.0 and I tried wget with the -r switch and it gives me URL's only. Maybe this is simply not an ftp site I don't know. How I can download all those files with the same directory structure.
View 5 Replies
View Related
Feb 17, 2010
Is there an easy way to get all of the multilib files from [URL]? I've tried:
1.) wget - won't use wildcards on an http server, and trying to get the whole folder just gets me index.html
2.) ftp - it's not an ftp server and I can't login anonymously via gftp. I wasn't expecting this to work, it was just something else to try.
3.) rsync - If this could work, I may not be using the right syntax. I tried (-n for a dry run first):
Code:
rsync -avn [URL]
And it just sat there doing nothing until I hit ctl-c.
Obviously I could download each file from my web browser but I figured there had to be a more elegant Unix-y way without all the clicky-clicky. Are the files hosted on an ftp server anywhere?
View 5 Replies
View Related
Feb 26, 2010
I'm trying to download Star Wars the Clone Wars torrents from two different feeds. The scantime below is temporary while I'm trying to get it to work. Will change it to 15 min later on.Below is the verbose output.
Quote:
INFO --- RSSDler 0.4.2
DEBUG writing daemonInfo
INFO [Waking up] Wed Oct 27 18:12:11 2010
DEBUG checking working dir, maybe changing dir
[code]....
View 2 Replies
View Related
Mar 24, 2011
I currently mirror the updates repository to my computer using rsync. I was wondering if I could save space and bandwidth by only rsyncing the .delta.rpm files? Are there any disadvantages to this or does zypper/YAST handle updates just fine with it?
View 3 Replies
View Related
Mar 17, 2010
I have an Ubuntu 9.10 server set up at my house. I have Apache2 and PHP5 installed on it. Every time I go to the server on a web page and try to load the PHP index page it downloads instead of displaying.
I have virtual servers set up and have the files stored at /home/cusinndzl. If anyone needs to take a look I can let them into the webmin panel.
View 4 Replies
View Related
Apr 24, 2011
Ubuntu Lucid 32-bit desktop machine. I run "aptitude update" and "aptitude safe-upgrade" manually from the command-line regularly to upgrade my machine. I usually don't pay that much attention to the downloading, but I noticed yesterday that aptitude appeared to be downloading four files simultaneously, rather than the one at a time that I've always seen. Would this be correct? How can I change that? I can't say I've ever noticed it before.
View 3 Replies
View Related
Jan 23, 2010
I want to limit bandwidth for downloading files with squid. I want to reserve bandwidth for other traffic (esp. web browsing). I know about delay pools but I don't understand well. Some users use download managers to download large movie files. I don't want to block downloading but I want to give them limited bandwidth. May be 5KBps or 6KBps because I have only 512Kbps (64KBps) connection.
View 1 Replies
View Related
Jun 5, 2011
I am looking for a file sharing program to install on my dedicated server that will allow me to upload large MP3 files and allow my clients to download them. these files are recordings of counseling sessions for families who are seeking help for their children.
What I am looking for is similar to the system this company uses [URL].
View 4 Replies
View Related
Oct 17, 2009
New apache install, php files are downloading rather than displaying, does anyone know whats causing this. When I compiled php with apxs it automatically added this line:
Code:
LoadModule php5_module modules/libphp5.so
I restarted apache, and also try adding this line:
Code:
AddType application/x-httpd-php5 .php5 .php4 .php
and then restarting.
View 2 Replies
View Related
Jul 13, 2011
For my project, I'm interested in the Scatterometer Products of Oceansat 2 from an Indian page [URL]
It's no problem for students to get a password to access and download their data for free. Nevertheless it's quite complicated to download the files by hand, since you have to mark every file by hand and click on a download button at the end of the page. When I tried it with my Script (which is below), and an internal server error 500 occured. I hope you're not too busy and could have a look on the script where the cookie and IP are entered manually for trial purposes.
The construction of the page is:
The adress, where you have to login:
http://218.248.0.134:8080/OCMWebSCAT...controller.jsp
The adress, when your're logged in:
http://218.248.0.134:8080/OCMWebSCAT...controller.jsp
[Code]....
View 3 Replies
View Related
Apr 12, 2011
I put a few Itunes songs into my music folder and tried to play them. It didn't. Then, I got to reading in the forums and noticed that there was a "one click" configuration for Gnome users. After doing some more reading, I saw a lot of references to restricted codecs and things of that nature. After beginning the one-click process and then aborting midway through installation, three questions come to mind:
1) Why all the warnings about legalities of downloading the codecs/files? If I bought the music and I use the codecs to listen to it, where does legalities come into play?
2) While installing some of the different files, etc in the one-click process, I received a few warnings that a particular file was not from a trusted source (I don't remember the info verbatim) and then it gave me an email address, presumably from the developer, and asked if I wanted to install it anyway.
3) If I do go through the one-click process, will I be able to listen to I-tunes or am I pretty much screwed on I-tunes on linux?
View 9 Replies
View Related