I've got a bachelors in Computer Information Systems from Metro State in MN, but sadly, we didn't spend much time familiarizing ourselves with UNIX based OSs. I'm sure I'll have plenty more questions to pose later as well. I'll pay it back with help for anyone who's working on an evil Nortel router.
I believe I've got VSFTP set up and configured (mostly) the way I want. Here is the config file:
vsftpd 2.3.2-3After user uploaded a file it has -rw-------(0600) permissions. Of course user can change permissions manualy to any he likeBut how to set for example 0700 by default?
I am using endian firewall 2.4 and squid 2.6, everything is working fine for me. In squid report i am getting download list by user or IP based. Now my question is, is this possible to get user based uploaded file using HTTP POST. Is there any configuration have to be done is squid.conf, if answer is yes then please give me the what is configuration i have to give. Example: if suppose user uploading a image file into facebook.com, i want to track the file name and which IP address and then where its uploaded.
I'm rather new to Fedora server, but I'm attempting to run a music FTP server, where anonymous users can submit songs into one particular folder (so i can personally tag them), while other user accounts have full read-write. Here we go: I 2 directories, /music and /untagged
I want anonymous users to be able to read both directories, but only be able to upload to /untagged, and not be able to delete anything. I want users that I select to have full read-write-create-delete privileges. how would I go about this with vsftpd?
I am trying to read a file uploaded by a simple <input type="file"> form and directly write it to a mysql blob - without saving it to the filesystem. I tryed something like:
Code:
It writes some few Bytes to the DB and there is no error, but it's not the actual file that is being written.
I'm having a very strange problem. I use lampp on centOS linux. My application has a upload Script (in PHP) which uploads file to file system. after upload and moving file to correct location, uploaded files are getting deleted. I check file upload and moving of file by putting all the status in a text log file. files are getting uploaded properly and after upload I'm able to move the files to it's correct location.
Alright running a ubuntu based webserver. The app will be accepting user uploaded files from my client's clients. My client will then need to download an access the files. I'm looking for a solution to scan for windows malware at the time of upload so I never expose her machine directly to her client's uploads.
I have an rsync server and am now setting up cwrsync on my windows machine. I want to be able to run cwrsync over ssh with public private keys. I followed a tutorial over here to set this up. It, however, still prompts me for server password and works only if I provide the password. For some reason the public/private key process is not working.
- I generated a key using the command: ssh-keygen -t rsa -N '' (I verified the key gets generated on my Windows machine)
- I uploaded the generated file id_rsa.pub to server /root/.ssh/authorized_keys
I am also prompted for a password if from command prompt I run this command to log into server: ssh -i c:docume~1user.sshid_rsa root@<server_ip_address> On server I have changed the configuration file (/etc/ssh/ssh_config) to say:
RSAAuthentication yes PubkeyAuthentication yes
I then restarted the sshd service, however, to no avail.
Finally I managed to install my printer/scanner drivers.The last thing I need to do is to add the following two lines to 40-libsane.rules (which is a read only file):# Brother scanners ATTRS{idVendor}=="04f9", ENV{libsane_matched}="yes".How can I change permissions for this file or add these lines without changing permissions?
I am trying to use cron and FTP to backup files regularly from my main server to a backup server. The backup server was a "bare bones" setup with no control panel or even FTP. It is running CentOS 5.3.
I installed VSFTPD which appears to be running OK and I can connect via FTP from my other server, but when I try to run my backup script (it uses mput) I get a "553 Could not create file" error.
Some relevant info:
The user I have created for this is "ftz" with home directory /home/ftz/
Running ls -l shows: drwxrwxrwx 3 ftz ftz 4096 Dec 18 07:46 ftz
so permissions and directory ownership don't seem to be the problem.
vsftpd.conf was left in default form:
Code: # Example config file /etc/vsftpd/vsftpd.conf # # The default compiled in settings are fairly paranoid. This sample file # loosens things up a bit, to make the ftp daemon more usable. # Please see vsftpd.conf.5 for all compiled in defaults. #
I've just installed Ubuntu server 10.04 and have set up vsftpd somewhat successfully. I can connect as my root user, and I only have access to my home directory. However, whenever I try to copy a file across a file I receive a 550 Permission denied error.
I've run the command ls -l /home/directory and the result returned is total 0, when I was expecting details of permissions per user. I've tried running chmod -w /home/directory but this doesn't return any result nor change the result of the above ls command.
I have a centos server installation running, and have installed and configured vsftpd. FileZilla works great. I am able to connect and transfer files both ways. I used this just for testing purposes.
What I need to do is get Fling File Transfer working. I can connect to vsftpd with Fling, but that is as far as it goes.Sep 20 11:18:44 ftp vsftpd[28286]: warning: can't get client address: Socket operation on non-socketSep 20 11:31:03 ftp avahi-daemon[2240]:
I have installed vsftpd by "yum -y install vsftpd",disabled anonymous login and set .When I use a linux client's file browser to login using a user account "ftpacc" by ftp://ip_address, its location is "/" instead of /home/ftpacc".When I use a window client to login, its location is "/home/ftpacc"
I want to back up an entire Linux system on a 3Tb external Western DIgital USB3 drive.
I do not want to reformat it from what it is, apparemtly NTFS.
Is there a utility that can act like a file manager like mc, that will permit me to create an ever expanding (to 320Gb) TAR file that will retain all the original file permissions. I have had nothing but disappointment with Linux backup utils with a FAT32 external drive, and I am concerned if I just try an tar the entire drive at once, with around 3 million files, I might run out of memory.
I have an ntfs partition that I wish to access as a normal user(non-root). For this I did the following. As root I created a folder /windows and did a chmod 777 -R on /windows. Then I added the following line to /etc/fstab
Now, the partition is mounted alright but the problem is that when any other user (non-root) creates a files in /windows (say by executing touch newfile) the newly created file has the owner and group set as root. The non-root user can create the file and he can also delete the file, however, he cannot change the permissions of the file and also the owner:group is always set as root:root. How do I get across this problem, i.e. how do I mount a partition, so that a non-root user can also change the permissions and ownerships of the files he creates.
And I added umask 022 to the user login script problem I have: I login with user and password that exists as a local user on my suse machine. I can read and download from my homedirectory, but I cannot upload with filezilla. Then I get the error: 550 permission denied critical file transfer error
i am trying to set the file permissions for the log files "/var/log/Xorg.0.log" and "/var/log/gdm/:0.log". These files seem to be created when a user logs into a whokstation (my guess so far). I am trying to comply with a security mandate that all log files in the directory /var/log are set to 0640. The two mentioned files always seem to have the permissions 0644, does anyone know where and when these filea are created and how I might set the permissions when the files are created
I have tried to back up some material to Ubuntu One. I went through the sequence Nautilus -> right click -> Synchronise this folder. The computer was working for some time, and now the folder's symbol has a green and blue arrow on it. Yet, when I log on to my Ubuntu One account, I see only the folders, but not the files. Also, the quote counter shows that I use 0% of the alloted 2 GB. (The folder is something like 200 MB on my computer.) My question is if I miss a step here. Is there something else that I should do in order to back up the content of a directory? I don't want to move everything to the Ubuntu One folder on my computer.
I have system at work I am setting up that runs on linux, it was powered up back in september but we didn't get the details to configure until this week, unfortunatly var filled up with 100% spaced used due to a log file that keeps being written to until its intizilized, I can't just delete the file so (will not be recreated), I pulled it off and took it home and split it into a smaller file (from 740mb down to a 15mb chunk)I'm really just a linux newbie so can someone explain to me what the permissions are on the current file and then what chmod would make smaller file the same. clusternet.log is the orginal and clusternet1.log is the one i made from split. I know its read, write and execute (whats the r write after x on clusternet.log?) but I'm not sure on what it means in the position its in, the clusternet.log should have permissions only for root correct?
Code:
-rw-r--r-- 1 luke luke 16613376 2011-01-06 20:10 clusternet1.log -rwxr----- 1 luke luke 740130816 2011-01-06 06:39 clusternet.log
i am using ubuntu 10.4, i have attachedd my desktop screenshot , my problem is that i use no software that uses network but still OS uploads data from my pc, is it something to worry about my network security ? and is any way to check which file is using network and how much it is using. [URL]...
I registered here and got Ubuntu a couple days ago. After setting up and connecting to my wireless LAN I went on the internet and uploaded the WINE source. However, after about 5-10 seconds of staying on the internet it all of a sudden stopped working (the internet stopped its connectivity) that is. And I was still connected to the wireless LAN. I tried to reconnect like most people would do, and I was not able to connect again (Would prompt me for password continuously to no avail). There isn't a problem on my windows 7 (I'm able to connect to LAN/internet).
I have an apache installation with /var/www/bob as the document root and the only site served up. I have bob as the directory's owner, and he is able to upload his files to that folder via ftp (vsftpd on the server). When a browser tries to access the pages, it simply gets a 403 forbidden. The problem occurs when apache2 tries to access the files bob uploads. The www-data user (Apache daemon) gets permission denied when I try to cat bob's files in the shell, so it's purely a permissions issue. What I can't figure out is how to give the Apache daemon the ability to read bob's files while also making certain it does not have the ability to modify any of them.
There is a suspicious amount of data (more than a megabyte) being uploaded from my computer whenever I log onto a commercial web site on which I advertise rental properties.
Is there any way I can see the data being uploaded - I am pretty familiar with the Unix/Linux system and commands.