Security :: Unable To Upload File Via Browser To The Server?
Feb 12, 2010
One of my user wants to be able to upload file via browser to the server. For that, i need to grant apache read and write access to a folder. How much secure is allowing apache to grant complete read and write access to a folder ?
i have VPS server and i installed Xserver on it and all ok i created new user for my client but i need to limit his access to the following
he can download and upload to his home file " browser by Firefox" he can't install or use any application "just the one i installed it" he can't see the file system or browser it !! if i can give him specific space on harddisk would be better he can extract and compress files he can't edit the settings ....
i have another sensitive folder and setting i don't want him to see it so how to limit his access?
However, configured a website on a dedicated server using WHM/cPanel. The site was uploaded using the master account for the website.
The security issue is public users are able to upload files on to my server via the website. They could even access the root and execute whatever they want on the server.
I have consulted with 2-3 Linux experts. According to them, the PHP user has rights to execute anything on the server or upload & store files in whichever folder they want.
Can I protect my folders to avoid file uploads via the website. The application has security vulnerabilites. However, I want to prevent hackers to enter my site until the vulnerabilities are fixed.
I am using "curl" command line tool to upload file to ftp server through ftps.I have also tried with the "Secure FTP" software from windows using Implicit mode, which works fine while transfering files.Command as follows:
curl -vk --ftp-ssl -u [username]:[password] ftps://ftp.hostname/directory/test.txt -T /tmp/text.txt --ftp-pasv --disable-expsv Login to server successfully but geting error while start to transfer data. The verbose
upload a file to a business partner of ours in another country. Currently they have an SFTP server set up for us that I am using to download a daily generated file from a previous requirement. I use a bash script to download it since its fairly simple.
sftps manual page only gives a hint about using a batchfile, however i still cannot get it to work. Does anyone know another way? Or if you can even suggest another method or application? It seems like a bit of a cop out you can EASILY download using the sftp command but can't upload.
EDIT - forgot to mention I have already got keyless entry set up using ssh keys.
I have been trying to get my message modem to read .g3 files under ubuntu (now 10.04) for over a year now, and have tried absolutely every application I can find to no avail.The output is always garbage.Every application tried gives the same result, but the fax is perfect on Windows, but I don't have a Windows machine any more.
I am unable to upload the .g3 file as it is an "invalid attachment" but have saved it to url.I have (hopefully) attached the resultant output from gimp, which is exactly the same with tkusr and efax. The fax is the weekly Debt collection agency touting for business (no, I don't owe them any money - yet)
I believe they all use a common file, something like g3topbm or g3topdf.If I try g2topbm on the command line: every second line has an error e.g: g3topbm: bad code word at row 202, col 5 (len 14 code 0x32), skipping to EOL
The .g3 files are created by a USR message modem which stores the files for later download ('cos the computer gets switched off at night.)
i have apache server running on my laptop. i am trying to upload files on it. i tied from terminal first connecting to server by ftp and then put but i am getting an error
Code: vivek@NEO:~$ ftp localhost Connected to localhost.localdomain.
I have just started using linux. I have setup an ubuntu apache2 server. It has been running brilliantly and I am highly impressed with the Linux system. My box is an HTTP server and I am hosting a website on it. I have VSFTPD installed and functioning as my FTP software. It has worked fine so far but I have been a bit annoyed that I have had to set permissions for each file I have put on there.
Now I have run into a serious issue with the permissions being set to 600 and I really need them to 755 because I am running an automatic upload for a webcam and the Image can't be accessed due to the automatic permissions of 600 being set to the image. My extensive windows background tells me that I need to apply the correct permissions to the WWW folder and get the files to inherit these permissions automatically.
I am trying to use apparmor to restrict my file browser, which is Thunar to only let me view the files that are in the home directory and also removable media.I tried following the apparmor sticky with no success.I created the profile and tried editing it and it either started and let me do pretty much everything or did not start at all. Would it be possible for someone to help me step by step to set up a profile for thunar that would only show the home directory and removable media.
We have a web server running apache and a custom web app that we log into from a web browser and it ask you to except the certificate and all is well. I now have an user who is using a window server 2008 and he wants to manually import the *.cer file into his browser to be able to login. My question are:
1 - What is the file that is being imported into the browser? *.pem *.crt
2 - I see on our server that we have our certs I believe located in the /etc/pki/tls/certs. The openRADIUS servers that I have created, this is the directory to where it is stored.Is this the typical placement for certs.
3 -If the files is a .cert or *.pem than could I use openssl to convert them to the appropiate *.cer file for IE7
After few hours working I can connect to ftp.And download files from the server.But I can not upload file or create directory.I checked my configure file several time could not figure it out.
I'm configuring a new public mail server running Centos 5.5 and Sendmail 8.13.8. I would like that the sendmail configuration is correct. The server will handle many domains using Virtualmin. Everyone can send mail only if authenticated, which I already tested.
How can I upload configuration file? I receive an error if try to upload a zip file.
Well I have vsftpd working flawlessly in a file browser or ftp client. I can log in, upload and download; but I cannot upload or download through a web browser. I have tried IE8/9, firefox, iceweasel and epiphany to no avail.
I need to upload a single file to FTP server from Ubuntu. This operation should be done in a script (in non-interactive mode). What is the right syntax for ftp?
Browser can't find server at att.yahoo.com so no internet. My folding at home client with Stanford can't download {an upload went ok}. I have 2 other fedora boxes & 3 windows boxes thru the same router and they are all fine.
I can manually ping Stanford ok, Add/remove software within fed. works ok. I can type in 192.168.0.1 & get the page for my router The only thing I did between working & not working was to install Nvidia Cuda driver for my GTX275
My guess is something in the firewall got tweaked. but I've compared it to 2 working boxes & nothing jumps out at me.
I have been trying to make it possible to upload images on my website via a web browser. I have created the HTML form, I can browse to my desktop and upload an image. But my question is, where is it on my server after I upload it? I am running 8.04 on my server.
I configured non-anonymous ftp server in my Ubuntu 10.04.it's working downloading and uploading through thrid party software like filezilla.Now i think that without using any other software i want to upload and download the ftp content in browser it'self.i heard that using webmin i can upload and download ftp data sharing through browser.
I am new to Linux and decided on trying out Ubuntu. Love it so far. Running Ubuntu 64-bit 10.04 on a Toshiva Satellite L300. Network browsing worked fine to both my other machines (one running XP, the other Win7). The next day I switched the laptop on, the other two systems don't show automatically in the File Browser under Network like they used to. It now only shows a Windows Network icon which, when I click on it, gives the following error: Unable to mount location, Failed to retrieve share list from server. I can access the machines' shared folders perfectly through the terminal using smbclient.
I can't access any of the sub-folders or files in my home folder on Ubuntu. I've checked the folder associations, that doesn't seem to be the issue. I've also opened the mimeapps.list file and the inode/directory association seems fine - inode/directory=nautilus-folder-handler.desktop;
I'm running Intrepid (8.10) (please don't ask me to upgrade! ) and the issue started after using Qtpfsgui 1.9.3 a couple of time to create HDR images. I guess Qtpfsgui broke an association somewhere, but where? I can access other folders, on Computer and Filesystem, but not on my home folder.
I've got used to using the ftp command from the terminal, which is useful, especially with macros. But it requires user input, and what I want to do now is upload a specific file to a server, once I've finished working with it every day. It's the same file every day. II would like to be able to do this semi-automatically: I just give the command and it connects to the server and uploads the file. (I will probably want to encrypt the file before uploading it.)I don't know how I could use ftp without any user input: I want it to be automatic.
I am using a linux fedora 12.0 with L7 filter and proxy as the main firewall for my system composed of some several hundred pcs. The port 80 is open for certain mac addresses these computers, that is to say that , only a few of these computers have access to internet and others have been denied. However, they have access to two specific websites on internet .
I would like to know that if there is a virus attack through these websites in form of executable adwares or malwares, can this linux firewall detect any information that might be directed out of those computers to the attacking source? In other words, is there s tuning in L7 filter or any other filter that can detect transfer of files or some bites through port 80 unrelated to normal http requests?
I have a remote VPS with 9.10 installed and would like to host some files on it. I'd like to be able to download the files from a browser using a login name and password.
I want to be able to created directories and upload files (images mostly) via a php web page. The directory structure is a throwback to windows and I really really don't want to have to change it because there are so many files/links already there.
/cust/cust_name/site/version/web (all html/php files go here)
I want to be able to edit the files with a 3rd party tool (SSH based). These are small orgs, like my church, local community club, sports team, etc., so file ownership needs to sync with the editor, not apache.
I just installed Fedora12 in a Core i3 machine... everything looks fine, but I have a huge problem... every time I upload a file (using ftp or sftp) some wier characters are included inside the file... for example.
I am testing my ftp server configuration.Anonymous download works , however anonymous upload does not.I am getting the following error message from both Windows and Linux 5.4clients : 553 cannot create the file.And i am running Fedore 12.
i have been trying to complete the following project1) Configure a FTP server where we can upload and download files.........2) server must run at 9 pm & stop at 9 am automatically ............although the first task was easy ,i have no idea how to accomplish the 2nd task(not to mention I'm a new user)
In Nautilus I select a directory on local NTFS volume. I'm logged in as root, right-click > Properties > Permissions and I set "Others" to "none". But it doesn't work. I want my friends & visitors to use and enjoy Ubuntu but without access to my NTFS volumes.
I am trying to upload some pics on my Facebook account using Firefox. When I click on Facebook's file upload icon, Firefox bring up a 'File Upload' window. I noticed that smaller image file is previewed on the lower right hand corner, while bigger image file is not. Is there anyway I can change this behavior or maybe change what Firefox is using to browse my files?
i am facing another problem with Squid Server The Ip's that pass through Squid are unable to update microsoft security Essentials Windows 7 O.S they are maintaining in their PC.