Ubuntu Servers :: Add Files And Make Chmod A Little Easier
Sep 13, 2010
I have set up a home server with ubuntu desktop since im new to this and need the gui. I have no problems with the lamp package or setting that up. My question is, can you setup a ftp server and then connect to it from a windows pc on the same network with a client like core ftp? I want to do this to add files and make chmod a little easier if possible.
Can this be done similar to some other O/Ses so Nouveau doesn't pre-screw up the kernel for 3rd party drivers??? Would be great in F16 if we were given a choice re: wanting nouveau or not. I simply can not get F14 or F15 to work with the 270.xx drivers on recent equipment.
i was trying to make a script to redo a chmod on a folder every 15min. by using crontab (*/15 * * * * /(link to script.sh)
now i didn't know much about the use of the scrips so i just write in the command i would in terminal chmod -R 664 /data/
i try to run the script in terminal and now i can't see the files in that folder. but acording to the folder properties, i am still using the disc space where the data use to be, so the data in the folder doesn't seem to be deleted.
the main focus i want is how to get the data back. scripting and others can come later.
I just finished setting up my home ubuntu home server. Installed LAMP and it works beautifully. The problem is everytime I upload a file through FTP into the server, the file changes permission even though I did chmod -R 755 www. Si everytime I upload a file to my server i need to run the command chmod -R 755 /var/www
I have spent the last 2 hours trying to get this to work and it is driving me crazy, I have a 11.04 box and have setup some zfs filesystems for data storage, I have 2 users and have created a group called media and added both of the users to the group. I have changed the group of the directory to media and have set chmod g+s
root@saturn:/tank/data# ls -l total 8 drwxrws--- 2 root media 2 2011-06-18 13:59 Backups drwxrws--- 2 root media 2 2011-06-18 14:26 Music drwxrws--- 2 root media 2 2011-06-18 12:44 Pictures
I made a Bash script that is fired by a Cron job every morning. It dumps an SVN backup on some Samba shared drive. I would like to know how I can make sure the job worked correctly without having to verify the shared drive every morning. Right now, I take the job's output, save it to a log file and send this file by email. But the ouput isn't so great.
I've been searching the web, without finding any sollution to my problem.vsFTPd is acting really weird. I've never seen this problem before, and I've been using vsftpd for some years nowWell.. The thing is, I've made a user that chroots to the folder /var/www on my server. And when I then try to chmod the file /var/www/htdocs/testsite/index.html through my ftp-client, I only get the error "550 SITE CHMOD command failed.", and when I then check in my /var/log/vsftpd.log it says
Code: FAIL CHMOD: Client "192.168.50.58", "/htdocs/testsite/index.html 777" Which I think would mean that it tries to chmod the file "/htdocs/testsite/index.html" instead of chmod the
I have a separate data partition on my F12 box with one dir for my children and subdirs for each of them. because they had no rw- rights and because they sometimes use one of the other logins to work for school I changed the permissions for their dir so that anyone has access. I used
Code:
# chmod -R 666 [their directory]
after that Nautilus displayed an empty folder even with 'show hidden files' on.however, with
Code:
ls -lh
on the dir and subdirs all the files seem to be present (luckily).
I have a Qnap 219p NAS to which I have connected a USB external harddrive. I can access the external harddrive from my windows box using the network share, but at first i couldn't access the folders. The permissions set in the NAS GUI for the external drive is correct and are identical to the permissions set to the 2 internal drives.
I ssh'ed to the nas and used 'chmod -R 770 /share/external/sds1' - this granted me access to the folders, and some files. I can open all files in the root, but if I go just 2 folders 'deeper', i can't open the files in this folder, and in the folders after that.
In ssh, if i navigate to the folder wher I cannot open the files and use 'ls -l', i can see that the permissions (770) hasn't been applied to these files. How can I get chmod to apply the 770 permission to all files, folders, subfolders and files in subfolders etc., without having to chmod every folder one by one?
i want to install vocproc auto tuner. im having problems, how do i do this exactly? please help. does anybody have some step by step terminal codes? Ive included the link to it.URl....Or does anybody have a better auto tuner thats easier to install?
I am using livecd-creator with the fedora-livecd-desktop.ks file to create livecd/usb images. The image that is created includes several languages (some are listed below). How can I only include one or two specific languages? Will I have to specify to remove each language group? If so, what is the syntax for yum?
I use a long mount command to mount a NAS drive but have to retype it every time I need to mount the drive. Because it is on my laptop I only need to mount the drive from time
An application I'm attempting to install URL...) requires a version of gcc above 4.3.2, however the only rpms available for CentOS I can find are 4.1.2.I began trying to install a newer version via compiling the latest one from the gcc GNU site, but I started running into problems. Especially in installing newer versions of gmp and mpfr (apparently upgrading from certain versions of gmp will write over the header files, but won't change path locations for the lib files?) After examining the problem a little closer, I got worried that I was going to end up making the system unstable, so I stopped fiddling.
So my question is, is there an easier way to install a newer version of gcc? I'm not completely new to Linux, but I'm far from a master at the system, if anyone knew an easier/slightly more fool proof way of upgrading gcc.
I've been trying to figure out a way to more easily color text in Perl like I do on Bash on a Linux box. In bash, what I'll do is set color variables up to equal the escape sequence, then echo out with escape seqeunces to print it exactly how I want it. Typically I'll want a character or a word in a different color, not the whole line. For example
echo -n -e "My face is turning ${RED}red${UNCOLOR} like a lobster." In Perl with the term::ANSIColor module, it seems to just do a line. Am I being dense? Is there a way that I can do it like I do it in BASH that's fairly easy to read after the fact?
I got a folder that I transfer stuff to all the time. the folder is in chmod 775 but when i upload folders and files, they are given chmod 700, but i want it chmod 775 everytime i upload something. so far i have logged in to my linux computer and did a chmod -R 775 to the folder every time i uploaded something to it. is there a function somewhere to make it 775 everytime i upload or can i have something run a script, so i don't have to go in and write it everytime i upload something?
I need to modify fs/open.c and fs/read_write.c to make my modifications. I cannot find any options in 'make menuconfig' to make these files modules rather than compiled elements. I'm thinking these cannot be modules because the file system won't work without open.c and read_write.c. Is this correct - I cannot compile fs/open.c and fs/read_write.c as modules, only as compiled elements? Or, is there some way for a module to overwrite these routines when the module is installed and re-enable the routines when the module is removed?
I downloaded Ubuntu Server. I absolutely love it. It is easy to set up and I've already learned alot while doing so.
Now, I want to get a bit further. I want my Ubuntu Server to be accessible from the internet, so I can access my files and webpages from everywhere.
Everything works locally, but I don't have any clue to maken it visible for WAN. Is everything from here set in the router settings? Or will I also have to make adjustments in the Server's settings?
I posted this yesterday, but my post completely disappeared (I looked high and low -- nothing.) I am using Ubuntu Server 10.04, all the latests updates. For an FTP Server, I use ProFTP.
One specific directory, and it's subdirectories on my server will not download at a reasonable rate. They move at about 17-50KBPS. All other folders work fine, at around 1.5-2.5MBPS.
What is going on? I have no idea how to troubleshoot this. The files being transfered are in a directory under home. They should have no permissions issues (I reapplied the permissions I want already), I tried restarting ProFTP, the files vary in sizes (from a few kilobytes to about 120 megabytes). I use Webmin for most web management.
I am not having overload issues with my network card or CPU utilization while downloading these files. They are being accessed from the local network.
This issue is taxing because the files in question are backup files.
I'm hosting my own dedicated server with Ubuntu Server 10.10. I have it set up with a static local IP, and I've configured DynDNS to link up with my router and allow my server to go live to the internet. I have all the appropriate ports unlocked, with the exception of port 80. This port is blocked by my ISP (Charter) and I can't use it. Due to this, I configured my router to listen on port 81, and direct it to my server.
So, In order to view it, you need to go to the IP XXX.xxx.XXX.xxx:81 Today, I registered (www.online-self.com) in hopes of getting around my current mask (provided by DynDNS.com (omegame.selfip.com). So here is my dilemma, When I go to the host of my domain name , I want to redirect my DNS to my server IP.
I can't seem to do it though? They want a strict IP address, no port extensions. How do I get around this so that my domain name and IP address link up? I'm thinking I may be missing a step, or maybe I needed to register a domain name that simply redirects? I'm starting to get confused on what I should do next. Can I even do this?