Software :: Combine Remote Shares To One Large Contiguous Directory?
Feb 17, 2010
I know it is possible to do... but I am not sure how to go about the whole thing. Here's the scenario. I run a lab. Lots of PCs. As time goes by, the older ones dont have the memory or disk space to run more modern apps. But I want to put them to use...
What I am trying to do, and have started, is the following: 1. Install Linux on a bunch of them, make a share on each of these. I've already installed FreeNAS on four machines. (Let's call these machines ClientA, ClientB, ClientC, and ClientD). And have made all the available diskspace
2. Install Linux on a fifth machine (call this Machine1) , and on this machine combine over-the-network all the shares from ClientA, ClientB, ClientC, and ClientD into one large "virtual" directory on Machine1. I know this is do-able, but what I hope to have is the total disk space from all the machines in step 1 to be combined for the purposes of saving files. Not sure which file system to use. For example, if all the other four machines have 2GB of space each, I want to be able to be able to save a 7GB file.
3. And then allow sharing of this one large directory using Samba.
4. Then allow lab users (not on any of the above mentioned machines) to be able to access the Samba-enabled large shared directory on Machine1 to read and write files. The user will have no idea where that the files[s] is/are not on Machine1, and that it maybe segmented in some way, nor should they care.
I understand the risks (if any one machine of ClientA, ClientB, ClientC, and ClientD goes down, lose probably everything). I am considering throwing mirroring into the mix (mirror Machine1's large directory), but that can wait.
So in the above scenario, what file system can I use on Machine1 to combine all the shares from ClientA, ClientB, ClientC, and ClientD to make one large "virtual" directory?
I've looked at UnionFS, but from my understanding while it combines directories, the maximum file size is the size of the largest share. Is this true?
View 3 Replies
ADVERTISEMENT
Jan 9, 2010
I ran fsck on one of my backup drives, one that is almost full, and I'm not quite sure if that drive is going bad or not.
It's running ext3 and has around 1.6TB used (2.0TB drive).
This is what the output was:
Code:
charles@thor:~$ sudo fsck -rV /dev/sdc1
fsck from util-linux-ng 2.16
[/sbin/fsck.ext3 (1) -- /dev/sdc1] fsck.ext3 -r /dev/sdc1
e2fsck 1.41.9 (22-Aug-2009)
I found some info about how ext3 defrags itself and that it cannot handle large files very well. Source.
Kinda makes me wonder how badly fragmented my 4TB partition is going to be. >.<
View 5 Replies
View Related
Apr 4, 2010
I m having a RHEL-5 sever.ABC directory size is 57GB after taking backup in the same disk with name ABC.bkp showing 56GB. i used below command to copy/backup. # cp -r ABC ABC.bkp (different sizes after copying)..I checked both the directory sizes by #du -sh <ABC> and du -ks <ABC.bkp>In both GB and KB there is lots of difference (200mb). why this will happen in copying? what is the solution for above question? what is the correct way of copying 1dir to newdir exactly?
View 4 Replies
View Related
Jun 18, 2010
I have cygwin on Windows XP running rsync to remote Ubuntu server over ssh using ADSL.My data set is about 20Gb! But, Cygwin will backup incrementally, so after the first backup the process should be relatively quick.With ADSL the first backups will take too long. I was thinking about doing the first backup by copying files to an external hard drive then attaching the hard drive to my remote server and copying the files. The idea being that rsync will pick up the files as if it had created them in the first instance. The incremental backups will then pickup from there.
Does anyone have any experience with this and/or can provide any advice? The external hd is fat-32 which is okay with Windows and should be okay with Ubuntu? From XP right click copy and then paste keeps the file dates intact on the external hd - is this enough to get rsync going incrementally?
View 1 Replies
View Related
May 6, 2010
Every once in a while on a computer I'm ssh'd into, I will accidentally type "cat largefile.txt" and my screen will start rushing with text for the next 10 minutes. I'm always working in a screen session, so my current solution is to just log out and then log back in, and since it can go 100X faster when I'm logged out, it'll finish in the short time it takes me to type my password in again. Is there a better way? Either involving the fact I'm in a screen session? Or a way to do this within SSH? What doesn't work: detaching from the screen session (doesn't respond until file is done outputting) trying command to move to a different window in the screen session (also doesn't respond) typing ctrl+C to kill cat command (also doesn't respond, probably because the command is done and the buffers just have to catch up).
View 3 Replies
View Related
Feb 7, 2010
We have an existing Windows 2000 network that I am trying to add an Ubuntu 8.04 server to. I have put links into the windows domain DFS to the linux machine's samba shares.
The shares work fine for local users that are physically on the same network (192.168.0.X). Remote users from other offices or dialing in with a vpn client can not access the these particular folders off the DFS. However, they can map them directly from the ubuntu server.
View 5 Replies
View Related
Jan 21, 2010
If I wanted to back up a large directory (13 GB) to DVD, what would be the best way to do this? Basically, what is the easiest way to make an archive that is split into volumes small enough to burn to disc?
View 3 Replies
View Related
Oct 9, 2010
This thread was nearly titled "The volume Filesysyem root has only 128 KB free space remaining" then I discovered the cause my Encrypted Private Directory had grown to 20GB eating all the free space on my Ubuntu system partition. Here's what happened:All was well with my system last night, left it downloading 2 GB of files from the internet to an NTFS drive to return to low space errors this morning.I checked and nothing had been downloaded to my Ubuntu partition, and even if it had, it could of handled the 2GB without issue. Did some reading on here and the first step I tried found the problem:
Code:
mark@media:~$ df -h
Filesystem Size Used Avail Use% Mounted on
[code]....
View 2 Replies
View Related
Feb 10, 2010
I've a directory containing around 2.8 lacs of files. I want to move them to another directory.If I use cp or mv then I get an error 'argument list too long'. If I write a script like
for file in ls *; do
cp {source} to {destination}
done
then because of ls command , its performance degrades.How can I do this?
View 7 Replies
View Related
Oct 18, 2010
I'm having a bit of an issue with Lucid installed via Wubi. I stuck the OS on its own partition (30 GB in size), and don't store any large files in the Ubuntu file system (when I download something large I move it to another hard drive.) I don't have anything wacky or esoteric installed on my system.
I've been consistently having a problem where, after a few hours or a few days of being booted up, Ubuntu warns me that my available HD space is dangerously small. The amount of available HD space Ubuntu sees then shrinks from a few GB to nothing within a few minutes, and the only way I can seem to solve this is to reboot. Taking a closer look at what's happening, my Home folder balloons in size until there's no more writable space recognized. But there are no files being created or added to, so it looks like there's a bug of some sort. This SEEMS to be correlated with watching videos (or maybe it's the pulling of large files from a mounted directory into RAM? My videos are all on another HD, as mentioned before). I can generally go a few days without getting the "low space" message, but I can't seem to make it through a full 2-hour movie without getting the error.
View 3 Replies
View Related
Mar 19, 2011
I'm trying to crawl a directory on a website and basically download everything in it. The structure is simple enough(but there are also multiple folders), but there is one thing that makes wget choke up.Both of the links work, but they are both the same thing. So wget will download the same file twice. How can I make wget ignore the first one? but this doesn't seem to actually do anything. It will still download the duplicate URLs
View 1 Replies
View Related
May 19, 2010
I am currently trying to copy a directory of roughly 400GBs to dvd, have gotten myself stuck. I tried to tar and then split; however, I don't have enough room on my hard-drive to make a compressed tar and split it up and then burn to disk, so I need a way to tar the and compress the directory, split it, and burn to disk every 4.3GBs.
I went ahead and installed DAR as an alternative, as I hear it is designed for this type of task, but I can't figure out which way is heads or tails.
my OS is the newest version of ubuntu 10.
View 5 Replies
View Related
Feb 9, 2009
Something that has been in the pipleline at work for a while is user-based web directories. Main PDCs are running Windows Server 2003 using Active Directory, ideally what would happen is that users have a web share under [URL].. - the server behind this would be Linux (either Fedora or CentOS).
What kind of configuration would be needed for Apache to make this possible? The way I have planned so far is to have the Linux box auth against the AD domain (possibly joined), with Apache setup to share local public_html folders. Not sure how I can get rid of the tilde from the start of the username, but it should be pretty easy.
View 1 Replies
View Related
Jul 20, 2010
I am the IT Manager at a research facility. We have a fairly unique network configuration in order to support all of the different projects we have going on. We have Red Hat, Ubuntu, Windows XP/Vista/7, Windows Servers 2003, Ubuntu servers, Red Hat servers, and even a few Netgear ReadyNAS and Buffalo Terastations. Over the last few years, I have been migrating all of my users and accounts to a single ACL list, which I chose to be a Windows AD 2003 server. 95% of my users work on Windows platforms and just use ssh tunnels to develop on our linux boxes.
However, i ran in to a problem with our Linux boxes not being able to symbolic link on my Windows 2003 file shares. Of course, this is a problem with Windows not supporting symbolic links. I know 2008 does support this feature, but given the economy and the budget restraints, we cannot afford to purchase the updates we would need, so now I am moving all of my shares to a Ubuntu 10.04 server using Samba. I have joined the server to my AD domain successfully, i can login using my AD credentials, and even assign ownership and group permissions using AD users/groups.
Here is my question.
I would like to keep the AD permission schemes intact. I have several shares that contain folders that have individual permission settings. For example, I have a /shared directory that contains about 50 different folders. Some of these folders I allow my users to write data to, some just read, and others I deny access to complete groups and just allow key groups to access (for example, personnel data should only be accessed by the Administrative staff).
Is there a way to make this work?
I can assign uid and gid manually per folder in Samba, but i would like to have the possibility to add multiple users and groups with permissions to folders, which I do not believe can be done with the standard chown commands. Currently, I can see the folder permissions from my Windows box, but when I try to edit the permission settings, it defaults back to full access. So my AD permissions are not being saved.
View 9 Replies
View Related
Jun 24, 2011
I am running CentOS 5.5 with a 14T ext4 volume. We are sharing out a few sub-directories via NFS. Our customer was doing some penetration testing with our web app that writes to one of those NFS shares. We are not sure if they did something to cause the metadata to grow so large or if it is corrupt. Here is the listing:drwxrwxr-x 1 owner owner 470M Jun 24 18:15 temp.badI guess the metadata could actually be that large, however we have been unable to perform any operations on that directory to determine if the directory is just loaded with files or corrupted. We have not run an fsck on the volume because we would need to schedule downtime for out customers to do so. Has anyone come across this before
View 2 Replies
View Related
Feb 3, 2011
We have 2 servers, 1 is the webserver and the other is the Mysql server.
When transfering a 2GB file from the webserver to the Mysql server.
The webserver's connection to the mysql DB server dies completely.
Need to restart the MYSQL process in order for it to come back online.
During this connection downtime, when using phpmyadmin on the mysql server shows no problem running queries etc.
View 2 Replies
View Related
Dec 11, 2008
I run opensuse 11 at work. I'm trying to see the list of shares on a "server" that is running windows server 2000. If I try smb://server, it doesn't show any shares, but I can browse directly to it such as smb://server/share1. If I use smbclient, it returns the list of shares correctly. I guess I just don't understand why smbclient shows the list of shared folders, but nautilus cannot.
View 3 Replies
View Related
Jul 28, 2010
I am working as a Linux administrator in a very small data centre with 5 servers with following routine tasks.
1. Managing SAMBA shares and giving user specific access for the shares.
2. Scheduling backup of some mount points with rsycn to store data in remote hard disk
3. User and group administration, with sudo access.
4. Creating and Managing Xen Virtual machines and giving access to other project teams.
5. Automating some tasks with Shell Scripting.
6. Managing FTP server for user uploads.
I have practiced a lot in my home laptop without RHEL training, Cleared RHCE and LPIC1. I want to do some advanced system admin tasks, but do not have option in my current data centre. With Above skills is it possible to get a job ?
View 9 Replies
View Related
May 13, 2010
The company I work for, as usual, is Microsoft-centric. I'm attempting to integrate my Ubuntu server into the domain to allow domain users to authenticate to the server and access file shares using Samba. Here's my current configuration:
[Code].....
View 9 Replies
View Related
Aug 4, 2011
How to copy a file from remote to local directory and vice versa using ftp bat script file in telnet.
View 1 Replies
View Related
Oct 17, 2010
have a Debian server which I use to hold my home directory for my user account. I used to use Windows 7 and connect to my /home/username directory via Samba which worked great. I could access all of my files as if they were sitting on my local PC, but they were actually sitting on my Debian server.
Now I have decided to give Ubuntu 10.10 a try (looks promising so far!).One thing I'm not sure how to do is to mount my home directory from my server! I am able to open an sftp connection to my server, but not able to access them natively as they were /home/username on my local machine.I'm assuming I need to mount my home directory somewhere in my fstab before it starts up, but which protocol should I use? I'm used to using windows networking, but am trying to get more into linux.Should I use NFS?
View 2 Replies
View Related
Jan 15, 2010
If you use Nautilus then you can just use the "Connect to server" from the file menu. However if you file manager does not support connecting to servers (like Thunar ) then you can use sshfs.
Code:
sudo apt-get install sshfs
You should create a directory as your mount point, say
Code:
mkdir /media/Server
[Code]....
View 1 Replies
View Related
Jul 15, 2010
I have a computer (C1) to which I can connect through the Internet (ssh, for instance) (it has a static ip and though it is sitting behind a router, the appropriate ports are all forwarded). I have another computer (C2) that doesn't have a fixed ip address and sits behind a router that I cannot fiddle with (so no port forwarding here). I would like to know if there is any way to connect from C2 to C1 such that a directory on C2 would be mounted on C1.
View 5 Replies
View Related
Jul 21, 2011
I have a folder on my workstation and I currently have an identical copy on my nas mounted via ifs (I'll be using this as my backup). The folder contains many virtual machines that are usually powered off. I like to think of the copy on my nas as a backup. The benefit of having two copies of my vms is that if I boot up too many on my workstation and the drive starts to cause a bottle neck I can simply boot the vms from the nas instead. (gigabit ethernet)
I would like changes I make to my local copy to be reflected on the nas.
I want this to happen In the background, I.e. If I turn off my machine it shouldnt cause a problem, the next time I boot up it just re checks the files and continues syncing the 2 directories.
What is the best tool for the job? Rsync?
I've never really used it before so a few pointers to get me going would be great, or obviously other recommendations if there are any.
View 1 Replies
View Related
Feb 21, 2011
I've set up ssh passwordless logins using keygen etc.before so I know the routine.
The problem I'm currently having is setting passwordless logins when I don't have write permission to my "root" of the remote machine. More specifically the slice provided by a commercial web hosting provider. I can ssh and sftp just fine keying in the password manually but since I'm unable to create a .ssh directory in my "root" I'm unsuccessful in scripting logins. What I'm wondering is if the .ssh directory and associated security files can be placed in an alternate location such as the httpdocs directory and pass that location to ssh in a command line parameter.
View 8 Replies
View Related
Mar 10, 2010
I am running an openldap server on fedora core 10 and now running into a need of get all users data from Active Directory. Actually I have a php based application which will be using that data from OpenLDAP and it will need to be updated on weekly bases. how can I do it and any script.
View 1 Replies
View Related
Jan 29, 2011
i would like to find and backup all *.mp4 files from /Pictures and its sub-directories and move them to a single directory on a remote. I can find and move the files but I don't want the directory structure...just the files to be placed in the remote directory.
To find my files I use
rsync -r -a -v -e "ssh -l user" --delete --include '*/' --include '*.mp4' --exclude '*' /home/drew/Pictures/ remoteserver:/Users/drew/mp4
but this creates all the subdirectories
I also tried
find ~/Pictures -name "*.mp4" -exec rsync -r -a -v -e "ssh -l user" --delete {} remote:/Users/drew/mp4 ;
This works but takes forever
View 3 Replies
View Related
May 26, 2011
managed to screw up the ownership of many directories on a remote server whilst installing a piece of new software. Basically I have set all ownership from / downwards to apache:apache.
Spotted the error quite quickly and managed to abort it, but am now unable to change to root to put anything right. Is there any way to restore ownerships of the underlying Slackware to 'factory default' as it were? Had a quick google and found some links for a script that is supposed to work, but it appears broken,
View 8 Replies
View Related
Feb 18, 2011
how big and widespaced the fonts on Clementine playlist are and how good they look on the appmenu (where my mouse pointer is). This is not because Clementine is QT4, I've got the same problem with Chrome, Opera etc. I've been messing with system-settings (KDE settings tool) a day before the fonts become that widespaced in order to make my KDE apps look more native on my GNOME, but I haven't touched the fonts settings there.
View 9 Replies
View Related
May 24, 2010
I'm trying to setup rsync to backup a remote directory to my local drive.
I cd to the directory that I want to pull the files to, then I enter:
rsync -vrtW account@remote.com:~/public_html
I enter the password then it starts running. I get all the files listed, but none of them actually transfer. What am I missing?
View 1 Replies
View Related