i need to install windows to my computer, but i don't want to loose my linux either.So what i wonder is, since my filesystem is ext4, which filesystem can be used for both operation systems.Do i need to have fat32 or ntfs or can i still use under windows ext4? I have a large collection of movies on my one partition and if i have to convert it, then it will be kinda pain.
I have a 2TB USB drive which I use as a backup device - I dump two filesystems onto it, totalling around 1TB. However, doing the dump trashes my F11 system, making it basically unusable, not only during the dump but also afterwards. I have 8GB of RAM, all of which is needed and normally in use, but when dumping, the system starts hogging huge amounts of it as buffer space - up to 1GB of RAM is reported to be allocated. And rather than using free memory for buffer space, it seems to aggressively swap processes out to get it. The system tends to melt down as a result, and just switching virtual desktops can take 5 minutes.
But after the dumps finish, the problems continue - the system is currently trying to keep around 700-800MB free, and continually swapping out processes to do so, even after the buffer space in use has gone back to about 100MB. This seems like strange behaviour for a fairly common type of activity. Presumably a lot of the buffer space is used to store what is being read from the filesystem (which will never be needed again), and some is used to cache the writes to the USB drive which is slower than the internal hard drives.
I have spent a lot of time trying changes to some of the kernel parameters, after reading articles about them. Of all the ones I've tried, setting vm.dirty_ratio to 1 instead of 5 helps a bit, and setting vm.dirty_background_ratio to 5 instead of 20 makes some improvement I think. Setting vm.swappiness to 0 doesn't seem to help at all.
So my question (at last!) is - how can I back up my filesystems without my system dying? In particular, can I limit the space used for buffers somehow, or turn off buffering for the dump process? And why does dumping result in the system artificially keeping huge amounts of space free afterwards, so I have to reboot to make the system usuable again?
I have a 500GB external drive I want to use on a couple of Linux systems, and looking for a filesystem for it. External drives are frequently formatted in FAT32, but I don't need to interoperate with Windows and would rather avoid the ugly limited kludge that is FAT.
Since I only need to use it on Linux, I would use ext4 or XFS, but they store ownership information. Ideally, I'd use a proper Unix file system that doesn't track ownership (files are owned by whoever mounts the device, like they are when mounting a FAT32 partition), but I do not know of any file system that does that.What would be a good file system for this disk?
Is it possible to write/edit files on HFS+ drive from Linux? Yes I need to disable journaling but how can I disable journaling from Linux? I dont have access to mac.
I've been using *Unix systems for many years now, and I've always been led to believe that its best to partition certain dirs into separate FileSystems, off the main root FS.
For instance, /tmp /var /usr etc
Leaving as little as possible on the main / system.
Its so that you don't fill up the root system be accident, by some user putting in too bigger files in /tmp, for example.
I would presume that filling the / system would not be too good for Linux, as it would not be able to write logs and possibly other things that it needs to.
I believe that if root gets full, then there is something like a 5% amount saved for just 'root' to write to, so that it can do its stuff.
However, eventually, / will become full, and writes will fail.
On top of this, certain scripting tools, such as awk, use the /tmp/ system to store temp files in, and awk wont be able to write to /tmp/ as its full, so awk will fail.
However, I'm being advised that there is no need to put /tmp /var etc onto separate FSs, as there is no problem nowerdays with / filling up. So, /tmp /var /usr are all on the root FS.
I'm talking about large systems, with TBs of data (which is on a separate FS), and with a user populations of around 800-1000 users, and 24/7 system access.
My home server runs Debian Lenny, and I'm about to upgrade the system drive to a larger drive.In the process, I want to take the opportunity to reorganize the partitions and resize them. For learning purposes, I'm planning to migrate from an MBR partition table to GPT.Because of those two changes, I can't just run "dd if=/old/drive of=/new/drive" (well, not without lots more work afterwards). I could use the debootstrap process to get a fresh installation on the new system drive, but I used that technique during the last system upgrade and it's probably overkill for this.Can I just copy the partitions from the old drive to the new?Will "dd if=/dev/hda1 of=/dev/hdb2" work, assuming /dev/hdb2 is larger than /dev/hda1? (If so, the filesystem can be resized to take advantage of the new larger partition, right?)Would parted (or gparted) be a better tool for copying the contents of the partitions?
I have Internet modem , and i want to share my internet to other , i mean other can connect to my internet from other computer ? with key security? , its possible? its possible from windows should linux do that too , but i dont know how .
Your scanner is installed properly and fully function. If you need to install your scanner first, have a look at the ScanningHowTo. To check if your scanner is connected and installed properly you can use the command "scanimage -L". The machine we're configuring to use as the scanner server has the ip address 192.168.1.1 and subnet-mask 255.255.255.0. This means that this machine is on the 192.168.1.0/24 (CIDR notation) network/subnet. code...
I am running Windows 7 on an HP laptop. I recently used Wubi to run Dual OS with Ubuntu. Everything is working fine, Ubuntu is great, but I am somewhat of a newbie and want to know: Is there any way that I can transfer, or share, my windows 7 files (music, pictures, etc) in Ubuntu? I can't seem to figure it out?
I have Ubuntu 10.04 in my laptop and at the same time I have Windows 7 (partitioned disk). I use mostly Ubuntu, but I need windows for some stuff. I want to share files of windows with Ubuntu (is weird but when I installed Ubuntu never gave me the option "share files from windows", I dunno why). Anyway, I can see the disk in Ubuntu, and I can see the folder /Documents and settings/, that creates windows by default with my files. However, the route is too long to arrive there from Ubuntu using the Terminal.
I created a shadow link using lndir to arrive to my files easier. It works fine, however, sometimes when I go to the files using this route, these are lightened in red, and when I try to enter to one of these folders, the system doesn't recognize it. After a while, these are in blue and I can go in them. Why it is happening?. What I did Is the "correct" way to do it?.
I'm getting ready to install Ubuntu Studio along side my regular Ubuntu on some extra space on my hard drive and it seems to make sense to share /home with both Ubuntu systems. All ext4. /home is on it's own partition so all I should have to do is point the installer at it and don't format.
I'm trying to set up a share for my system which is dual booting windows 7 64bit and ubuntu 32bit 10.10 What i did
-Install Samba from the Ubuntu software center
- In System->Admin->Samba add the share for my /Home/Downloads
- In smb.conf i edited the workgroup to that of my windows7 workgroup
- In the home folder right clicked on Downloads and in sharing option i allowed others create and delete
privileges and Guest access ( security not an issue). Then allowed Nautilus to add the permissions automatically. Now from what I have seen if i go into windows 7 in the network section i should have a Ubuntu share appearing but i don't. The only devices which show are the router and the computer itself ( of which ubuntu isn't in) Is there anything i missed or how do i access these files ?
i have a tutorial question to do and don't know where to start, the question is install a workable nfs fileshare system between your system and a remote system, using optimum values for rsize and wsize
how to give nfs share to only one particular user in that particular system. that is for example if 192.168.0.5 has many users but i wanna make only one particular user to acces that share.
I just installed the beta 10.04 LTS with Gnome Version: 2.29.92. I want to to create a network share but when I navigate to system>preferences>personal file sharing I can not create a share. The message indicates that the feature is not available because the required package is not installed. I tried to reinstall gnome-user-share and in reinstalled without issue but I still get the same thing. What am I missing? How can I create a network share?
I have a computer running Ubuntu 10.10 32-bit with Gnome and another computer running Windows 7 64-bit. How can I share folders between these? I can use Samba to view my Windows shares, if I specify the IP address by going to the "Connect to Server..." option in the Places menu. Going to "Network" in Nautilus and trying to open "Windows Network" fails because it could not retrieve the share list from the server.
MY question, though, is how can I set up Samba (or some other software) such that my Windows box will be able to see my shared folders? And also, what is the difference between Samba and Samba4 (because both are in the repositories).
how to backup both the system and the data from my Red Hat Server to a share on a windows SBS server where the regular network backup will take it off-site every night. The Red Hat Server is basically a web server on the LAN using RAID 1 to ensure some redundancy in the discs and is used as an internal web server for an administration system.
The Windows SBS server is the domain controller and rest of the network is totally windows based. The SBS server has large discs which host all of the general data to be backed up and is controlled by a 3rd party IT company. We have a share created on the SBS Server which is to backup all of the information from the Red Hat box to ensure we can potentially restore the system quickly in case of a major problem.
The thorn here is that I am working on a different site to the Red Hat Box and have to do everything over SSH. the Red Hat server has Samba running but I have not used it to mount an external windows share? Is there an 'out of the box' backup software on the Red Hat server that I can control over ssh to do the job? The biggest worry for me is the backing up the system to a state where it can be popped on a disc and the system restored from the backup - or most of it anyway - enough to get the server back on its legs.
Does anyone here knows how to share files/folder with two different GNU/Linux OSes?
I have a notebook and desktop computer here. The Operating System of my notebook is Easypeasy and my desktop computer is Linux Mint. I already install the Samba on both computers. I could open the folders that I shared under my Documents. But when I tried to open the shared folder under the NTFS drives or even the drives itself (of my notebook or desktop computer) it will prompt an error.
This is the error: "Unable to mount location. Failed to mount Windows share."
i have about 22 gb of music (mp3 & ogg) on my laptop harddrive.i also have an unused sony mp3 player with a 20 gb hard drive.what i want to do is back up the 22gb into a 20 gb space the music does not need to be playable on the sony player...just using it as a back up device.ok...2 issues:1. when i've tried compressing (tar.gz) mp3 files, little to no space is saved, i assume that a mp3 is pretty compressed already.is there another way to compress effectively ? i dont want to reduce bit rates of the individual music tracks.2. i formatted the sony hd using ext4, but this leaves me with only 16 gb usable space. tried fat32 and this left me with about 18gb.
I need a command-line method of copying files from a Linux box to a Windows machine that is in a domain and requires authentication. I cannot install additional software or services on the Windows XP machine. I can install any software on the Linux machine. I've tried scp, but the connection failed and if my understanding is correct it is because scp requires that the target (windows machine) be running an ssh service. Is there a command-line linux utility that can pass Windows domain user and password and then copy a file from the linux machine to a share on the windows machine?
i want to know how i can use data which is situated in windows hard disks on linux red hat 5 operting system. i m using dual boot concept and i have installed both windows and linux properly. 3 partition of hard disks are used in windows and one in linux. my data like songs are situated in one of the windows partition. now i want to know how i can use that data when i m working on linux.
Today I try to shared a folder for Windows XP using GUI in OpenSUSE 11.1. I created a folder --> right click and click on Share --> Check on "Share with Samba(Microsoft Windows)", "Public" and "Writable". Then connect from Windows using //192.168.100.1/sharefolder. It open up but I can only read (no write access as I check on "Writable"). Did I miss something?
When I go to 'Places' on my panel, my windows drives are listed. I can click on one, it then asks me for password, then it puts a hard drive icon on my desktop. This is excellent.They are listed as '200 GB Filesystem' etc.My only problem, is how to make these icons stay after I shut down. I don't want to do this every time I boot up.
Generally, I LOVE Ubuntu 10.04...best Ubuntu yet, IMO. But there's this one thing about it that really bugs me, and that is that all executable files on CD/DVD are set with very restricted permissions, including the 'Allow executing file as program' checkbox being left blank. Since CD/DVD's are read-only, I can't change these permissions the normal way or even just execute the files as root!
So far I've been able to get by with just copying the disk's contents to the hard drive and then running the program with altered permissions from there, but right now I want to install Unreal Tournament 2004 (the DVD version, if that makes any difference) and its Linux installer will not function properly from a local directory, so I'm stuck on this one.
Surely there's some way to alter the permissions for a read-only filesystem! Can't I just set system-wide permissions that would even apply to CD's and DVD's?
When I try to start up 9.10 I can get past GRUB but not fsck. A file check will be started but no progress will be made and finally I get the 'General error mounting filesystems'.Trying in recovery mode I get this just before the fsck check
Quote: fsck from util-linux-ng 2.16 [8.016378] ACPI: I/O resource piix4_smbs [0xb00-0xb07] conflicts with ACPI region SOR 1 [0x0b00-0xb0f] /dev/sda1 goes fine Quote: /dev/sda3 has been mounted 34 times without being checked, check forced mountall: canceled
[Code]...
This seems to be slightly different than the other threads I've seen discussing this issue. It just all of a sudden happened, I didn't upgrade anything or have any crashes immediately prior.
Currently I'm using Arch Linux. I'm more used to Linux now, but for communication, I still have some problems:
My main software which I use on Windows is Skype and TeamViewer to share screen / voice chat to do pair programming with my friends. imo.im is too simple and it doesn't help much with only chat.