Ubuntu Installation :: Which Disks Can Be Main Boot Disks
Mar 31, 2010
I have/had a PC with several hard drives, and a mix of ubuntu and windows on multi boot.The old boot drive died screaming, and I need to start again. (But my data is safe! yay!)
Is there anything special about which drive can be the main drive to start booting from? Or to put it another way, can I install to any of the other 3 and expect it to work, or do I need to switch them around so a different drive is on the connections for the recently dead one?
I have servers which contain SATA disks and SAS disks. I was testing the speed of writing on these servers and I recognized that SAS 10.000 disks much more slowly than the SATA 7200. What do you think about this slowness? What are the reasons of this slowness?
I am giving the below rates (values) which I took from my test (from my comparisons between SAS 10.000 and SATA 7200);
dd if=/dev/zero of=bigfile.txt bs=1024 count=1000000 when this comment was run in SAS disk server, I took this output(10.000 rpm)
(a new server,2 CPU 8 core and 8 gb ram)
1000000+0 records in 1000000+0 records out 1024000000 bytes (1.0 GB) copied, 12.9662 s, 79.0 MB/s (I have not used this server yet) (hw raid1)
I've installed Windows 7 onto one hard drive, and then installed Ubuntu 9.10 onto a second hard drive. The installations seemed to go fine, and I can boot into Ubuntu from the GRUB menu. However when I try to boot into Windows from the GRUB menu, I get a message saying "error: no such device: 446e94786e946488".
I recently tried installing Lucid x86 on my system beside Windows 7 and managed to screw it up.
My disk setup is this;
Disk A = 3 partitions (1st partition=Windows 2&3 partitions=Data) Disk B = 1 partition Disk C = 3 partitions (1st partition=Data 2&3 partitions=Ubuntu & Swap)
Disk A = SATA and internal Disk B = SATA to USB external Disk C = SATA to USB external
I want to install Lucid on the 2nd partition on Disk C. And dual boot it with Windows on Disk A.
During Lucid setup i specified the partition for installation (C2) and asked for GRUB to install on Disk A (no partition specified) so GRUB is always used as the dual-boot manager even if the Lucid disk (Disk C) is ejected. Once installed and rebooted i was taken to the GRUB rescue prompt as no installation drive could be found (a long string of numbers (looked like a Disk ID number???) was also shown). Obvviiusly, i could not access either OS on my system at this point. I had my W7 DVD handy so it was just a case of recovering the windows boot manager and i could use my PC but how do i go about installing Lucid with this setup? Should i specify a partition for GRUB to install to? I have a hunch this is where i am going wrong but am too scared to try again and potentially balls things up.
After upgrade to 10.04, my disks are randomly named (sda, sdb, sdc) at each boot. My drive labeled "XP" is sometimes named "sdb" and sometimes "sdc", while my other drive "DATA" is respectively "sdc" or "sdb". This wasn't the case before upgrade with KUbuntu 9.10.
Due to this random naming, my auto-mount in fstab often fail at boot time !
Any solution for this (not found here by myself) ?
Is this linked to Grub troubles reported many times here ?
I have a fully operational PXE boot server, the client boots up and begins the setup process however, fails to detect the hard disk, I have tried with ubuntu 8.10, 9.10 and 10.10 and none of them will see my hard disk, I boot to the cd and it sees the hard disks with no problem, so apparently the pxe boot server isnt serving up the neccesary drivers or something to detect my hard disks properly. They are just IDE drives and like I said, regular cd install detects my drives just fine.So if anyone here has any information that may help shed some light on this issue I would be so grateful
I have no hard drives in my computer, so I have been trying to boot Ubuntu 11.04 from an 8GB usb flash drive. Is this possible? So far the best result i have gotten is it will sit on the loading screen for a while then dump. I was only able to get the last little bit which reads mount. mounting /dev on /root/dev failed: no such file or directory. mounting /sys on /root/sys filed: no such file or directory. mounting /proc on /root/proc failed: no such file or dirctory. target file system doesn't have requested /sbin/init
Dell 600SC running an Adaptec 39160 dual channel SCSI controller which has 2 disks connected to it. The machine also has 2 IDE drives connected to it. The boot order of the disks is set to the SCSI disks as the first in boot order (after CD).
I am trying to set it to maximize performance from the SCSI config so I have XP on the first SCSI and I set up Ub 9.1 on the second SCSI in a dual-boot configuration.
In this set up the machine when rebooting goes straight to XP (on the first SCSI) and does not even see the Ub installation. The installation went fine and no complaints. On the same machine if I just had Ub on the first SCSI - machine boots fine (albeit after a long pause looking for the bootloader).
So with XP on the first disk (which I need to - to have XP) the Ub bootloader does not seem to set the right params to be able to boot.
Again this is with 9.1. Not trying 10.04 as with 10.04 I don't even get to boot even with standalone Ub (with no XP). However it installs fine but does not find bootloader in 10.4, so we will keep to the 9.1 for now. I am however open to working with 10.04 if there is a solution in dual-boot with XP in my config.
So again 9.1 installs fine with XP on 1st SCSI disk, an ub 9.1 on 2nd SCSI disk, but then bootloader does not get activated and machine goes straight to XP.
One of my disks in my computer crashed, it was the one containing /boot and some data partitions. The other system and /home partitions were on a second disk, which is ok.
I was wondering, can I create a new /boot partition, and keep on using the rest of the system? Can I somehow do it with a chroot from a live/installer disk, run grub, and use my system again? I have another disk which I can put in the system, but there is even an unused partition on the disk which is ok (but it is rather big for /boot).
Having installed 9,10 onto a laptop my cherubic daughter swicthed off the power (no battery) and upon restarting i am faced with "Starting Init crypto disks... OK) and there it stops!! I had hoped that I could go to recovery mode and fix it but am faced with the same stalling point. I see others are unresolved in this.
Ever since my upgrade from 9.10 to 10.4, every time I reboot the system it does a full disk check. /var/log/boot.log tells me that fsck thinks that the file systems contain errors or that it wasn't cleanly unmounted. And yet, it doesn't seem to actually find errors, and a clean reboot starts another check (again with it thinking something is dirty). I dual-boot with Windows, and reboot from there with the same problem.Again, all of this is new with 10.4 and was not happening with 9.10.Is there a way to find out when/how/why the disks are not being unmounted cleanly?
Disk 0 (500GB): Windows Vista Disk 1 (1TB): Windows 7 Disk 2 (160GB): Ubuntu
My boot disk is Disk 0. Currently when I turn on the PC, GRUB loads from Disk 0. I can then choose either Ubuntu or Windows Loader. If I choose Windows Loader (also located on Disk 0), I can choose to load Windows Vista or Windows 7. I like this setup, but I would like to move the loaders (exactly as they are) to Disk 1 so that I can format Disk 0.
I get the pinkish Ubuntu screen and a message such as "checking disk 1 of 4". I assume that it is doing an fsck. However, the time it takes does not seem to relate to the time it takes if I do a manual fsck (almost instantaneous) or fsck -c (several minutes to half an hour depending on the drive). I also wonder what is counts as a "disk". I have in the system:
I have installed xp at the main hdd. It has 3 partitions. Then I installed Kubuntu 10.04 on the slave hdd. When I boot, it doesn't recognize kubuntu. When I searched at My PC in XP, didn't recognized the slave hdd. I switched the hdd (slave to master and viceversa) and it didn't go well either.
i have got a very strange boot problem. But first: I have openSUSE 11.4 with kde installed. I have the amd64 dual core cpu and 2 hard disks. I was able to boot from both of those disks (on the second disk I have openSUSE 11.2 in case something goes wrong with the first disk). Then I decided to install openSUSE 11.4 from DVD to a usb key (just like I would to a hard disk). I succeeded. I did not involve any partition of the hard disks in this install. But now I can not boot anymore from any of my both hard disks although bios finds them it did before. After bios I get the following message: Loading stage 1.5 error 21.
Error 21 means: Selected disk does not exist. This error is returned if the device part of a device- or full file name refers to a disk or BIOS device that is not present or not recognized by the BIOS in the system But I am still able to boot from usb key. I have even modified the menu.lst from the usb key to boot openSUSE 11.4 from the first hard disk. This works fine. I have also tried to install grub again on my first hard disk with grub.install.unsupported and with yast2. But installation stops with an error message like "hard disk not found by bios".
I'm just curious - why do all linux distros (all I've seen) run their periodic disk checks during boot? I mean, I understand that a disk should be checked now and then, but why does the system do it during boot, when I'm waiting for it to load, instead of checking them during shutdown, when (most probably) user doesn't need the computer anymore.
upon installing 4 2TB drives, my server will not boot. I have tried booting from a slackware 11 dvd and passing these boot paramaters:
huge26.s root=/dev/sda1 noinitrd ro in addition to just trying to boot from the DVD using the huge26.s kernel. the kernel starts to load and says "Ready." Then sits there with a flashing cursor... The problem only exists with the new 2TB drives installed. I never had any problems when I had 750GB drives installed. Also, everything works fine if I boot from the DVD using "huge26.s root=/dev/sda1 noinitrd ro" as boot paramaters, and insert the 4 2TB drives (hot add) after the system starts booting.
I have also tried booting from a backtrack 3 cd but experience the same problem (boot halt after loading initrd)
I have a PC with 4 harddisks and one ssd drive, presently PC boots from the 1st harddisk and other harddisks (sometimes 1, 2 or 3, depends upon the requirement) are used for the storage. Now i want to boot the PC from SSD and use the other harddisks for the storage only. My problem is that when system boots it takes 1st harddisk as sda and SSD as sdb, if i am using only one harddisk, and if i use 2 harddisk it takes sdc as SSD. So i am not able to give fix boot point in menu.lst file, if i wish to use root filesystem from SSD.I am using 18.104.22.168 kernel and grub bootloader. I have tried using initrd with udev but not able to include and start udev properly in initrd. I am trying to boot from UUID or LABEL, but no success. Am i missing something in kernel to get the UUID or LABEL.
The HDD is brand new and hasn't been formated, partitioned, or previously had any other OS on it. The HDD is recognized in the BIOS.A list of possible drivers are listed in which don't match, except for two, but they don't lead to anywhere beneficial in order to complete the installation process.At this point, I have no idea how I must proceed.
I did a fresh install of 11.04. It didn't let me choose custom mount points for my disks, so I left non-system partitions unmounted in the installer. Big mistake. It seemingly screwed with two of my disks. I can see them in disk utility, but disk utility cannot identify partition information.
@fridge:~$ sudo mount /dev/sde1 /m2 -t ext4 mount: wrong fs type, bad option, bad superblock on /dev/sde1, missing codepage or helper program, or other error
Is it possible to create a boot CD to boot external volumes on an Apple iMac 7.1 (which has an older firmware version and cannot boot external disks, unlike the MacBook Pro 5.1 which can do it, at least with grub-legacy which is all I'll ever use until EFI boot becomes available). There is some promising stuff on www.pendrivelinux.com, and I'll try it, but the instructions are for Windows, and I am not sure how to translate the menu.lst entry to linux (I suppose it would have to be entered in the "automagic" section). Of course I don't want to create a bootable flash drive but to use my external volumes that already boot on the MacBook Pro without altering them, except for installing the ATI video driver (but I have no problem booting in low graphics mode).
Until karmic there was a trick to make the iMac mistake the external volume for an internal one (the root partition had to have the same UUID as the internal root partition), but this does not seem to work for lucid. Anyway this UUID trick is dirty and causes problems when you want to edit the internal partition (which is the point of the external boot - you get a customized maintenance environment that boots much faster than the CD).
Reformatting two drives before re-installation. I'm new to Linux generally. Can someone give me instructions on formatting the hard drives? I have installed ubuntu on a machine that had been running Windows for a while. Since Windows worked the boot drive for years, I decided to switch the positions of the two drives; putting the old data drive into the boot position.
Apparently, ubunta was too smart for me. I guess it discovered the Windows installation, and decided that's where it should be installed too. Now, what I would like to do - and I would really like to do it this way - is reformat both the disks, wiping out everything that's on both of them; before running another installation.
BTW: the old data drive is still NTFS, and I don't want installations own both disks - reformatting and starting fresh seems nice and clean, if not entirely necessary.
*-storage description: SATA controller product: 82801GBM/GHM (ICH7 Family) SATA AHCI Controller vendor: Intel Corporation
Want to add Ubuntu + Swap in the 90 or so GB range, fairly new to partitioning. Trying to create recovery disks using system tools is over 16 Gb, for that kind of expense I may as well just order recovery disks OEM if (When) Windows falls apart.
I installed ubuntu 11.04 inside of my Windows 7 using live cd. All was working fine until three hours ago when I started my pc and selected ubuntu, it gave me grub command line. I figured it was because grub couldn't find ubuntu. So, I rebooted from win7 and now all I can see is ubuntu.icon, wubi-uninstallation.exe, ubuntu and another folder. ubuntu/disk is missing.
I just installed Ubuntu server edition to my computer (brand new, no OS) and finished installation. In the terminal I used apt-get ubuntu-desktop to install a desktop interface.In my rig, I have two 500GB HDDs. I set them up through my computer BIOS as RAID1 drives, yet as I understand I still need to configure the Ubuntu software raid for it to work correctly. Unfortunately, I already partitioned my drives! I used the easy way (guided with LVM or whatever) and let it do it for me. Now, RAID1 is very important to me! Is there anyway to repartition the disks to use RAID1, or do I need to wipe my computer and reinstall Ubuntu?
I have burned dvds with Divx for windows and likes it because they were compatible with most dvd players out there where .AVI is not. I need to find a GUI for Divx 6.1.1 or another application that will burn Divx
I`d like to know what happens when with the disks the we include in the installation process of Fedora 15.Obviously, the HD we choose to be the system is entirely formated and reconfigured. But what about the others?hat does happen to them? I`m asking this because I have included two brand new disks of 2 TB, one of them with 900.000 files and the other empty, both of them formated with ext4, and completely functional on Fedora 14.However, after the upgrade, in the very first Fedora 15 boot, these disks disapeared from "places", and could not be mounted even via terminal. Opening the Gnome Disk Utility, I found out they had been changed to VLM (instead of ext4) and that they also lost their original lable. I am starting to be concerned because all the data there was important to me.