bugGNU GRUB - Bugs: bug #10727, Boot fails after adding a new IDE...

 
 

You are not allowed to post comments on this tracker with your current authentication level.

bug #10727: Boot fails after adding a new IDE hdd, in IDE/SCSI configuration

Submitter:  None
Submitted:  Tue 19 Oct 2004 11:10:12 AM UTC
   
 
Category:  Booting Severity:  Major
Priority:  5 - Normal Item Group:  Action Request
Status:  None Privacy:  Public
Assigned to:  None Originator Name:  Teodor Mihai
Originator Email:  -email is unavailable- Open/Closed:  Closed
Release:  None Release:  grub-0.94-25.i586.rpm
Reproducibility:  Every Time Planned Release:  None

Sat 15 Dec 2007 04:40:07 PM UTC, comment #2: 

We've moved to GRUB 2 as a development platform. Please can you check if this bug still applies there, and if it does, reopen it?

Thanks

Robert Millan <robertmh>
Group administrator
Fri 14 Jan 2005 11:47:40 PM UTC, comment #1: 

I got a similar problem here.

I added a new [unpartitioned] IDE drive to my system, started it up and GRUB refused to boot, showing error 5 during stage 1.5. Removing the drive also removed the error.

I am a little unexperienced with hardware related problems under Linux so I thought the error might be caused by the unpartitioned drive.

So I booted from a CD-ROM (which just works fine) with the new drive connected and partitioned it to my needs. This changed absolutley nothing. Same error.

Benedikt Spellmeyer <chimera26>
Tue 19 Oct 2004 11:10:12 AM UTC, original submission:  

As soon as I put together my PC it had the following drive setup:

SCSI ID 0 /dev/sda
SCSI ID 1 /dev/sdb

Due to some problem (or incompatibility) either with the PC BIOS or the SCSI BIOS in booting from the SCSI drive 0 (basically I never managed to boot from SCSI) I had to add an old IDE drive as secondary master, which I used for the bootloader. Therefore, there's a new disk:

Secondary Master IDE /dev/hdc, 8Gb Maxtor (old disk, BIOS used "Large" addressing mode by default, when set to "Auto")

I have a dual installation of Linux Suse 9.1 and Windows XP Professional, both residing on the first SCSI disc /dev/sda.
Windows recognized the setup as being unusual (it was installed on the SCSI disc, but the boot disc was /dev/hdc) and specifically explained it had to put the boot loader on /dev/hdc. That was surprisingly smart, given the manufacturer, and it worked fine.

I installed Linux and GRUB was configured on /dev/hdc, and worked in this dual boot setup by booting Linux on /dev/sda5 or Windows on /dev/hdc. Everything worked fine. The GRUB device map was as follows:

hd0 /dev/hdc
hd1 /dev/sda
hd2 /dev/sdb

However, due to storage constraints I had to purchase a new 200Gb disk, and I shrewdly planned to copy the old IDE disk, sector by sector, on the new one (using Norton Ghost 9.0) and then remove the old drive, while keeping the new one as /dev/hdc, and still be able to boot Windows after re-installing GRUB on the new disk.

The first step was adding the new drive as Secondary Slave, /dev/hdd.
The BIOS recognized it as a 200Gb drive (fortunately the motherboard was made in 2003, as the BIOS was up-to-date) and defaulted to LBA addressing when set to "Auto".
I left the old drive, /dev/hdc, in place, planning to boot Windows and copy the drives.

However, GRUB refused to boot, showing error 5 (Partition table invalid or corrupt) during stage 1.5.

For some reason GRUB took the new disk as hd1, even if according to its device map hd1 should have been /dev/sda.
Physically disconnecting /dev/hdd would have the expected result (GRUB booted fine).

I know it was referring to the new disk since I booted of a Linux rescue disk, ran fdisk just to put an empty partition table, booted again and GRUB showed error 15 (File not found) during stage 1.5. Therefore it was obviously reading /dev/hdd (probably as hd1), instead of using /dev/sda.

I also tried connecting the new disk as Primary Master or Primary Slave. Same result: GRUB had to read it and declare itself unhappy.

When I understood the issue I reinstalled GRUB using the new device map containing all 4 disks (/dev/hdc, /dev/hdd, /dev/sda, /dev/sdb), I used the same boot configuration (using /dev/hdc and /dev/sda) and it booted my existing Linux and Windows installations without any problems.

=> I think this is a bug; adding a new [unpartitioned] drive, which basically has no role in the boot process, should not cause the PC to stop booting. <=

Of course, after I copied the drives, removed the old disk, installed GRUB on the new disk using the same configuration as before (/dev/hdc, /dev/sda, /dev/sdb) it refused to boot Windows and nothing but a Windows reinstall (which wiped out GRUB) fixed the problem, but this is another issue.

Anonymous

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

No files currently attached

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by robertmh (Posted a comment)
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

     

    Follow 2 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2007-12-15 robertmh Open/ClosedOpen Closed
    2005-01-14 chimera26 Carbon-Copy- Added chimera26

    Back to the top

    Powered by Savane 3.13-caa5.
    Corresponding source code