From f797f9cc5485b50c35c106b462e1bc432ec37f90 Mon Sep 17 00:00:00 2001 From: Olof Johansson Date: Mon, 13 Jun 2005 15:52:27 -0700 Subject: [PATCH] Fix PCI BAR size interpretation on 64-bit arches On 64-bit machines, PCI_BASE_ADDRESS_MEM_MASK and other mask constants passed to pci_size() are 64-bit (for example ~0x0fUL). However, pci_size does comparisons between the u32 arguments and the mask, which will fail even though any result from pci_size is still just 32-bit. Changing the mask argument to u32 seems the obvious thing to do, since all arithmetic in the function is 32-bit and having a larger mask makes no sense. This triggered on a PPC64 system here where an adapter (VGA, as it happened) had a memory region base of 0xfe000000 and a sz of the same, matching the if (max == maxbase ...) test at the bottom of pci_size but failing the mask comparison. Quite a corner case which I guess explains why we haven't seen it until now. Signed-off-by: Olof Johansson Acked-by: Greg KH Signed-off-by: Andrew Morton Signed-off-by: Linus Torvalds --- drivers/pci/probe.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'drivers') diff --git a/drivers/pci/probe.c b/drivers/pci/probe.c index b7ae87823c6..fd48b201eb5 100644 --- a/drivers/pci/probe.c +++ b/drivers/pci/probe.c @@ -125,7 +125,7 @@ static inline unsigned int pci_calc_resource_flags(unsigned int flags) /* * Find the extent of a PCI decode.. */ -static u32 pci_size(u32 base, u32 maxbase, unsigned long mask) +static u32 pci_size(u32 base, u32 maxbase, u32 mask) { u32 size = mask & maxbase; /* Find the significant bits */ if (!size) -- cgit v1.2.3