Open MPI logo

Hardware Locality Users' Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [hwloc-users] hwloc on systems with more than 64 cpus?
From: Brice Goglin (Brice.Goglin_at_[hidden])
Date: 2010-05-16 15:44:15


No, there is no such limit. If you have 128cores, the cpuset string will
be 0xffffffff,0xffffffff,0xffffffff,0xffffffff

As long as you have less than 1024 cores, everything should work fine.
For more than 1024, you'll need to rebuild with a manual change in the
source code, or wait for hwloc 1.1.

Brice

On 14/05/2010 23:51, Jirka Hladky wrote:
> Thanks Samuel!!
>
> The data looks fine. hwloc rocks.
>
> I assume
> --cpuset
> option (lstopo command)
> is not supported on such systems, right?
>
> My understanding is that cpuset masks works only upto 64 cores. Is it correct?
>
> Thanks
> Jirka
>
> On Friday 14 May 2010 08:06:12 pm Samuel Thibault wrote:
>
>> Jeff Squyres, le Fri 14 May 2010 09:09:44 -0400, a écrit :
>>
>>> I believe that Brice / Samuel (the two main developers) have tested hwloc
>>> on an old Altix 4700 with 256 itanium cores.
>>>
>>> I don't have their exact results, and I don't see them on IM right now,
>>> so I don't know if they're around today or not...
>>>
>> It was tested on a 256 core itanium machine, see
>> tests/linux/256ia64-64n2s2c.tar.gz.output
>>
>> Samuel
>> _______________________________________________
>> hwloc-users mailing list
>> hwloc-users_at_[hidden]
>> http://www.open-mpi.org/mailman/listinfo.cgi/hwloc-users
>>
> _______________________________________________
> hwloc-users mailing list
> hwloc-users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/hwloc-users
>