Open MPI logo

Hardware Locality Users' Mailing List Archives

  |   Home   |   Support   |   FAQ   |   all Hardware Locality Users mailing list

Subject: Re: [hwloc-users] hwloc on systems with more than 64 cpus?
From: Jirka Hladky (jhladky_at_[hidden])
Date: 2010-05-16 18:23:08


Hi Brice,

thanks a lot for the clarification!

I got access to 64 cores system and you are indeed right! There is however an
issue that taskset does not support 0x80000000,0x0 format.

taskset 0x80000000,0x0 sleep 100
failed to parse CPU mask 0x80000000,0x0

However,
taskset 0x8000000000000000 sleep 100
works fine:-)

Can I suggest an enhancement to hwloc to support taskset format? taskset is
currently standard utility to set CPU affinity. Some colleagues of mine don't
want to switch to hwloc-bind yet, so supporting taskset format would be great.
You can certainly get around with

hwloc-calc --proclist --physical 0x80000000,0x0

but it will make unnecessarily complex.

Could you either add new option --cpuset-taskset-compatible or perhaps change
--cpuset output from 0x80000000,0x0
to 0x8000000000000000 ?

Please let me know your opinion.

=====================================
hwloc-1.0rc5.tar.bz2 used for testing
hwloc-ls --merge --cpuset
PU p#63 cpuset=0x80000000,0x0

hwloc-bind 0x80000000,0x0 sleep 1000

hwloc-ls --top
PU #63 (phys=63) + 12147 sleep

taskset -p 12147
pid 12147's current affinity mask: 8000000000000000
======================================

Thanks a lot!
Jirka

On Sunday 16 May 2010 09:44:15 pm Brice Goglin wrote:
> No, there is no such limit. If you have 128cores, the cpuset string will
> be 0xffffffff,0xffffffff,0xffffffff,0xffffffff
>
> As long as you have less than 1024 cores, everything should work fine.
> For more than 1024, you'll need to rebuild with a manual change in the
> source code, or wait for hwloc 1.1.
>
> Brice
>
> On 14/05/2010 23:51, Jirka Hladky wrote:
> > Thanks Samuel!!
> >
> > The data looks fine. hwloc rocks.
> >
> > I assume
> > --cpuset
> > option (lstopo command)
> > is not supported on such systems, right?
> >
> > My understanding is that cpuset masks works only upto 64 cores. Is it
> > correct?
> >
> > Thanks
> > Jirka
> >
> > On Friday 14 May 2010 08:06:12 pm Samuel Thibault wrote:
> >> Jeff Squyres, le Fri 14 May 2010 09:09:44 -0400, a écrit :
> >>> I believe that Brice / Samuel (the two main developers) have tested
> >>> hwloc on an old Altix 4700 with 256 itanium cores.
> >>>
> >>> I don't have their exact results, and I don't see them on IM right now,
> >>> so I don't know if they're around today or not...
> >>
> >> It was tested on a 256 core itanium machine, see
> >> tests/linux/256ia64-64n2s2c.tar.gz.output
> >>
> >> Samuel
> >> _______________________________________________
> >> hwloc-users mailing list
> >> hwloc-users_at_[hidden]
> >> http://www.open-mpi.org/mailman/listinfo.cgi/hwloc-users
> >
> > _______________________________________________
> > hwloc-users mailing list
> > hwloc-users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/hwloc-users
>
> _______________________________________________
> hwloc-users mailing list
> hwloc-users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/hwloc-users