Open MPI logo

Open MPI User's Mailing List Archives

  |   Home   |   Support   |   FAQ   |  

This web mail archive is frozen.

This page is part of a frozen web archive of this mailing list.

You can still navigate around this archive, but know that no new mails have been added to it since July of 2016.

Click here to be taken to the new web archives of this list; it includes all the mails that are in this frozen archive plus all new mails that have been sent to the list since it was migrated to the new archives.

Subject: Re: [OMPI users] Prototypes for Fortran MPI_ commands using 64-bit indexing
From: Jim Parker (jimparker96313_at_[hidden])
Date: 2013-10-30 19:00:56


Jeff,
  Here's what I know:
1. Checked FAQs. Done
2. Version 1.6.5
3. config.log file has been removed by the sysadmin...
4. ompi_info -a from head node is in attached as headnode.out
5. N/A
6. compute node info in attached as compute-x-yy.out
7. As discussed, local variables are being overwritten after calls to
MPI_RECV from Fortran code
8. ifconfig output from head node and computes listed as *-ifconfig.out

Cheers,
--Jim

On Wed, Oct 30, 2013 at 5:29 PM, Jeff Squyres (jsquyres) <jsquyres_at_[hidden]
> wrote:

> Can you send the information listed here:
>
> http://www.open-mpi.org/community/help/
>
>
> On Oct 30, 2013, at 6:22 PM, Jim Parker <jimparker96313_at_[hidden]> wrote:
>
> > Jeff and Ralph,
> > Ok, I downshifted to a helloWorld example (attached), bottom line
> after I hit the MPI_Recv call, my local variable (rank) gets borked.
> >
> > I have compiled with -m64 -fdefault-integer-8 and even have assigned
> kind=8 to the integers (which would be the preferred method in my case)
> >
> > Your help is appreciated.
> >
> > Cheers,
> > --Jim
> >
> >
> >
> > On Wed, Oct 30, 2013 at 4:49 PM, Jeff Squyres (jsquyres) <
> jsquyres_at_[hidden]> wrote:
> > On Oct 30, 2013, at 4:35 PM, Jim Parker <jimparker96313_at_[hidden]>
> wrote:
> >
> > > I have recently built a cluster that uses the 64-bit indexing
> feature of OpenMPI following the directions at
> > >
> http://wiki.chem.vu.nl/dirac/index.php/How_to_build_MPI_libraries_for_64-bit_integers
> >
> > That should be correct (i.e., passing -i8 in FFLAGS and FCFLAGS for OMPI
> 1.6.x).
> >
> > > My question is what are the new prototypes for the MPI calls ?
> > > specifically
> > > MPI_RECV
> > > MPI_Allgathterv
> >
> > They're the same as they've always been.
> >
> > The magic is that the -i8 flag tells the compiler "make all Fortran
> INTEGERs be 8 bytes, not (the default) 4." So Ralph's answer was correct
> in that all the MPI parameters are INTEGERs -- but you can tell the
> compiler that all INTEGERs are 8 bytes, not 4, and therefore get "large"
> integers.
> >
> > Note that this means that you need to compile your application with -i8,
> too. That will make *your* INTEGERs also be 8 bytes, and then you'll match
> what Open MPI is doing.
> >
> > > I'm curious because some off my local variables get killed (set to
> null) upon my first call to MPI_RECV. Typically, this is due (in Fortran)
> to someone not setting the 'status' variable to an appropriate array size.
> >
> > If you didn't compile your application with -i8, this could well be
> because your application is treating INTEGERs as 4 bytes, but OMPI is
> treating INTEGERs as 8 bytes. Nothing good can come from that.
> >
> > If you *did* compile your application with -i8 and you're seeing this
> kind of wonkyness, we should dig deeper and see what's going on.
> >
> > > My review of mpif.h and mpi.h seem to indicate that the functions are
> defined as C int types and therefore , I assume, the coercion during the
> compile makes the library support 64-bit indexing. ie. int -> long int
> >
> > FWIW: We actually define a type MPI_Fint; its actual type is determined
> by configure (int or long int, IIRC). When your Fortran code calls C, we
> use the MPI_Fint type for parameters, and so it will be either a 4 or 8
> byte integer type.
> >
> > --
> > Jeff Squyres
> > jsquyres_at_[hidden]
> > For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/
> >
> > _______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> >
> > <mpi-test-64bit.tar.bz2>_______________________________________________
> > users mailing list
> > users_at_[hidden]
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
>
>
> --
> Jeff Squyres
> jsquyres_at_[hidden]
> For corporate legal information go to:
> http://www.cisco.com/web/about/doing_business/legal/cri/
>
> _______________________________________________
> users mailing list
> users_at_[hidden]
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>