why does an FFI function return nil instead of an integer?

Q&A's, tips, howto's

why does an FFI function return nil instead of an integer?

Postby TedWalther » Sun Jun 21, 2015 12:10 am

Hi, I've been doing FFI. I have a C function that returns an integer. But now from time to time, it is returning a nil. How can this happen? What is the right way to deal with it?
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence. Nine months later, they left with a baby named newLISP. The women of the ivory towers wept and wailed. "Abomination!" they cried.
TedWalther
 
Posts: 605
Joined: Mon Feb 05, 2007 1:04 am
Location: Abbotsford, BC

Re: why does an FFI function return nil instead of an intege

Postby ralph.ronnquist » Sun Jun 21, 2015 12:35 pm

"from time to time" sounds interesting :-)
How sure are you about the function?

Thanks, btw; this pushed me towards learning new stuff, which has been fun, and by the looks of it, it requires only a small amount of stack corruption for an ffi imported function to return nil, due to the ffi layer then misinterpreting the return type upon function return. At least it looks to me that that'd be the first contender to explain this.

It might possibly be sufficient to have some declaration badness, but I ran out of puff before seeing the light here.

And then there might be all sorts of other things still over my head.
ralph.ronnquist
 
Posts: 202
Joined: Mon Jun 02, 2014 1:40 am
Location: Melbourne, Australia

Re: why does an FFI function return nil instead of an intege

Postby TedWalther » Sun Jun 21, 2015 1:00 pm

Ralph, thanks for looking into this.

I fixed my problem by making sure there were no nils being passed in. There is no time when a nil should be passed to an FFI function; it just isn't in the spec. Perhaps this is a kind of error that would be useful for newlisp to catch and report. an FFI function returning nil isn't quite Heisenbug territory, but it is getting there.
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence. Nine months later, they left with a baby named newLISP. The women of the ivory towers wept and wailed. "Abomination!" they cried.
TedWalther
 
Posts: 605
Joined: Mon Feb 05, 2007 1:04 am
Location: Abbotsford, BC

Re: why does an FFI function return nil instead of an intege

Postby TedWalther » Sun Jun 21, 2015 1:09 pm

I've been meditating on how to port newLisp to Plan9. FFI just flat out isn't available in Plan9. The Python approach is to compile in whatever library you need at build time. So if newLisp follows that path, we'll need a little script to take an FFI type of spec, and turn it into a C header with appropriate wrappers to turn the desired functions into newlisp primitives. It might not be too hard, might be able to whip up something in a couple days.

I think it would need just a 1 line change to primes.h and a 1 line change to the generated makefiles, to pull in the glue-wrappers.

I also see a use for a more precise layer that spells out exactly the sizes of the native C types, in addition to being able to specify exact byte widths, as newlisp does now.

Currently, I find it a little confusing the way newlisp reuses the names of C types, but they don't map 100% onto the underlying C types.

I'd be happy with something like U8, U16, U32, U64 etc instead of byte, short, int, long long. Then the platform can define byte short int long long the way it wants, and that information is available to the newlisp programmer when he wants to map FFI calls to the underlying platform in a portable way, but if he wants exact byte widths, that is an option too.
Cavemen in bearskins invaded the ivory towers of Artificial Intelligence. Nine months later, they left with a baby named newLISP. The women of the ivory towers wept and wailed. "Abomination!" they cried.
TedWalther
 
Posts: 605
Joined: Mon Feb 05, 2007 1:04 am
Location: Abbotsford, BC


Return to newLISP in the real world

Who is online

Users browsing this forum: No registered users and 3 guests