127 char limit in get-url

For the Compleat Fan
Locked
nigelbrown
Posts: 429
Joined: Tue Nov 11, 2003 2:11 am
Location: Brisbane, Australia

127 char limit in get-url

Post by nigelbrown »

I'm porting some amazon webservices code from their perl examples to newlisp (of course) and to retrieve the info from amazon a 206 char url is generated (below is the string formed from newlisp with my SubscriptionId X'ed out)
viz:
> (length "http://webservices.amazon.com/onca/xml? ... dition=All")
206
trying to get url gives:
> (get-url "http://webservices.amazon.com/onca/xml? ... dition=All")
"ERR: bad formed URL"
>
I see in nl-web.c the code:
int parseUrl(char* url, char* host, int* port, char* path)
{
char* colonPtr;
char* slashPtr;
int len;

/* trim trailing whitespace like '/r/n' from url */
if((len = strlen(url)) > 127) return(FALSE);
while(*(url + len) <= ' ' && len > 0)
{
*(url + len) = 0;
len--;
}
...
and
/* parse URL for parameters */
if(parseUrl(url, host, &port, path) == FALSE)
return(stuffString(ERROR_BAD_URL));
...
so I suspect the long url is the problem.
I've not looked further into what would be required to accept longer url's but could this be done easily (in which case I could try coding)? Or would need some string buffer structures etc or different system calls?

Regards
Nigel

Lutz
Posts: 5289
Joined: Thu Sep 26, 2002 4:45 pm
Location: Pasadena, California
Contact:

Post by Lutz »

Yes, there is a limitation of 127 chars per URL+query, I will post a corrected version later today.

Lutz

nigelbrown
Posts: 429
Joined: Tue Nov 11, 2003 2:11 am
Location: Brisbane, Australia

Post by nigelbrown »

Thanks Lutz and Happy New Year (to all).
I'll post some code when I've something useful.
It will hopefully be a cgi or tk/tcl interfaced program to maintain a personal library "database" with ISBN's being converted to full entries.
Regards
Nigel

Locked