I'm porting some amazon webservices code from their perl examples to newlisp (of course) and to retrieve the info from amazon a 206 char url is generated (below is the string formed from newlisp with my SubscriptionId X'ed out)
viz:
> (length "http://webservices.amazon.com/onca/xml? ... dition=All")
206
trying to get url gives:
> (get-url "http://webservices.amazon.com/onca/xml? ... dition=All")
"ERR: bad formed URL"
>
I see in nl-web.c the code:
int parseUrl(char* url, char* host, int* port, char* path)
{
char* colonPtr;
char* slashPtr;
int len;
/* trim trailing whitespace like '/r/n' from url */
if((len = strlen(url)) > 127) return(FALSE);
while(*(url + len) <= ' ' && len > 0)
{
*(url + len) = 0;
len--;
}
...
and
/* parse URL for parameters */
if(parseUrl(url, host, &port, path) == FALSE)
return(stuffString(ERROR_BAD_URL));
...
so I suspect the long url is the problem.
I've not looked further into what would be required to accept longer url's but could this be done easily (in which case I could try coding)? Or would need some string buffer structures etc or different system calls?
Regards
Nigel
127 char limit in get-url
-
- Posts: 429
- Joined: Tue Nov 11, 2003 2:11 am
- Location: Brisbane, Australia
-
- Posts: 429
- Joined: Tue Nov 11, 2003 2:11 am
- Location: Brisbane, Australia