Can you change get-url and related functions so that instead of returning a string containing "ERR: ..." on an error, just throw an error that can be caught, similar to what is done with invalid list references?
As it is right now, I came across two different errors last night when running a site-scraper, but I don't feel comfortable with checking the output for a specific error string. 1) what if the file I'm fetching contains that error string? 2) I don't know in advance what all the possible error strings are. I came across two last night (connected failed, and server refused connection). How many more are there? What if hostname lookup failed?
Feature request for 10.0.6, error exceptions for get-url
-
- Posts: 608
- Joined: Mon Feb 05, 2007 1:04 am
- Location: Abbotsford, BC
- Contact:
newLISP distinguishes between errors caused by programming errors through wrong syntax or parameter choice versus soft errors caused by network conditions, timeouts etc. Only the first kind throws catchable errors. The other kind is detected using either (net-error) or (sys-error).
Another advantage is, that you still get a meaningfull output from 'get-url' without error handling. Similar to a web browser displaying a 404 page.
Code: Select all
> (net-error)
nil
> (get-url "http://blahblah")
"ERR: could not connect"
> (net-error)
(2 "ERR: Host name not known")
>
-
- Posts: 608
- Joined: Mon Feb 05, 2007 1:04 am
- Location: Abbotsford, BC
- Contact: