get-url -> ERR: HTTP document empty
Posted: Tue Mar 26, 2013 10:52 pm
I'm using some code in a program, to get the links of a website incl. subpages. But it's not working, I often get "ERR: HTTP document empty". I puted a until loop into the code, so that it tries it several times, always after some minutes. My fist thougth was, I'm blocked by the server, but this seems not to be the case. If I open a newlist shell and write (get-url url) I get the site, while I still have the same IP.
Code: Select all
(define (pick-links url)
(setq page (get-url url))
(println (1 20 page)) ; testing
(write-file "page" page); also testing
(until (not (starts-with page "ERR: HTTP document empty"))
(and (sleep 600000) (setq page (get-url url))))
(setq linklist (join (find-all "<a href=([^>]+)>([^>]*)</a>" page) "<br>\n"))
(setq linklist
(replace {"} linklist "`")) ;"
(setq parsedlist
(parse linklist "\n"))
(setq page nil) )