lazy iteration over strings in memory???
Posted: Tue Jun 30, 2009 4:54 am
Programming language operators are as a rule very inconsistent. Some return meaningful values, others exist for "side effects", some are greedy, others are lazy or can be made lazy etc. etc.
Functionally aware languages,. such as lisps and newlisp in particular are great, they have nice uniformity.
However there is one task I do not seem to be able to solve easily. This is lazy iteration over strings
That is, I want for a great number of tasks to have ability to do with strings in memory exactly the same thing I can do with newlisp "search" and files: that is lazy processing.
Search gives me some result and advances the offset in the file (which I can read and use later, if I need it). The next invocation of the same search, say in a loop, will give me the second match and I can pick its offset.
The file can be arbitrarily large, but the operator is not "greedy", as it does not try to process all of it, it is "lazy", it yields result only wneh asked to.
I seem not to be able to do easily and efficiently the same with strings in memory. E.g. if I read that file (or a part of it) into memory, I do not know how to iterate over the string in such a way, that a second invocation of the same matching operator would yield the second match and advance offset, which I'd be able to pick.
My attempts at constructing such behaviour from existing newlisp primitives were clumsy and terribly inefficient (e.g. there is an operator that will give only the first find, but then truncating the string will be very inefficient etc.)
Am I missing something? Is there a way to do efficient processing of largish strings in memory a la files, in the lazy manner?
Functionally aware languages,. such as lisps and newlisp in particular are great, they have nice uniformity.
However there is one task I do not seem to be able to solve easily. This is lazy iteration over strings
That is, I want for a great number of tasks to have ability to do with strings in memory exactly the same thing I can do with newlisp "search" and files: that is lazy processing.
Search gives me some result and advances the offset in the file (which I can read and use later, if I need it). The next invocation of the same search, say in a loop, will give me the second match and I can pick its offset.
The file can be arbitrarily large, but the operator is not "greedy", as it does not try to process all of it, it is "lazy", it yields result only wneh asked to.
I seem not to be able to do easily and efficiently the same with strings in memory. E.g. if I read that file (or a part of it) into memory, I do not know how to iterate over the string in such a way, that a second invocation of the same matching operator would yield the second match and advance offset, which I'd be able to pick.
My attempts at constructing such behaviour from existing newlisp primitives were clumsy and terribly inefficient (e.g. there is an operator that will give only the first find, but then truncating the string will be very inefficient etc.)
Am I missing something? Is there a way to do efficient processing of largish strings in memory a la files, in the lazy manner?