I don't really mind schemes monomorphic nature and chose to adhere to it and leave a generic transduce form for whenever there is a good way to implement it in a portable way. There is really nothing stopping anyone from extending SRFI-171 with a generic transduce form (and some more transducers that I left out because didn't really think it through). Such a project has my blessing and I would be happy to mark the SRFI as superseded.
A vector-reduce form would be trivial but icky, and I chose not to do it to not have to have the continuation safety discussion. I have an idea to make thread and continuation safe transducers with immutable and visible state, but the first PoC was pretty slow. (I am going to say it... I miss c++ move semantics. Ouch)
Anyway, if I read things correctly the complaint that srfi-171 has delete dupes and delete neighbor dupes forgets that transducers are not always used to or from a data structure. They are oblivious to context. That is why both are necessary.
The SRFI document was written for someone who already knows what a transducer is, and specifies an API that implementers are to follow. I did not intend for it to be user documentation. User documentation is severely lacking. I was hoping for it to make it into r7rs-large (hubris. I know) and then I would make some kind of push to document it better. As it is now I have very little computer time.
Regarding why transducers are faster I am still pretty certain it has to do with mutation and boxing. Looking at the assembly generated by srfi-171 in chez I don't really see that much aggressive inlining - and I don't think chez would fare much worse with srfi-158. Generators and accumulators use set! everywhere, meaning chez (and guile) doesn't really try to keep the values unboxed or typed. That incurs quite a slowdown. It does use more state though.
Sorry about the messy response. Typing this while walking home.
In short: his library looks fine. Use it. From what I can see the only differences are ordering of clauses to make the transduce form generic and naming conventions. His library shadows a bunch of bindings in a non-compatible way. The transduce form is still not generic but moves the list-, vector-, generator- part of transduce into a "folder". Which is fine. But a generic dispatch would be nicer.
Ask me anything I guess.
ThatGeoGuy 148 days ago [-]
Author of the post here, hi! Funny to see this resurface again, I have made a number of changes to the transducers library since this blog post (see: https://wiki.call-cc.org/eggref/5/transducers).
A vector-reduce form would be trivial but icky, and I chose not to do it to not have to have the continuation safety discussion.
I am not sure what "continuation safety" refers to in this context but I wanted a library that would give me a good out-of-the-box experience and have support for commonly used Scheme types. I have not yet added any folders/collectors/transducers specific to some types (like anything related to streams or SRFI-69), but I think a broad swath of types and patterns are currently covered.
I think in particular my griped regarding vectors were that collectors such as `collect-vector`, `collect-u8vector`, etc. were not implemented. There is a chance to break out of these collectors using continuations but that's not really a good argument to not have them (I hope this is not what you're referring to!).
Anyway, if I read things correctly the complaint that srfi-171 has delete dupes and delete neighbor dupes forgets that transducers are not always used to or from a data structure. They are oblivious to context. That is why both are necessary.
I think this is exactly my argument: they are oblivious to context and actually do the wrong thing by default. I've seen this happen in Rust with users preferring `dedup` or `dedup_by` (from the Itertools crate) rather than just constructing a HashSet or BTreeSet. It almost always is used as a shortcut to save on a data structure, and time and again I've seen it break workflows because it requires that the chain of items is first sorted.
I think this is is particularly damning for a library that means to be general purpose. If users want to implement this themselves and maintain it within their own code-bases, they're certainly welcome to; however, I don't personally think making this kind of deduping "easy" helps folks in the general sense. You'd be better off collecting into a set or bag of some kind, and then transducing a second time.
From what I can see the only differences are ordering of clauses to make the transduce form generic and naming conventions. His library shadows a bunch of bindings in a non-compatible way. The transduce form is still not generic but moves the list-, vector-, generator- part of transduce into a "folder". Which is fine. But a generic dispatch would be nicer.
Shadowing bindings in a "non-compatible" way can be bad, but it also helps to make programs more clean. If you're using transducers across your codebase, you almost certainly aren't also using e.g. SRFI-1's filter.
As for generic dispatch: I agree wholeheartedly. I wish we had something like Clojure protocols that didn't suck. I've looked into ways to (ab)use variant records for this sort of thing, but you run into an open/closed problem on extending the API. This is really something that needs to be solved at the language level and something like COOPS / GOOPS incurs a degree of both conceptual and actual performance overhead that makes them somewhat unsatisfying :(
And also: thank you for SRFI-171. I disagree with some of the design decisions but had it not been written I probably wouldn't have even considered transducers as something worth having.
I don't have much time to read through all this now but I'll check later, looks like great write-ups!
ThatGeoGuy 147 days ago [-]
Thanks for the heads up! Seems my excerpt urls had some errors!
bjoli 147 days ago [-]
> It almost always is used as a shortcut to save on a data structure, and time and again I've seen it break workflows because it requires that the chain of items is first sorted.
I am not sure I understand. I almost never use transducers to create data structures. I use them as a way to create general processing steps. The standard example is how they are used in clojure's channels. In such a context you need both dedup and dedup-neighbors.
To be frank, I don't really care much for the *-transduce functions. I think a general purpose looping facility is a better choice almost always. For those things I use https://git.sr.ht/~bjoli/goof-loop which is always going to be faster than transducers unless you have very very smart compiler (or perhaps a tracing JIT).
I think that transducers should be integrated into the standard library to make sense so that you can for example pass them to a port constructor.
Anyway, your library looks much more complete, and pretty similar to the SRFI. The differences are mostly cosmetic.
taeric 148 days ago [-]
Fun article. I'm somewhat still in the camp of loving Common Lisp's LOOP over many of the newer tools that are for looping over things. Articles like this do a good job of shining light on a lot of the concerns.
Quick nit/question. For the fold method, I don't think I've seen it called sentinel value. Usually it is seed or initial?
Now, my main question. Transducer? I'm curious on the etymology of that word. By itself, I don't think I could ever guess what it was referencing. :(
zappacino 148 days ago [-]
AFAIK, the term was popularized by Rich Hickey in Clojure's implementation [1]. His talk introducing the concept goes into the etymology specifically. If I remember correctly it's something like "to carry across."
Transducers are not loops. You can create transducers and pass them as arguments to functions, that in their turn can prepend or append then.
They are composable algorithmic transformations.
drcode 148 days ago [-]
composable algorithmic transformation in the streets
but mostly an alternative to LOOP in the sheets
bjoli 148 days ago [-]
Sure. But in that case it is somewhere between map(car)-and-friends and LOOP.
Which is a situation where they add very little. Being able to use them as an intermediate step wherever data flows is probably the only place I use them myself. In channels, in file-readers etc. in places where you really need speed you should of course reach for whatever loop construct you prefer.
RhysU 148 days ago [-]
> composable algorithmic transformations
This phrasing always feels pointlessly fancy. I may be understeeped in the right lore.
What is a non-composable, non-algorithmic transformation? A composable, non-algorithmic transformation? A non-composable, algorithmic transformation?
Maybe there's some constraint on the domain vs the range buried in "transformation"?
bjoli 147 days ago [-]
There are lots of examples of non-composable transformations. Two functions are not necessarily composable. Call/cc does not compose with the current continuation, whereas delimited continuations do.
Algorithmic transformations are algorithmic. The result is well-defined.
It is a succinct way of saying that a transducer is a well-defined (in terms of input and output) transformation that can be combined with other transducers.
I didn't make up the lingo.
147 days ago [-]
Zambyte 148 days ago [-]
You are correct with your question, SRFI 1[0] describes the knil argument as the "seed" or fold state (the latter for a recursive implementation). A sentinel value usually refers to a final value.
Regarding the etymology: transform + reduce = transduce
Quick nit/question. For the fold method, I don't think I've seen it called sentinel value. Usually it is seed or initial?
I think in Scheme it is common to call it knil, mirroring how lists use nil as the "sentinel" value which marks the end of a proper list. I opted to name it sentinel in that article (and in the docs) for two reasons:
2. Transducers necessarily abstract across a lot more than loops / lists. Lisps do a lot of really cool (and optimized) stuff with lists alone and Scheme is no different in this regard. However, because of how Scheme primarily exports list operations in (scheme base) is really easy to run into a situation where lists are used in an ad-hoc way where another data structure is more appropriate. This includes vectors, sets, mappings, hashmaps, etc. Transducers-the-library is meant to be general across operations that work on all of these types, so I chose language that intentionally departs from thinking in a list-focused way.
Now, my main question. Transducer? I'm curious on the etymology of that word. By itself, I don't think I could ever guess what it was referencing. :(
It's not a reducer, because they serve as higher-order functions that operate on reducers. Instead, the values they accept are transient through the function(s), so they transduce. You should watch the video, I think Rich explains the origins of his language very well.
darby_nine 148 days ago [-]
I can't be alone in thinking that LOOP usage produces nearly unreadable code. It's extremely difficult to figure out what the intended behavior is if you don't have decades of reading them under your belt.
taeric 148 days ago [-]
Meh. It is like any other construct in this regard. If you let it get complicated, it can be complicated. Most of the time, you should be able to keep it short.
Pointedly, though, it was hilarious to me how much Java began to look like typical LOOP constructs with Collectors.
darby_nine 147 days ago [-]
Why not make multiple forms, then? The readability would certainly be increased if the writer were able to move beyond "loop".
This smells more like for-loop insecurity than anything.
taeric 147 days ago [-]
I'm not sure I understand the question. If you think I was arguing against any new forms, that is a failing of my post. I am a fan of LOOP for places it works. I am also all for people exploring new things, and my intent on my post was to give credit in that direction.
0x3444ac53 148 days ago [-]
I'm a fan of the
```
(let [ loop (fn [...]
```
lkuty 146 days ago [-]
It is Clojure right? Does it imply a `recur` in the anonymous function? I don't really understand what the code will do since I only saw examples of `loop` with a condition. Why the `fn` ?
It's just a way of simulating a loop with a recursive anonymous function within a let binding. As far as the fn, it's because lately I've been programming quite a lot in fennel[1] that borrows syntax from clojure. Although it should be said that this also isn't valid fennel code unless it's in a macro, but that's beside the pont.
the (let [loop (fn ...)]) is something I learned here[2]
> Transducers were an “invention” of Rich Hickey, the creator of Clojure. I put “invention” in quotes here because I think that a lot of the groundwork and ideas somewhat predate Rich Hickey in the literature that he references in the original video I linked.
The word that springs to mind for me is "distillation" - certainly in my head that gives Rich the significant level of credit he deserves for both naming and overall design while acknowledging all that came before. Whether it works in anybody else's head must inevitably be left as an exercise to the reader.
jiehong 148 days ago [-]
Nice to see Scheme being active. It is really used much in companies?
While the author speaks about Rust iterators at first, Transducers seem better in the end.
In Java 23, the introduction of gatherers seems to be an attempt at having a more open set of functions of a stream, which transducers don’t suffer from.
Rust Iterators seems to also have this limited sed of actions available (but some crate like tap seem to allow to .pipe() a function).
veqq 148 days ago [-]
> It is really used much in companies?
Cisco seems to use Chez for something mysterious, a video game company scripted in Scheme, stopped and restarted again. I know a guy who does option trading in Racket, I have some large valuation models in Racket (most of the system is in and Common Lisp). I know a few guys who do vague data analysis and scripting in scheme (especially Guile or Gerbil for some reason) where their clients only care about the result. https://www.itasoftware.com/ is in scheme. There are some dead webshops(?) like https://www.greghendershott.com/2018/05/extramaze-llc-using-... or https://defn.io/2019/08/20/racket-ecommerce/
That's about it.
Jtsummers 148 days ago [-]
ITA used Common Lisp. The game company you mentioned is probably Naughty Dog who created "Game Oriented Assembly Language" for their games. GOAL was created in Common Lisp but was Scheme-like. They switched away after being acquired by Sony and needing to fit in better, they brought it back later on though.
Never try to fit in after being acquired. Be what they acquired.
bitwize 147 days ago [-]
Sony had a policy: games are implemented in C++, no ifs, ands, or buts. Once you're acquired, you play by the parent's rules.
That said, they did sneak Scheme (specifically, Racket) in the back door, by making their C++ engine data-driven and using Racket programs to generate and munge the data consumed by the engine.
pjmlp 146 days ago [-]
Naturally a policy only added after PlayStation 2, as PlayStation 1 only did C and Assembly, being famously the first devkit to offer C support.
Also the SPUs on the Playstation 3 could in theory be programmed in C, but really only Assembly was usable for the goals of using them.
Finally, some Playstation Studios also use Unity with C#.
kazinator 147 days ago [-]
Given the policy, simply don't acquire stuff not written in C++, and the people working on it not in C++. Unless it's competition and you're trying to bury it or something. Otherwise it doesn't make sense buy an operation and its staff, and then change their tooling and have them rework everything.
bitwize 146 days ago [-]
They were a hitmaker for Sony before the acquisition. The cost of a forced transition was perceived as worth it (and probably was; Uncharted and TLOU made more money than all their previous titles put together).
mst 147 days ago [-]
Curiosity: How did the mixture of Racket and Common Lisp come about?
(there are plenty of plausible guesses but I enjoy reading about specific examples if you have a moment to explain)
pjmlp 147 days ago [-]
SISCOG in Lisbon used to be a big Common Lisp shop in the early 2000's, not sure how much of it they still use 20 years later.
aag 147 days ago [-]
SRFI* Editor here. I enjoyed reading this post, and wish I had seen it sooner. I'm a big fan of transducers, and would like to see them used more in Scheme.
I want to respond to one comment in the post about the SRFI process:
Unlike a normal library, the whole point of SRFIs is that once they’re finalized, that SRFI is locked in stone. You can’t just add or remove or deprecate functions. You can change the default implementation if there’s a bug or add a post-finalization note about how the implementation should behave if there’s ambiguity, but you can’t just start picking and ripping at an SRFI. This kind of attitude towards software / APIs is anathemic to change.
The idea behind SRFIs is that an author comes up with an idea, writes it down clearly, provides a sample implementation, gets feedback from the community, and tries to persuade Scheme implementations to adopt it. Because this is the goal, once a SRFI is published, it needs to be stable, not a moving target. That's why we treat it like a standard rather than a library. But there's no reason at all that new versions of a proposal can't be published as new SRFIs. In fact, some SRFIs, e.g. SRFI 231: Intervals and Generalized Arrays**, are the result of several rounds of refinement, each published as a separate SRFI. But if your Scheme implementation says that it supports SRFI N, you know what you're getting.
I would welcome a SRFI from Thatgeoguy to help bring his work from CHICKEN Scheme to other Scheme implementations. We've brought many ideas from one implementation to others this way.
My big takeaway is that Google has a Common Lisp Style Guide. Makes you wonder what they use Common Lisp for.
bitwize 147 days ago [-]
They acquired ITA Software in like 2012. ITA's QPX search engine is largely written in Common Lisp. Unlike Sony, q.v. in another comment thread, Google decided not to mess too much with a good thing and let them keep their Lisp codebase without following bitwize's corollary to Greenspun's Tenth Rule (every sufficiently complicated Common Lisp or Scheme program will eventually be rewritten in C++, Java, or Python).
147 days ago [-]
147 days ago [-]
YeGoblynQueenne 147 days ago [-]
So what's a "transducer" in this context and does it have to do anything with Finite State Transducers?
Unified iterators help a lot with improving code quality. Most of the time when consuming sequences, I couldn't care less what kind of collection I'm dealing with.
I wonder if the author considered SRFI-42 at all? It looks, at a glance, similar.
bjoli 148 days ago [-]
They are not similar at all, except that they might overlap. Transducers are functions that perform one-way transformations of data. Srfi-42 is couple of loop macros.
I wrote both srfi-171 (mentioned in the post) and this looping macro https://git.sr.ht/~bjoli/goof-loop which is probably the most powerful one scheme (at least that generates fast code) has to offer until someone reimpmements Olin's loops.
tonyg 148 days ago [-]
> They are not similar at all, except that they might overlap.
(Well, which is it?)
Perhaps I was misled by the intro to the article, that wished for something like Rust's iterators: generic data-source-neutral map/filter/zip/etc utilities, static dispatch, extensible. Srfi-42, Racket's for/* system, and your goof-loop all seem to fit the bill. Obviously the article ended up at a dynamically dispatched system, like -171, but to start with it didn't seem like it had to have headed in that direction.
> Srfi-42 is couple of loop macros.
This is decidedly ungenerous, especially given its historical relationship to Barzilay's work and descendants.
bjoli 147 days ago [-]
> (Well, which is it?)
The more I have though about these things, the less I see them as something similar. They overlap because they are sold as an alternative to map, filter etc. I think that is a pretty non-exciting use case that is better served by proper looping facilities.
What is interesting about transducers is that they can be the basis for a generic protocol for transformations. You can make a transducer and use it on a list, pass it to a port constructor to have it preprocess everything read from the port, pass it to an asynchronous channel, make it into a generator.
That is something not served by looping macros.
> This is decidedly ungenerous
I did not mean to disparage srfi-42. It _is_ a couple of looping macros (that set the bar for descendants), which I mean is what sets it apart from the exciting use cases of transducers. I think a language should have both.
Racket took srfi-42 and showed that it doesn't have to use mutation. The reference implementation of SRFI-42 uses set! quite a lot which tanks performance in many schemes.
I don't really mind schemes monomorphic nature and chose to adhere to it and leave a generic transduce form for whenever there is a good way to implement it in a portable way. There is really nothing stopping anyone from extending SRFI-171 with a generic transduce form (and some more transducers that I left out because didn't really think it through). Such a project has my blessing and I would be happy to mark the SRFI as superseded.
A vector-reduce form would be trivial but icky, and I chose not to do it to not have to have the continuation safety discussion. I have an idea to make thread and continuation safe transducers with immutable and visible state, but the first PoC was pretty slow. (I am going to say it... I miss c++ move semantics. Ouch)
Anyway, if I read things correctly the complaint that srfi-171 has delete dupes and delete neighbor dupes forgets that transducers are not always used to or from a data structure. They are oblivious to context. That is why both are necessary.
The SRFI document was written for someone who already knows what a transducer is, and specifies an API that implementers are to follow. I did not intend for it to be user documentation. User documentation is severely lacking. I was hoping for it to make it into r7rs-large (hubris. I know) and then I would make some kind of push to document it better. As it is now I have very little computer time.
Regarding why transducers are faster I am still pretty certain it has to do with mutation and boxing. Looking at the assembly generated by srfi-171 in chez I don't really see that much aggressive inlining - and I don't think chez would fare much worse with srfi-158. Generators and accumulators use set! everywhere, meaning chez (and guile) doesn't really try to keep the values unboxed or typed. That incurs quite a slowdown. It does use more state though.
Sorry about the messy response. Typing this while walking home.
In short: his library looks fine. Use it. From what I can see the only differences are ordering of clauses to make the transduce form generic and naming conventions. His library shadows a bunch of bindings in a non-compatible way. The transduce form is still not generic but moves the list-, vector-, generator- part of transduce into a "folder". Which is fine. But a generic dispatch would be nicer.
Ask me anything I guess.
A vector-reduce form would be trivial but icky, and I chose not to do it to not have to have the continuation safety discussion.
I am not sure what "continuation safety" refers to in this context but I wanted a library that would give me a good out-of-the-box experience and have support for commonly used Scheme types. I have not yet added any folders/collectors/transducers specific to some types (like anything related to streams or SRFI-69), but I think a broad swath of types and patterns are currently covered.
I think in particular my griped regarding vectors were that collectors such as `collect-vector`, `collect-u8vector`, etc. were not implemented. There is a chance to break out of these collectors using continuations but that's not really a good argument to not have them (I hope this is not what you're referring to!).
Anyway, if I read things correctly the complaint that srfi-171 has delete dupes and delete neighbor dupes forgets that transducers are not always used to or from a data structure. They are oblivious to context. That is why both are necessary.
I think this is exactly my argument: they are oblivious to context and actually do the wrong thing by default. I've seen this happen in Rust with users preferring `dedup` or `dedup_by` (from the Itertools crate) rather than just constructing a HashSet or BTreeSet. It almost always is used as a shortcut to save on a data structure, and time and again I've seen it break workflows because it requires that the chain of items is first sorted.
I think this is is particularly damning for a library that means to be general purpose. If users want to implement this themselves and maintain it within their own code-bases, they're certainly welcome to; however, I don't personally think making this kind of deduping "easy" helps folks in the general sense. You'd be better off collecting into a set or bag of some kind, and then transducing a second time.
From what I can see the only differences are ordering of clauses to make the transduce form generic and naming conventions. His library shadows a bunch of bindings in a non-compatible way. The transduce form is still not generic but moves the list-, vector-, generator- part of transduce into a "folder". Which is fine. But a generic dispatch would be nicer.
Shadowing bindings in a "non-compatible" way can be bad, but it also helps to make programs more clean. If you're using transducers across your codebase, you almost certainly aren't also using e.g. SRFI-1's filter.
As for generic dispatch: I agree wholeheartedly. I wish we had something like Clojure protocols that didn't suck. I've looked into ways to (ab)use variant records for this sort of thing, but you run into an open/closed problem on extending the API. This is really something that needs to be solved at the language level and something like COOPS / GOOPS incurs a degree of both conceptual and actual performance overhead that makes them somewhat unsatisfying :(
And also: thank you for SRFI-171. I disagree with some of the design decisions but had it not been written I probably wouldn't have even considered transducers as something worth having.
I don't have much time to read through all this now but I'll check later, looks like great write-ups!
I am not sure I understand. I almost never use transducers to create data structures. I use them as a way to create general processing steps. The standard example is how they are used in clojure's channels. In such a context you need both dedup and dedup-neighbors.
To be frank, I don't really care much for the *-transduce functions. I think a general purpose looping facility is a better choice almost always. For those things I use https://git.sr.ht/~bjoli/goof-loop which is always going to be faster than transducers unless you have very very smart compiler (or perhaps a tracing JIT).
I think that transducers should be integrated into the standard library to make sense so that you can for example pass them to a port constructor.
Anyway, your library looks much more complete, and pretty similar to the SRFI. The differences are mostly cosmetic.
Quick nit/question. For the fold method, I don't think I've seen it called sentinel value. Usually it is seed or initial?
Now, my main question. Transducer? I'm curious on the etymology of that word. By itself, I don't think I could ever guess what it was referencing. :(
[1] https://clojure.org/reference/transducers
for example: https://www.cs.cmu.edu/Groups/AI/html/cltl/clm/node352.html#...
They are composable algorithmic transformations.
but mostly an alternative to LOOP in the sheets
Which is a situation where they add very little. Being able to use them as an intermediate step wherever data flows is probably the only place I use them myself. In channels, in file-readers etc. in places where you really need speed you should of course reach for whatever loop construct you prefer.
This phrasing always feels pointlessly fancy. I may be understeeped in the right lore.
What is a non-composable, non-algorithmic transformation? A composable, non-algorithmic transformation? A non-composable, algorithmic transformation?
Maybe there's some constraint on the domain vs the range buried in "transformation"?
Algorithmic transformations are algorithmic. The result is well-defined.
It is a succinct way of saying that a transducer is a well-defined (in terms of input and output) transformation that can be combined with other transducers.
I didn't make up the lingo.
Regarding the etymology: transform + reduce = transduce
[0] https://srfi.schemers.org/srfi-1/srfi-1.html#fold
I think in Scheme it is common to call it knil, mirroring how lists use nil as the "sentinel" value which marks the end of a proper list. I opted to name it sentinel in that article (and in the docs) for two reasons:
1. Sentinel values are a common topic in many languages https://en.wikipedia.org/wiki/Sentinel_value
2. Transducers necessarily abstract across a lot more than loops / lists. Lisps do a lot of really cool (and optimized) stuff with lists alone and Scheme is no different in this regard. However, because of how Scheme primarily exports list operations in (scheme base) is really easy to run into a situation where lists are used in an ad-hoc way where another data structure is more appropriate. This includes vectors, sets, mappings, hashmaps, etc. Transducers-the-library is meant to be general across operations that work on all of these types, so I chose language that intentionally departs from thinking in a list-focused way.
Now, my main question. Transducer? I'm curious on the etymology of that word. By itself, I don't think I could ever guess what it was referencing. :(
This is from Rich Hickey's presentation: https://www.youtube.com/watch?v=6mTbuzafcII
It's not a reducer, because they serve as higher-order functions that operate on reducers. Instead, the values they accept are transient through the function(s), so they transduce. You should watch the video, I think Rich explains the origins of his language very well.
Pointedly, though, it was hilarious to me how much Java began to look like typical LOOP constructs with Collectors.
This smells more like for-loop insecurity than anything.
the (let [loop (fn ...)]) is something I learned here[2]
[1] https://fennel-lang.org/ [2] http://www.phyast.pitt.edu/~micheles/scheme/scheme5.html#the...
The word that springs to mind for me is "distillation" - certainly in my head that gives Rich the significant level of credit he deserves for both naming and overall design while acknowledging all that came before. Whether it works in anybody else's head must inevitably be left as an exercise to the reader.
While the author speaks about Rust iterators at first, Transducers seem better in the end.
In Java 23, the introduction of gatherers seems to be an attempt at having a more open set of functions of a stream, which transducers don’t suffer from.
Rust Iterators seems to also have this limited sed of actions available (but some crate like tap seem to allow to .pipe() a function).
Cisco seems to use Chez for something mysterious, a video game company scripted in Scheme, stopped and restarted again. I know a guy who does option trading in Racket, I have some large valuation models in Racket (most of the system is in and Common Lisp). I know a few guys who do vague data analysis and scripting in scheme (especially Guile or Gerbil for some reason) where their clients only care about the result. https://www.itasoftware.com/ is in scheme. There are some dead webshops(?) like https://www.greghendershott.com/2018/05/extramaze-llc-using-... or https://defn.io/2019/08/20/racket-ecommerce/
That's about it.
https://en.wikipedia.org/wiki/Game_Oriented_Assembly_Lisp
That said, they did sneak Scheme (specifically, Racket) in the back door, by making their C++ engine data-driven and using Racket programs to generate and munge the data consumed by the engine.
Also the SPUs on the Playstation 3 could in theory be programmed in C, but really only Assembly was usable for the goals of using them.
Finally, some Playstation Studios also use Unity with C#.
(there are plenty of plausible guesses but I enjoy reading about specific examples if you have a moment to explain)
I want to respond to one comment in the post about the SRFI process:
Unlike a normal library, the whole point of SRFIs is that once they’re finalized, that SRFI is locked in stone. You can’t just add or remove or deprecate functions. You can change the default implementation if there’s a bug or add a post-finalization note about how the implementation should behave if there’s ambiguity, but you can’t just start picking and ripping at an SRFI. This kind of attitude towards software / APIs is anathemic to change.
The idea behind SRFIs is that an author comes up with an idea, writes it down clearly, provides a sample implementation, gets feedback from the community, and tries to persuade Scheme implementations to adopt it. Because this is the goal, once a SRFI is published, it needs to be stable, not a moving target. That's why we treat it like a standard rather than a library. But there's no reason at all that new versions of a proposal can't be published as new SRFIs. In fact, some SRFIs, e.g. SRFI 231: Intervals and Generalized Arrays**, are the result of several rounds of refinement, each published as a separate SRFI. But if your Scheme implementation says that it supports SRFI N, you know what you're getting.
I would welcome a SRFI from Thatgeoguy to help bring his work from CHICKEN Scheme to other Scheme implementations. We've brought many ideas from one implementation to others this way.
* Scheme Requests for Implementation, https://srfi.schemers.org/
** https://srfi.schemers.org/srfi-231/srfi-231.html
https://en.wikipedia.org/wiki/Finite-state_transducer
https://github.com/codr7/sharpl#iterators
I wrote both srfi-171 (mentioned in the post) and this looping macro https://git.sr.ht/~bjoli/goof-loop which is probably the most powerful one scheme (at least that generates fast code) has to offer until someone reimpmements Olin's loops.
(Well, which is it?)
Perhaps I was misled by the intro to the article, that wished for something like Rust's iterators: generic data-source-neutral map/filter/zip/etc utilities, static dispatch, extensible. Srfi-42, Racket's for/* system, and your goof-loop all seem to fit the bill. Obviously the article ended up at a dynamically dispatched system, like -171, but to start with it didn't seem like it had to have headed in that direction.
> Srfi-42 is couple of loop macros.
This is decidedly ungenerous, especially given its historical relationship to Barzilay's work and descendants.
The more I have though about these things, the less I see them as something similar. They overlap because they are sold as an alternative to map, filter etc. I think that is a pretty non-exciting use case that is better served by proper looping facilities.
What is interesting about transducers is that they can be the basis for a generic protocol for transformations. You can make a transducer and use it on a list, pass it to a port constructor to have it preprocess everything read from the port, pass it to an asynchronous channel, make it into a generator.
That is something not served by looping macros.
> This is decidedly ungenerous
I did not mean to disparage srfi-42. It _is_ a couple of looping macros (that set the bar for descendants), which I mean is what sets it apart from the exciting use cases of transducers. I think a language should have both.
Racket took srfi-42 and showed that it doesn't have to use mutation. The reference implementation of SRFI-42 uses set! quite a lot which tanks performance in many schemes.