chriswarbo 2 days ago

Looks very interesting. I played around with (dependently) typed combinatory logic a while ago, using a system called "illative combinatory logic" http://www.chriswarbo.net/blog/2012-12-01-typed_combinators.... and I agree that's it's hard to actually use ;)

I was approaching typed combinator expressions as a target for AI systems (this predated transformers; I was thinking more like genetic programming, inductive programming, etc.), but in order to trust the results it would need to be free from paradoxes (like Type : Type).

Yusefmosiah 2 days ago

I wonder if combinators could be useful for neurosymbolic AI—either in the backward pass (e.g., training models on synthetic data) or the forward pass (e.g., iterative code generation with evolutionary algorithms). Combinators feel alien, making even Haskell or APL seem intuitive, but maybe that’s because they don’t align with human working memory. Language models, with their massive context windows, handle long-range dependencies in sequences well, even if their understanding is shallower in some ways. Could combinators, with their compositional and deductive nature, be a better fit for machines than humans? For example, instead of generating Python functions in an evolutionary approach[0], could we use combinators as the building blocks? They’re compact, formal, and inherently step-by-step, which might make them ideal for tasks requiring structured reasoning and generalization. What do you think?

[0]: https://jeremyberman.substack.com/p/how-i-got-a-record-536-o...

  • ocean_moist 2 days ago

    Yeah training data is the major issue. You can try programmatically brute-forcing/generating random well-formed (compilable) data and training on that. Then heuristically pick exogenous programs from the output of the trained model to train a new "generation". The issue is in the possibility of the algorithm generating "exogenous" program and "heuristically" picking them out.

  • Vecr 2 days ago

    There's probably not enough training data. People use Python with LLMs because it's what works, not because it's the best language. Anything rarer starts having issues.

munchler 2 days ago

> I also cannot write this language, like I am not close to grokking it.

The hallmark of any good esoteric language. :)

4ad 2 days ago

Related: https://treecalcul.us

https://github.com/barry-jay-personal/tree-calculus/blob/mas...

TBH, I am not sure I understand how this is different from Tree Calculus. Is it just the addition of dependent types?

  • jbhn 2 days ago

    Reflections in Tree Calculus work differnt from lisp quote/unqote used here. It is a hunch, but I think Tree Calculus can implement this (if it is sound) (book pg 64 ff on quote), but not vice-versa.

    As far as I know there is no Tree Calculus with (dependent) types, because types in Tree Calculus work different from main stream type theory (you internalize the type checker using reflection (see book pg 58), a bit like I did here with scheme: https://github.com/JanBessai/tcscheme).

anonzzzies 2 days ago

I like those experiments: I did many experiments over christmas and some I will try to continue working on. It's just fun playing with pl theory and concrete implementations.