As an avid Prolog fan, I would have to agree with a lot of Mr. Wayne's comments! There are something things about the language that are now part of the ISO standard that are a bit unergonomic.
On the other hand, you don't have to write Prolog like that! The only shame is that there are 10x more examples (at least) of bad Prolog on the internet than good Prolog.
If you want to see some really beautiful stuff, check out Power of Prolog[1] (which Mr. Wayne courteously links to in his article!)
If you are really wondering why Prolog, the thing about it that makes it special among all languages is metainterpretation. No, seriously, would strongly recommend you check it out[2]
This is all that it takes to write a metainterpreter in Prolog:
It actually has quite good UX affordances. More than that, however, I find the code imminently hackable, even as someone with very little Prolog experience. Reading through the plwm code really demystified the apparent gap between toy and practical Prolog for me. Heck, even the SWI-Prolog codbase itself is quite approachable!
I'm also mildly surprised at some of OG's gripes. A while back, I ran through Triska's The Power of Prolog[0], which crisply grounds Prolog's mental model and introduces standard conventions. In particular, it covers desugaring syntax into normal predicates, e.g. -/2 as pairs, [,]/2 as special syntax for ./2 cons cells, etc. Apparently, I just serendipitously stumbled into good pedagogical resources!
I'd be interested in ways that people utilize logical programming concepts and techniques into non-LP languages.
>> I expect by this time tomorrow I'll have been Cunningham'd and there will be a 2000 word essay about how all of my gripes are either easily fixable by doing XYZ or how they are the best possible choice that Prolog could have made.
In that case I won't try to correct any of the author's misconceptions, but I'll advise anyone reading the article to not take anything the author says seriously because they are seriously confused and have no idea what they're talking about.
Sorry to be harsh, but it seems to me the author is trying their damnedest best to misunderstand everything ever written about Prolog, and to instead apply entirely the wrong abstractions to it. I don't want to go into the weeds, since the author doesn't seem ready to appreciate that, but Prolog isn't Python, or Java, or even Picat, and to say e.g. that Prolog predicates "return true or false" is a strong hint that the author failed to read any of the many textbooks on Prolog programming, because they all make sure to drill into you the fact that Prolog predicates don't "return" anything because they're not functions. And btw, Prolog does have functions, but like I say, not going into the weeds.
I guess we are supposed to pile on, so I'll add that the author should read "The Art of Prolog" (Sterling & Shapiro) and then "The Craft of Prolog" (O'Keefe).
And also "Prolog Programming for AI" by Bratko and "Programming in Prolog" by Clocksin and Mellish.
Although these days I'd recommend anyone interested in Prolog starts in at the deep end with "Foundations of Logic Programming" by George W. Lloyd, because I've learned the hard way that teaching Prolog as a mere programming language, without explaining the whole logic programming thing, fails.
In short, "Here are my gripes about Prolog, a language that I don't understand."
It's perfectly fine to not like Prolog, but I do feel that if you're going to write an article about why you don't like it, you should at least spend some time figuring it out first.
He says of the cut operator "This is necessary for optimization but can lead to invalid programs." Imagine if a programmer new to C++ said the same thing of the "break" keyword. That's how ridiculous it sounds. Yes, cut can be used to prune backtracking and eliminate unneeded work, but that's hardly it's purpose. It leads to "invalid" programs (by which I assume he means, programs that do something other than what he wants) only in cases where you are using it wrong. Cut is no more "necessary for optimization" than break is. It's a control structure that you don't understand
Negation (\+) is confusing, and the author correctly provides examples where its meaning is unintuitive when applied to unbound variables. That's because it's not strictly speaking a negation predicate, but rather a "not provable" predicate. In that light, the examples in the article make perfect sense. Yes, Prolog is a programming language, so the order of terms matter, even if the order wouldn't matter in pure logic.
Look, Prolog is a weird language. It has a learning curve. It's not "just another language" in the Java, C++, Pascal, Python mold. I get it. But this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.
> In short, "Here are my gripes about Prolog, a language that I don't understand."
> this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.
The author has written about Prolog in a positive light before (and linked to it in the post), I don't get the impression that these are all "the author doesn't understand what they're doing".
Their first complaint, that "strings are not standardised, so code working with strings in SWI-Prolog is not compatible with Scryer Prolog", seems an appropriate thing to be unhappy about (unless the author is just wrong?).
Your response to their gripe about \+ being "not provable" instead of "negation" notes it's a subtle difference, and that Prolog differs from pure logic there.
The author even notes that doing due diligence, they found a solution to a complaint they had. This doesn't strike me as "can't be bothered to read the documentation".
>> Code logic is expressed entirely in rules, predicates which return true or false for certain values.
Open any Prolog programming textbook (Clocksin & Mellish, Bratko, Sterling & Shapiro, O'Keefe, anything) and the first thing you learn about Prolog is that "code logic" is expressed in facts and rules, and that Prolog predicates don't "return" anything.
The confusion only deepens after that. There are no boolean values? In an untyped language? Imagine that. true/0 and false/0 are not values? In a language where everything is a predicate? Imagine that. Complete lack of understanding that "=" is a unification operator, and that unification is not assignment, like, really, it's not, it's not just a fancy way to pretend you don't do assignment while sneaking it in by the backdoor to be all smug and laugh at the noobs who aren't in the in-group, it's unification, it doesn't work as you think it should work if you think it should work like assignment because everything is immutable so you really, really don't need assignment. Complete misunderstanding of the cut, and its real dangers, complete misunderstanding of Negation as Failure, a central concept in logic programming (including in ASP) and so on and so on and so on and on.
The author failed to do due diligence. And if they've written "in a positive light" about Prolog, I would prefer not to read it because I'll pull my remaining hair out, which is not much after reading this.
Is it your contention that the author doesn't understand that that Prolog predicates don't "return" anything, that they were expecting assignment rather than unification? I would read it again, their examples clearly state these (noting that the author does say "return", but also clearly shows bidirectional examples).
Both you and GP have had some fairly strong responses to what looked like mild complaints, the kind I would expect anyone to have with a language they've used enough to find edges to.
The original example in the last section was this:
foo(A, B) :-
\+ (A = B),
A = 1,
B = 2.
foo(1, 2) returns true, so you'd expect f(A, B) to return A=1, B=2. But it returns false.
foo(A,B) fails because \+(A = B) fails, because A = B succeeds. That's because = is not an assignment but a unification, and in the query foo(A,B), A and B are variables, so they always unify.
In fact here I'm not sure whether the author expects = to work as an assignment or an equality. In \+(A = B) they seem to expect it to work as an equality, but in A = 1, B = 2, they seem to expect it to work as an assignment. It is neither.
I appreciate unification is confusing and takes effort to get one's head around, but note that the author is selling a book titled LOGIC FOR PROGR∀MMERS (in small caps) so they should really try to understand what the damn heck this logic programming stuff is all about. The book is $30.
I'm wildly out of my depth here, but sometimes I find I learn quickly if I try out my intuition publicly and fail spectacularly :)
> "This is necessary for optimization but can lead to invalid programs."
Is this not the case? It feels right in my head, but I assume I'm missing something.
My understanding:
- Backtracking gets used to find other possible solutions
- Cut stops backtracking early which means you might miss valid solutions
- Cut is often useful to prune search branches you know are a waste of time but Prolog doesn't
- But if you're wrong you might cut a branch with solutions you would have wanted and if Prolog iterates all other solutions then I guess you could say it's provided an invalid solution/program?
Again, please be gentle. This sounded reasonable to me and I'm trying to understand why it wouldn't be. It's totally possible that it feels reasonable because it might be a common misconception I've seen other places. My understanding of how Prolog actually works under-the-hood is very patchy.
>> Cut stops backtracking early which means you might miss valid solutions
That's right, but missing valid solutions doesn't mean that your program is "invalid", whatever that means. The author doesn't say.
Cuts are difficult and dangerous. The danger is that they make your program behave in unexpected ways. Then again, Prolor programs behave in unexpected ways even without the cut, and once you understand why, you can use the cut to make them behave.
In my experience, when one begins to program in Prolog, they pepper their code with cuts to try and stop unwanted bactracking, which can often be avoided by understanding why Prolog is backtracking in the first place. But that's a hard thing to get one's head around, so everyone who starts out makes a mess of their code with the cut.
There are very legitimate and safe ways to use cuts. Prolog textbooks sometimes introduce a terminology of "red" and "green" cuts. Red cuts change the set of answers found by a query, green cuts don't. And that, in itself, is already hard enough to get one's head around.
At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...
> Red cuts change the set of answers found by a query, green cuts don't.
Ohhh, interesting. So a green cut is basically what I described as cutting branches you know are a waste of time, and red cuts are the ones where you're wrong and cut real solutions?
> At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...
Yeah, I'm wondering how much of this is almost social or use-case in nature?
E.g., I'm experimenting with Prolog strictly as a logic language and I experiment with (at a really novice level) things like program synthesis or model-to-model transformations to emulate macro systems that flow kind of how JetBrains MPS handles similar things. I'm basically just trying to bend and flex bidirectional pure relations (I'm probably conflating fp terms here) because it's just sort of fun to me, yeah?
So cut _feels_ like something I'd only use if I were optimizing and largely just as something I'd never use because for my specific goals, it'd be kind of antithetical--and also I'm not an expert so it scares me. Basically I'm using it strictly because of the logic angle, and cut doesn't feel like a bad thing, but it feels like something I wouldn't use unless I created a situation where I needed it to get solutions faster or something--again, naively anyway.
Whereas if I were using Prolog as a daily GP language to actually get stuff done, which I know it's capable of, it makes a lot of sense to me to see cut and `break` as similar constructs for breaking out of a branch of computation that you know doesn't actually go anywhere?
I'm mostly spit-balling here and could be off base. Very much appreciate the response, either way.
Yeah, exactly why I'm not writing the same sort of article about Haskell or prolog. I'm inexperienced in both and the effort to learn them was more than I wanted to spend.
I'll warn you that Picat is very much a "research language" and a lot of the affordances you'd expect with a polished PL just aren't there yet. There's also this really great "field notes" repo from another person who learned it: https://github.com/dsagman/picat
Side note: Just clocked your name. Read through Practical TLA+ recently modeling a few things at work. Incredibly helpful book for working through my first concrete model in practice.
I liked the idea behind Prolog, but I absolutely detest the syntax.
IMO it would be better to have something like Prolog as part of a
"better designed" language per se. I can't come up with a good
proposal myself - language design is hard, including syntax design -
but imagine if Prolog would be a part of python. That feature would
then be used by more people. (This is just an example; just randomly
creeping in features into a more successful language, also often won't
work. I am just giving this as an example that MIGHT be better.)
The syntax of Prolog is (a fragment of) the syntax of First Order Logic. It's not supposed to look like your friendly neighbourhood programming language because it's mathematical notation.
Count yourself lucky you (probably) learned programming in a language like Java or Python, and not, say, FORTRAN. Because then you'd really pray for the simplicity and elegance of definite clauses.
(Or not. FORTRAN programmers can write FORTRAN in any language, even FORTRAN).
“Something like Prolog” as a part of a more traditional language is kind of the idea of miniKanren, which has been implemented for many languages: https://minikanren.org/
The line reorder issue is evergreen and it seems all languages need to either go through this phase and fix it, or gaslight its users forever that it's "not really a problem".
Prolog: "Mistakes were made"
As an avid Prolog fan, I would have to agree with a lot of Mr. Wayne's comments! There are something things about the language that are now part of the ISO standard that are a bit unergonomic.
On the other hand, you don't have to write Prolog like that! The only shame is that there are 10x more examples (at least) of bad Prolog on the internet than good Prolog.
If you want to see some really beautiful stuff, check out Power of Prolog[1] (which Mr. Wayne courteously links to in his article!)
If you are really wondering why Prolog, the thing about it that makes it special among all languages is metainterpretation. No, seriously, would strongly recommend you check it out[2]
This is all that it takes to write a metainterpreter in Prolog:
Writing your own Prolog-like language in Prolog is nearly as fundamental as for-loops in other language.[1] https://www.youtube.com/@ThePowerOfProlog
https://www.metalevel.at/prolog
[2] https://www.youtube.com/watch?v=nmBkU-l1zyc
https://www.metalevel.at/acomip/
https://github.com/Seeker04/plwm
It actually has quite good UX affordances. More than that, however, I find the code imminently hackable, even as someone with very little Prolog experience. Reading through the plwm code really demystified the apparent gap between toy and practical Prolog for me. Heck, even the SWI-Prolog codbase itself is quite approachable!
I'm also mildly surprised at some of OG's gripes. A while back, I ran through Triska's The Power of Prolog[0], which crisply grounds Prolog's mental model and introduces standard conventions. In particular, it covers desugaring syntax into normal predicates, e.g. -/2 as pairs, [,]/2 as special syntax for ./2 cons cells, etc. Apparently, I just serendipitously stumbled into good pedagogical resources!
I'd be interested in ways that people utilize logical programming concepts and techniques into non-LP languages.
[0]:https://www.metalevel.at/prolog
To me, it feels like a data description language that someone discovered could be tricked into performing computation.
In that case I won't try to correct any of the author's misconceptions, but I'll advise anyone reading the article to not take anything the author says seriously because they are seriously confused and have no idea what they're talking about.
Sorry to be harsh, but it seems to me the author is trying their damnedest best to misunderstand everything ever written about Prolog, and to instead apply entirely the wrong abstractions to it. I don't want to go into the weeds, since the author doesn't seem ready to appreciate that, but Prolog isn't Python, or Java, or even Picat, and to say e.g. that Prolog predicates "return true or false" is a strong hint that the author failed to read any of the many textbooks on Prolog programming, because they all make sure to drill into you the fact that Prolog predicates don't "return" anything because they're not functions. And btw, Prolog does have functions, but like I say, not going into the weeds.
Just stay away. Very misinformed article.
Although these days I'd recommend anyone interested in Prolog starts in at the deep end with "Foundations of Logic Programming" by George W. Lloyd, because I've learned the hard way that teaching Prolog as a mere programming language, without explaining the whole logic programming thing, fails.
It's perfectly fine to not like Prolog, but I do feel that if you're going to write an article about why you don't like it, you should at least spend some time figuring it out first.
He says of the cut operator "This is necessary for optimization but can lead to invalid programs." Imagine if a programmer new to C++ said the same thing of the "break" keyword. That's how ridiculous it sounds. Yes, cut can be used to prune backtracking and eliminate unneeded work, but that's hardly it's purpose. It leads to "invalid" programs (by which I assume he means, programs that do something other than what he wants) only in cases where you are using it wrong. Cut is no more "necessary for optimization" than break is. It's a control structure that you don't understand
Negation (\+) is confusing, and the author correctly provides examples where its meaning is unintuitive when applied to unbound variables. That's because it's not strictly speaking a negation predicate, but rather a "not provable" predicate. In that light, the examples in the article make perfect sense. Yes, Prolog is a programming language, so the order of terms matter, even if the order wouldn't matter in pure logic.
Look, Prolog is a weird language. It has a learning curve. It's not "just another language" in the Java, C++, Pascal, Python mold. I get it. But this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.
> this article has the flavor of an impatient newbie getting frustrated because he can't be bothered to read the documentation.
The author has written about Prolog in a positive light before (and linked to it in the post), I don't get the impression that these are all "the author doesn't understand what they're doing".
Their first complaint, that "strings are not standardised, so code working with strings in SWI-Prolog is not compatible with Scryer Prolog", seems an appropriate thing to be unhappy about (unless the author is just wrong?).
Your response to their gripe about \+ being "not provable" instead of "negation" notes it's a subtle difference, and that Prolog differs from pure logic there.
The author even notes that doing due diligence, they found a solution to a complaint they had. This doesn't strike me as "can't be bothered to read the documentation".
Open any Prolog programming textbook (Clocksin & Mellish, Bratko, Sterling & Shapiro, O'Keefe, anything) and the first thing you learn about Prolog is that "code logic" is expressed in facts and rules, and that Prolog predicates don't "return" anything.
The confusion only deepens after that. There are no boolean values? In an untyped language? Imagine that. true/0 and false/0 are not values? In a language where everything is a predicate? Imagine that. Complete lack of understanding that "=" is a unification operator, and that unification is not assignment, like, really, it's not, it's not just a fancy way to pretend you don't do assignment while sneaking it in by the backdoor to be all smug and laugh at the noobs who aren't in the in-group, it's unification, it doesn't work as you think it should work if you think it should work like assignment because everything is immutable so you really, really don't need assignment. Complete misunderstanding of the cut, and its real dangers, complete misunderstanding of Negation as Failure, a central concept in logic programming (including in ASP) and so on and so on and so on and on.
The author failed to do due diligence. And if they've written "in a positive light" about Prolog, I would prefer not to read it because I'll pull my remaining hair out, which is not much after reading this.
Both you and GP have had some fairly strong responses to what looked like mild complaints, the kind I would expect anyone to have with a language they've used enough to find edges to.
In fact here I'm not sure whether the author expects = to work as an assignment or an equality. In \+(A = B) they seem to expect it to work as an equality, but in A = 1, B = 2, they seem to expect it to work as an assignment. It is neither.
I appreciate unification is confusing and takes effort to get one's head around, but note that the author is selling a book titled LOGIC FOR PROGR∀MMERS (in small caps) so they should really try to understand what the damn heck this logic programming stuff is all about. The book is $30.
> "This is necessary for optimization but can lead to invalid programs."
Is this not the case? It feels right in my head, but I assume I'm missing something.
My understanding: - Backtracking gets used to find other possible solutions - Cut stops backtracking early which means you might miss valid solutions - Cut is often useful to prune search branches you know are a waste of time but Prolog doesn't - But if you're wrong you might cut a branch with solutions you would have wanted and if Prolog iterates all other solutions then I guess you could say it's provided an invalid solution/program?
Again, please be gentle. This sounded reasonable to me and I'm trying to understand why it wouldn't be. It's totally possible that it feels reasonable because it might be a common misconception I've seen other places. My understanding of how Prolog actually works under-the-hood is very patchy.
That's right, but missing valid solutions doesn't mean that your program is "invalid", whatever that means. The author doesn't say.
Cuts are difficult and dangerous. The danger is that they make your program behave in unexpected ways. Then again, Prolor programs behave in unexpected ways even without the cut, and once you understand why, you can use the cut to make them behave.
In my experience, when one begins to program in Prolog, they pepper their code with cuts to try and stop unwanted bactracking, which can often be avoided by understanding why Prolog is backtracking in the first place. But that's a hard thing to get one's head around, so everyone who starts out makes a mess of their code with the cut.
There are very legitimate and safe ways to use cuts. Prolog textbooks sometimes introduce a terminology of "red" and "green" cuts. Red cuts change the set of answers found by a query, green cuts don't. And that, in itself, is already hard enough to get one's head around.
At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...
Ohhh, interesting. So a green cut is basically what I described as cutting branches you know are a waste of time, and red cuts are the ones where you're wrong and cut real solutions?
> At first, don't use the cut, until you know what you're doing, is I think the best advice to give to beginner Prolog programmers. And to advanced ones sometimes. I've seen things...
Yeah, I'm wondering how much of this is almost social or use-case in nature?
E.g., I'm experimenting with Prolog strictly as a logic language and I experiment with (at a really novice level) things like program synthesis or model-to-model transformations to emulate macro systems that flow kind of how JetBrains MPS handles similar things. I'm basically just trying to bend and flex bidirectional pure relations (I'm probably conflating fp terms here) because it's just sort of fun to me, yeah?
So cut _feels_ like something I'd only use if I were optimizing and largely just as something I'd never use because for my specific goals, it'd be kind of antithetical--and also I'm not an expert so it scares me. Basically I'm using it strictly because of the logic angle, and cut doesn't feel like a bad thing, but it feels like something I wouldn't use unless I created a situation where I needed it to get solutions faster or something--again, naively anyway.
Whereas if I were using Prolog as a daily GP language to actually get stuff done, which I know it's capable of, it makes a lot of sense to me to see cut and `break` as similar constructs for breaking out of a branch of computation that you know doesn't actually go anywhere?
I'm mostly spit-balling here and could be off base. Very much appreciate the response, either way.
thisWouldBeGreat: !, fail.
Bidirectionality has always been super fascinating.
Didn’t know about Picat. 100% going to check it out.
I’m mostly a language tourist they likes kicking the tires on modes of modeling problems that feel different to my brain.
Started skimming those notes. Really solid info. Appreciate it!
IMO it would be better to have something like Prolog as part of a "better designed" language per se. I can't come up with a good proposal myself - language design is hard, including syntax design - but imagine if Prolog would be a part of python. That feature would then be used by more people. (This is just an example; just randomly creeping in features into a more successful language, also often won't work. I am just giving this as an example that MIGHT be better.)
Count yourself lucky you (probably) learned programming in a language like Java or Python, and not, say, FORTRAN. Because then you'd really pray for the simplicity and elegance of definite clauses.
(Or not. FORTRAN programmers can write FORTRAN in any language, even FORTRAN).