2025/04/20

Newest at the top

2025-04-20 22:21:11 +0200 <monochrom> On every path, after 10 transitions or less?
2025-04-20 22:19:30 +0200Square3(~Square@user/square) (Ping timeout: 252 seconds)
2025-04-20 22:18:31 +0200 <haskellbridge> <hellwolf> But "10 steps" needs to be defined properly too, since in a labeled state transition diagram, it can be a tree, not a linear line.
2025-04-20 22:18:22 +0200 <haskellbridge> My half-arse LTL skill suggests me that should be easily expressible, it seems rather trivial.
2025-04-20 22:18:22 +0200 <haskellbridge> <hellwolf> > but can never say "within 10 steps".
2025-04-20 22:16:51 +0200 <haskellbridge> <hellwolf> But if you want to say, the data you obtained is "outdated", do not use it in the current labeled state, Ur becomes inadequate.
2025-04-20 22:16:26 +0200 <monochrom> Wait a second. Back to previous topic. Does any temporal logic count time? I thought they only order events and say "eventually" but can never say "within 10 steps".
2025-04-20 22:16:19 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn
2025-04-20 22:15:49 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) (Ping timeout: 248 seconds)
2025-04-20 22:12:48 +0200 <monochrom> More than nice. :)
2025-04-20 22:12:19 +0200 <monochrom> Ur pattern is nice, yes.
2025-04-20 22:11:16 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn
2025-04-20 22:11:11 +0200 <haskellbridge> <hellwolf> Managing resource uniqueness + Ur pattern <-- this is how linear arrows typically are used in GHC. That's my understanding.
2025-04-20 22:10:35 +0200 <haskellbridge> <hellwolf> Contrary to some common held understanding, I believe, LinearTypes in GHC does more than managing uniqueness of resources; which people may easily dismiss and say ST does it more or less, with less type safety but probably equal or better performance.
2025-04-20 22:09:56 +0200haritz(~hrtz@user/haritz) haritz
2025-04-20 22:09:56 +0200haritz(~hrtz@152.37.68.178) (Changing host)
2025-04-20 22:09:31 +0200haritz(~hrtz@152.37.68.178)
2025-04-20 22:09:17 +0200 <haskellbridge> <hellwolf> Instead, it helps you to have a type safe way to build linear temporal logic.
2025-04-20 22:09:04 +0200 <haskellbridge> So I'd say nothing prevents you from having lazy semantics, since you don't have to consume.
2025-04-20 22:09:02 +0200 <haskellbridge> <hellwolf> GHC's LinearTypes only says "A function f is linear if: when its result is consumed exactly once, then its argument is consumed exactly once."
2025-04-20 22:07:06 +0200 <haskellbridge> <hellwolf> maybe you give an example, and let me give it a try and learn?
2025-04-20 22:06:46 +0200haritz(~hrtz@user/haritz) (Ping timeout: 265 seconds)
2025-04-20 22:05:55 +0200 <haskellbridge> <hellwolf> why not? it's just lambda with linear arrows. You would need to take care of dup/ignore data to produce valid diagrams; the type system forces you anyways.
2025-04-20 22:03:55 +0200 <haskellbridge> <hellwolf> I am curious. I don't know the answer.
2025-04-20 22:03:53 +0200 <monochrom> Or maybe I assumed it didn't matter. The question I would ask: Can I obtain compositional reasoning from linear temporal logic?
2025-04-20 22:02:29 +0200 <monochrom> Maybe I did.
2025-04-20 22:02:03 +0200 <haskellbridge> <hellwolf> Do you assume laziness cannot be obtained from linear temporal logic?
2025-04-20 22:01:59 +0200 <monochrom> Sure! Now "take n (repeat ())" takes Theta(n)-time.
2025-04-20 22:01:35 +0200 <mauke> why do I need a statement about repeat ()?
2025-04-20 22:01:26 +0200 <lambdabot> []
2025-04-20 22:01:25 +0200 <mauke> > take 0 undefined
2025-04-20 22:00:30 +0200 <monochrom> (I did, it's my PhD thesis: https://www.cs.utoronto.ca/~trebla/albert-thesis-ut.pdf )
2025-04-20 21:59:17 +0200 <monochrom> How do you prove that in Haskell "take 0 (repeat ())" is O(1)-time, without a whole-program analysis of the the "take 0 (repeat ())"? Can you just prove one statement about "take 0", and one statement about "repeat ()", and combine them to deduce O(1)-time?
2025-04-20 21:57:52 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) (Ping timeout: 252 seconds)
2025-04-20 21:57:36 +0200 <monochrom> People like eager languages because running times are compositional. The running time of "take 0 . repeat" is factorable to (time of take 0) + (time of repeat) = infinity.
2025-04-20 21:54:46 +0200 <haskellbridge> <hellwolf> At least you didn't say it's false. I feel relieved :D
2025-04-20 21:54:41 +0200j1n37(~j1n37@user/j1n37) j1n37
2025-04-20 21:53:33 +0200 <monochrom> That is tautological.
2025-04-20 21:53:03 +0200 <haskellbridge> In some cases, when timeless equations are of advantage, denotational semantics is really great; but the moment you want to deal with performance, or worse, side effects, inevitably you need the help from the "timely" logic.
2025-04-20 21:53:03 +0200 <haskellbridge> <hellwolf> I am still conceiving this idea. And I am puching above my weight:
2025-04-20 21:51:02 +0200 <haskellbridge> <hellwolf> it's just boxes and wires with very rigorous rules.
2025-04-20 21:50:58 +0200 <monochrom> Compositional reasoning means if I want to prove "foo . bar" correct for example, I can break it down into proving a statement about foo, a statement about bar, and a combination of them to obtain overall correctness.
2025-04-20 21:50:48 +0200 <haskellbridge> linear-logic and symmetric monoidal category can represent operations in quite simple diagrams.
2025-04-20 21:50:48 +0200 <haskellbridge> <hellwolf> You sure?
2025-04-20 21:50:47 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn
2025-04-20 21:50:37 +0200j1n37(~j1n37@user/j1n37) (Read error: Connection reset by peer)
2025-04-20 21:48:53 +0200j1n37(~j1n37@user/j1n37) j1n37
2025-04-20 21:48:51 +0200 <monochrom> OK here is why my preference, but I can't speak for others. It is easier to do compositional reasoning in denotational semantics than in operational semantics.
2025-04-20 21:48:35 +0200j1n37(~j1n37@user/j1n37) (Read error: Connection reset by peer)
2025-04-20 21:43:51 +0200 <monochrom> Then don't.