2025/02/10

Newest at the top

2025-02-10 15:42:04 +0100 <dminuoso> Very roughly you could just say
2025-02-10 15:41:54 +0100 <dminuoso> euouae: Oh that's quite easy.
2025-02-10 15:41:39 +0100 <dminuoso> But I find them all confusing.
2025-02-10 15:41:30 +0100 <dminuoso> I know there's a *bunch* of extensions that try and give you ways to not do that..
2025-02-10 15:41:29 +0100 <euouae> But I also read that lenses deal with this problem, not sure how.
2025-02-10 15:41:22 +0100 <euouae> I read that there's some new GHC extension that solves this, or maybe a proposal: <https://ghc-proposals.readthedocs.io/en/latest/proposals/0282-record-dot-syntax.html>
2025-02-10 15:41:00 +0100 <dminuoso> I just do what most others do: data X = X { xName :: String } and data Y = Y { yName :: String }
2025-02-10 15:40:36 +0100 <euouae> well, data X = X {name :: String} and then data Y = Y {name :: String}
2025-02-10 15:40:19 +0100 <dminuoso> What namespace problem?
2025-02-10 15:40:11 +0100 <euouae> field accesosrs
2025-02-10 15:40:08 +0100 <euouae> One thing that I didn't understand, and maybe that's some GHC extension, is, how to beat the namespace problem for the record accessors?
2025-02-10 15:39:54 +0100 <dminuoso> If its not, I would refrain.
2025-02-10 15:39:39 +0100 <dminuoso> euouae: In general lens/optics is best when your data is deeply nested.
2025-02-10 15:39:36 +0100 <euouae> It might not be of serious use to me but the book does teach me some haskell too in between
2025-02-10 15:39:19 +0100 <euouae> Hmm... neat.
2025-02-10 15:39:01 +0100 <dminuoso> And `optics` gives us a tool to concisely manipulate that large structure in passes.
2025-02-10 15:38:58 +0100 <euouae> There's ways around the wackyness of the operators, one on top of my head is to color-code them
2025-02-10 15:38:45 +0100 <dminuoso> We have one big use case, which is a networking compiler. In our intermediate representation we have deeply nested data types (around 10 layers deep), with lists/maps, most have plenty of fields..
2025-02-10 15:37:45 +0100 <euouae> they do remind me also of lisp's SETF, which I also always liked but it is limited
2025-02-10 15:37:44 +0100 <dminuoso> But the DSL (especially all the operators) can look confusing, there's only so much %~~.! and ?!~..! that my eyes can tolerate.
2025-02-10 15:37:32 +0100 <euouae> maybe it's a style thing, but I like them personally
2025-02-10 15:36:53 +0100 <dminuoso> Execution performance is in general really good.
2025-02-10 15:36:40 +0100 <dminuoso> The latter.
2025-02-10 15:36:29 +0100 <euouae> by outperform are you talking about code performance or generally that they're the better tool?
2025-02-10 15:35:52 +0100 <dminuoso> Although I find the usecases where they outperform manual accessors are limited.
2025-02-10 15:35:04 +0100 <dminuoso> euouae: Yeah optics/lenses are some of the most novel libraries in Haskell.
2025-02-10 15:33:18 +0100alfiee(~alfiee@user/alfiee) (Ping timeout: 252 seconds)
2025-02-10 15:31:16 +0100CiaoSen(~Jura@ip-037-201-241-067.um10.pools.vodafone-ip.de) CiaoSen
2025-02-10 15:30:42 +0100 <euouae> Though lenses do make Haskell look like APL a bit, they are really cool. Like view/span of C++ but more powerful and expressive
2025-02-10 15:29:40 +0100acidjnk_new3(~acidjnk@p200300d6e7283f99c4dbaee3a15423f1.dip0.t-ipconnect.de) acidjnk
2025-02-10 15:29:09 +0100 <euouae> I've already written this app in the past in Python so it'll be exciting to see how much I can improve with Haskell
2025-02-10 15:28:51 +0100alfiee(~alfiee@user/alfiee) alfiee
2025-02-10 15:28:51 +0100 <euouae> Doing some leetcode I realized I wanted something that looked a lot like lenses so I got a book on them (penner's optics by example) and after that I'll try writing a web app
2025-02-10 15:27:58 +0100 <euouae> I read half of the STG paper before I felt that I've been going to deep for my own good
2025-02-10 15:24:52 +0100jespada(~jespada@2800:a4:2243:2100:5cc9:2329:b53c:b25f) (Quit: My Mac has gone to sleep. ZZZzzz…)
2025-02-10 15:24:48 +0100 <dminuoso> Its just that in my parser, each step produces a bytestring buffer, and I refocus (not with the primitive I linked above, that one is a bit more special) into that slice.
2025-02-10 15:24:04 +0100 <dminuoso> (Which is just a path description of the decoding tree)
2025-02-10 15:23:53 +0100 <dminuoso> Plus some labels and context as to what that Int, Bool or String is.
2025-02-10 15:23:40 +0100 <dminuoso> until you reach a primitive, and then that primitive gets turned into Int, Bool, String
2025-02-10 15:23:19 +0100koz(~koz@121.99.240.58)
2025-02-10 15:23:18 +0100 <dminuoso> essentially it's slowly zooming into a bytestring buffer (though that zooming action sometimes includes fusing some chunks together over the parsing)
2025-02-10 15:22:58 +0100 <haskellbridge> <Profpatsch> Ah, I see
2025-02-10 15:22:40 +0100 <dminuoso> What I do is still a kind of vertical parsing in that sense.
2025-02-10 15:22:34 +0100 <dminuoso> haskellbridge: No I get the idea of vertical parsing.
2025-02-10 15:22:26 +0100 <haskellbridge> <Profpatsch> 420 :)
2025-02-10 15:22:20 +0100 <haskellbridge> <Profpatsch> dminuoso: fwiw I’m not talking conventional parsers (stream of tokens to data struct) but “vertical” parsers (value of high entropy to value of lower entropy)
2025-02-10 15:22:12 +0100 <dminuoso> Profpatsch: We should compare notes when I have a little more time, I gotta blaze.
2025-02-10 15:21:53 +0100 <dminuoso> s/ready/read/
2025-02-10 15:21:44 +0100 <dminuoso> This should have been named :.:
2025-02-10 15:21:34 +0100 <dminuoso> Gah I find prefix Compose hard to ready.