2024/11/14

2024-11-14 00:01:51 +0100falafel(~falafel@2600:1700:99f4:2050:41b3:d17e:817a:4e83) falafel
2024-11-14 00:02:55 +0100Everything(~Everythin@46.211.104.82) (Quit: leaving)
2024-11-14 00:28:22 +0100Xe_(~Xe@perl/impostor/xe) Xe
2024-11-14 00:29:11 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 00:29:57 +0100acidjnk_new3(~acidjnk@p200300d6e7283f7100fa0b96aa6639bf.dip0.t-ipconnect.de) (Ping timeout: 248 seconds)
2024-11-14 00:32:33 +0100acidjnk_new3(~acidjnk@p200300d6e7283f717cba866c0fa9f7cd.dip0.t-ipconnect.de)
2024-11-14 00:47:47 +0100 <jackdk> I want to provide a type family-shaped helper that identifies the type of a record field, something like `FieldType "foo" MyRecord` reducing to `Bar`. I can get at the type of a field by looking at the fundep on the `HasField` class (GHC gives me in `instance HasField "foo" MyRecord Bar`, but is there a good idiom for binding and returning that type variable using a type family?
2024-11-14 00:48:23 +0100alexherbo2(~alexherbo@2a02-8440-3117-f07c-987b-fc29-77ee-addd.rev.sfr.net) (Remote host closed the connection)
2024-11-14 00:49:22 +0100CoolMa7(~CoolMa7@ip5f5b8957.dynamic.kabel-deutschland.de) (Quit: My Mac has gone to sleep. ZZZzzz…)
2024-11-14 00:49:54 +0100athostFI(~Atte@176-93-56-50.bb.dnainternet.fi)
2024-11-14 00:50:52 +0100alp(~alp@2001:861:e3d6:8f80:8dec:7d0f:9187:87d0) (Remote host closed the connection)
2024-11-14 00:51:04 +0100 <Axman6> This feels like it might be easier with generics-sop, but it's been a long time since I looked at any of these things
2024-11-14 00:51:40 +0100alp(~alp@2001:861:e3d6:8f80:cd0a:c39d:37b7:c1a3)
2024-11-14 00:53:05 +0100Sgeo(~Sgeo@user/sgeo) Sgeo
2024-11-14 00:53:24 +0100alp(~alp@2001:861:e3d6:8f80:cd0a:c39d:37b7:c1a3) (Remote host closed the connection)
2024-11-14 00:54:14 +0100alp(~alp@2001:861:e3d6:8f80:c1d0:6a01:957c:3af2)
2024-11-14 00:55:56 +0100alp(~alp@2001:861:e3d6:8f80:c1d0:6a01:957c:3af2) (Remote host closed the connection)
2024-11-14 01:01:19 +0100falafel(~falafel@2600:1700:99f4:2050:41b3:d17e:817a:4e83) (Ping timeout: 260 seconds)
2024-11-14 01:10:06 +0100acidjnk_new3(~acidjnk@p200300d6e7283f717cba866c0fa9f7cd.dip0.t-ipconnect.de) (Read error: Connection reset by peer)
2024-11-14 01:10:17 +0100alexherbo2(~alexherbo@2a02-8440-3117-f07c-987b-fc29-77ee-addd.rev.sfr.net) alexherbo2
2024-11-14 01:11:02 +0100 <Leary> jackdk: I doubt there's anything like an 'idiom' for this. Does `class HasField f r (Field f r) => HasFieldF f r where { type Field f r }; instance HasField f r t => HasFieldF f r where { type Field f r = t }` work?
2024-11-14 01:14:53 +0100Tuplanolla(~Tuplanoll@91-159-69-59.elisa-laajakaista.fi) (Quit: Leaving.)
2024-11-14 01:16:38 +0100Lord_of_Life(~Lord@user/lord-of-life/x-2819915) (Ping timeout: 245 seconds)
2024-11-14 01:17:43 +0100 <glguy> jackdk: You could do something like this: https://bpa.st/AYDQ
2024-11-14 01:18:38 +0100Lord_of_Life(~Lord@user/lord-of-life/x-2819915) Lord_of_Life
2024-11-14 01:20:20 +0100arahael(~arahael@user/arahael) (Quit: Lost terminal)
2024-11-14 01:25:58 +0100sprotte24(~sprotte24@p200300d16f059400e8d39b8ffa006815.dip0.t-ipconnect.de) (Quit: Leaving)
2024-11-14 01:26:14 +0100xff0x(~xff0x@2405:6580:b080:900:ca42:e655:d7e4:ec2b) (Ping timeout: 272 seconds)
2024-11-14 01:32:10 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 252 seconds)
2024-11-14 01:35:18 +0100 <jackdk> Leary: alas no: "The RHS of an associated type declaration mentions out-of-scope variable ‘t’ All such variables must be bound on the LHS"; glguy: Yeah, recursing through the `Rep` seems like the best bet. Thanks to you both.
2024-11-14 01:36:34 +0100MironZ3(~MironZ@nat-infra.ehlab.uk) (Quit: Ping timeout (120 seconds))
2024-11-14 01:36:34 +0100Square(~Square@user/square) Square
2024-11-14 01:39:10 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Read error: Connection reset by peer)
2024-11-14 01:43:55 +0100MironZ3(~MironZ@nat-infra.ehlab.uk)
2024-11-14 01:44:15 +0100ljdarj1(~Thunderbi@user/ljdarj) ljdarj
2024-11-14 01:47:34 +0100ljdarj(~Thunderbi@user/ljdarj) (Ping timeout: 265 seconds)
2024-11-14 01:47:34 +0100ljdarj1ljdarj
2024-11-14 01:48:46 +0100athostFI(~Atte@176-93-56-50.bb.dnainternet.fi) (Read error: Connection reset by peer)
2024-11-14 01:49:48 +0100 <jle`> does anybody know if there has been any updates on https://github.com/haskell/cabal/issues/9577 ? is there a good way to get haddock to do multiple sublibraries?
2024-11-14 01:50:03 +0100 <glguy> getting ready for aoc? ;-)
2024-11-14 01:51:19 +0100 <geekosaur> https://github.com/haskell/cabal/pull/9821 maybe?
2024-11-14 01:51:52 +0100 <jle`> glguy: heh how did you guess
2024-11-14 01:52:04 +0100 <jle`> i am merging all of my aoc libs into a single master cabal project
2024-11-14 01:52:14 +0100 <glguy> jle`: I can't think of any other reason to use multiple sublibraries ^_^
2024-11-14 01:52:38 +0100 <geekosaur> amazonka and recent HLS use them
2024-11-14 01:52:59 +0100 <geekosaur> HLS for all its plugins, amazonka for all its generated service packages
2024-11-14 01:53:06 +0100 <glguy> geekosaur: Maybe someone used those libraries to solve an aoc problem then
2024-11-14 01:53:09 +0100 <Leary> jle`: There's some discussion on it here: https://discourse.haskell.org/t/best-practices-for-public-cabal-sublibraries/10272
2024-11-14 01:53:26 +0100 <jle`> geekosaur: ah that does seem promising, do you know if it's in any cabal releases?
2024-11-14 01:54:06 +0100 <geekosaur> not yet but it should be in 3.14.1.0
2024-11-14 01:54:16 +0100 <geekosaur> which is due around when ghc 9.12.1 GA is
2024-11-14 01:54:32 +0100 <jle`> ooh, that's in a matter of days right?
2024-11-14 01:54:36 +0100 <geekosaur> the release process has already begun
2024-11-14 01:54:47 +0100 <jle`> woo hoo
2024-11-14 01:55:19 +0100 <jle`> rate of cabal improvements has been amazing
2024-11-14 02:14:21 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 02:18:56 +0100xff0x(~xff0x@fsb6a9491c.tkyc517.ap.nuro.jp)
2024-11-14 02:21:07 +0100ljdarj(~Thunderbi@user/ljdarj) (Quit: ljdarj)
2024-11-14 02:27:07 +0100CrunchyFlakes_(~CrunchyFl@ip1f13e94e.dynamic.kabel-deutschland.de) (Ping timeout: 264 seconds)
2024-11-14 02:28:51 +0100yin(~z@user/zero) (Read error: Connection reset by peer)
2024-11-14 02:29:30 +0100zero(~z@user/zero) zero
2024-11-14 02:33:06 +0100califax(~califax@user/califx) (Remote host closed the connection)
2024-11-14 02:43:43 +0100califax(~califax@user/califx) califx
2024-11-14 02:45:51 +0100jero98772(~jero98772@190.158.28.32)
2024-11-14 02:47:45 +0100telser(~quassel@user/telser) (Quit: https://quassel-irc.org - Chat comfortably. Anywhere.)
2024-11-14 02:55:16 +0100jero98772(~jero98772@190.158.28.32) (Ping timeout: 244 seconds)
2024-11-14 03:00:23 +0100housemate(~housemate@146.70.66.228) (Quit: "I saw it in a tiktok video and thought that it was the most smartest answer ever." ~ AnonOps Radio [some time some place] | I AM THE DERIVATIVE I AM GOING TANGENT TO THE CURVE!)
2024-11-14 03:13:26 +0100Smiles(uid551636@id-551636.lymington.irccloud.com) Smiles
2024-11-14 03:14:45 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) (Ping timeout: 246 seconds)
2024-11-14 03:30:17 +0100myxos(~myxos@syn-065-028-251-121.res.spectrum.com) myxokephale
2024-11-14 04:06:25 +0100alexherbo2(~alexherbo@2a02-8440-3117-f07c-987b-fc29-77ee-addd.rev.sfr.net) (Remote host closed the connection)
2024-11-14 04:13:00 +0100arahael(~arahael@user/arahael) arahael
2024-11-14 04:16:04 +0100arahael_(~arahael@user/arahael) arahael
2024-11-14 04:29:59 +0100td_(~td@i53870901.versanet.de) (Ping timeout: 260 seconds)
2024-11-14 04:31:21 +0100td_(~td@i5387092A.versanet.de) td_
2024-11-14 04:46:17 +0100Pozyomka(~pyon@user/pyon) (Quit: Reboot.)
2024-11-14 04:52:54 +0100agent314(~quassel@static-198-44-129-53.cust.tzulo.com) agent314
2024-11-14 05:09:25 +0100bitdex(~bitdex@gateway/tor-sasl/bitdex) bitdex
2024-11-14 05:10:33 +0100agent314(~quassel@static-198-44-129-53.cust.tzulo.com) (Ping timeout: 276 seconds)
2024-11-14 05:12:25 +0100divya(~user@139.5.11.223) divya
2024-11-14 05:19:01 +0100mange(~user@user/mange) mange
2024-11-14 05:19:01 +0100mange(~user@user/mange) (Excess Flood)
2024-11-14 05:25:13 +0100mc47(~mc47@xmonad/TheMC47) (Remote host closed the connection)
2024-11-14 05:25:33 +0100mc47(~mc47@xmonad/TheMC47) mc47
2024-11-14 05:27:30 +0100divya(~user@139.5.11.223) (Remote host closed the connection)
2024-11-14 05:30:03 +0100Pozyomka(~pyon@user/pyon) pyon
2024-11-14 05:30:40 +0100mange(~user@user/mange) mange
2024-11-14 05:31:26 +0100mange(~user@user/mange) (Client Quit)
2024-11-14 05:32:37 +0100pavonia(~user@user/siracusa) (Quit: Bye!)
2024-11-14 05:36:48 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 05:37:01 +0100stiell_(~stiell@gateway/tor-sasl/stiell) (Ping timeout: 260 seconds)
2024-11-14 05:41:15 +0100stiell_(~stiell@gateway/tor-sasl/stiell) stiell
2024-11-14 05:54:07 +0100vanishingideal(~vanishing@user/vanishingideal) vanishingideal
2024-11-14 06:06:35 +0100mikko(~mikko@user/mikko) (Ping timeout: 255 seconds)
2024-11-14 06:13:50 +0100agent314(~quassel@static-198-44-129-53.cust.tzulo.com) agent314
2024-11-14 06:14:44 +0100visilii_(~visilii@213.24.127.47)
2024-11-14 06:14:54 +0100visilii(~visilii@213.24.133.209) (Ping timeout: 276 seconds)
2024-11-14 06:36:59 +0100Guest16(~Guest16@2401:4900:65c9:bca3:883d:d42c:cc19:7f95)
2024-11-14 06:37:48 +0100 <Guest16> hi
2024-11-14 06:37:56 +0100 <Guest16> Does anyone know why http://wiki.haskell.org/ is down
2024-11-14 06:39:54 +0100 <Axman6> There's been issues with the machine it runs on lately, which I believe are proving to be quite hard to fix. sm I think knows more (there's also #haskell-infrastructure)
2024-11-14 06:41:03 +0100 <probie> Guest16: https://mail.haskell.org/pipermail/haskell-cafe/2024-November/136929.html
2024-11-14 06:41:44 +0100 <Guest16> thank you
2024-11-14 06:52:22 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Remote host closed the connection)
2024-11-14 06:52:41 +0100Guest16(~Guest16@2401:4900:65c9:bca3:883d:d42c:cc19:7f95) (Quit: Client closed)
2024-11-14 06:58:06 +0100takuan(~takuan@178-116-218-225.access.telenet.be)
2024-11-14 07:00:42 +0100 <haskellbridge> <sm> https://github.com/haskell/haskell-wiki-configuration/issues/43
2024-11-14 07:02:55 +0100misterfish(~misterfis@84.53.85.146) misterfish
2024-11-14 07:04:06 +0100michalz(~michalz@185.246.207.203)
2024-11-14 07:08:06 +0100philopsos(~caecilius@user/philopsos) (Quit: Lost terminal)
2024-11-14 07:08:12 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 07:12:25 +0100Smiles(uid551636@id-551636.lymington.irccloud.com) (Quit: Connection closed for inactivity)
2024-11-14 07:16:45 +0100Square(~Square@user/square) (Remote host closed the connection)
2024-11-14 07:17:09 +0100Square(~Square@user/square) Square
2024-11-14 07:25:25 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 248 seconds)
2024-11-14 07:31:18 +0100Square2(~Square4@user/square) Square
2024-11-14 07:34:04 +0100Square(~Square@user/square) (Ping timeout: 252 seconds)
2024-11-14 07:51:43 +0100alp(~alp@2001:861:e3d6:8f80:c4b2:beb0:f361:d694)
2024-11-14 07:57:54 +0100hellwolf(~user@0e2f-3a3b-aecf-adb3-0f00-4d40-07d0-2001.sta.estpak.ee) (Ping timeout: 246 seconds)
2024-11-14 07:58:37 +0100divya(~user@139.5.11.223) divya
2024-11-14 07:59:20 +0100divya(~user@139.5.11.223) (Quit: ERC 5.6.0.30.1 (IRC client for GNU Emacs 30.0.91))
2024-11-14 08:00:12 +0100divya(~user@139.5.11.223) divya
2024-11-14 08:00:12 +0100tv(~tv@user/tv) (Read error: Connection reset by peer)
2024-11-14 08:06:51 +0100Sgeo(~Sgeo@user/sgeo) (Read error: Connection reset by peer)
2024-11-14 08:09:38 +0100Xe(~cadey@perl/impostor/xe) (Ping timeout: 248 seconds)
2024-11-14 08:09:59 +0100Xe_(~Xe@perl/impostor/xe) (Ping timeout: 252 seconds)
2024-11-14 08:16:08 +0100Xe(~Xe@perl/impostor/xe) Xe
2024-11-14 08:17:18 +0100Cadey(~cadey@perl/impostor/xe) Xe
2024-11-14 08:30:14 +0100acidjnk(~acidjnk@p200300d6e7283f73687bc11ede7922f8.dip0.t-ipconnect.de) acidjnk
2024-11-14 08:34:21 +0100petrichor(~znc-user@user/petrichor) petrichor
2024-11-14 08:45:10 +0100vanishingideal(~vanishing@user/vanishingideal) (Ping timeout: 265 seconds)
2024-11-14 08:45:17 +0100ubert(~Thunderbi@178.165.164.236.wireless.dyn.drei.com) ubert
2024-11-14 08:51:11 +0100ft(~ft@p4fc2a216.dip0.t-ipconnect.de) (Quit: leaving)
2024-11-14 08:51:35 +0100vanishingideal(~vanishing@user/vanishingideal) vanishingideal
2024-11-14 08:51:46 +0100lortabac(~lortabac@2a01:e0a:541:b8f0:55ab:e185:7f81:54a4) lortabac
2024-11-14 08:53:54 +0100kuribas(~user@2a02:1808:84:5008:bc1f:a609:eab5:5cb9) kuribas
2024-11-14 08:55:57 +0100ubert(~Thunderbi@178.165.164.236.wireless.dyn.drei.com) (Quit: ubert)
2024-11-14 08:58:52 +0100kuribas(~user@2a02:1808:84:5008:bc1f:a609:eab5:5cb9) (Remote host closed the connection)
2024-11-14 08:59:05 +0100kuribas(~user@2a02:1808:84:5008:61f:fb32:d5a4:cce1) kuribas
2024-11-14 09:00:02 +0100caconym(~caconym@user/caconym) (Quit: bye)
2024-11-14 09:00:39 +0100caconym(~caconym@user/caconym) caconym
2024-11-14 09:01:47 +0100kuribas`(~user@ip-188-118-57-242.reverse.destiny.be) kuribas
2024-11-14 09:03:43 +0100kuribas(~user@2a02:1808:84:5008:61f:fb32:d5a4:cce1) (Ping timeout: 264 seconds)
2024-11-14 09:08:39 +0100vanishingideal(~vanishing@user/vanishingideal) (Ping timeout: 252 seconds)
2024-11-14 09:10:24 +0100vanishingideal(~vanishing@user/vanishingideal) vanishingideal
2024-11-14 09:16:00 +0100falafel(~falafel@2600:1700:99f4:2050:1cad:26ba:1279:135d) falafel
2024-11-14 09:16:18 +0100tv(~tv@user/tv) tv
2024-11-14 09:22:38 +0100mceresa(~mceresa@user/mceresa) (Remote host closed the connection)
2024-11-14 09:22:47 +0100mceresa(~mceresa@user/mceresa) mceresa
2024-11-14 09:25:02 +0100misterfish(~misterfis@84.53.85.146) (Ping timeout: 255 seconds)
2024-11-14 09:27:15 +0100Smiles(uid551636@id-551636.lymington.irccloud.com) Smiles
2024-11-14 09:38:42 +0100falafel(~falafel@2600:1700:99f4:2050:1cad:26ba:1279:135d) (Remote host closed the connection)
2024-11-14 09:51:18 +0100hellwolf(~user@2001:1530:70:545:809e:22e1:baa3:1e4c) hellwolf
2024-11-14 09:58:59 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) machinedgod
2024-11-14 10:02:51 +0100alphazone(~alphazone@2.219.56.221) (Ping timeout: 246 seconds)
2024-11-14 10:05:57 +0100Maxdamantus(~Maxdamant@user/maxdamantus) (Ping timeout: 248 seconds)
2024-11-14 10:08:06 +0100rvalue(~rvalue@user/rvalue) (Read error: Connection reset by peer)
2024-11-14 10:08:36 +0100rvalue(~rvalue@user/rvalue) rvalue
2024-11-14 10:13:18 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) misterfish
2024-11-14 10:14:38 +0100CrunchyFlakes(~CrunchyFl@ip1f13e94e.dynamic.kabel-deutschland.de)
2024-11-14 10:19:37 +0100Maxdamantus(~Maxdamant@user/maxdamantus) Maxdamantus
2024-11-14 10:24:45 +0100vanishingideal(~vanishing@user/vanishingideal) (Quit: leaving)
2024-11-14 10:31:28 +0100chele(~chele@user/chele) chele
2024-11-14 10:32:08 +0100alp(~alp@2001:861:e3d6:8f80:c4b2:beb0:f361:d694) (Remote host closed the connection)
2024-11-14 10:32:14 +0100tzh(~tzh@c-76-115-131-146.hsd1.or.comcast.net) (Quit: zzz)
2024-11-14 10:32:25 +0100alp(~alp@2001:861:e3d6:8f80:c18:bc99:f25e:38cc)
2024-11-14 10:41:23 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2024-11-14 10:42:06 +0100favalex(~favalex@176.200.207.41)
2024-11-14 10:50:16 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 10:53:06 +0100favalex(~favalex@176.200.207.41) (Quit: Client closed)
2024-11-14 11:05:52 +0100mari18976(~mari-este@user/mari-estel) mari-estel
2024-11-14 11:07:14 +0100hgolden(~hgolden@2603:8000:9d00:3ed1:6c70:1ac0:d127:74dd) (Ping timeout: 260 seconds)
2024-11-14 11:08:17 +0100mari-estel(~mari-este@user/mari-estel) (Ping timeout: 248 seconds)
2024-11-14 11:10:27 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 11:10:45 +0100mari18976(~mari-este@user/mari-estel) (Read error: Connection reset by peer)
2024-11-14 11:12:07 +0100mari24610(~mari-este@user/mari-estel) mari-estel
2024-11-14 11:14:04 +0100lxsameer(~lxsameer@Serene/lxsameer) lxsameer
2024-11-14 11:15:10 +0100xff0x(~xff0x@fsb6a9491c.tkyc517.ap.nuro.jp) (Ping timeout: 252 seconds)
2024-11-14 11:15:12 +0100mari-estel(~mari-este@user/mari-estel) (Ping timeout: 276 seconds)
2024-11-14 11:20:44 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 11:23:39 +0100mari24610(~mari-este@user/mari-estel) (Ping timeout: 276 seconds)
2024-11-14 11:24:27 +0100ash3en(~Thunderbi@149.222.147.110) ash3en
2024-11-14 11:27:17 +0100Digitteknohippie(~user@user/digit) Digit
2024-11-14 11:27:49 +0100Digit(~user@user/digit) (Ping timeout: 260 seconds)
2024-11-14 11:28:05 +0100ash3en(~Thunderbi@149.222.147.110) (Client Quit)
2024-11-14 11:32:25 +0100Smiles(uid551636@id-551636.lymington.irccloud.com) (Quit: Connection closed for inactivity)
2024-11-14 11:33:37 +0100DigitteknohippieDigit
2024-11-14 11:49:20 +0100mikko(~mikko@user/mikko) mikko
2024-11-14 11:57:08 +0100mari-estel(~mari-este@user/mari-estel) (Remote host closed the connection)
2024-11-14 11:57:18 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 11:58:42 +0100mari-estel(~mari-este@user/mari-estel) (Remote host closed the connection)
2024-11-14 11:58:53 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:00:01 +0100Smiles(uid551636@id-551636.lymington.irccloud.com) Smiles
2024-11-14 12:05:29 +0100mari96334(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:06:53 +0100mari96334(~mari-este@user/mari-estel) (Remote host closed the connection)
2024-11-14 12:07:05 +0100mari89179(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:07:42 +0100mari-estel(~mari-este@user/mari-estel) (Ping timeout: 252 seconds)
2024-11-14 12:11:27 +0100xff0x(~xff0x@ai080132.d.east.v6connect.net)
2024-11-14 12:22:06 +0100jero98772(~jero98772@190.158.28.32)
2024-11-14 12:26:32 +0100__monty__(~toonn@user/toonn) toonn
2024-11-14 12:38:40 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:40:42 +0100mari89179(~mari-este@user/mari-estel) (Ping timeout: 252 seconds)
2024-11-14 12:42:19 +0100mari29333(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:43:13 +0100mari-estel(~mari-este@user/mari-estel) (Read error: Connection reset by peer)
2024-11-14 12:43:53 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 12:46:03 +0100mari-estel(~mari-este@user/mari-estel) (Client Quit)
2024-11-14 12:47:34 +0100mari29333(~mari-este@user/mari-estel) (Ping timeout: 260 seconds)
2024-11-14 12:52:05 +0100pavonia(~user@user/siracusa) siracusa
2024-11-14 12:57:24 +0100 <hellwolf> Is "IOPhobia" pathological case? After decades of programming, I find pure joy in writing main part of the code that deals with zero IO. And only Haskell can guarantee that, to the extent that I am questioning if I am sick.
2024-11-14 12:58:54 +0100jero98772(~jero98772@190.158.28.32) (Remote host closed the connection)
2024-11-14 13:00:04 +0100caconym(~caconym@user/caconym) (Quit: bye)
2024-11-14 13:01:18 +0100 <Rembane> hellwolf: Nah, it's sound. Not having to deal with side effects makes code so much easier to write, read and test.
2024-11-14 13:02:11 +0100caconym(~caconym@user/caconym) caconym
2024-11-14 13:03:23 +0100 <hellwolf> I hesitate to make a connetion with germophobia, since I personally am an opposite of a germophobia.
2024-11-14 13:04:00 +0100 <Leary> hellwolf: Welcome to the oasis of sanity.
2024-11-14 13:04:09 +0100 <Rembane> Some germs are quite good to not be in contact with IMO
2024-11-14 13:05:28 +0100 <hellwolf> like unsafePerformIO?
2024-11-14 13:07:45 +0100acidjnk(~acidjnk@p200300d6e7283f73687bc11ede7922f8.dip0.t-ipconnect.de) (Ping timeout: 248 seconds)
2024-11-14 13:09:11 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) (Ping timeout: 252 seconds)
2024-11-14 13:09:23 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 13:13:49 +0100 <dminuoso> unsafePerformIO is indeed quite unsafe. :-)
2024-11-14 13:16:33 +0100hellwolflike when label is truth for to itself.
2024-11-14 13:17:57 +0100 <dminuoso> It was a simple case of something like `replicate n (unsafePerformIO (newIORef []))`, which GHC happily refactored into `let x = unsafePerformIO (newIORef []) in replicate n x`
2024-11-14 13:18:14 +0100housemate(~housemate@146.70.66.228) (Quit: "I saw it in a tiktok video and thought that it was the most smartest answer ever." ~ AnonOps Radio [some time some place] | I AM THE DERIVATIVE I AM GOING TANGENT TO THE CURVE!)
2024-11-14 13:18:15 +0100 <dminuoso> (In reality the code was far more sophisticated, so it was neither obvious how or why this happened)
2024-11-14 13:18:46 +0100 <dminuoso> I mean actually there was a `traverse_` in there too.
2024-11-14 13:19:48 +0100 <dminuoso> Yeah I think it was something like `unsafePerformIO (traverse_ (\_ -> newIORef []) xs)` and GHC successfully floated that IORef out
2024-11-14 13:20:06 +0100 <dminuoso> Ill have to dig through the commit history to find this one.
2024-11-14 13:20:14 +0100 <hellwolf> which code base?
2024-11-14 13:20:26 +0100 <dminuoso> An internal compiler of ours.
2024-11-14 13:20:32 +0100 <dminuoso> No, those examples I named are both wrong. Mmm.
2024-11-14 13:22:02 +0100 <dminuoso> hellwolf: Anyway, IO can still be a useful tool, especially if you want any kind of introspectability of whats going on (say logging or debugging)
2024-11-14 13:22:21 +0100 <dminuoso> Pure code is often cumbersome to debug
2024-11-14 13:22:42 +0100 <dminuoso> Consider something like GHC, where large portions work in IO
2024-11-14 13:23:59 +0100mari59415(~mari-este@user/mari-estel) mari-estel
2024-11-14 13:24:25 +0100arahael_(~arahael@user/arahael) (Quit: leaving)
2024-11-14 13:25:23 +0100arahael_(~arahael@user/arahael) arahael
2024-11-14 13:26:03 +0100mari-estel(~mari-este@user/mari-estel) (Ping timeout: 252 seconds)
2024-11-14 13:26:39 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2024-11-14 13:27:19 +0100 <hellwolf> runTrace your_pure_fn your_trace_filters ...currying your_pure_fn_args...
2024-11-14 13:28:06 +0100 <dminuoso> What is `runTrace` supposed to be here?
2024-11-14 13:28:25 +0100 <hellwolf> that'd be my ideal way of trace into your pure fn in a principled way. I am entirely sure how feasible/difficult it could be; I did somethings that involve some aspects of such a thing.
2024-11-14 13:28:32 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) misterfish
2024-11-14 13:28:48 +0100 <hellwolf> sorry, typed too slow. I meant to propose a hypothetical
2024-11-14 13:31:46 +0100haskellbridge(~hackager@syn-024-093-192-219.res.spectrum.com) (Remote host closed the connection)
2024-11-14 13:31:51 +0100 <lortabac> "Pure code is often cumbersome to debug" *with GHC*
2024-11-14 13:32:28 +0100 <lortabac> I don't think we should see lack of observability as an intrinsic property of pure computations
2024-11-14 13:32:36 +0100haskellbridge(~hackager@syn-024-093-192-219.res.spectrum.com) hackager
2024-11-14 13:32:36 +0100ChanServ+v haskellbridge
2024-11-14 13:32:42 +0100 <mari59415> no mentions about pure code being amounts easier to test
2024-11-14 13:33:53 +0100 <hellwolf> But I find the habit of spending more time in thinking then examining into what happened is a better use of time. Of course, on the contrary, Linus, notoriously, promoted the idea of printf debugging. So I guess the tool influence on how you do troubleshooting.
2024-11-14 13:34:38 +0100 <hellwolf> exactly, mari59415, it is a problem most applicable to impure code. For pure code, you write properties (which means thinking a lot about what you are writing.)
2024-11-14 13:35:34 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 13:36:10 +0100 <hellwolf> fwiw, @dminuoso, I had a small example here https://discourse.haskell.org/t/variable-arity-currying-helper/10659 that decorates "let foo' = curry' (MkFn foo)" but that assumes all arguments is "showable". to make it runTrace, you'd need to have a default instance for all types, and then overlapping instance for Show, Num, Functor, etc.
2024-11-14 13:37:49 +0100mari59415(~mari-este@user/mari-estel) (Read error: Connection reset by peer)
2024-11-14 13:38:40 +0100 <mari-estel> huh properties help equally with pure and monadic
2024-11-14 13:38:40 +0100 <mari-estel> prints or traces are a good way to collect test samples while troubleshooting
2024-11-14 13:41:35 +0100 <hellwolf> Does Trace.trace help?
2024-11-14 13:41:53 +0100 <hellwolf> Debug.Trace (trace)
2024-11-14 13:43:13 +0100mari73904(~mari-este@user/mari-estel) mari-estel
2024-11-14 13:44:19 +0100mari-estel(~mari-este@user/mari-estel) (Read error: Connection reset by peer)
2024-11-14 13:44:51 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 13:45:55 +0100 <kuribas`> The problem is that the GHC debugger follow the imperative model for debugging (stepping through, etc..)
2024-11-14 13:46:10 +0100 <kuribas`> A more useful pure debugger would allow you to choose which expression to evaluate.
2024-11-14 13:46:40 +0100 <kuribas`> I the end, lazyness doesn't specify an order for execution.
2024-11-14 13:47:00 +0100 <kuribas`> As long as the semantics are preserved.
2024-11-14 13:47:01 +0100mari-estel(~mari-este@user/mari-estel) (Client Quit)
2024-11-14 13:47:50 +0100mari73904(~mari-este@user/mari-estel) (Ping timeout: 255 seconds)
2024-11-14 13:48:41 +0100 <__monty__> That would also cause confusion though. Since sometimes referential transparency is a lie. And it's easy to convince yourself that the expressions must surely be evaluating in the order you think they are.
2024-11-14 13:58:57 +0100 <kuribas`> __monty__: how can it be a lie with "pure" code?
2024-11-14 13:59:08 +0100 <kuribas`> Assuming it doesn't use unsafePerformIO.
2024-11-14 13:59:51 +0100alphazone(~alphazone@2.219.56.221)
2024-11-14 14:02:26 +0100 <__monty__> There's the rub : )
2024-11-14 14:05:11 +0100 <haskellbridge> <hellwolf> the lie is limited to the extent that, if your program is not total, would the debugger hurt your bottom where you intend to leave it so.
2024-11-14 14:12:04 +0100 <kuribas`> > head [1, undefined]
2024-11-14 14:12:05 +0100 <lambdabot> 1
2024-11-14 14:12:30 +0100 <kuribas`> If you would evaluate the second element of the list, the debugger should not halt the whole expression.
2024-11-14 14:13:44 +0100 <bailsman> Huh, are mutable vectors a scam? `VM.iforM_ mv $ \i x -> VM.write mv i (updateValue x)` is considerably slower for simple objects, and barely faster than `map updateValue` even for large complex objects.
2024-11-14 14:14:28 +0100 <geekosaur> they will definitely have costs you don't incur with immutable vectors
2024-11-14 14:15:25 +0100 <bailsman> So the use cases are considerably more niche than I thought. Like if you need to exchange two elements or something, the pure version would have to copy the entire thing and the mutable version only two elements. But for most cases, it's a bait?
2024-11-14 14:16:30 +0100 <bailsman> If you expect to touch every element, just use map.
2024-11-14 14:16:51 +0100 <geekosaur> pretty much
2024-11-14 14:17:21 +0100 <geekosaur> it's still going to do copies, I think, and more of them the more elements you touch. but I'mnot sure how that plays out for vector
2024-11-14 14:18:05 +0100 <geekosaur> for Array it's split into "cards" and modifications within a single card are batched so only a single copy needs to be done by the mutator, AIUI
2024-11-14 14:18:13 +0100 <bailsman> I tried look at it with -ddump-simpl and the mutable version doesn't compile to simple code at all. What should be like 5 assembly instructions turns into several pages of assembly.
2024-11-14 14:18:30 +0100 <geekosaur> but that's built into GC and I don't think vector can take advantage of it
2024-11-14 14:18:58 +0100 <bailsman> I think if you need a mutable algorithm maybe you should do a CFFI or something.
2024-11-14 14:24:13 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) alexherbo2
2024-11-14 14:32:56 +0100weary-traveler(~user@user/user363627) (Remote host closed the connection)
2024-11-14 14:33:10 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2024-11-14 14:35:46 +0100acidjnk(~acidjnk@p200300d6e7283f73687bc11ede7922f8.dip0.t-ipconnect.de) acidjnk
2024-11-14 14:38:28 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 14:48:24 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) (Ping timeout: 276 seconds)
2024-11-14 14:50:45 +0100weary-traveler(~user@user/user363627) user363627
2024-11-14 14:56:06 +0100bitdex(~bitdex@gateway/tor-sasl/bitdex) (Quit: = "")
2024-11-14 15:01:26 +0100ash3en(~Thunderbi@149.222.147.110) ash3en
2024-11-14 15:05:34 +0100ash3en(~Thunderbi@149.222.147.110) (Client Quit)
2024-11-14 15:06:31 +0100L29Ah(~L29Ah@wikipedia/L29Ah) (Ping timeout: 265 seconds)
2024-11-14 15:10:02 +0100 <dminuoso> bailsman: Do you have the actual code and the generated core to look at?
2024-11-14 15:16:32 +0100mari-estel(~mari-este@user/mari-estel) (Quit: errands)
2024-11-14 15:18:37 +0100Sgeo(~Sgeo@user/sgeo) Sgeo
2024-11-14 15:30:28 +0100mari-estel(~mari-este@user/mari-estel) mari-estel
2024-11-14 15:35:09 +0100ash3en(~Thunderbi@149.222.147.110) ash3en
2024-11-14 15:35:40 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) (Remote host closed the connection)
2024-11-14 15:35:45 +0100ash3en(~Thunderbi@149.222.147.110) (Client Quit)
2024-11-14 15:35:59 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) alexherbo2
2024-11-14 15:36:27 +0100yaroot(~yaroot@2400:4052:ac0:d901:1cf4:2aff:fe51:c04c) (Read error: Connection reset by peer)
2024-11-14 15:36:41 +0100yaroot(~yaroot@2400:4052:ac0:d901:1cf4:2aff:fe51:c04c) yaroot
2024-11-14 15:39:55 +0100Cadey(~cadey@perl/impostor/xe) (Quit: WeeChat 4.4.2)
2024-11-14 15:41:12 +0100weary-traveler(~user@user/user363627) (Quit: Konversation terminated!)
2024-11-14 15:47:02 +0100billchenchina(~billchenc@2a0d:2580:ff0c:1:e3c9:c52b:a429:5bfe) billchenchina
2024-11-14 15:48:48 +0100 <bailsman> Plain old lists are consistently the fastest. I find that somewhat confusing, since in imperative languages linked lists are often slow.
2024-11-14 15:49:41 +0100 <geekosaur> if all you're doing is iterating through them, consider that ghc is optimized for that case: think of a list as a loop encoded as data
2024-11-14 15:49:51 +0100 <hellwolf> I mean, if you need to do a log of random indexing, it got to be slow. but for stream processing, it is probably the most efficient
2024-11-14 15:50:23 +0100 <geekosaur> allocation, gc, and iteration are all optimized because it's so common
2024-11-14 15:50:37 +0100 <haskellbridge> <Bowuigi> Reasoning imperatively in functional languages leads to bad performance in general
2024-11-14 15:50:40 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) misterfish
2024-11-14 15:51:04 +0100ph88(~ph88@2a02:8109:9e26:c800:7ee4:dffc:4616:9e2a)
2024-11-14 15:52:00 +0100 <bailsman> I thought I needed to do a lot of random indexing. But, now I'm not sure if I shouldn't instead redesign everything so that it does not require random access.
2024-11-14 15:52:55 +0100 <haskellbridge> <Bowuigi> Have you tried any functional random access data structures?
2024-11-14 15:53:14 +0100 <haskellbridge> <Bowuigi> Data.Map is the first one that comes to mind
2024-11-14 15:53:38 +0100 <bailsman> Data.Vector.Map over a vector is consistently 4x slower than regular map over []. (Data.Map is 10x slower)
2024-11-14 15:54:06 +0100hgolden(~hgolden@2603:8000:9d00:3ed1:6c70:1ac0:d127:74dd) hgolden
2024-11-14 15:54:11 +0100 <hellwolf> "data Array i e" is also under rated.
2024-11-14 15:54:16 +0100 <geekosaur> right, map's going to be one of those cases that [] will work very well for
2024-11-14 15:55:03 +0100 <geekosaur> it actually compiles down to a tight loop in most cases, not the C-style linked list you might expect
2024-11-14 15:55:14 +0100 <ph88> when i have some code more or less in the shape of this thing https://hackage.haskell.org/package/containers-0.7/docs/Data-Tree.html#t:Tree how can i write code that changes `a` with State but there are two points to change it, when going down (into the leafs) and going up (back to the root)? also known as visitor pattern
2024-11-14 15:55:38 +0100 <geekosaur> ph88, are you aware of tree zippers?
2024-11-14 15:55:42 +0100 <ph88> no
2024-11-14 15:55:53 +0100 <geekosaur> sadly the first reference that comes to mind is on the wiki…
2024-11-14 15:56:06 +0100 <bailsman> I have some parts right now that use random access. But was thinking maybe I don't want to pay a 4x performance penalty just for random access.
2024-11-14 15:56:08 +0100 <hellwolf> (wiki has been fixed)
2024-11-14 15:56:16 +0100 <geekosaur> just found that, yes
2024-11-14 15:56:29 +0100 <geekosaur> actually hgolden in #h-i said there are still some style issues
2024-11-14 15:56:30 +0100 <bailsman> Awesome! Thank you to whoever fixed it
2024-11-14 15:56:30 +0100 <geekosaur> https://wiki.haskell.org/Zipper
2024-11-14 15:56:57 +0100 <geekosaur> it uses a tree as the example data structure, where most of them focus on lists which are the easiest case
2024-11-14 15:57:50 +0100 <haskellbridge> <Bowuigi> Gérard Huet's pearl "The Zipper" is also good if you don't mind OCaml
2024-11-14 15:58:09 +0100 <bailsman> What do you mean by tight loop? Surely it still has to allocate all the elements for the new list?
2024-11-14 15:58:25 +0100 <bailsman> Or does it turn into an in-place algorithm?
2024-11-14 15:58:44 +0100 <geekosaur> if your generation and consumption are written correctly, they get pipelined
2024-11-14 15:59:04 +0100 <bailsman> I don't know what any of those words mean
2024-11-14 15:59:10 +0100 <ph88> wiki got a makeover? i remember being it uglier
2024-11-14 15:59:34 +0100 <geekosaur> ph88, that's what I meant by style but also a mediawiki upgrade is what started the whole outage thing
2024-11-14 15:59:42 +0100 <bailsman> I am just doing [SmallRecord] -> [SmallRecord] by updating a field in the record
2024-11-14 15:59:43 +0100 <haskellbridge> <Bowuigi> GHC does dark magic to not actually use a linked list
2024-11-14 16:00:07 +0100 <geekosaur> bailsman, construction of the list vs. mapping through the list
2024-11-14 16:00:40 +0100 <geekosaur> in the optimal case, the list is never constructed as such, elements are fed directly to map as they are created
2024-11-14 16:01:05 +0100 <bailsman> Hey, no, that's cheating. Then I've written my benchmark wrong
2024-11-14 16:01:11 +0100 <bailsman> I need to benchmark the list already existing
2024-11-14 16:01:46 +0100 <bailsman> It has to actually be stored and loaded from memory to be a fair comparison.
2024-11-14 16:02:19 +0100 <bailsman> Why is understand performance of things so difficult aaargh
2024-11-14 16:02:30 +0100 <EvanR> yes, when you "write C in any language" in haskell, it's not optimal. Surprise
2024-11-14 16:02:33 +0100 <geekosaur> because everyone wants speeeeeeed
2024-11-14 16:02:59 +0100 <EvanR> haskell is weird that way. But it's actually not smart to write C in any language generally
2024-11-14 16:03:18 +0100 <geekosaur> (including C /gd&r)
2024-11-14 16:03:24 +0100weary-traveler(~user@user/user363627) user363627
2024-11-14 16:03:33 +0100 <bailsman> EvanR: That would be helpful advice if I automatically understood how to write idiomatic-and-perforant code in Haskell - but unfortunately that wisdom is as yet inaccesible to me :P
2024-11-14 16:03:49 +0100 <EvanR> advice: forget anything you know about C and C++ and learn haskell
2024-11-14 16:04:02 +0100 <EvanR> also forget python for good measure
2024-11-14 16:04:47 +0100 <geekosaur> I think maybe if you want to understand idiomatic-and-performant, it might be worth looking at Chris Okasaki's thesis on functional data structures
2024-11-14 16:04:53 +0100 <haskellbridge> <Bowuigi> Because it is different to what you are used to. Functional languages can do optimizations that imperative langs can't, like list/fold/map/hylo fusion (AKA removing intermediate computations while traversing or creating stuff), safe(-ish) inlining, laziness stuff, etc
2024-11-14 16:05:47 +0100 <geekosaur> IIRC it's in OCaml instead of Haskell so it won't cover things like laziness, but it'll still teach you the zen of functional programming
2024-11-14 16:06:43 +0100 <bailsman> How do I force it to actually create the list? `smallRecs = force [... | ... <- ...]` did not change anything, map is still as fast as it was before. Maybe it wasn't cheating?
2024-11-14 16:06:57 +0100 <haskellbridge> <Bowuigi> Laziness is something you will want to learn at some point but for now you can use "{-# LANGUAGE Strict #-}" if you don't want laziness
2024-11-14 16:07:23 +0100 <bailsman> Or did the compiler optimize that out
2024-11-14 16:07:27 +0100 <EvanR> why are we trying to cripple haskell again by "actually creating lists" and enabling Strict xD
2024-11-14 16:08:02 +0100L29Ah(~L29Ah@wikipedia/L29Ah) L29Ah
2024-11-14 16:08:14 +0100 <geekosaur> there's multiple levels of cheating
2024-11-14 16:08:18 +0100 <geekosaur> build/foldr is one
2024-11-14 16:08:20 +0100 <haskellbridge> <Bowuigi> You can force the first constructor (IIRC) with "seq", every constructor with "length" and the entire thing with "deepseq". Yeah Haskell has evaluation control
2024-11-14 16:08:48 +0100 <geekosaur> optimizing lists by treating them as loops is another
2024-11-14 16:08:52 +0100 <ph88> geekosaur, i was mistaken, i have actually not one data structure to fit all of the tree but multiple like `data Program = Program a [Statement]` and `data Statement = Statement a Expression` (dummy examples). Can tree zippers work with this? or do i need another technique?
2024-11-14 16:09:51 +0100 <geekosaur> I don't know of any examples, but that doesn't seem much different from (say) a zipper for red-black trees
2024-11-14 16:10:04 +0100 <haskellbridge> <Bowuigi> You might need the slightly more general idea of the "derivative of a data structure" but it is essentially the same idea
2024-11-14 16:10:43 +0100 <bailsman> doing smallRecsDeep = smallRecs `deepseq` smallRecs did not change anything either
2024-11-14 16:11:03 +0100Square2(~Square4@user/square) (Ping timeout: 246 seconds)
2024-11-14 16:11:06 +0100 <geekosaur> right, I'm not sure it's the place to start buit the fundamentals of the zipper technique are http://strictlypositive.org/diff.pdf
2024-11-14 16:11:06 +0100 <bailsman> the benchmark is using `nf` so that should be forcing both the source list and the destination list to be actually created now, right? But it's exactly as fast as before
2024-11-14 16:11:37 +0100 <EvanR> that's just a definition, it would have to be evaluated to cause the normal form to be realized
2024-11-14 16:11:54 +0100 <geekosaur> given the stuff in that paper you should be able to construct a derivative-based zipper for any list-like or tree-like structure
2024-11-14 16:12:13 +0100 <EvanR> it might also be that the non deepseq version was "just as slow" for some reason
2024-11-14 16:12:16 +0100 <haskellbridge> <Bowuigi> I think that usage of deepseq means "fully evaluate smallRecs when smallRecs is evaluated" but I am probably wrong
2024-11-14 16:12:28 +0100 <lortabac> Bowuigi: probably worth mentioning that the Strict pragma only makes user definitions strict. So the rest of the ecosystem (including lists) will still be lazy
2024-11-14 16:12:51 +0100 <geekosaur> not even that, actually. "strict" in Haskell means WHNF
2024-11-14 16:13:00 +0100 <geekosaur> not `rnf`
2024-11-14 16:13:02 +0100 <lortabac> it won't magically make Haskell a strict language
2024-11-14 16:13:33 +0100 <haskellbridge> <Bowuigi> So it is StrictData but also for functions? Huh
2024-11-14 16:13:40 +0100 <lortabac> geekosaur: if you only use functions and data types that you define it shouldn't make a difference I guess
2024-11-14 16:13:56 +0100 <haskellbridge> <Bowuigi> Oh well, you can't make Haskell strict on a single pragma then
2024-11-14 16:14:27 +0100 <bailsman> How do I write this benchmark to ensure the list is already created when map runs and not streamed
2024-11-14 16:14:33 +0100 <bailsman> and the output list is created as well
2024-11-14 16:14:33 +0100 <geekosaur> and you really don't want to because a fair amount of the Prelude assumes laziness and will bottom if you somehow forced them to be strict
2024-11-14 16:14:38 +0100 <haskellbridge> <Bowuigi> AutomaticBang might have been a clearer name lol
2024-11-14 16:15:06 +0100 <bailsman> When I hear someone say AutomaticBang something different comes to mind than was probably intended
2024-11-14 16:15:25 +0100 <haskellbridge> <Bowuigi> Fair enough
2024-11-14 16:15:43 +0100acidjnk(~acidjnk@p200300d6e7283f73687bc11ede7922f8.dip0.t-ipconnect.de) (Ping timeout: 264 seconds)
2024-11-14 16:16:01 +0100 <lortabac> AutomaticExclamationMark
2024-11-14 16:16:20 +0100mari-estel(~mari-este@user/mari-estel) (Quit: on the move)
2024-11-14 16:16:33 +0100 <bailsman> Did I write this correctly? https://paste.tomsmeding.com/B6koT8Nx
2024-11-14 16:16:48 +0100 <bailsman> In my real-world-use-case I'm pretty sure the lists are going to have to be loaded from memory and cannot be streamed.
2024-11-14 16:16:58 +0100 <haskellbridge> <Bowuigi> bailsman foldr/build uses a rule so just creating the list on a function on a function that is not inlined (with "{-# NOINLINE createList #-}") may work, I don't have a GHC at hand to test though
2024-11-14 16:17:04 +0100 <EvanR> in IO somewhere realList <- evaluate (force list)
2024-11-14 16:17:09 +0100 <geekosaur> consider that loading can be streamed
2024-11-14 16:17:10 +0100 <EvanR> should do it
2024-11-14 16:17:14 +0100 <geekosaur> as can writing
2024-11-14 16:17:39 +0100 <geekosaur> in fact that's where streaming frameworks came from
2024-11-14 16:18:18 +0100 <bailsman> geekosaur: I'm fine that it streams loading and writing. But streaming the list generator into the update and never actually constructing the intermediate list is cheating for the purposes of the benchmark, since that won't be possible in the real use case.
2024-11-14 16:18:20 +0100 <EvanR> usually when you load a big list of stuff from I/O, the whole list will exists just because
2024-11-14 16:18:26 +0100 <EvanR> unless you use lazy I/O which is weird
2024-11-14 16:19:25 +0100 <EvanR> (this is not the case for writing a big list out to I/O, this is a case where you can get streaming, which is good)
2024-11-14 16:19:34 +0100 <geekosaur> or a streaming framework (conduit, pipes, streamly, …)
2024-11-14 16:19:55 +0100acidjnk(~acidjnk@p200300d6e7283f73687bc11ede7922f8.dip0.t-ipconnect.de) acidjnk
2024-11-14 16:20:27 +0100 <geekosaur> anyway if you really want to know if the compiler is "cheating", look at the Core (intermediate representation language, use `-ddump-ds -ddump-to-file`)
2024-11-14 16:20:43 +0100 <geekosaur> or for quick and dirty, play.haskell.org has a button to generate Core
2024-11-14 16:21:54 +0100 <geekosaur> sorry, `-ddump-simpl`
2024-11-14 16:22:01 +0100 <haskellbridge> <Bowuigi> If it is fusing anything it will be fairly obvious there. Reading Core is very necessary for doing very fast Haskell code
2024-11-14 16:22:18 +0100 <bailsman> Using `smallRecs <- evaluate $ force [... | ... <- ...]` makes no difference whatsoever. Map is still faster 4x, absolutely no difference in performance. So can I now conclude it was not cheating?
2024-11-14 16:23:32 +0100 <EvanR> no you should still read the Core dump
2024-11-14 16:24:05 +0100 <EvanR> haskell is so high level you can't conclude anything from the source code
2024-11-14 16:24:15 +0100 <EvanR> on the subject of low level optimizations
2024-11-14 16:24:21 +0100 <bailsman> I printed the -ddump-simpl output to a file but I have no real clue how to interpret what I'm looking at
2024-11-14 16:24:31 +0100 <EvanR> I think there was a core primer somewhere
2024-11-14 16:24:50 +0100 <EvanR> but essentially it's a simplified low level language that haskell is translated to
2024-11-14 16:25:01 +0100 <EvanR> before it's compiled and assembled
2024-11-14 16:25:05 +0100 <bailsman> The pure function just translates to: updatePure_r2HI = map @SmallRecord @SmallRecord updateValue_r2HH
2024-11-14 16:25:24 +0100 <bailsman> sorry list function, I guess they're all pure except the mutable vector one
2024-11-14 16:25:39 +0100 <EvanR> @SmallRecord is a type, updateValue_r2HH should be another thing defined in the dump somewhere
2024-11-14 16:25:39 +0100 <lambdabot> Unknown command, try @list
2024-11-14 16:26:07 +0100 <bailsman> EvanR: I posted the source code of my benchmark here. https://paste.tomsmeding.com/B6koT8Nx
2024-11-14 16:26:13 +0100 <bailsman> please point out any beginner mistakes there
2024-11-14 16:26:42 +0100 <bailsman> All are using the same updateValue function.
2024-11-14 16:26:49 +0100 <EvanR> trying to +1 everything in the collection?
2024-11-14 16:28:39 +0100 <EvanR> it's not clear what defaultMain and bench do
2024-11-14 16:28:44 +0100Inst_(~Inst@user/Inst) (Ping timeout: 272 seconds)
2024-11-14 16:29:18 +0100kuribas`(~user@ip-188-118-57-242.reverse.destiny.be) (Remote host closed the connection)
2024-11-14 16:29:31 +0100 <EvanR> or `nf' ?
2024-11-14 16:29:37 +0100 <bailsman> I copied that from some example code to do a benchmark somewhere
2024-11-14 16:29:46 +0100 <bailsman> I don't understand either but it printed some numbers to my console output
2024-11-14 16:29:48 +0100Inst(~Inst@user/Inst) Inst
2024-11-14 16:31:16 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) (Remote host closed the connection)
2024-11-14 16:31:36 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) alexherbo2
2024-11-14 16:32:01 +0100 <EvanR> well that will have a big effect on performance
2024-11-14 16:32:27 +0100 <EvanR> code doesn't do anything in isolation, the evaluation is on demand
2024-11-14 16:32:33 +0100 <bailsman> I'd like to understand exactly what's going on to make map so much faster.
2024-11-14 16:33:36 +0100 <EvanR> well, mapping a list to get another list is much simpler than building a big tree or copying a vector so you can mutate it
2024-11-14 16:33:49 +0100 <bailsman> Why is it simpler? It's the same operation
2024-11-14 16:33:57 +0100 <EvanR> even simpler if the source list already exists and doesn't need to be evaluated
2024-11-14 16:33:58 +0100 <bailsman> It should be harder because you need to allocate and create a linked list
2024-11-14 16:34:44 +0100 <bailsman> My intuitions are completely wrong, but I don't know exactly why.
2024-11-14 16:35:07 +0100 <ph88> geekosaur, i went back and forth with chatgpt for a bit. Could you take a peek at this document, specifically on line 490 https://bpa.st/MSVA it made an example with tree zippers to implement something for each type, which i don't want. Is there a way to use tree zippers without resorting to generic programming solutions such as GHC.Generics, syb, lens or Data.Data ?
2024-11-14 16:35:20 +0100 <EvanR> you may or may not be allocating any list nodes due to fusion, but even if you did, that's 1 node per item. Meanwhile the IntMap has a more complex structure and the Vector is larger, even if you ignore the fact that you have to copy it
2024-11-14 16:35:31 +0100 <bailsman> Why is the vector larger?
2024-11-14 16:35:46 +0100 <EvanR> it's larger than 1 list node
2024-11-14 16:35:55 +0100 <bailsman> but there's only 1 of them, not 1 million
2024-11-14 16:36:40 +0100 <EvanR> and 1 megabyte chunk of Vector might not play as nice with the GC
2024-11-14 16:37:04 +0100 <EvanR> it goes back to how your "bench" thing is processing the final list, 1 by 1, it's nicer on the GC
2024-11-14 16:37:56 +0100 <haskellbridge> <flip101> Bowuigi: could you please take a look as well?
2024-11-14 16:38:01 +0100philopsos(~caecilius@user/philopsos) philopsos
2024-11-14 16:38:45 +0100 <bailsman> I'm expecting the vector version to compile to something like `nv = new Vector(v.length); for (int i = 0; i < v.length; ++i) nv[i] = updateValue(v[i])`. One allocation, extremely simple update. Whereas the linked list version has to allocate 1M nodes and set up each of their 'next' pointers, so it seems like it should be doing more work.
2024-11-14 16:38:58 +0100 <EvanR> and again, the benchmark code might have gotten optimized so there are no list nodes, other than the source list
2024-11-14 16:39:07 +0100 <bailsman> How do I prevent it from doing that?
2024-11-14 16:39:24 +0100 <EvanR> go to the benchmark code and cripple that
2024-11-14 16:39:37 +0100 <EvanR> fully evaluated the final list before doing whatever it does with it
2024-11-14 16:39:46 +0100 <bailsman> Isn't that what I'm doing already?
2024-11-14 16:39:52 +0100 <bailsman> That's what the nf was for right?
2024-11-14 16:40:04 +0100 <EvanR> I have no idea, I don't see what nf is or bench is
2024-11-14 16:40:20 +0100 <EvanR> right now all I see is "map updateValue someList"
2024-11-14 16:41:16 +0100 <EvanR> finalList <- evaluate (force (map updateValue someList)) ought to slow it down more
2024-11-14 16:41:23 +0100 <bailsman> nf :: NFData b => (a -> b) -> a -> Benchmarkable
2024-11-14 16:41:39 +0100 <EvanR> I'm not familiar with Benchmarkable
2024-11-14 16:42:07 +0100 <EvanR> if nf works, computes full normal form, sounds bad for performance
2024-11-14 16:42:16 +0100 <EvanR> in the case of list
2024-11-14 16:42:26 +0100 <geekosaur> ph88, it's doable without any of those but it's harder since you have to write it all yourself. those libraries exist for a reason
2024-11-14 16:43:17 +0100 <EvanR> when I was tooling with the profiling and performance I would make sure to write my own main IO action so I know what what's
2024-11-14 16:43:32 +0100 <EvanR> control what ultimately is demanding evaluation
2024-11-14 16:43:39 +0100 <geekosaur> especially when you have multiple data types
2024-11-14 16:43:53 +0100 <bailsman> Anyway, I guess we can assume that it isn't cheating, it is actually constructing the intermediate list, and most of the performance difference is going to come from map being a builtin and the vector code not compiling to anything nearly as simple as what I expected. So it's not map being fast, it's map being slowish, and vector being slower, I think.
2024-11-14 16:44:05 +0100 <ph88> geekosaur, doable .. would i have to write code for each data type?
2024-11-14 16:44:14 +0100 <geekosaur> exactly, yes
2024-11-14 16:44:17 +0100misterfish(~misterfis@31-161-39-137.biz.kpn.net) (Ping timeout: 248 seconds)
2024-11-14 16:44:34 +0100 <ph88> that's going to take so much time, the AST is absolutely huge
2024-11-14 16:44:43 +0100 <geekosaur> that's where generics or syb come in, they generate the necessary code for you
2024-11-14 16:44:49 +0100 <EvanR> 4x faster isn't that much of a difference, it seems plausible you're creating the whole structure for everything. It's not like a 1000x speedup that you'd normally see when you switch from full evaluation to lazy evaluation
2024-11-14 16:45:50 +0100 <ph88> geekosaur, do you think it's still worth to use zippers but then to combine them with a generic approach? i am not sure whether i can go up and down with other approaches such as lens or GHC.Generics
2024-11-14 16:46:30 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2024-11-14 16:46:34 +0100 <EvanR> bailsman, Vector shines when you start with combine chains of operations together, it fuses away intermediate vectors
2024-11-14 16:46:45 +0100 <bailsman> I only do one operation.
2024-11-14 16:46:55 +0100 <EvanR> so you won't see that benefit there
2024-11-14 16:46:58 +0100 <geekosaur> you're conflating things, syb/generics/uniplate are mechanism, lens uses the mechanism. and lens should indeed be able to navigate up/down
2024-11-14 16:47:46 +0100 <EvanR> again, "I don't know how this benchmark library works, but I'll assume a bunch of conclusions" isn't as good as writing your own code then profiling
2024-11-14 16:48:08 +0100 <EvanR> and looking at the core, of your own code
2024-11-14 16:53:17 +0100 <geekosaur> ph88, it's easier to replace lens there with something else (such as a zipper) than it is to replace the generics mechanism needed to make lens/a zipper/whatever useful
2024-11-14 16:54:05 +0100 <geekosaur> if, as you say, "that's going to take so much time, the AST is absolutely huge", you need generics of some variety to escape that
2024-11-14 16:54:18 +0100 <geekosaur> that's why generics packages exist
2024-11-14 16:54:45 +0100 <ph88> why would i want this? "it's easier to replace lens there with something else (such as a zipper)"
2024-11-14 16:55:03 +0100 <ph88> i have neither, and i like something to traverse while not having to write traversal code for each type
2024-11-14 16:55:26 +0100 <ph88> as i understood it can be ghc.generics with zipper, or lens or maybe something else
2024-11-14 16:55:35 +0100 <geekosaur> then use generics to derive the traversal (all of the generics packages do so in some fashion)
2024-11-14 16:56:11 +0100 <ph88> and you still recommend to do the traversal with zipper yes? (with code derived with generics)
2024-11-14 16:56:27 +0100 <geekosaur> although the default traversals are all of the Traversable variety, unlike a zipper which lets you move at will
2024-11-14 16:56:38 +0100 <geekosaur> which it sounded like you wanted
2024-11-14 16:56:55 +0100 <geekosaur> if you just want something Traversable-style, any generics library will give you that
2024-11-14 16:56:59 +0100 <ph88> what if i don't only want to change the variable `a` but i also want to inspect the nodes and modify/replace them ?
2024-11-14 16:57:24 +0100 <ph88> can zipper do this too ?
2024-11-14 16:57:24 +0100 <geekosaur> that'd be a zipper
2024-11-14 16:57:30 +0100 <ph88> ok cool, thanks geekosaur !
2024-11-14 16:57:58 +0100 <geekosaur> you can do anything to the focused node including remove or replace it, and moving the zipper will reknit the tree
2024-11-14 16:58:42 +0100lortabac(~lortabac@2a01:e0a:541:b8f0:55ab:e185:7f81:54a4) (Quit: WeeChat 4.4.2)
2024-11-14 16:58:51 +0100 <geekosaur> even if it won't work with your structure as is, the wiki page I pointed you to earlier describes what you can do with a zipper
2024-11-14 16:59:12 +0100 <geekosaur> and the tree example is probably closer to your actual AST than a list zipper example would be
2024-11-14 17:00:06 +0100 <bailsman> To test my theory, I wrote a C version of the benchmark. Updating a linked list by allocating nodes one by one and copying over the values takes 14ms, approximately as long as Haskell takes to do map. Updating 1M records inplace in an array takes 2ms.
2024-11-14 17:00:30 +0100 <bailsman> So I think I'm concluding that map is "the best you can do in haskell" because it's optimized and a builtin, and any attempt to do in place algorithms is just going to be massively slow.
2024-11-14 17:00:31 +0100 <EvanR> that's... not going to be an apples to apples comparison
2024-11-14 17:00:38 +0100 <EvanR> are you allocating nodes with malloc
2024-11-14 17:00:57 +0100 <EvanR> allocating nodes in haskell is much faster
2024-11-14 17:01:08 +0100 <bailsman> No it isn't.
2024-11-14 17:01:22 +0100 <EvanR> yes it is
2024-11-14 17:01:56 +0100 <geekosaur> bailsman, what do you think is going on during an allocation?
2024-11-14 17:02:06 +0100 <geekosaur> because it's probably not what actually happens
2024-11-14 17:03:13 +0100 <bailsman> I agree - I'm not really sure. Some GC magic probably. But the point is that it's builtin and optimized, so it's much faster than trying to emulate in-place updates, which compiles to a morass of work and not 5 asm instructions like the c version.
2024-11-14 17:03:20 +0100 <geekosaur> not magic
2024-11-14 17:03:31 +0100 <geekosaur> the nursery/gen 0 is a bump-pointer allocator
2024-11-14 17:03:44 +0100 <geekosaur> gc only gets involved when the pointer reaches the end of the nursery
2024-11-14 17:04:50 +0100 <EvanR> "straight list processing and immutable structures are probably better in haskell than C-like mutable array munging" though is what I've been saying for days
2024-11-14 17:05:01 +0100 <EvanR> but the specific reasons are off
2024-11-14 17:06:10 +0100 <EvanR> before claiming stuff about what stuff compiles to you should check it
2024-11-14 17:06:20 +0100 <bailsman> To me the fact that the Haskell Vector is ~100ms, Haskell map is ~25ms, C allocate-new-linked-list-and-copy version is ~15ms, C array in place is ~2ms is suggestive of the fact that indeed allocating a list is slow, and it's indeed what Haskell is doing, but it's still better than trying to do an array in Haskell.
2024-11-14 17:06:59 +0100 <EvanR> the C version of linked list is just a bad thing to compare to haskell list unless you are careful to emulate what the haskell version did
2024-11-14 17:07:25 +0100 <EvanR> "they are both called list" isn't that inspiring
2024-11-14 17:08:09 +0100 <EvanR> list and arrays in haskell are both good for certain purposes
2024-11-14 17:08:44 +0100 <EvanR> in the case of list, usually not as a data structure
2024-11-14 17:08:53 +0100 <EvanR> but as a looping mechanism
2024-11-14 17:09:22 +0100 <bailsman> I agree with your conclusion - stop trying to be clever and just learn what idiomatic haskell code looks like.
2024-11-14 17:09:25 +0100 <EvanR> in the case of arrays, for lookup tables
2024-11-14 17:10:03 +0100 <bailsman> If you write idiomatic haskell, you get as-slow-as-you-would-expect, if you try to write in-place code, you get way-slower-than-you-would-expect.
2024-11-14 17:10:34 +0100 <EvanR> not necessarily, sometimes idiomatic haskell is faster
2024-11-14 17:11:19 +0100 <EvanR> in any case idiomatic haskell is a starting point for getting into the weeds for optimization
2024-11-14 17:12:30 +0100 <Inst> @bailsman
2024-11-14 17:12:30 +0100 <lambdabot> Unknown command, try @list
2024-11-14 17:12:36 +0100 <Inst> try compile with -fllvm
2024-11-14 17:14:34 +0100 <bailsman> Inst: I compiled my benchmark with -O2 -fllvm. Does not seem meaningfully different. Is -O2 the wrong optimization level?
2024-11-14 17:16:14 +0100 <EvanR> is llvm not the default now anyway
2024-11-14 17:16:16 +0100 <Inst> probably MY skill issue :(
2024-11-14 17:16:35 +0100 <tomsmeding> EvanR: it definitely is not
2024-11-14 17:16:39 +0100 <EvanR> ok
2024-11-14 17:19:01 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2024-11-14 17:19:31 +0100 <tomsmeding> :)
2024-11-14 17:20:48 +0100aljazmc(~aljazmc@user/aljazmc) aljazmc
2024-11-14 17:21:27 +0100 <geekosaur> llvm still lacks support for pre-CPSed code
2024-11-14 17:33:44 +0100 <haskellbridge> <Bowuigi> Now that everything is solved, it's time to move to something else
2024-11-14 17:34:26 +0100 <haskellbridge> <Bowuigi> It turns out that first class labels are just Proxy on a kind ranging over every possible label
2024-11-14 17:37:00 +0100Digitteknohippie(~user@user/digit) Digit
2024-11-14 17:37:19 +0100Digit(~user@user/digit) (Ping timeout: 265 seconds)
2024-11-14 17:40:17 +0100 <bailsman> Hmmm. I had Claude.AI write an unboxed small record instance with 50+ lines of code (to my eyes absolutely horrific). Then, using Data.Vector.Unboxed.Mutable the performance is now approaching the C in-place update speed. I don't entirely trust that this won't segfault at some point, but if claude.ai did everything correctly then apparently it *is* possible to write inplace algorithms, you just
2024-11-14 17:40:18 +0100 <bailsman> need to write unboxed instances for all of your data types.
2024-11-14 17:41:51 +0100 <geekosaur> well, yes, that helps
2024-11-14 17:42:00 +0100 <geekosaur> otherwise it'll be chasing a lot of pointers
2024-11-14 17:42:20 +0100 <haskellbridge> <Bowuigi> Oh yeah unboxing and strict data type fields can help in optimizing in general
2024-11-14 17:42:44 +0100 <bailsman> It went from 4x slower to 10x faster than plain `map`
2024-11-14 17:44:33 +0100Digitteknohippie(~user@user/digit) (Ping timeout: 252 seconds)
2024-11-14 17:46:08 +0100Digit(~user@user/digit) Digit
2024-11-14 17:58:01 +0100aljazmc(~aljazmc@user/aljazmc) (Remote host closed the connection)
2024-11-14 17:58:27 +0100aljazmc(~aljazmc@user/aljazmc) aljazmc
2024-11-14 17:58:53 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 18:03:50 +0100mantraofpie_mantraofpie
2024-11-14 18:05:45 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) (Ping timeout: 252 seconds)
2024-11-14 18:10:17 +0100mantraofpie(~mantraofp@user/mantraofpie) (Quit: ZNC 1.9.1 - https://znc.in)
2024-11-14 18:10:57 +0100mantraofpie(~mantraofp@user/mantraofpie) mantraofpie
2024-11-14 18:11:25 +0100Inst_(~Inst@user/Inst) Inst
2024-11-14 18:12:03 +0100emfrom(~emfrom@37.168.28.138)
2024-11-14 18:13:09 +0100Inst(~Inst@user/Inst) (Ping timeout: 276 seconds)
2024-11-14 18:21:58 +0100emfrom(~emfrom@37.168.28.138) (Remote host closed the connection)
2024-11-14 18:22:11 +0100 <bailsman> Wait, so apparently I can derive the unboxed instances with minimal boilerplate (as tuples), and the pure world doesn't even need to know or care that I did that all. I can write it idiomatically. And it's now as fast as C
2024-11-14 18:22:15 +0100 <bailsman> why did nobody tell me :P
2024-11-14 18:23:30 +0100 <bailsman> Please tell me it's not going to segfault on me if I move forward with this in more complex examples
2024-11-14 18:25:10 +0100 <tomsmeding> bailsman: "please tell me" if you show the code, perhaps we can :)
2024-11-14 18:26:10 +0100 <bailsman> updateValue is pure. This is the 'inplace map': `runST $ do; mv <- VU.unsafeThaw v; VUM.iforM_ mv $ \i s -> VUM.write mv i $! updateValue s; VU.unsafeFreeze mv`
2024-11-14 18:26:43 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) (Remote host closed the connection)
2024-11-14 18:27:04 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) alexherbo2
2024-11-14 18:31:34 +0100 <bailsman> https://paste.tomsmeding.com/yaTzqQA3
2024-11-14 18:31:59 +0100 <bailsman> more readable on multiple lines
2024-11-14 18:32:55 +0100 <bailsman> I should probably find a way to keep it mutable permanently rather than thawing and freezing
2024-11-14 18:33:13 +0100 <tomsmeding> bailsman: yes, mutating an immutable vector is sure to produce very strange issues
2024-11-14 18:33:32 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 18:33:35 +0100 <tomsmeding> GHC assumes that immutable values don't change and sometimes optimises quite aggressively based on that assumption
2024-11-14 18:33:54 +0100 <tomsmeding> please don't do this :p
2024-11-14 18:34:19 +0100 <tomsmeding> work in ST and keep the thing mutable while you're mutating it
2024-11-14 18:34:51 +0100wootehfoot(~wootehfoo@user/wootehfoot) wootehfoot
2024-11-14 18:35:42 +0100 <bailsman> Even if I make sure that the code with mutable reference has fully evaluated before any code with immutable references tries to read?
2024-11-14 18:36:47 +0100 <bailsman> I actually really like the performance now - I'd like to fully understand the dragons on my path.
2024-11-14 18:37:26 +0100 <geekosaur> ST will ensure that for you
2024-11-14 18:37:43 +0100 <bailsman> Replacing the code with the safe versions of freeze and thaw makes it 3x slower
2024-11-14 18:38:56 +0100housemate(~housemate@146.70.66.228) (Quit: "I saw it in a tiktok video and thought that it was the most smartest answer ever." ~ AnonOps Radio [some time some place] | I AM THE DERIVATIVE I AM GOING TANGENT TO THE CURVE!)
2024-11-14 18:39:30 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2024-11-14 18:44:03 +0100ft(~ft@p4fc2a216.dip0.t-ipconnect.de) ft
2024-11-14 18:46:27 +0100tzh(~tzh@c-76-115-131-146.hsd1.or.comcast.net) tzh
2024-11-14 18:47:10 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 18:50:58 +0100 <tomsmeding> bailsman: yes, depending on how you ensure that it is fully evaluated
2024-11-14 18:51:15 +0100 <tomsmeding> GHC may just decide that "fully evaluating" the values can also happen a bit later
2024-11-14 18:51:26 +0100 <tomsmeding> (again depending on the precise code)
2024-11-14 18:51:29 +0100 <bailsman> evaluate $ force
2024-11-14 18:51:32 +0100 <tomsmeding> in IO?
2024-11-14 18:51:35 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 255 seconds)
2024-11-14 18:51:42 +0100 <bailsman> sure
2024-11-14 18:51:47 +0100 <tomsmeding> (evaluate runs in IO)
2024-11-14 18:51:50 +0100 <geekosaur> evaluate has to be in IO
2024-11-14 18:51:51 +0100 <tomsmeding> yes IO sequences, so that's fine
2024-11-14 18:52:12 +0100 <tomsmeding> I would still not do this
2024-11-14 18:53:01 +0100 <tomsmeding> bailsman: try putting the whole mutable part in ST, so that you only have one unsafeThaw and one unsafeFreeze; presumably that should still be fast
2024-11-14 18:53:20 +0100 <tomsmeding> then change the unsafeThaw to thaw, because with the unsafeThaw + unsafeFreeze you're still in unsafe world
2024-11-14 18:54:10 +0100 <tomsmeding> perhaps you can ensure that you don't create the initial vector as immutable, but instead as mutable, so that you never have to thaw it
2024-11-14 18:54:16 +0100 <bailsman> The 'challenge' I have is I'm not sure how to store a mutable array outside the monad. Currently I'm freezing it, then doing writeIORef, then when it runs again I do readIORef followed by thaw.
2024-11-14 18:54:17 +0100 <tomsmeding> that would save the introduced copy
2024-11-14 18:54:18 +0100housemate(~housemate@146.70.66.228) (Read error: Connection reset by peer)
2024-11-14 18:54:30 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 18:54:32 +0100 <tomsmeding> the point is that a mutable array _must_ live inside the monad :p
2024-11-14 18:54:39 +0100 <tomsmeding> that's the whole point of the interface, and what makes it safe
2024-11-14 18:55:06 +0100 <tomsmeding> a mutable array is a _reference_, you only need a read-only reference to the MVector to be able to modify the underlying storage
2024-11-14 18:55:09 +0100 <bailsman> this part isn't safe. This is the fast part. The rest of my code I want to write idiomatically/pure.
2024-11-14 18:55:27 +0100 <tomsmeding> no code needs to _return_ an MVector, just create one, then pass it _to_ everything
2024-11-14 18:55:45 +0100 <tomsmeding> (think: there's already an "IORef" (ish) inside the MVector type)
2024-11-14 18:55:51 +0100 <bailsman> I need to store it somewhere between invocations from javascript :P
2024-11-14 18:56:01 +0100 <tomsmeding> O.o
2024-11-14 18:56:05 +0100 <tomsmeding> okay that changes the picture
2024-11-14 18:56:17 +0100 <tomsmeding> make it an IOVector instead of an STVector?
2024-11-14 18:56:35 +0100 <tomsmeding> I don't recall the JS FFI well, but can't exported haskell functions run in IO?
2024-11-14 18:56:41 +0100 <tomsmeding> then you can just put the IOVector wherever
2024-11-14 18:57:17 +0100 <bailsman> They totally can. I'm being called from javascript, that JSFFI function is in IO, currently I'm faking a global variable by doing unsafePerformIO (newIOREf ...) then reading/writing to the IORef
2024-11-14 18:57:26 +0100 <tomsmeding> that sounds like a decent plan
2024-11-14 18:57:36 +0100 <tomsmeding> just make that an IOVector, not an STVector :p
2024-11-14 18:57:45 +0100 <tomsmeding> never need to thaw/freeze it
2024-11-14 18:58:24 +0100 <tomsmeding> bailsman: make sure to put {-# NOINLINE #-} on that global variable
2024-11-14 18:58:44 +0100 <bailsman> yep I did that. So my global variable is a record with a bunch of vectors in it. I haven't figured out yet how to make those mutable.
2024-11-14 18:58:54 +0100 <tomsmeding> make them IOVectors?
2024-11-14 18:59:02 +0100bailsmangoogles iovectors
2024-11-14 18:59:13 +0100 <tomsmeding> vector:Data.Vector.Mutable (IOVector)
2024-11-14 18:59:22 +0100 <tomsmeding> perhaps Data.Vector.Unboxed.Mutable
2024-11-14 18:59:47 +0100 <bailsman> and that means I can put them in a record and then I can writeIORef that record ? And everything works?
2024-11-14 18:59:56 +0100 <tomsmeding> don't writeIORef that record
2024-11-14 19:00:00 +0100 <tomsmeding> just read the IOVectors
2024-11-14 19:00:08 +0100 <tomsmeding> they're already references
2024-11-14 19:00:23 +0100 <bailsman> so I need to noinline unsafePerformIO the iovectors
2024-11-14 19:00:27 +0100 <tomsmeding> yes
2024-11-14 19:00:42 +0100JuanDaugherty(~juan@user/JuanDaugherty) JuanDaugherty
2024-11-14 19:01:04 +0100housemate(~housemate@146.70.66.228) (Remote host closed the connection)
2024-11-14 19:01:33 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:01:38 +0100housemate(~housemate@146.70.66.228) (Remote host closed the connection)
2024-11-14 19:01:43 +0100 <bailsman> I see. And then I just use them. And GHC understands that they were mutable the whole time and so it doesn't do anything unsafe.
2024-11-14 19:01:51 +0100 <tomsmeding> yes
2024-11-14 19:02:26 +0100visilii(~visilii@213.24.126.184)
2024-11-14 19:02:49 +0100visilii_(~visilii@213.24.127.47) (Ping timeout: 260 seconds)
2024-11-14 19:06:26 +0100ljdarj(~Thunderbi@user/ljdarj) ljdarj
2024-11-14 19:09:05 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 19:09:33 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2024-11-14 19:09:50 +0100alexherbo2(~alexherbo@2a02-8440-3313-668b-a9ec-921f-0511-ee3f.rev.sfr.net) (Remote host closed the connection)
2024-11-14 19:11:58 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:25:51 +0100shapr(~user@2601:19c:417e:5434:4627:df70:3bf6:c2cb) shapr
2024-11-14 19:26:17 +0100 <shapr> I'm writing a small blog post about using Haskell for "shell scripting", or maybe "how to run a Haskell source file"
2024-11-14 19:26:37 +0100 <shapr> The only two ways I know are: #!/usr/bin/env runhaskell
2024-11-14 19:27:49 +0100 <shapr> and something similar with cabal: https://github.com/shapr/randomtesting/blob/main/Hedgehog.hs#L3
2024-11-14 19:27:58 +0100 <shapr> Am I missing anything?
2024-11-14 19:29:09 +0100hellwolf(~user@2001:1530:70:545:809e:22e1:baa3:1e4c) (Ping timeout: 246 seconds)
2024-11-14 19:29:19 +0100 <tomsmeding> documentation about the latter here: https://cabal.readthedocs.io/en/stable/cabal-commands.html#cabal-run
2024-11-14 19:29:38 +0100 <shapr> tomsmeding: thank you!
2024-11-14 19:30:24 +0100 <tomsmeding> similarly https://docs.haskellstack.org/en/stable/topics/scripts/
2024-11-14 19:32:59 +0100 <shapr> I'll add that as well
2024-11-14 19:35:30 +0100 <sm> shapr: quite a bit, I must tell you :)
2024-11-14 19:36:09 +0100 <shapr> oh no, what else am I missing?
2024-11-14 19:36:57 +0100 <sm> stack scripts and cabal scripts are important for using deps / reproducibility / sharing, already mentioned by tomsmeding
2024-11-14 19:37:32 +0100 <sm> there have been lots of attempts to bridge haskell and scripting - https://hackage.haskell.org/packages/search?terms=shell probably includes some of them (sorry)
2024-11-14 19:37:52 +0100 <sm> https://chrisdone.github.io/hell is the most recent and a particularly interesting one IMHO
2024-11-14 19:38:41 +0100 <shapr> That makes the subject larger than I can put into a single blog post
2024-11-14 19:38:56 +0100 <shapr> tempting, but for now I'm gonna cover "how to run a Haskell source file like it's a shell script"
2024-11-14 19:39:01 +0100 <sm> Yeah. I hope you'll mention the issue of keeping it running beyond a single ghc release at least. Maybe another post :)
2024-11-14 19:39:17 +0100 <shapr> uh, what is that issue?
2024-11-14 19:39:26 +0100housemate(~housemate@146.70.66.228) (Quit: "I saw it in a tiktok video and thought that it was the most smartest answer ever." ~ AnonOps Radio [some time some place] | I AM THE DERIVATIVE I AM GOING TANGENT TO THE CURVE!)
2024-11-14 19:39:42 +0100 <shapr> tell me more?
2024-11-14 19:39:53 +0100 <sm> if you don't pin down ghc (& base) and any other deps your script uses, it's likely to break eventually (or possibly very soon), unlike normal shell scripts
2024-11-14 19:40:10 +0100 <shapr> oh, interesting
2024-11-14 19:40:23 +0100 <sm> just because of ghc's evolution and tight/fragile dependencies between haskell packages
2024-11-14 19:40:28 +0100 <shapr> Now that I know, I will mention this!
2024-11-14 19:41:33 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 276 seconds)
2024-11-14 19:41:37 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:41:40 +0100 <sm> (and, getting it working may require either some coding to update it, or installing a few gigabytes of old ghc and deps that may or may not run easily on your machine)
2024-11-14 19:41:53 +0100 <sm> sorry, I'm just giving the most negative (but real) case
2024-11-14 19:42:23 +0100 <sm> feel free to ignore me :)
2024-11-14 19:42:28 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:42:34 +0100 <shapr> I like to know all the bits around the edges!
2024-11-14 19:42:38 +0100shaprhugs sm
2024-11-14 19:43:14 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:43:22 +0100smbounces
2024-11-14 19:45:17 +0100 <sm> other issues if scripts are to be shared: how to make a robust shebang line
2024-11-14 19:45:23 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:45:52 +0100 <sm> ah you showed one. If more arguments are needed, some platforms require env -S
2024-11-14 19:46:25 +0100 <shapr> What's env -S ?
2024-11-14 19:46:44 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:46:56 +0100 <shapr> The benefit of writing blog posts is that I learn so much!
2024-11-14 19:47:11 +0100 <geekosaur> (but some systems that require it don't have it. then again I'm not sure how many people run Illumos)
2024-11-14 19:47:21 +0100 <sm> I think env requires -S on *bsd (& mac) to allow extra arguments after the executable, GNU/linux doesn't. Or vice versa.
2024-11-14 19:47:33 +0100 <geekosaur> the former
2024-11-14 19:47:49 +0100 <geekosaur> historically the entire shebang line after the interpreter name was passed as a single parameter
2024-11-14 19:47:58 +0100 <geekosaur> -S word-splits the parameter
2024-11-14 19:48:11 +0100 <sm> 👍🏻
2024-11-14 19:48:23 +0100 <geekosaur> (shebangs originated on BSD)
2024-11-14 19:48:42 +0100 <shapr> Yeah, I just realized I don't know whether # was a comment BEFORE it #! was the magic bytes for "this is an executable"
2024-11-14 19:48:53 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:49:01 +0100 <geekosaur> it was a comment… for csh
2024-11-14 19:49:21 +0100 <shapr> oh wow, so csh had # comments before #! was magic bytes?
2024-11-14 19:49:22 +0100 <geekosaur> on sh there was a libc hack, if exec() failed it retried it via sh
2024-11-14 19:49:24 +0100 <sm> just remember it's very easy to write a runhaskell script, add some standard-looking imports, then share it or come back in a few months and find it depends on packages which are no longer installed or installed but changed
2024-11-14 19:49:26 +0100 <geekosaur> yes
2024-11-14 19:49:32 +0100 <shapr> wow, thanks
2024-11-14 19:49:43 +0100 <shapr> sm: yeah, I'll mention that
2024-11-14 19:49:44 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:50:40 +0100 <geekosaur> back then sh's idea of a comment was :, and it was actually a null command so you needed to quote any special characters in the "comment"
2024-11-14 19:50:58 +0100smwonders what shebang line works in native windows shells (cmd, powershell) - none ?
2024-11-14 19:51:17 +0100 <geekosaur> I think powershell has something vaguely shebang-like, cmd doesn't
2024-11-14 19:51:24 +0100 <geekosaur> os/2 used to have EXTPROC
2024-11-14 19:51:53 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:52:03 +0100 <shapr> ooh this is a fun read https://en.wikipedia.org/wiki/Shebang_%28Unix%29#Version_8_improved_shell_scripts
2024-11-14 19:52:37 +0100 <shapr> I was a small child when #! was added
2024-11-14 19:53:44 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:54:33 +0100 <sm> I was a small child too when I was a lad
2024-11-14 19:54:36 +0100 <shapr> haha
2024-11-14 19:55:03 +0100smwas 12 and about to see a computer next year
2024-11-14 19:55:08 +0100 <geekosaur> sophomore in high school
2024-11-14 19:55:17 +0100Tuplanolla(~Tuplanoll@91-159-69-59.elisa-laajakaista.fi) Tuplanolla
2024-11-14 19:55:53 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:55:53 +0100 <geekosaur> banging on a TRS-80 Model I during off periods
2024-11-14 19:56:02 +0100 <geekosaur> teaching myself Z80 assembly language
2024-11-14 19:56:15 +0100 <sm> sweet. Commodore Pet & 6502 checking in!
2024-11-14 19:56:39 +0100 <geekosaur> I had that at home courtesy of my father
2024-11-14 19:56:44 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 19:56:47 +0100 <sm> ohhhh
2024-11-14 19:57:08 +0100 <sm> was he a scientist ?
2024-11-14 19:57:19 +0100 <geekosaur> well, OSI SuperBoard II (the mainboard minus components), then he bought the components separately and had someone at work wave-solder it all together
2024-11-14 19:57:25 +0100 <tomsmeding> sm: re base versions with scripts: a cabal script has a build-depends block at the top, so you can put a version constraint on that ;)
2024-11-14 19:57:31 +0100 <tomsmeding> (not that anybody does that)
2024-11-14 19:57:45 +0100 <sm> tomsmeding yes exactly, that's why stack/cabal scripts are important
2024-11-14 19:58:03 +0100 <geekosaur> salesman, actually, but for a company that sold electronic (for 1980 values of electronic) parts to industry
2024-11-14 19:58:14 +0100chele(~chele@user/chele) (Remote host closed the connection)
2024-11-14 19:58:19 +0100 <sm> nice
2024-11-14 19:59:10 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 19:59:40 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:01:39 +0100 <sm> stack provides the `script` command specifically for this purpose - it will shout at you if you leave anything unspecified, so you won't forget
2024-11-14 20:02:05 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:02:33 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:04:46 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:04:49 +0100Lestat9(~Admin1@47.203.239.77)
2024-11-14 20:04:53 +0100JuanDaugherty(~juan@user/JuanDaugherty) (Quit: JuanDaugherty)
2024-11-14 20:05:14 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:07:29 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:07:57 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:08:49 +0100housemate(~housemate@146.70.66.228) (Remote host closed the connection)
2024-11-14 20:09:14 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:11:36 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:12:04 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:14:27 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:14:55 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:15:49 +0100housemate(~housemate@146.70.66.228) (Remote host closed the connection)
2024-11-14 20:16:14 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 20:16:38 +0100hellwolf(~user@da7f-daa3-a2f4-21df-0f00-4d40-07d0-2001.sta.estpak.ee) hellwolf
2024-11-14 20:17:04 +0100 <hellwolf> basic question regarding cabal... if you specify a version that is not the "HEAD" of the package, would cabal be able to retrieve it from somewhere still?
2024-11-14 20:18:30 +0100Lestat9(~Admin1@47.203.239.77) (K-Lined)
2024-11-14 20:18:44 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 20:25:15 +0100 <sm> sure.. all old versions are stored on hackage
2024-11-14 20:31:55 +0100 <c_wraith> ... give or take some that had security issues so bad it was determined no one should ever use that version
2024-11-14 20:39:58 +0100 <geekosaur> even those are on hackage, just flagged as deprecated
2024-11-14 20:40:17 +0100 <geekosaur> hackage is immutable append-only storage
2024-11-14 20:46:32 +0100 <lxsameer> I just found out about bluefin, and omg it's much much clearer that effectful for me, kudos!!!
2024-11-14 20:47:03 +0100 <shapr> So far, only problem is that using cabal as the interpreter directive takes about two minutes on first run, ouch
2024-11-14 20:48:35 +0100ljdarj1(~Thunderbi@user/ljdarj) ljdarj
2024-11-14 20:50:47 +0100 <c_wraith> I recently made my CI cache the cabal index as well as built packages. Turns out it can load from the cache *marginally* faster than from hackage.
2024-11-14 20:51:08 +0100 <shapr> tiny blog post: https://www.scannedinavian.com/how-to-run-haskell-source-files-like-shell-scripts.html
2024-11-14 20:52:06 +0100ljdarj(~Thunderbi@user/ljdarj) (Ping timeout: 265 seconds)
2024-11-14 20:52:06 +0100ljdarj1ljdarj
2024-11-14 20:52:30 +0100weary-traveler(~user@user/user363627) (Remote host closed the connection)
2024-11-14 20:52:31 +0100mantraofpie_(~mantraofp@user/mantraofpie) mantraofpie
2024-11-14 20:52:46 +0100CoolMa7(~CoolMa7@ip5f5b8957.dynamic.kabel-deutschland.de) CoolMa7
2024-11-14 20:52:48 +0100 <c_wraith> I may need to look into using the partial-hackage-mirror script in the cache as well.
2024-11-14 20:53:26 +0100mantraofpie(~mantraofp@user/mantraofpie) (Ping timeout: 260 seconds)
2024-11-14 20:53:30 +0100chexum_(~quassel@gateway/tor-sasl/chexum) chexum
2024-11-14 20:53:46 +0100mantraofpie_mantraofpie
2024-11-14 20:54:01 +0100chexum(~quassel@gateway/tor-sasl/chexum) (Ping timeout: 260 seconds)
2024-11-14 20:54:39 +0100shapr(~user@2601:19c:417e:5434:4627:df70:3bf6:c2cb) (Quit: walkies)
2024-11-14 20:55:50 +0100 <hellwolf> Cabal scripts is convenient, you just need to be aware of the junks it leave behind: ~/.cabal/script-builds/. I wish it has a convenient way of running the script in interpreting mode though, without creating any artifacts.
2024-11-14 20:55:58 +0100 <hellwolf> that'd be even more "script-ish"
2024-11-14 20:57:14 +0100sprotte24(~sprotte24@p200300d16f0f4e0080b9b718c313bb1e.dip0.t-ipconnect.de)
2024-11-14 20:57:56 +0100 <lxsameer> why don't you folks nix? out of curiosity
2024-11-14 20:59:37 +0100tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2024-11-14 20:59:45 +0100 <lxsameer> * use nix
2024-11-14 21:00:02 +0100 <c_wraith> I like the theory, but every time I've tried to use it it's been immensely more complexity than benefit
2024-11-14 21:00:03 +0100caconym(~caconym@user/caconym) (Quit: bye)
2024-11-14 21:00:43 +0100caconym(~caconym@user/caconym) caconym
2024-11-14 21:00:44 +0100hellwolf(~user@da7f-daa3-a2f4-21df-0f00-4d40-07d0-2001.sta.estpak.ee) (Quit: rcirc on GNU Emacs 29.4)
2024-11-14 21:00:57 +0100 <lxsameer> c_wraith: how come?
2024-11-14 21:01:00 +0100 <c_wraith> (Not helped by attempting to use NixOS at one of the few times it actually was installing broken stuff. Not fair, but I still associate that experience with nix)
2024-11-14 21:01:02 +0100hellwolf(~user@da7f-daa3-a2f4-21df-0f00-4d40-07d0-2001.sta.estpak.ee) hellwolf
2024-11-14 21:01:31 +0100 <geekosaur> cabal rebuilds on first run and when the script changes. sadly its solver is really slow; people are working on that, though
2024-11-14 21:01:56 +0100hellwolf(~user@da7f-daa3-a2f4-21df-0f00-4d40-07d0-2001.sta.estpak.ee) ()
2024-11-14 21:01:58 +0100 <geekosaur> stack is much faster because instead of using a solver it gets fixed versions from the snapshot and `extra-deps`
2024-11-14 21:02:00 +0100hellwolf(~user@da7f-daa3-a2f4-21df-0f00-4d40-07d0-2001.sta.estpak.ee) hellwolf
2024-11-14 21:02:23 +0100 <geekosaur> (consider that a solver in thsi case is a constraint satisfaction solver, so yes, it's pretty slow)
2024-11-14 21:02:40 +0100 <lxsameer> c_wraith: ahh I see. I use nix with cabal, the good thing is I use nix packages and it's pretty fast
2024-11-14 21:03:27 +0100ash3en(~Thunderbi@193.32.248.167) ash3en
2024-11-14 21:05:09 +0100 <statusbot> Status update: Wiki.haskell.org is serving content again, but the upgrade is ongoing and various configs/css still need to be restored. -- http://status.haskell.org/pages/incident/537c07b0cf1fad5830000093/6728f5b530789205372a3361
2024-11-14 21:05:37 +0100 <lxsameer> geekosaur: I use cabal freeze, and the nix uses that file to setup the env for me. so far so good
2024-11-14 21:06:29 +0100agent314(~quassel@static-198-44-129-53.cust.tzulo.com) (Ping timeout: 260 seconds)
2024-11-14 21:06:57 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 21:07:48 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 21:08:36 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 21:09:08 +0100billchenchina(~billchenc@2a0d:2580:ff0c:1:e3c9:c52b:a429:5bfe) (Ping timeout: 245 seconds)
2024-11-14 21:10:57 +0100housemate(~housemate@146.70.66.228) (Max SendQ exceeded)
2024-11-14 21:11:11 +0100shapr(~user@4.30.215.226) shapr
2024-11-14 21:11:21 +0100ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 246 seconds)
2024-11-14 21:12:40 +0100housemate(~housemate@146.70.66.228) housemate
2024-11-14 21:12:50 +0100wootehfoot(~wootehfoo@user/wootehfoot) (Read error: Connection reset by peer)
2024-11-14 21:13:15 +0100 <jle`> thanks statusbot
2024-11-14 21:14:08 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) machinedgod
2024-11-14 21:14:22 +0100 <hellwolf> nix + cabal freeze file is underrated
2024-11-14 21:14:44 +0100housemate(~housemate@146.70.66.228) (Remote host closed the connection)
2024-11-14 21:14:53 +0100 <hellwolf> I am a reproducibility maximalist.
2024-11-14 21:14:54 +0100weary-traveler(~user@user/user363627) user363627
2024-11-14 21:15:54 +0100 <hellwolf> (in fairness stack freezes a set for you, too)
2024-11-14 21:15:59 +0100hellwolfsometimes living on the edge
2024-11-14 21:16:11 +0100guy(~guy@2a01:4b00:d007:ed00:81c3:85aa:e2c9:6027)
2024-11-14 21:16:23 +0100 <c_wraith> aw, darn. the partial-hackage-mirror script only grabs packages. It doesn't thin the index data.
2024-11-14 21:16:44 +0100 <guy> hi! i have made a recording describing haskell as a "nonliner graphically complete programming language"
2024-11-14 21:16:46 +0100 <guy> https://voca.ro/14nNu3Nm5FaV
2024-11-14 21:16:57 +0100 <guy> i was wondering if anyne would like to take a listen and we could have a discussion
2024-11-14 21:17:18 +0100 <sm> shapr: nice post. What's cabal doing in the two minutes ?
2024-11-14 21:18:29 +0100 <shapr> sm: I haven't looked, and I didn't see anything obvious in the docs for `cabal run`
2024-11-14 21:18:53 +0100 <shapr> sm: I'd guess it's doing `cabal update` and then `cabal build` but I wouldn't expect it to take that long?
2024-11-14 21:19:13 +0100 <c_wraith> two minutes is not improbable for `cabal update` with no previous state to add to
2024-11-14 21:19:16 +0100 <hellwolf> @shapr, you can use "#!/usr/bin/env -S cabal run -v1"
2024-11-14 21:19:16 +0100 <lambdabot> Unknown command, try @list
2024-11-14 21:19:20 +0100 <hellwolf> shapr, you can use "#!/usr/bin/env -S cabal run -v1"
2024-11-14 21:19:35 +0100 <hellwolf> or -v2
2024-11-14 21:20:06 +0100 <hellwolf> most likely, it was building packages that you hadn't built for that version of GHC
2024-11-14 21:20:41 +0100 <shapr> Yeah, since I'm using NixOS and a just-created empty environment with `nix
2024-11-14 21:20:43 +0100 <sm> `stack script` can also take minutes, possibly many minutes, the first time you run a script, and it might appear hung for part of that time; adding --verbosity=info to the shebang line shows more progress output. Like cabal it could be building half of hackage (say your script uses pandoc or hakyll :). Unlike cabal it could be installing GHC, as well.
2024-11-14 21:20:43 +0100 <shapr> oops
2024-11-14 21:20:56 +0100 <shapr> `nix-shell -p cabal-install ghc` is what I used to test this.
2024-11-14 21:21:12 +0100 <shapr> sm: good point
2024-11-14 21:21:17 +0100guy15(~guy@2a01:4b00:d007:ed00:81c3:85aa:e2c9:6027)
2024-11-14 21:21:35 +0100guy15guy_
2024-11-14 21:21:38 +0100 <guy_> feel free to open up a dm conversation if your listening along and i can answer any questions you might want to keep off the main channel, otherwise i guess ill wait for about 20 mins to see if anyone makes it to the end of the voice note, and if anyone enjoys the theory and is interested in the work im doing
2024-11-14 21:21:41 +0100 <sm> more stack trivia: don't miss `stack script --compile`, which will auto (re)compile the script, or run the compiled version if it already exists, for instant startup
2024-11-14 21:22:07 +0100 <guy_> (here for anyone that cant see the scrollup https://vocaroo.com/14nNu3Nm5FaV)
2024-11-14 21:22:22 +0100 <shapr> guy_: I comprehend text the fastest, is there a transcript?
2024-11-14 21:22:22 +0100 <c_wraith> pandoc and hakyll are... yeah. I patched hakyll to not accidentally build warp in CI (why did that need a patch?) and ripped pandoc out of the build pipeline entirely.
2024-11-14 21:22:43 +0100shaprworriedly checks his blog dependencies
2024-11-14 21:22:51 +0100 <guy_> shapr: not currently! it would be good if i had a voice to text tool, does anyone have a good tool for this?
2024-11-14 21:23:21 +0100 <guy_> it would be good to see what a GPT has to say on the subject
2024-11-14 21:23:33 +0100 <shapr> I don't have such a tool handy.
2024-11-14 21:23:34 +0100 <c_wraith> shapr: you might luckily be on an older version of hakyll that didn't accidentally build warp for a single data type import!
2024-11-14 21:23:53 +0100 <haskellbridge> <Bowuigi> guy_ The Whisper models are good
2024-11-14 21:23:55 +0100ash3en(~Thunderbi@193.32.248.167) (Quit: ash3en)
2024-11-14 21:23:59 +0100 <sm> c_wraith: similar - I abandoned hakyll and now always use pandoc via cli rather than importing
2024-11-14 21:24:03 +0100 <guy_> yeah, sorry about the format, thats currently the only available description i have
2024-11-14 21:24:29 +0100 <guy_> thanks haskellbridge, ill have a see if i can find an easy interface to see if i can get a trascript together
2024-11-14 21:25:03 +0100 <haskellbridge> <Bowuigi> There's a faster whisper project somewhere, lemme find the link
2024-11-14 21:25:08 +0100 <c_wraith> I thought about using shake instead of hakyll, but even for a small amount of code the porting process seemed huge.
2024-11-14 21:25:13 +0100guy(~guy@2a01:4b00:d007:ed00:81c3:85aa:e2c9:6027) (Ping timeout: 256 seconds)
2024-11-14 21:25:29 +0100 <sm> shake is what I switched to, I love it
2024-11-14 21:26:03 +0100 <shapr> I am using pandoc to convert org-mode to html, but it's not pulling in warp, whew.
2024-11-14 21:26:07 +0100 <sm> with a few caveats, like you can have only one shake file in a project directory and can run it only once at a time
2024-11-14 21:26:09 +0100 <guy_> hmm.. whisper is in pythos so thats inaccessible to me, i can find this blog post about a sort of haskell port
2024-11-14 21:26:09 +0100 <guy_> https://www.reddit.com/r/haskell/comments/102bxc1/voice_assistant_app_in_haskell/
2024-11-14 21:26:14 +0100 <guy_> python*
2024-11-14 21:28:04 +0100 <c_wraith> also, I never really used make, so the process of learning how to use shake seemed large. Much of shake's documentation is very "you already know how to use make"-oriented.
2024-11-14 21:28:42 +0100 <guy_> says it only builds out of the box with nixos... https://gitlab.com/ludflu/vad-audio
2024-11-14 21:29:13 +0100 <guy_> never mind! it would be most simple if people could just listen to the recording, save the the hassle! https://vocaroo.com/14nNu3Nm5FaV
2024-11-14 21:30:00 +0100 <guy_> im going afk for ~20 mins to give people time to listen through, see you at about 10 too
2024-11-14 21:30:33 +0100 <haskellbridge> <Bowuigi> Yeah that's easier, otherwise try https://github.com/ggerganov/whisper.cpp which is not Python but C++
2024-11-14 21:31:19 +0100 <sm> c_wraith I hear that. Even if you know make, Shake is not exactly a walk in the park to program, especially if you're not using regularly.
2024-11-14 21:31:48 +0100 <sm> but anything I have implemented in it has been rock solid and I never had to worry about it again
2024-11-14 21:31:58 +0100 <c_wraith> But it's not like I'm short on free time these days. I should take another shot at it.
2024-11-14 21:32:12 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) peterbecich
2024-11-14 21:33:01 +0100 <sm> https://github.com/simonmichael/hledger/blob/master/Shake.hs
2024-11-14 21:34:42 +0100 <sm> shapr: stack script example, if you want one ^
2024-11-14 21:36:20 +0100 <guy_> https://neilmitchell.blogspot.com/2021/09/reflecting-on-shake-build-system.html
2024-11-14 21:38:38 +0100 <shapr> sm: I'll pitch that link into the poscript. Also, I linked to your mastodon in the post, do you have some other preferred link target?
2024-11-14 21:39:59 +0100 <shapr> guy_: oui, c'est ca
2024-11-14 21:40:19 +0100 <shapr> I should stick to Swedish, my French has rusted away.
2024-11-14 21:40:23 +0100 <guy_> https://www.reddit.com/r/haskell/comments/i97lz7/is_there_something_similar_to_hakyll_using_shake/
2024-11-14 21:40:24 +0100 <guy_> this links;
2024-11-14 21:40:24 +0100 <guy_> rib (as a static site generator - sounds like a good alternative to hakyll) https://github.com/srid/rib
2024-11-14 21:40:25 +0100 <guy_> and
2024-11-14 21:40:25 +0100 <guy_> https://github.com/ChrisPenner/slick (which claims to be a simpler alternative to hakyll/jekyll)
2024-11-14 21:41:05 +0100 <shapr> Oh speaking of hakyll, I had a feature request to include the blog post body in my RSS feed.
2024-11-14 21:41:08 +0100 <guy_> the moustache specification markup seems interesting
2024-11-14 21:41:21 +0100 <sm> nice link guy_. I'd love to see Neil's latest build system, whatever it is
2024-11-14 21:41:35 +0100 <guy_> yeah, i wonder what he's up too!
2024-11-14 21:41:38 +0100 <sm> shapr thanks! https://joyful.com is the other
2024-11-14 21:41:59 +0100 <sm> probably not needed
2024-11-14 21:42:12 +0100 <guy_> sm: dead link
2024-11-14 21:42:40 +0100 <sm> holy.... ! thanks!
2024-11-14 21:42:49 +0100 <guy_> iv been away from the haskell comunity for a while, working on AGI research with sam altman and lex fridman
2024-11-14 21:42:51 +0100 <guy_> its been a blast
2024-11-14 21:43:08 +0100 <shapr> sm: oh no, website down?
2024-11-14 21:43:21 +0100 <guy_> just visiting #haskell to give some of this theory about "functor sheduling" and the implications it has on the haskell prelude
2024-11-14 21:43:28 +0100 <sm> Up but nicely blank. I didn't think you had to monitor a static website 😂
2024-11-14 21:43:43 +0100 <guy_> "nonlinear graphically complete languages" i think is a really strong result
2024-11-14 21:44:03 +0100 <guy_> just off the back of some abstraction i developed for mixture models for the AGI
2024-11-14 21:44:30 +0100 <sm> guy_ maybe worth a post on the haskell discourse / reddit, you might get more input
2024-11-14 21:44:45 +0100 <guy_> the people i want to reach are right here im sure
2024-11-14 21:44:55 +0100 <sm> the idle chatters ? :)
2024-11-14 21:44:56 +0100 <guy_> kind of shy to release a voice note on the open internets
2024-11-14 21:45:32 +0100 <sm> the thing about a 20m voice memo is nobody has time for it probably, unless they know you / your work
2024-11-14 21:45:52 +0100 <sm> a youtube would get more listens
2024-11-14 21:45:54 +0100 <guy_> sm: i was kind of hoping someone could help me cobble it together for a PhD proposal im trying to submit to philip wadler. he insists he wont supervise anything to do with scientific computation, so im trying to make it a pure language consideration
2024-11-14 21:46:39 +0100 <guy_> sm: i used to chat here under the names fog and fen. i was often kicked for giving walls of text on the seti/geti methodology, so i thought a voice note would at least save the users from the normal deluge
2024-11-14 21:47:03 +0100 <guy_> here is the link again if anyones interest is piqued https://vocaroo.com/14nNu3Nm5FaV
2024-11-14 21:48:00 +0100 <sm> well, welcome back and thanks for not deluging :)
2024-11-14 21:48:05 +0100 <guy_> :-)
2024-11-14 21:49:39 +0100emfrom(~emfrom@78.243.183.111)
2024-11-14 21:51:07 +0100 <shapr> I love using nix to re-compile my blog on my beefy laptop and push only the compiled result to my server. Fast and easy updates are pleasant.
2024-11-14 21:53:32 +0100 <guy_> sounds like hotswapping... i was trying to use nix-copy-closure for this
2024-11-14 21:53:49 +0100 <shapr> yeah, similar
2024-11-14 21:54:12 +0100 <shapr> Would be nice if nixos ran on Erlang and I could do real hotswap
2024-11-14 21:55:13 +0100 <c_wraith> you can't really do perfect hot-swap of web sites anyway, unless every deploy has a different URL... and that's really bad for bookmarks.
2024-11-14 21:57:21 +0100 <guy_> i was talking to simon marlow about his work in hotswapping at facebook, i cant find much online but there is this
2024-11-14 21:57:21 +0100 <guy_> https://www.reddit.com/r/haskell/comments/1le4y5/the_haxl_project_at_facebook_slides_from_my_talk/
2024-11-14 21:58:25 +0100 <guy_> it was dealing with the kind of issues where you might eg, have a saved data lib relavent to a previous build, and it somehow did some fancy versioning history to ensure reproducability, but the details escape me
2024-11-14 21:59:14 +0100 <guy_> like if you have a read and show instance for a save, but you change the datatype...
2024-11-14 22:00:09 +0100 <guy_> something like including the versioning considerations to ensure robust hotswapping... all very complicated, must have been about 7 years ago
2024-11-14 22:01:59 +0100 <guy_> ...
2024-11-14 22:03:26 +0100shapr(~user@4.30.215.226) (Ping timeout: 252 seconds)
2024-11-14 22:03:32 +0100 <guy_> so, its been about 20 mins since i linked the vocarro voice note (https://vocaroo.com/14nNu3Nm5FaV). has anyone had a chance to listen / is listening
2024-11-14 22:04:04 +0100 <guy_> would be cool to field some questions while i have it fresh in my memory
2024-11-14 22:05:01 +0100 <haskellbridge> <Bowuigi> Definitely interesting but I don't know enough graph theory to understand it lol
2024-11-14 22:05:47 +0100 <guy_> darn. i was hoping it was accessible
2024-11-14 22:07:23 +0100youthlic(~Thunderbi@user/youthlic) (Remote host closed the connection)
2024-11-14 22:07:25 +0100 <haskellbridge> <Bowuigi> I think it's fine though, my knowledge of graph theory is pretty much just the definitions of a graph and a DAG
2024-11-14 22:07:49 +0100youthlic(~Thunderbi@user/youthlic) youthlic
2024-11-14 22:07:49 +0100youthlic(~Thunderbi@user/youthlic) (Remote host closed the connection)
2024-11-14 22:08:14 +0100youthlic(~Thunderbi@user/youthlic) youthlic
2024-11-14 22:08:19 +0100 <guy_> i was hoping that the term "graphically complete language" would become widespread so that haskell could be exeplary as such
2024-11-14 22:08:22 +0100alphazone_(~alphazone@2.219.56.221)
2024-11-14 22:08:53 +0100 <guy_> i didnt realise at the time there was this more complicated idea of the "graphically complete language" being *nonlinear* owing to local scoping considerations
2024-11-14 22:10:29 +0100 <guy_> the turing completeness is basically to do with the "linear"ization of the graph, putting it into a turing machine on a 1d (liniarized) turing tape. when you have local scoping, its something like a violation of the 1-1 nature of the traversable laws. i thought it was super interesting!
2024-11-14 22:11:11 +0100 <guy_> (thats a pretty good tldr tbh)
2024-11-14 22:11:12 +0100alphazone(~alphazone@2.219.56.221) (Ping timeout: 246 seconds)
2024-11-14 22:12:42 +0100 <guy_> like "if the variable x corresponds to some Int label on the turing tape, what happens when you locally reasign x within a local scope"
2024-11-14 22:12:43 +0100 <guy_> this kind of overwrite / reuse of a restricted cache of variable names is basically amounting to some "nonliner" concept
2024-11-14 22:13:13 +0100 <guy_> you end up getting something like seti+geti+rewrites
2024-11-14 22:14:02 +0100 <guy_> overwrites* (rewite is a protected word to do with dereferencing and the program monad)
2024-11-14 22:14:16 +0100 <guy_> but i dont want to ramble... thats why i made the voice note!
2024-11-14 22:15:43 +0100mange(~user@user/mange) mange
2024-11-14 22:16:08 +0100 <guy_> if anyone has any specific questions i can answer then it would avoid the ire of the moderator!
2024-11-14 22:16:48 +0100peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 246 seconds)
2024-11-14 22:18:22 +0100 <guy_> also, anyone interested in the AGI stuff can shoot me a DM aswell
2024-11-14 22:20:17 +0100lxsameer(~lxsameer@Serene/lxsameer) (Ping timeout: 252 seconds)
2024-11-14 22:21:02 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) (Ping timeout: 252 seconds)
2024-11-14 22:22:21 +0100dolio(~dolio@130.44.140.168) (Quit: ZNC 1.9.1 - https://znc.in)
2024-11-14 22:22:33 +0100 <haskellbridge> <Bowuigi> So linear here is not the type theoretic linearity right?
2024-11-14 22:26:17 +0100manwithluck(manwithluc@gateway/vpn/protonvpn/manwithluck) (Remote host closed the connection)
2024-11-14 22:26:42 +0100manwithluck(manwithluc@gateway/vpn/protonvpn/manwithluck) manwithluck
2024-11-14 22:26:55 +0100machinedgod(~machinedg@d108-173-18-100.abhsia.telus.net) machinedgod
2024-11-14 22:28:10 +0100 <haskellbridge> <Bowuigi> Oh also make a blog and post the usual introductory stuff there, way more compact and easier to follow
2024-11-14 22:28:29 +0100 <haskellbridge> <Bowuigi> I wanted to do that for my research but I got too lazy lol
2024-11-14 22:30:12 +0100dolio(~dolio@130.44.140.168) dolio
2024-11-14 22:30:47 +0100 <guy_> too lazy!?
2024-11-14 22:31:02 +0100 <guy_> are you sure its not the ol' "not the secret societies responsibility" argument!
2024-11-14 22:31:17 +0100 <guy_> i was hopeing the advances in transparency and open society would maybe percolate through
2024-11-14 22:31:33 +0100CrunchyFlakes(~CrunchyFl@ip1f13e94e.dynamic.kabel-deutschland.de) (Quit: ZNC 1.8.2 - https://znc.in)
2024-11-14 22:31:53 +0100 <guy_> im always looking for keen people that can offer their services. better than being completely invisabailised imo
2024-11-14 22:32:52 +0100 <guy_> "type theoretic linearity". no thats about some kind of strict purity, right? like, variables used exactly once and then deleted?
2024-11-14 22:32:56 +0100shapr(~user@2601:19c:417e:5434:b5b7:a31:f560:51b7) shapr
2024-11-14 22:33:09 +0100 <guy_> that was something i think we were working on in terms of a "strictly stateful" functional programming language
2024-11-14 22:33:29 +0100 <guy_> like how haskell has "all functions are bivariate functions" where partial application can return a new bivariate function
2024-11-14 22:33:53 +0100 <guy_> but now instead of a bivariate function a -> b, its the stateful function; s -> a -> (s,b)
2024-11-14 22:34:13 +0100 <guy_> since you can always have s~() you can basically make a totally stateful language
2024-11-14 22:34:29 +0100 <guy_> i think the "type theoretic linearity" can be used to great effect here, but i forget how!
2024-11-14 22:35:16 +0100 <guy_> (s,s->a->(s,b)) actually, since you need the state aswell
2024-11-14 22:35:34 +0100 <guy_> and then its weird because your functions are replaced by something that has concrete and variable data associated
2024-11-14 22:36:02 +0100 <guy_> i think basically, because the state is updated each time, thats where the linearity comes in
2024-11-14 22:36:10 +0100 <guy_> but yeah, totally different concept of linearity
2024-11-14 22:36:28 +0100 <guy_> its more like basically "because its foldable there is a toList" so its "linearizable"
2024-11-14 22:37:00 +0100 <guy_> seti+geti are abstractions extending around pattern matching on (:) where you get an extra piece of data