2022/06/23

2022-06-23 00:06:40 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 00:07:19 +0200crazazy(~user@130.89.171.62) (Ping timeout: 256 seconds)
2022-06-23 00:12:26 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 00:16:58 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Ping timeout: 240 seconds)
2022-06-23 00:23:41 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 00:25:08 +0200ubert(~Thunderbi@p200300ecdf0da56600626fc30d47cd25.dip0.t-ipconnect.de) (Ping timeout: 244 seconds)
2022-06-23 00:33:43 +0200cosimone(~user@93-44-186-171.ip98.fastwebnet.it)
2022-06-23 00:35:48 +0200Midjak(~Midjak@82.66.147.146) (Quit: This computer has gone to sleep)
2022-06-23 00:36:41 +0200Qudit(~user@user/Qudit)
2022-06-23 00:43:04 +0200zer0bitz(~zer0bitz@2001:2003:f748:2000:b968:ef1b:5eee:ca89) (Ping timeout: 248 seconds)
2022-06-23 00:46:22 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi)
2022-06-23 00:47:26 +0200Inoperable(~PLAYER_1@fancydata.science) (Excess Flood)
2022-06-23 00:50:00 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 00:55:16 +0200Inoperable(~PLAYER_1@fancydata.science)
2022-06-23 00:56:39 +0200werneta(~werneta@137.78.30.207) (Ping timeout: 268 seconds)
2022-06-23 00:58:12 +0200werneta(~werneta@137.79.203.93)
2022-06-23 00:58:13 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi) (Quit: WeeChat 3.5)
2022-06-23 00:58:44 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 01:07:30 +0200Inoperable(~PLAYER_1@fancydata.science) (Excess Flood)
2022-06-23 01:10:48 +0200Inoperable(~PLAYER_1@fancydata.science)
2022-06-23 01:11:44 +0200Inoperable(~PLAYER_1@fancydata.science) (Excess Flood)
2022-06-23 01:15:24 +0200mixfix41(~sdenynine@user/mixfix41) (Quit: out for now)
2022-06-23 01:19:34 +0200Inoperable(~PLAYER_1@fancydata.science)
2022-06-23 01:20:06 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Ping timeout: 268 seconds)
2022-06-23 01:21:21 +0200superz(~superegg@user/superegg)
2022-06-23 01:22:11 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net)
2022-06-23 01:22:18 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 01:23:29 +0200Lord_of_Life_(~Lord@user/lord-of-life/x-2819915)
2022-06-23 01:23:58 +0200Lord_of_Life(~Lord@user/lord-of-life/x-2819915) (Ping timeout: 240 seconds)
2022-06-23 01:24:24 +0200esrh(~user@sw-10121.atl5.as22384.net)
2022-06-23 01:24:44 +0200Lord_of_Life_Lord_of_Life
2022-06-23 01:26:15 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 268 seconds)
2022-06-23 01:26:15 +0200dlbh^(~dlbh@50.237.44.186) (Ping timeout: 268 seconds)
2022-06-23 01:28:05 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 01:28:06 +0200pleo(~pleo@user/pleo) (Quit: quit)
2022-06-23 01:32:17 +0200mixfix41(~sdenynine@user/mixfix41)
2022-06-23 01:34:01 +0200lucifero(~satan@ip-046-223-003-068.um13.pools.vodafone-ip.de) (Ping timeout: 256 seconds)
2022-06-23 01:35:57 +0200lucifero(~satan@ip-037-201-207-048.um10.pools.vodafone-ip.de)
2022-06-23 01:39:29 +0200bucifero(~satan@ip-046-223-003-073.um13.pools.vodafone-ip.de)
2022-06-23 01:39:41 +0200Tuplanolla(~Tuplanoll@91-159-69-97.elisa-laajakaista.fi) (Quit: Leaving.)
2022-06-23 01:42:54 +0200lucifero(~satan@ip-037-201-207-048.um10.pools.vodafone-ip.de) (Ping timeout: 268 seconds)
2022-06-23 01:44:21 +0200pragma-(~chaos@user/pragmatic-chaos) (Bye!)
2022-06-23 01:46:15 +0200justsomeguy(~justsomeg@user/justsomeguy) (Quit: WeeChat 3.5)
2022-06-23 01:46:51 +0200nate4(~nate@98.45.169.16)
2022-06-23 01:46:58 +0200werneta(~werneta@137.79.203.93) (Ping timeout: 240 seconds)
2022-06-23 01:48:55 +0200zeenk(~zeenk@2a02:2f04:a301:3d00:39df:1c4b:8a55:48d3) (Quit: Konversation terminated!)
2022-06-23 01:51:58 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net) (Ping timeout: 240 seconds)
2022-06-23 01:54:25 +0200nate4(~nate@98.45.169.16) (Ping timeout: 256 seconds)
2022-06-23 01:55:15 +0200machinedgod(~machinedg@66.244.246.252) (Remote host closed the connection)
2022-06-23 01:55:33 +0200alp_(~alp@user/alp) (Ping timeout: 256 seconds)
2022-06-23 01:56:42 +0200 <hololeap> Either x (y,z) -> (Either x y, z)
2022-06-23 01:56:50 +0200machinedgod(~machinedg@66.244.246.252)
2022-06-23 01:57:05 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net) (Ping timeout: 268 seconds)
2022-06-23 01:57:06 +0200esrh(~user@sw-10121.atl5.as22384.net) (Ping timeout: 264 seconds)
2022-06-23 01:59:42 +0200mvk(~mvk@2607:fea8:5ce3:8500::4588)
2022-06-23 02:00:34 +0200 <hololeap> I was going to ask a question about this, but I think I figured it out
2022-06-23 02:01:45 +0200unit73e(~emanuel@2001:818:e8dd:7c00:32b5:c2ff:fe6b:5291) (Ping timeout: 244 seconds)
2022-06-23 02:03:47 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Remote host closed the connection)
2022-06-23 02:04:02 +0200HackingSpring(~haru@2804:431:c7f5:d4eb:75fd:791c:59a2:7773)
2022-06-23 02:04:47 +0200califax(~califax@user/califx) (Remote host closed the connection)
2022-06-23 02:05:53 +0200califax(~califax@user/califx)
2022-06-23 02:06:03 +0200Inoperable(~PLAYER_1@fancydata.science) (Excess Flood)
2022-06-23 02:06:06 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 264 seconds)
2022-06-23 02:06:38 +0200dlbh^(~dlbh@50.237.44.186)
2022-06-23 02:09:00 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 02:09:51 +0200Inoperable(~PLAYER_1@fancydata.science)
2022-06-23 02:10:01 +0200alp_(~alp@user/alp)
2022-06-23 02:12:44 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 02:13:40 +0200vysn(~vysn@user/vysn)
2022-06-23 02:23:09 +0200werneta(~werneta@70-142-214-115.lightspeed.irvnca.sbcglobal.net)
2022-06-23 02:26:30 +0200gurkenglas(~gurkengla@dslb-002-207-014-022.002.207.pools.vodafone-ip.de) (Ping timeout: 264 seconds)
2022-06-23 02:27:50 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net)
2022-06-23 02:28:25 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470) (Ping timeout: 256 seconds)
2022-06-23 02:30:17 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 248 seconds)
2022-06-23 02:31:21 +0200Colere(~colere@about/linux/staff/sauvin) (Ping timeout: 248 seconds)
2022-06-23 02:32:30 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470)
2022-06-23 02:33:55 +0200Colere(~colere@about/linux/staff/sauvin)
2022-06-23 02:36:58 +0200xff0x(~xff0x@b133147.ppp.asahi-net.or.jp) (Ping timeout: 240 seconds)
2022-06-23 02:40:57 +0200alp_(~alp@user/alp) (Ping timeout: 248 seconds)
2022-06-23 02:48:18 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 240 seconds)
2022-06-23 02:50:28 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 02:56:04 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 03:01:52 +0200 <johnw> you wouldn't be able to implement that
2022-06-23 03:02:04 +0200 <johnw> just hoping that's what you figured out :)
2022-06-23 03:02:18 +0200jao(~jao@cpc103048-sgyl39-2-0-cust502.18-2.cable.virginm.net) (Ping timeout: 240 seconds)
2022-06-23 03:02:29 +0200 <dolio> Just ask the djinni.
2022-06-23 03:10:36 +0200arahael(~arahael@118.211.187.178) (Ping timeout: 258 seconds)
2022-06-23 03:10:39 +0200nate4(~nate@98.45.169.16)
2022-06-23 03:24:06 +0200dlbh^(~dlbh@50.237.44.186) (Ping timeout: 264 seconds)
2022-06-23 03:24:27 +0200aeka`(~aeka@2606:6080:1001:d:c59c:6e9a:3115:6f2f)
2022-06-23 03:25:44 +0200aeka(~aeka@user/hiruji) (Ping timeout: 248 seconds)
2022-06-23 03:25:44 +0200aeka`aeka
2022-06-23 03:26:50 +0200hpc(~juzz@ip98-169-32-242.dc.dc.cox.net) (Ping timeout: 240 seconds)
2022-06-23 03:26:59 +0200xff0x(~xff0x@125x103x176x34.ap125.ftth.ucom.ne.jp)
2022-06-23 03:27:03 +0200 <zzz> johnw: i spent 10 minutes playing with sequence before it hit me
2022-06-23 03:28:41 +0200arahael(~arahael@203.63.7.203)
2022-06-23 03:29:02 +0200hpc(~juzz@ip98-169-32-242.dc.dc.cox.net)
2022-06-23 03:30:11 +0200aeka(~aeka@2606:6080:1001:d:c59c:6e9a:3115:6f2f) (Ping timeout: 255 seconds)
2022-06-23 03:31:00 +0200aeka(~aeka@user/hiruji)
2022-06-23 03:33:38 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net) (Ping timeout: 240 seconds)
2022-06-23 03:33:43 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340) (Ping timeout: 244 seconds)
2022-06-23 03:36:25 +0200Kaipei(~Kaiepi@156.34.47.253) (Read error: Connection reset by peer)
2022-06-23 03:38:36 +0200pavonia(~user@user/siracusa)
2022-06-23 03:39:25 +0200_xor(~xor@74.215.182.83) (Quit: brb)
2022-06-23 03:39:38 +0200zebrag(~chris@user/zebrag) (Ping timeout: 240 seconds)
2022-06-23 03:40:33 +0200Kaiepi(~Kaiepi@156.34.47.253)
2022-06-23 03:40:50 +0200toluene(~toluene@user/toulene) (Ping timeout: 240 seconds)
2022-06-23 03:41:31 +0200machinedgod(~machinedg@66.244.246.252) (Ping timeout: 256 seconds)
2022-06-23 03:43:01 +0200toluene(~toluene@user/toulene)
2022-06-23 03:48:38 +0200nate4(~nate@98.45.169.16) (Ping timeout: 240 seconds)
2022-06-23 03:50:58 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Remote host closed the connection)
2022-06-23 03:51:23 +0200HackingSpring(~haru@2804:431:c7f5:d4eb:75fd:791c:59a2:7773) (Remote host closed the connection)
2022-06-23 03:51:35 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 03:52:27 +0200nate4(~nate@98.45.169.16)
2022-06-23 03:54:40 +0200kannon(~NK@74-95-14-193-SFBA.hfc.comcastbusiness.net)
2022-06-23 03:55:38 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Ping timeout: 240 seconds)
2022-06-23 03:55:57 +0200kannon(~NK@74-95-14-193-SFBA.hfc.comcastbusiness.net) (Read error: Connection reset by peer)
2022-06-23 03:58:54 +0200nate4(~nate@98.45.169.16) (Ping timeout: 264 seconds)
2022-06-23 04:04:33 +0200 <hololeap> johnw, you can, if a) z is a monoid b) you already have a z lying around
2022-06-23 04:05:43 +0200 <hololeap> so, I'm kinda just throwing the INLINE pragma on all of my pointfree functions. is this reasonable?
2022-06-23 04:07:32 +0200 <DigitalKiwi> idk sounds kind of pointless
2022-06-23 04:07:37 +0200 <monochrom> hahaha
2022-06-23 04:07:45 +0200 <monochrom> I think you should benchmark.
2022-06-23 04:07:55 +0200 <DigitalKiwi> ba dum tsch
2022-06-23 04:09:54 +0200 <hololeap> I'm just wondering if this is a reasonable heuristic, not really asking if it will _always_ speed things up, but if it will speed _some_ things up without causing any problems
2022-06-23 04:10:29 +0200 <monochrom> Then I don't know.
2022-06-23 04:12:08 +0200 <hololeap> also, is there a wrapper that will transform a semigroup into a monoid, kind of like how MaybeApply will transform an Apply to an Applicative?
2022-06-23 04:12:29 +0200 <hololeap> something in base or semigroupoids that I'm just not spotting
2022-06-23 04:13:00 +0200 <hololeap> oh, I guess Maybe works
2022-06-23 04:13:13 +0200 <hololeap> there it is :)
2022-06-23 04:13:22 +0200 <monochrom> Oh haha it's already in base.
2022-06-23 04:13:33 +0200 <hololeap> thanks, monochrom
2022-06-23 04:13:39 +0200 <EvanR> +1 to that
2022-06-23 04:14:30 +0200 <zzz> lol
2022-06-23 04:14:57 +0200 <zzz> thanks for Nothing
2022-06-23 04:15:09 +0200 <monochrom> hahahaha
2022-06-23 04:15:40 +0200 <monochrom> Secret Santa put it in base!
2022-06-23 04:16:16 +0200liz(~liz@host86-159-158-175.range86-159.btcentralplus.com) (Quit: leaving)
2022-06-23 04:20:46 +0200brettgilio(~brettgili@c9yh.net)
2022-06-23 04:21:31 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Quit: leaving)
2022-06-23 04:26:31 +0200hnOsmium0001(uid453710@user/hnOsmium0001)
2022-06-23 04:29:08 +0200 <hololeap> :t \z = swap . first (fromMaybe z) . traverse (swap . second Just)
2022-06-23 04:29:10 +0200 <lambdabot> error: parse error on input ‘=’
2022-06-23 04:29:18 +0200 <hololeap> :t \z -> swap . first (fromMaybe z) . traverse (swap . second Just)
2022-06-23 04:29:19 +0200 <lambdabot> (Traversable t, Semigroup a) => a -> t (b, a) -> (t b, a)
2022-06-23 04:31:52 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 04:36:06 +0200Kaiepi(~Kaiepi@156.34.47.253) (Ping timeout: 264 seconds)
2022-06-23 04:38:03 +0200 <zzz> ok question:
2022-06-23 04:39:38 +0200 <zzz> traverse is defined in terms of sequence, and sequence is in turn defined in terms of traverse
2022-06-23 04:40:21 +0200nate4(~nate@98.45.169.16)
2022-06-23 04:40:42 +0200 <zzz> more specifically
2022-06-23 04:40:54 +0200_xor(~xor@74.215.182.83)
2022-06-23 04:40:55 +0200 <zzz> traverse f = sequenceA . fmap f
2022-06-23 04:40:58 +0200 <zzz> and
2022-06-23 04:41:13 +0200 <zzz> sequenceA = traverse id
2022-06-23 04:42:04 +0200 <zzz> is this a "lie" or am i missing something?
2022-06-23 04:44:50 +0200 <zzz> ok nvm i was missing something
2022-06-23 04:45:05 +0200terrorjack(~terrorjac@2a01:4f8:1c1e:509a::1)
2022-06-23 04:45:48 +0200leeb(~leeb@KD106155002239.au-net.ne.jp)
2022-06-23 04:45:55 +0200 <zzz> more specitically
2022-06-23 04:46:26 +0200 <zzz> {-# MINIMAL traverse | sequenceA #-}
2022-06-23 04:46:51 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 04:47:23 +0200brettgilio(~brettgili@c9yh.net) (Quit: The Lounge - https://thelounge.chat)
2022-06-23 04:49:33 +0200brettgilio(~brettgili@c9yh.net)
2022-06-23 04:54:05 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 04:54:49 +0200mvk(~mvk@2607:fea8:5ce3:8500::4588) (Ping timeout: 248 seconds)
2022-06-23 04:54:58 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Remote host closed the connection)
2022-06-23 04:55:36 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 04:57:08 +0200jrm(~jrm@user/jrm) (Quit: ciao)
2022-06-23 04:58:28 +0200jrm(~jrm@user/jrm)
2022-06-23 04:58:46 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Remote host closed the connection)
2022-06-23 05:01:28 +0200td_(~td@muedsl-82-207-238-203.citykom.de) (Ping timeout: 268 seconds)
2022-06-23 05:02:50 +0200td_(~td@94.134.91.184)
2022-06-23 05:03:06 +0200vysn(~vysn@user/vysn) (Ping timeout: 264 seconds)
2022-06-23 05:03:15 +0200jrm(~jrm@user/jrm) (Client Quit)
2022-06-23 05:04:15 +0200 <dsal> `sequenceA` always lies and `traverse` always tells the truth, so it works.
2022-06-23 05:04:28 +0200jrm(~jrm@user/jrm)
2022-06-23 05:06:35 +0200azimut(~azimut@gateway/tor-sasl/azimut)
2022-06-23 05:07:51 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 05:12:25 +0200Unicorn_Princess(~Unicorn_P@93-103-228-248.dynamic.t-2.net) (Quit: Leaving)
2022-06-23 05:18:09 +0200 <zzz> it could work even if you didn't know which is which
2022-06-23 05:22:33 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470) (Remote host closed the connection)
2022-06-23 05:24:13 +0200eggplant_(~Eggplanta@108-201-191-115.lightspeed.sntcca.sbcglobal.net)
2022-06-23 05:26:08 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Ping timeout: 268 seconds)
2022-06-23 05:28:15 +0200yangby(~secret@115.206.19.11)
2022-06-23 05:28:56 +0200yangby(~secret@115.206.19.11) (Client Quit)
2022-06-23 05:30:24 +0200nate4(~nate@98.45.169.16)
2022-06-23 05:34:45 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Remote host closed the connection)
2022-06-23 05:35:04 +0200 <Axman6> "SequenceA: I am lying" -> Exception: <<loop>>
2022-06-23 05:36:05 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 05:39:18 +0200nate4(~nate@98.45.169.16) (Ping timeout: 240 seconds)
2022-06-23 05:40:26 +0200 <monochrom> heh
2022-06-23 05:40:31 +0200zso(~inversed@97e3d74e.skybroadband.com) (Ping timeout: 256 seconds)
2022-06-23 05:41:42 +0200toluene(~toluene@user/toulene) (Quit: Ping timeout (120 seconds))
2022-06-23 05:42:59 +0200inversed(~inversed@97e3d74e.skybroadband.com)
2022-06-23 05:43:12 +0200toluene(~toluene@user/toulene)
2022-06-23 05:47:35 +0200lisbeths(uid135845@id-135845.lymington.irccloud.com)
2022-06-23 05:51:55 +0200 <Axman6> Reminds me of my favourite (and only) logic joke: Three logicians walk into a bar. The bartender asks "Would you all like a drink?". The first one say "I don'
2022-06-23 05:52:12 +0200 <Axman6> "I don't know", the second one says "I don't know", and the third one says "Yes".
2022-06-23 05:52:31 +0200 <Axman6> says*
2022-06-23 05:52:52 +0200causal(~user@50.35.83.177) (Quit: WeeChat 3.5)
2022-06-23 05:56:18 +0200winny(~weechat@user/winny) (Remote host closed the connection)
2022-06-23 05:56:45 +0200winny(~weechat@user/winny)
2022-06-23 05:57:29 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 246 seconds)
2022-06-23 06:00:03 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 06:01:39 +0200jargon(~jargon@184.101.186.108)
2022-06-23 06:05:11 +0200ski(~ski@remote11.chalmers.se)
2022-06-23 06:05:36 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 268 seconds)
2022-06-23 06:13:50 +0200Vajb(~Vajb@hag-jnsbng11-58c3a8-176.dhcp.inet.fi) (Read error: Connection reset by peer)
2022-06-23 06:14:03 +0200Vajb(~Vajb@85-76-45-183-nat.elisa-mobile.fi)
2022-06-23 06:21:09 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 06:29:10 +0200justsomeguy(~justsomeg@user/justsomeguy)
2022-06-23 06:31:05 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 246 seconds)
2022-06-23 06:31:18 +0200nate4(~nate@98.45.169.16)
2022-06-23 06:36:24 +0200nate4(~nate@98.45.169.16) (Ping timeout: 272 seconds)
2022-06-23 06:37:22 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 06:43:16 +0200Kaiepi(~Kaiepi@156.34.47.253)
2022-06-23 06:53:46 +0200arkeet(arkeet@moriya.ca) (Quit: ZNC 1.8.2 - https://znc.in)
2022-06-23 06:57:02 +0200misterfish(~misterfis@ip214-130-173-82.adsl2.static.versatel.nl)
2022-06-23 06:59:02 +0200jargon(~jargon@184.101.186.108) (Remote host closed the connection)
2022-06-23 07:02:30 +0200vglfr(~vglfr@coupling.penchant.volia.net) (Ping timeout: 276 seconds)
2022-06-23 07:07:29 +0200justsomeguy(~justsomeg@user/justsomeguy) (Ping timeout: 246 seconds)
2022-06-23 07:08:11 +0200moet(~moet@mobile-166-171-250-122.mycingular.net)
2022-06-23 07:08:16 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 07:09:48 +0200 <moet> hi.. i'm running `hoogle server --port=8088` in a virtual machine guest and trying to browse it from the VM host. firefox requests the page in https, but then upgrades all the resources (css, etc) to https and cannot load them (because hoogle isn't serving https)
2022-06-23 07:10:12 +0200 <moet> i can't tell if this is an issue with hoogle or with firefox, so i tried out safari and the same thing is happening.. this makes me think it's an issue with hoogle
2022-06-23 07:11:19 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 07:11:40 +0200 <moet> any ideas about what to try next? i'm able to curl (over http) the hoogle html and css and other resources
2022-06-23 07:12:29 +0200triteraflops(~triterafl@user/triteraflops)
2022-06-23 07:12:54 +0200 <triteraflops> Ho! What news?
2022-06-23 07:13:53 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Remote host closed the connection)
2022-06-23 07:14:31 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 07:14:43 +0200 <triteraflops> I'm trying to wrap my head around large objects.
2022-06-23 07:14:47 +0200 <triteraflops> hm.
2022-06-23 07:15:13 +0200 <triteraflops> Reading this back, maybe I should stop trying to wrap my head around large objects.
2022-06-23 07:15:16 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 07:16:07 +0200 <triteraflops> anyway, large objects shouldn't be copied. A mutation can be represented as a pure operation if the large object is not aliased.
2022-06-23 07:16:50 +0200 <triteraflops> And haskell will not automatically mutate large objects, even when it could. Which means the compiler can't detect whether an object is aliased.
2022-06-23 07:16:58 +0200 <triteraflops> But why is that a hard problem?
2022-06-23 07:17:09 +0200 <triteraflops> Why can't GHC detect this?
2022-06-23 07:17:36 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Remote host closed the connection)
2022-06-23 07:19:46 +0200 <moet> i'm not sure i understand what you're seeing vs what you want to see triteraflops... can you state it in terms of this example? `let foo = Foo{field1=Just ..., ..., fieldN=...}; let bar = foo{field1=Nothing}; in ...` assume that Foo contains many large fields
2022-06-23 07:19:52 +0200takuan(~takuan@178-116-218-225.access.telenet.be)
2022-06-23 07:20:41 +0200 <triteraflops> moet: the cleanest example I can think of is a function I wrote recently which speeds itself up using a hashmap.
2022-06-23 07:20:44 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Ping timeout: 272 seconds)
2022-06-23 07:20:51 +0200 <triteraflops> moet: let me see if I can find it.
2022-06-23 07:23:17 +0200vglfr(~vglfr@88.155.20.3)
2022-06-23 07:23:37 +0200jargon(~jargon@184.101.186.108)
2022-06-23 07:23:46 +0200jargon(~jargon@184.101.186.108) (Remote host closed the connection)
2022-06-23 07:24:12 +0200jargon(~jargon@184.101.186.108)
2022-06-23 07:24:44 +0200bilegeek(~bilegeek@2600:1008:b06f:8528:b8b4:9bf9:3a8:ef97) (Quit: Leaving)
2022-06-23 07:25:30 +0200 <triteraflops> moet: aw hell no lol. This example I found is actually needlessly complicated for demonstrating the basic point.
2022-06-23 07:30:52 +0200 <davean> triteraflops: why do you think it is reasonable to tell if something has multiple reference?
2022-06-23 07:31:11 +0200 <triteraflops> davean: not all the time. Just some of the time.
2022-06-23 07:32:13 +0200 <triteraflops> if multiple references can only arise from some kind of fork of the form (f x x), then you should be able to tell.
2022-06-23 07:32:25 +0200 <davean> Thats not the only way it can
2022-06-23 07:32:31 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 07:32:32 +0200 <triteraflops> davean: how else?
2022-06-23 07:32:40 +0200 <davean> litterly every reference to it creates a reference
2022-06-23 07:33:11 +0200 <davean> They only get cleaned up when computation is forced, or the GC simplifies
2022-06-23 07:33:22 +0200 <davean> the only thing that knows how many things refer to some thing is the GC
2022-06-23 07:33:46 +0200 <davean> every time you say something like "fieldN x" thats a reference that doesn't get resolved until demand
2022-06-23 07:33:53 +0200 <davean> EVERYTHING IS A HELD REFERENCE
2022-06-23 07:33:59 +0200 <davean> EVERYTHING
2022-06-23 07:34:22 +0200 <triteraflops> so maybe it can't be done in runtime
2022-06-23 07:34:31 +0200 <triteraflops> but perhaps it can be done at compile time
2022-06-23 07:34:53 +0200arkeet(arkeet@moriya.ca)
2022-06-23 07:35:20 +0200 <triteraflops> as long as the functions are inspectable by ghc. FFI functions are clearly ineligible
2022-06-23 07:35:37 +0200 <triteraflops> Can't you tell how many times a function uses one of its inputs?
2022-06-23 07:35:40 +0200 <triteraflops> at compile time?
2022-06-23 07:35:57 +0200 <davean> Its not about how many times it uses it
2022-06-23 07:35:58 +0200leeb(~leeb@KD106155002239.au-net.ne.jp) (Ping timeout: 240 seconds)
2022-06-23 07:36:13 +0200 <davean> "f x" that x, how many times is it already referenced?
2022-06-23 07:36:24 +0200 <davean> You can optimize it to produce fewer intermediate copies
2022-06-23 07:36:33 +0200 <davean> but you can't know how many times x is already referenced
2022-06-23 07:37:05 +0200 <triteraflops> Sometimes, you can.
2022-06-23 07:37:37 +0200 <triteraflops> let x' = copy x in g x'
2022-06-23 07:37:45 +0200 <davean> There is no "copy x"
2022-06-23 07:37:49 +0200 <davean> but lets go simpler
2022-06-23 07:37:50 +0200 <triteraflops> but there could be
2022-06-23 07:38:07 +0200 <davean> ok, how many times is the argument referenced if I say "f (Foo ...)"
2022-06-23 07:38:07 +0200leeb(~leeb@KD106154144179.au-net.ne.jp)
2022-06-23 07:38:16 +0200 <triteraflops> if there were, there could be some kind of compiler optimisation for it
2022-06-23 07:38:27 +0200neoatnebula(~neoatnebu@49.206.16.59)
2022-06-23 07:38:42 +0200 <triteraflops> davean: which argument
2022-06-23 07:38:50 +0200 <davean> (Foo ...)
2022-06-23 07:38:56 +0200 <davean> How many times is (Foo ...) referenced?
2022-06-23 07:39:13 +0200 <triteraflops> are we f, or are we calling f?
2022-06-23 07:39:30 +0200mbuf(~Shakthi@122.164.15.160)
2022-06-23 07:39:34 +0200 <davean> Calling f say
2022-06-23 07:39:52 +0200Batzy_(~quassel@user/batzy)
2022-06-23 07:39:59 +0200 <triteraflops> (Foo 45) is brand new, so there's only one of them
2022-06-23 07:40:08 +0200 <davean> there are between 0 and N of them
2022-06-23 07:40:24 +0200 <triteraflops> It's brand new. How could there be?
2022-06-23 07:40:41 +0200mjs22(~mjs22@76.115.19.239) (Quit: Leaving)
2022-06-23 07:41:19 +0200 <davean> well if f is inlined, then Foo is never constructed
2022-06-23 07:41:32 +0200 <davean> if the same code exists elsewhere, it might be lifted an be passed as a reference
2022-06-23 07:41:42 +0200 <davean> Litterly between 0 and N copies
2022-06-23 07:41:51 +0200nate4(~nate@98.45.169.16)
2022-06-23 07:41:55 +0200 <davean> 1 is not even the probable case
2022-06-23 07:42:54 +0200triteraflopslooks up litterly in the wiktionary
2022-06-23 07:43:05 +0200Batzy(~quassel@user/batzy) (Ping timeout: 255 seconds)
2022-06-23 07:43:13 +0200 <davean> Literally
2022-06-23 07:43:33 +0200 <davean> I was in no way being anything but exact in that statement and you can't narrow it down more
2022-06-23 07:45:06 +0200 <triteraflops> so... making no demands on the number of references to a potential (Foo 45) allows for optimisations that would be impossible otherwise?
2022-06-23 07:45:14 +0200 <triteraflops> interesting
2022-06-23 07:45:27 +0200 <triteraflops> but would preclude other optimisations that could be done and may be more important
2022-06-23 07:46:13 +0200 <davean> if I say (field1 f)+(field2 f) how many refences to f are there?
2022-06-23 07:46:36 +0200 <triteraflops> ah, well, this is a clear example of a form (f x x)
2022-06-23 07:46:58 +0200 <triteraflops> so you couldn't use compile time analysis to allow mutation
2022-06-23 07:47:20 +0200nate4(~nate@98.45.169.16) (Ping timeout: 272 seconds)
2022-06-23 07:48:08 +0200 <triteraflops> on the other hand, if it were h . g . f $ (Foo 45)
2022-06-23 07:48:13 +0200 <davean> well really? then what about "add f = (field1 f)+(field2 f)"
2022-06-23 07:48:41 +0200 <davean> I *really* don't think you get what non-strict means.
2022-06-23 07:48:51 +0200 <davean> I already said it could eliminate intermediate copies
2022-06-23 07:48:51 +0200 <triteraflops> oh great, now it's recursive
2022-06-23 07:48:59 +0200 <davean> theres nothing recursive there
2022-06-23 07:49:14 +0200 <triteraflops> ohhh now I get it
2022-06-23 07:49:29 +0200 <triteraflops> This is a function definition, not one part of a let statement
2022-06-23 07:49:30 +0200 <triteraflops> right
2022-06-23 07:49:54 +0200 <triteraflops> so the add function you define might be OK, or might not, depending on the type of (field1 f)
2022-06-23 07:50:11 +0200 <triteraflops> If it's an Int, say, then maybe you could get away with it.
2022-06-23 07:50:13 +0200 <davean> Say they're both Int
2022-06-23 07:50:38 +0200 <triteraflops> The simplest optimisation algorithm would look at add, see that it is using f twice, and give up
2022-06-23 07:51:15 +0200 <triteraflops> I'm basically looking for the simplest case where haskell could prove mutation were possible
2022-06-23 07:51:54 +0200 <davean> So I'm getting a bit tired, so I'm just going to say again that the compiler can eliminate intermediate copies - thats standard code optimization using local reasoning, but the mutation you claim requires global reasoning.
2022-06-23 07:52:09 +0200 <triteraflops> not always
2022-06-23 07:52:14 +0200 <triteraflops> it can be made local with explicit copy
2022-06-23 07:52:24 +0200 <triteraflops> or with the creation of a constant object
2022-06-23 07:52:25 +0200 <davean> No you can't
2022-06-23 07:52:34 +0200 <davean> haskell is non-strict
2022-06-23 07:52:40 +0200 <davean> you can't even make the copy happen
2022-06-23 07:52:49 +0200 <davean> "copy x" is its self a reference to x
2022-06-23 07:53:06 +0200 <triteraflops> What about seq?
2022-06-23 07:53:11 +0200 <davean> what about seq?
2022-06-23 07:53:18 +0200 <triteraflops> Haskell isn't always non-strict
2022-06-23 07:53:23 +0200 <triteraflops> strictness can be enforced
2022-06-23 07:53:39 +0200 <davean> Here is where I direct you to the report and you learn what seq is
2022-06-23 07:53:54 +0200 <davean> but also, all you're doing here is creating MORE copying not less
2022-06-23 07:54:02 +0200 <triteraflops> davean: You make interesting assumptions of my knowledge.
2022-06-23 07:54:05 +0200 <davean> Because this is *strictly more copies than eliminating intermediate copies*
2022-06-23 07:54:22 +0200 <davean> triteraflops: They're not assumptions, they're responses to what you're saying
2022-06-23 07:54:25 +0200Midjak(~Midjak@82.66.147.146)
2022-06-23 07:54:30 +0200gurkenglas(~gurkengla@dslb-002-207-014-022.002.207.pools.vodafone-ip.de)
2022-06-23 07:55:20 +0200 <triteraflops> If a function needs to change a large object a million times before returning it, it would be nice not having to copy it every time.
2022-06-23 07:55:36 +0200 <triteraflops> So you do one copy, maybe, depending, and use that copy to prove mutation is safe.
2022-06-23 07:55:40 +0200 <davean> So hold up
2022-06-23 07:55:52 +0200 <davean> you just said a different statement, and I keep saying it can infact eliminate intermediate copies
2022-06-23 07:56:08 +0200 <davean> it never needs to produce anything but the final returned thing
2022-06-23 07:56:23 +0200 <davean> infact, almost never will - also for the same reasons as above
2022-06-23 07:57:09 +0200 <triteraflops> so if I insert 100 million ints into a hashmap using a single function, haskell will not make a copy every time?
2022-06-23 07:57:57 +0200 <davean> it depends on the exact code, but often no, Haskell is non-strict
2022-06-23 07:58:28 +0200 <triteraflops> copy elimination does not necessarily follow from laziness.
2022-06-23 07:58:46 +0200 <davean> Correct, I said it depends on the exact code
2022-06-23 07:58:50 +0200 <davean> also I didn't say lazy
2022-06-23 07:59:06 +0200 <triteraflops> You mean to say lazy and strict aren't opposites?
2022-06-23 07:59:10 +0200 <davean> Correct
2022-06-23 07:59:14 +0200 <triteraflops> oh bloody hell
2022-06-23 07:59:32 +0200 <triteraflops> This might help explain some of the confusing.
2022-06-23 07:59:35 +0200 <triteraflops> *ion
2022-06-23 07:59:55 +0200 <davean> lazy and strict are on opposite sides, there are a few other things over with lazy, and there is an entire ocean in between
2022-06-23 08:00:14 +0200coot(~coot@213.134.190.95)
2022-06-23 08:00:17 +0200nate4(~nate@98.45.169.16)
2022-06-23 08:01:05 +0200michalz(~michalz@185.246.204.97)
2022-06-23 08:01:06 +0200 <davean> This is why I referenced the Haskell Report when you brought up seq
2022-06-23 08:01:46 +0200 <triteraflops> "Lazy and strict aren't opposites. They're *opposites*."——davean
2022-06-23 08:03:18 +0200 <davean> I mean Europe are Egypt are on opposite sides of the med. but Egypt isn't the opposite side, its some of what isn't Europe.
2022-06-23 08:03:33 +0200 <davean> and there is the med inbetween
2022-06-23 08:03:48 +0200 <davean> Haskell is non-strict, it isn't lazy
2022-06-23 08:04:01 +0200Sgeo(~Sgeo@user/sgeo) (Read error: Connection reset by peer)
2022-06-23 08:04:24 +0200 <triteraflops> This single sentence contradicts every description of haskell I've ever read.
2022-06-23 08:04:52 +0200 <davean> The report is clear about haskell being non-strict
2022-06-23 08:05:15 +0200 <davean> things CAN be eagerly evaluated if it doesn't violate some information rules
2022-06-23 08:05:20 +0200 <davean> and infact is
2022-06-23 08:05:30 +0200 <davean> in GHC and other implimentations
2022-06-23 08:05:41 +0200 <pavonia> What's the difference beween non-strict and lazy?
2022-06-23 08:06:25 +0200 <davean> pavonia: basicly with non-strict you don't see bottoms from undemanded computations
2022-06-23 08:06:34 +0200 <davean> pavonia: but with laziness non-demanded computations are not evaluated
2022-06-23 08:06:56 +0200 <davean> so Haskell can be strict up to the point that it would prove it wasn't lazy
2022-06-23 08:07:07 +0200 <triteraflops> So, haskell *can* use static analysis to avoid copying large objects?
2022-06-23 08:07:19 +0200 <davean> triteraflops: *head desk*
2022-06-23 08:07:47 +0200 <davean> triteraflops: I can use static analysis to avoid making the copies in the first place, locally
2022-06-23 08:08:07 +0200 <davean> there are some global things one could do, but they're VERY limited
2022-06-23 08:08:12 +0200 <pavonia> I don't understand that expanation, tbh
2022-06-23 08:08:19 +0200 <davean> pavonia: ok, uh
2022-06-23 08:08:25 +0200 <triteraflops> Your explanations are pretty cryptic really.
2022-06-23 08:08:43 +0200 <davean> so like a Haskell implimentation can specualatively evaluate something, or fuse computation
2022-06-23 08:09:36 +0200 <triteraflops> It just seems like you're getting really upset at my choice of words, and repeating what I said, only using different words, rather than just saying yes or no
2022-06-23 08:09:54 +0200 <davean> triteraflops: But the details are all that matter here
2022-06-23 08:10:09 +0200 <davean> This is a very exacting thing
2022-06-23 08:11:25 +0200 <triteraflops> It is pretty clear that the answer to "So, haskell *can* use static analysis to avoid copying large objects?" is yes, right now.
2022-06-23 08:11:43 +0200 <triteraflops> or at least a "sometimes"
2022-06-23 08:11:46 +0200 <triteraflops> which is also a yes
2022-06-23 08:11:57 +0200 <davean> triteraflops: It can use static analysis to avoid making copies, it can't use it to allow mutation really.
2022-06-23 08:12:20 +0200 <davean> Not in any practical sense at least to the later
2022-06-23 08:12:22 +0200 <triteraflops> maybe it depends on your definition of mutation
2022-06-23 08:12:39 +0200 <triteraflops> My idea of mutation is allowing x' = f x to reuse x's memory
2022-06-23 08:12:40 +0200 <davean> one is a code optimization, one is a data dependency thing.
2022-06-23 08:12:57 +0200 <triteraflops> at least sometimes
2022-06-23 08:13:12 +0200 <davean> yah, and that it can't do
2022-06-23 08:13:19 +0200 <davean> Not sanely almsot ever at least
2022-06-23 08:13:36 +0200 <davean> Some VERY local cases, it could, but by not forming them like that in the first place really
2022-06-23 08:13:49 +0200 <triteraflops> and I'm saying that static analysis can be easily conceived which would allow the memory reuse.
2022-06-23 08:14:24 +0200 <davean> and I'm saying you don't get how fast and easily references leak depending on strictness - this depends on EXACTLY how the entire thing compiles
2022-06-23 08:14:28 +0200 <triteraflops> You just need a new object to start or a data dependency-breaking operation. Some kind of copy.
2022-06-23 08:14:49 +0200 <davean> once you're copying it, thats the one copy you might make anyway
2022-06-23 08:14:58 +0200 <davean> because all we need to get out is the end result
2022-06-23 08:15:33 +0200 <Axman6> Haskell hets a hell of a lot easier to understand when you realise it's basically all just case statements - and because of that, lots of optimisation can be built - like if you have case Foo a b c of Bar ... -> ...; Foo x y z -> res - you can eliminate that Foo ever being constructed and just pass in a b c for x y z in res - BAM, no large object recreation
2022-06-23 08:15:46 +0200 <Axman6> triteraflops: coming back to your first question, in your mind, what is a large object?
2022-06-23 08:16:05 +0200 <triteraflops> Axman6: a 4 gigabyte 3D array of float32s
2022-06-23 08:16:25 +0200 <Axman6> Sounds like the perfect thing for the ST monad
2022-06-23 08:16:39 +0200 <davean> right but thats the programmer reusing the memory
2022-06-23 08:17:08 +0200ubert(~Thunderbi@p200300ecdf0da521adf2b2fea6746db1.dip0.t-ipconnect.de)
2022-06-23 08:17:21 +0200 <triteraflops> actually davean, it looks like the ST monad is filling the role of the copy operation that you said didn't exist lol
2022-06-23 08:17:33 +0200 <davean> triteraflops: its ... not
2022-06-23 08:18:38 +0200 <triteraflops> Axman6: or a 4GB hashmap being generated incrementally by a single function, then returned
2022-06-23 08:19:11 +0200 <triteraflops> or a hashmap of the same size, requiring updates
2022-06-23 08:19:37 +0200 <triteraflops> this last case, I don't think even the static analysis could allow, but I'd have to think about it.
2022-06-23 08:22:04 +0200 <Axman6> a hashmap isn't a single large object though
2022-06-23 08:22:17 +0200 <davean> It depends on the implimentation
2022-06-23 08:22:22 +0200 <davean> which was a point I made above
2022-06-23 08:22:28 +0200 <triteraflops> Suppose it were made a single large object
2022-06-23 08:22:29 +0200 <Axman6> the Haskell unordered-containers implementation anyway
2022-06-23 08:22:38 +0200 <davean> Axman6: well thats one VERY specific implimentation
2022-06-23 08:22:52 +0200 <davean> its no cuckoo hasking for example
2022-06-23 08:23:38 +0200 <davean> Data.Array cuckoo hashing would be close to a single object
2022-06-23 08:24:34 +0200 <davean> Array !i !i !Int (Array# e)
2022-06-23 08:24:42 +0200 <davean> Not REALLY a single object there, but we're not far off
2022-06-23 08:25:11 +0200 <Axman6> right, but in that case, if you were strictly after the performance and semantics (i.e. not persistent) of a traditional hashmap, then you would need to use true mutation to avoid copying - and luckily, we can do that
2022-06-23 08:25:35 +0200 <davean> Axman6: He was asking about static analysis on Haskell for optimizations though
2022-06-23 08:25:45 +0200 <davean> Not how to impliment something well :)_
2022-06-23 08:25:55 +0200 <davean> An entirely different discussion
2022-06-23 08:27:06 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:fd56:2c1b:68f1:73e4)
2022-06-23 08:27:40 +0200 <triteraflops> Well, a hashmap implementation may be forced to copy the entire hashmap in order to add one item, right?
2022-06-23 08:28:00 +0200 <Axman6> sure, I get that, but I'm saying the way we avoid copying large objects is by not copying large objects
2022-06-23 08:28:00 +0200 <triteraflops> Depends on implementation, but it *may* be forced. Suppose this is the kind of implementation we have.
2022-06-23 08:28:01 +0200 <davean> depending on the implimentation and code ...
2022-06-23 08:28:10 +0200MajorBiscuit(~MajorBisc@c-001-001-031.client.tudelft.eduvpn.nl)
2022-06-23 08:28:34 +0200 <davean> triteraflops: the way to provide the semantics to get what you want is probably fusion BTW
2022-06-23 08:28:40 +0200 <Axman6> Data.HashMap wipp copy at most O(log n) objects
2022-06-23 08:28:42 +0200 <davean> Its not Haskell
2022-06-23 08:28:56 +0200 <davean> Axman6: He's specificly not talking about that though
2022-06-23 08:29:02 +0200 <Axman6> Haskell isn't some magic language that automatically solve poor coding
2022-06-23 08:29:51 +0200z0k(~z0k@206.84.141.12)
2022-06-23 08:30:03 +0200 <triteraflops> how about something like hm & insert ka va & insert kb vb & insert kc vc
2022-06-23 08:30:16 +0200 <triteraflops> That's one expression
2022-06-23 08:30:30 +0200 <davean> triteraflops: yes, and it can produce a single new hashmap just fine
2022-06-23 08:30:42 +0200 <davean> theres no reason it has to produce intermediate ones, as I've mentioned before
2022-06-23 08:31:44 +0200 <davean> that can collapse the 3 inserts into one function
2022-06-23 08:31:45 +0200nate4(~nate@98.45.169.16) (Ping timeout: 268 seconds)
2022-06-23 08:31:49 +0200 <triteraflops> You may have tried explaining it, but think about what it means when you're not understood. Is it really my fault? Usually there is blame on both sides of a communications failure.
2022-06-23 08:32:59 +0200 <davean> Haskell is pure, so "insert ka va & insert kb vb & insert kc vc" can become a single piece of code that produces no intermediate forms
2022-06-23 08:33:04 +0200 <triteraflops> Like, you seem upset right now. My internal audio simulation of your voice has you shouting loudly right now.
2022-06-23 08:33:15 +0200 <davean> I'm not upset, I'm more amused.
2022-06-23 08:33:54 +0200 <triteraflops> That is not what is being communicated right now. Think about what that means.
2022-06-23 08:34:18 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 08:34:36 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Remote host closed the connection)
2022-06-23 08:35:11 +0200 <davean> so remember when I mentioned optimizing code vs. data flow?
2022-06-23 08:35:21 +0200 <davean> we're right back to that.
2022-06-23 08:35:38 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 08:35:41 +0200 <triteraflops> So, that 3x insert example I wrote up there. That can be done with a single copy.
2022-06-23 08:35:57 +0200echoreply(~echoreply@2001:19f0:9002:1f3b:5400:ff:fe6f:8b8d) (Quit: WeeChat 2.8)
2022-06-23 08:35:57 +0200 <davean> And I agreed it could be, but not by reusing the memory
2022-06-23 08:36:24 +0200 <triteraflops> The memory is being reused. That's waht a single copy means.
2022-06-23 08:36:32 +0200 <davean> Is it? :)
2022-06-23 08:36:33 +0200 <triteraflops> When it's one copy and not three, memory is being reused.
2022-06-23 08:36:41 +0200 <triteraflops> Just not the original memory.
2022-06-23 08:36:46 +0200 <davean> Ah no, thats one particular way to impliment it
2022-06-23 08:36:55 +0200 <davean> you keep running back into that assumption
2022-06-23 08:37:07 +0200 <davean> you keep thinking about the data, and I keep pointing you at the code
2022-06-23 08:37:10 +0200 <davean> so think abotu the code for a bit
2022-06-23 08:37:13 +0200echoreply(~echoreply@45.32.163.16)
2022-06-23 08:38:15 +0200 <davean> we don't need to know anything about hm or anything to avoid copies, we can make code that only produces the result
2022-06-23 08:38:36 +0200jakalx(~jakalx@base.jakalx.net) (Error from remote client)
2022-06-23 08:41:36 +0200 <davean> its write once
2022-06-23 08:41:48 +0200 <triteraflops> Let's try a different example. What is a way of storing a 4GB array of floats in haskell? I know there are several, but basically, just pick one that is actually going to store the floats, and not an array of thunks or something.
2022-06-23 08:42:21 +0200 <triteraflops> I would call such an array a strict array
2022-06-23 08:42:21 +0200 <davean> You'll need an unboxed vector or something
2022-06-23 08:42:31 +0200 <triteraflops> so this vector, then.
2022-06-23 08:42:58 +0200 <davean> YOu're going to run int othe same code thing as above, pure code can collapse into a fused function
2022-06-23 08:43:48 +0200 <davean> but happy to get there naturally
2022-06-23 08:44:03 +0200 <triteraflops> This is 4 gigs of data. It will exist. It must in order for the program to make any sense.
2022-06-23 08:44:39 +0200 <davean> yep
2022-06-23 08:44:43 +0200 <triteraflops> so you could have v & mod na xa & mod nb xb & mod nc xc
2022-06-23 08:44:52 +0200 <davean> what is mod?
2022-06-23 08:44:53 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 08:45:01 +0200 <triteraflops> oh yeah mod is already a function
2022-06-23 08:45:05 +0200jakalx(~jakalx@base.jakalx.net)
2022-06-23 08:45:22 +0200 <davean> well usually mod is the modulus operator which doesn't make sense here
2022-06-23 08:45:23 +0200 <triteraflops> ok, import qualified Vector as V
2022-06-23 08:45:36 +0200 <triteraflops> so you could have v & V.mod na xa & V.mod nb xb & V.mod nc xc
2022-06-23 08:45:48 +0200 <triteraflops> short for modify
2022-06-23 08:46:12 +0200 <triteraflops> maybe set would be better
2022-06-23 08:46:35 +0200 <triteraflops> v & set na xa & set nb xb & set nc xc
2022-06-23 08:46:50 +0200 <davean> Right so you know what "set na xa & set nb xb & set nc xc" as a function is?
2022-06-23 08:47:27 +0200 <triteraflops> If I were implementing set, I would be forced to copy the whole array. I can't think of any way around that, besides like a
2022-06-23 08:47:29 +0200 <davean> "copy the input until na, copy in xa, resume copying to nb, copy in xb, resume cpying to nc, copy in xc, resume coping to end"
2022-06-23 08:47:41 +0200 <davean> Thats what that function is
2022-06-23 08:48:14 +0200 <davean> See how theres no intermediate forms of v?
2022-06-23 08:48:19 +0200 <davean> You just got directly to the result
2022-06-23 08:49:00 +0200 <triteraflops> If set's output is a vector, how am I not forced to copy the input?
2022-06-23 08:49:05 +0200 <triteraflops> I clearly am
2022-06-23 08:49:18 +0200 <davean> You're forced to copy the parts of the input that aren't changed
2022-06-23 08:49:41 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Ping timeout: 246 seconds)
2022-06-23 08:50:59 +0200 <triteraflops> I can make a Plan type, which could store a list of operations, then get executed.
2022-06-23 08:51:21 +0200 <triteraflops> Then it would look like
2022-06-23 08:51:48 +0200 <triteraflops> v & plan & set na xa & set nb xb & set nc xc & ex
2022-06-23 08:52:00 +0200 <triteraflops> Then only ex would copy
2022-06-23 08:52:02 +0200 <triteraflops> That's betetr
2022-06-23 08:52:25 +0200 <davean> You don't need plan, because Haskell is pure. You just need plan to impliment it yourself.
2022-06-23 08:52:46 +0200 <davean> (Not saying the compiler WILL give you what you want, but morally it can)
2022-06-23 08:52:53 +0200 <triteraflops> morally...
2022-06-23 08:53:02 +0200 <davean> (The code approach I generated is allowed - and infact you will sometimes get it)
2022-06-23 08:53:19 +0200 <davean> because it will optimize the functions into one function
2022-06-23 08:53:25 +0200 <triteraflops> You didn't write any code
2022-06-23 08:53:48 +0200 <davean> "copy the input until na, copy in xa, resume copying to nb, copy in xb, resume cpying to nc, copy in xc, resume coping to end" <-- I'm refering to memory operations here
2022-06-23 08:53:58 +0200 <davean> thats machine pseudo code, not haskell code
2022-06-23 08:54:03 +0200 <davean> the compile doesn't produce Haskell
2022-06-23 08:54:28 +0200 <davean> a sequence of sets is a fatehr
2022-06-23 08:54:32 +0200 <davean> a sequence of sets is a gather operation
2022-06-23 08:54:33 +0200 <triteraflops> I know the compiler doesn't produce Haskell.
2022-06-23 08:54:58 +0200 <triteraflops> It's not a gather unless it is implemented as such
2022-06-23 08:55:10 +0200 <davean> No, it is a gather, it might just not execute as one
2022-06-23 08:55:13 +0200 <davean> haskell is pure
2022-06-23 08:55:30 +0200 <davean> consider why I keep mentioning that
2022-06-23 08:55:39 +0200 <lortabac> davean: I am trying to follow what you say, but I'm lost
2022-06-23 08:55:52 +0200 <lortabac> are you trying to explain rewrite rules?
2022-06-23 08:55:56 +0200 <davean> lortabac: Oh thats kinda expecting, I'm specificly trying to poke at triteraflops' missunderstandings
2022-06-23 08:56:03 +0200 <davean> lortabac: no, I did mention them above though
2022-06-23 08:56:08 +0200 <triteraflops> Ah, Miss Understanding
2022-06-23 08:56:19 +0200 <triteraflops> Sorry, couldn't resist that one.
2022-06-23 08:56:24 +0200 <davean> lortabac: I'm actually trying to explain closer to inlining here
2022-06-23 08:57:02 +0200 <triteraflops> A vector type like this pretty much needs to be internal, and may not be safe.
2022-06-23 08:57:09 +0200 <davean> (I say that since inlining is most of how GHC actualyl gets this in practice)
2022-06-23 08:57:17 +0200 <triteraflops> It is only made pure by things like mandatory opies
2022-06-23 08:57:19 +0200 <triteraflops> copies
2022-06-23 08:57:28 +0200 <lortabac> how does inline avoid building intermediate structures?
2022-06-23 08:57:30 +0200 <davean> triteraflops: Thats actually not true! In practice!
2022-06-23 08:58:05 +0200 <triteraflops> There are internal operations that are not pure. I know there are.
2022-06-23 08:58:12 +0200 <triteraflops> It's how the pure stuff is implemented.
2022-06-23 08:58:21 +0200 <triteraflops> some of it, anyway
2022-06-23 08:58:53 +0200 <davean> triteraflops: Some of it, but nothing I mentioned here about that vector stuff goes outside the pure stuff, strictly standards haskell stuff
2022-06-23 08:59:59 +0200 <davean> lortabac: So the usall analogy I'd use is you just setup the definition of the final thing and you pull it like a thread and leave everything that doesn't end up in it behind, the vector case is a bit more complicated
2022-06-23 09:00:10 +0200 <davean> lortabac: but I think you'd want the setup to get there ...
2022-06-23 09:00:25 +0200 <davean> so lortabac how much do you know coming into this?
2022-06-23 09:01:02 +0200 <davean> and have you ever stepped through inlining manually?
2022-06-23 09:01:30 +0200 <davean> lortabac: do you not know how inlining can avoid intermediate forms in general, or the specific example I gave?
2022-06-23 09:01:39 +0200 <triteraflops> lortabac: so I wrote an example of a series of vector operations that should be possible with a single copy
2022-06-23 09:01:53 +0200 <triteraflops> if v is a 4 GB vector of float32s, say
2022-06-23 09:01:57 +0200 <triteraflops> no thunks
2022-06-23 09:02:26 +0200 <triteraflops> and set n x v modifies v, setting element n to x
2022-06-23 09:02:36 +0200 <triteraflops> you could have an expression of the form
2022-06-23 09:02:50 +0200 <triteraflops> v set na xa & set nb xb & set nc xc
2022-06-23 09:02:53 +0200 <triteraflops> oops
2022-06-23 09:02:56 +0200 <triteraflops> v & set na xa & set nb xb & set nc xc
2022-06-23 09:03:13 +0200 <triteraflops> This should be possible with a single copy
2022-06-23 09:03:51 +0200 <triteraflops> So, davean, you're saying GHC natively supports vector types that can consolidate multiple set operations like this?
2022-06-23 09:04:00 +0200 <lortabac> AFAIK it's not possible with a single copy
2022-06-23 09:04:18 +0200 <davean> triteraflops: I'm saying the ability to do that is inherent to purity
2022-06-23 09:04:31 +0200 <lortabac> not without rewrite rules at least
2022-06-23 09:04:31 +0200 <triteraflops> davean: no it isn't
2022-06-23 09:04:42 +0200 <triteraflops> a pure implementation could copy every time
2022-06-23 09:04:48 +0200 <triteraflops> and still be pure
2022-06-23 09:04:51 +0200 <davean> triteraflops: it *could* it doesn't have to
2022-06-23 09:05:13 +0200vysn(~vysn@user/vysn)
2022-06-23 09:05:14 +0200 <davean> because "set na xa & set nb xb & set nc xc" is equivilent to the fused function
2022-06-23 09:05:25 +0200 <triteraflops> davean: well, you just daid that all pure implementations must do this with a single copy
2022-06-23 09:05:26 +0200zeenk(~zeenk@2a02:2f04:a301:3d00:39df:1c4b:8a55:48d3)
2022-06-23 09:05:27 +0200 <davean> so the compile is strictly allowed to replace that with the fused version
2022-06-23 09:05:28 +0200 <lortabac> davean: can you show exactly how purity would give you this optimization?
2022-06-23 09:05:40 +0200 <davean> triteraflops: I did *not* say they must, I said it was inherent to purity
2022-06-23 09:05:51 +0200 <triteraflops> That's what inherent to purity means.
2022-06-23 09:06:31 +0200 <davean> lortabac: purity *allows* this particular one, you'd get it from the optimizer in this particular case
2022-06-23 09:06:37 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 09:06:55 +0200 <lortabac> davean: ok, but how would you get it without rewrite rules?
2022-06-23 09:07:36 +0200 <davean> lortabac: well a SSA style analysis gets it for you
2022-06-23 09:07:39 +0200 <lortabac> I mean, what is the exact transformation that the optimizer will (or can) perform
2022-06-23 09:07:45 +0200 <davean> SSA will do this
2022-06-23 09:07:57 +0200 <davean> Thats an optimization pass that will exactly generate that
2022-06-23 09:08:19 +0200 <lortabac> what is SSA?
2022-06-23 09:08:25 +0200 <davean> Single static assignment
2022-06-23 09:08:37 +0200 <davean> basicly what we're doing here is keeping the last writes
2022-06-23 09:08:55 +0200 <davean> but in the abstract
2022-06-23 09:09:44 +0200 <davean> Theres a few other ways to get that vector example above
2022-06-23 09:09:51 +0200 <davean> its actually a pretty easy one to get ...
2022-06-23 09:10:40 +0200 <lortabac> thanks
2022-06-23 09:10:51 +0200 <lortabac> does GHC actually do it?
2022-06-23 09:10:56 +0200 <davean> Almost any symbolic approach to computation will get that
2022-06-23 09:11:29 +0200 <triteraflops> idk why you say that. This clearly requires special consideration at the level of GHC's vector implementation.
2022-06-23 09:11:40 +0200 <davean> That particular one? No, partially because Vector is specificly optimized, I don't THINK GHC would get it if Vector wasn't in the way? ... hum, I know how to write fast code with GHC but I don't know all the passes in depth ...
2022-06-23 09:11:47 +0200 <davean> triteraflops: No, it does NOT
2022-06-23 09:12:02 +0200alp_(~alp@user/alp)
2022-06-23 09:12:14 +0200 <davean> It requires an abstract interpritation step for optimization
2022-06-23 09:12:18 +0200 <davean> which can apply to any code
2022-06-23 09:13:08 +0200 <triteraflops> GHC needs to implement this vector at some point
2022-06-23 09:13:33 +0200 <triteraflops> This implementation will support certain operations.
2022-06-23 09:13:46 +0200 <triteraflops> These operations may or may not be pure
2022-06-23 09:13:52 +0200taeaad(~taeaad@user/taeaad) (Quit: ZNC 1.7.5+deb4 - https://znc.in)
2022-06-23 09:13:54 +0200 <davean> Yes but those operations can be emergent at the assembly level
2022-06-23 09:14:06 +0200 <davean> and the code is pure
2022-06-23 09:14:46 +0200 <triteraflops> so there's some kind of deep optimisation that gives you copy elimination for free?
2022-06-23 09:14:55 +0200 <davean> Its not even deep
2022-06-23 09:15:12 +0200 <triteraflops> or like general
2022-06-23 09:15:18 +0200 <davean> And I explained to you how simpler forms come out explicitely from being lazy, this is a deeper one but its very shallow
2022-06-23 09:15:20 +0200 <davean> yes
2022-06-23 09:15:29 +0200 <triteraflops> a general optimisation that also works on the assembly operations as they apply to the vector...
2022-06-23 09:15:30 +0200 <davean> in general you can eliminate most of this stuff in pure code
2022-06-23 09:15:40 +0200 <davean> Yes
2022-06-23 09:15:49 +0200 <davean> This is what optimizing compilers do, and what they've done for ages
2022-06-23 09:16:01 +0200 <davean> Haskell has it better because more of these optimizations apply more of the time because its pure
2022-06-23 09:16:04 +0200taeaad(~taeaad@user/taeaad)
2022-06-23 09:16:36 +0200 <davean> but yes, we've basicly been over the basics of what happened in the 1980s to compilers
2022-06-23 09:16:36 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:fd56:2c1b:68f1:73e4) (Quit: WeeChat 2.8)
2022-06-23 09:17:21 +0200 <davean> This "magic" can be sensative to disruption, and GHC isn't the best at doing it well but yah, this is very standard entirely general code optimization stuff
2022-06-23 09:17:43 +0200 <davean> That can and often ends up in practice, apply to litterly any haskell you writwe
2022-06-23 09:18:19 +0200 <davean> which is part of what Axman6's comment about case statements got to actually
2022-06-23 09:18:42 +0200 <triteraflops> This is getting interesting now.
2022-06-23 09:19:14 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:1762:327:502f:fc6c)
2022-06-23 09:19:26 +0200 <davean> https://en.wikipedia.org/wiki/Static_single-assignment_form you'll note it is a subset of continuation passing style, and *that* is deeply related to functional programming
2022-06-23 09:19:34 +0200 <davean> so it shouldn't supprise you it applies :)
2022-06-23 09:19:43 +0200 <davean> SSA is just a particularly easy to understand optimization
2022-06-23 09:20:07 +0200 <triteraflops> aaah, SSA was ticking some memories
2022-06-23 09:20:17 +0200 <triteraflops> ***tickling
2022-06-23 09:20:27 +0200 <merijn> Essentially, the main topic of optimisation in our compiler course was: here's some neat optimisations
2022-06-23 09:20:45 +0200 <merijn> here's why you can't do any of them in C, because they're near impossible to prove correct in a mutable setting
2022-06-23 09:20:57 +0200 <triteraflops> spirv uses ssa
2022-06-23 09:21:21 +0200 <merijn> triteraflops: Nearly any compiler/optimisation tool that is remotely serious/relevant uses SSA ;)
2022-06-23 09:21:33 +0200 <davean> and yah, triteraflops we have ways to do stuff even better than the compiler in Haskell but they're extensions, not standards compliant Haskell
2022-06-23 09:22:01 +0200 <triteraflops> well, it looks like if something doesn't need extra copies, it likely won't incur them
2022-06-23 09:22:08 +0200 <davean> merijn: right! Optimizations actually apply in general to Haskell because of purity :)
2022-06-23 09:22:13 +0200 <triteraflops> with some provisos lol
2022-06-23 09:22:41 +0200 <davean> triteraflops: I mean uh, shouldn't. Again, GHC not PARTICULARLY great at reliably applying what it has
2022-06-23 09:22:45 +0200 <davean> but its pretty good
2022-06-23 09:22:54 +0200 <davean> you know like 80% of the time it'll work :)
2022-06-23 09:23:12 +0200 <davean> And another 15% of the time a small wiggle gets it to work
2022-06-23 09:23:38 +0200moet(~moet@mobile-166-171-250-122.mycingular.net) (Ping timeout: 246 seconds)
2022-06-23 09:23:41 +0200 <davean> but yah triteraflops when I decided to engage in this conversation I knew it was going to be a long one
2022-06-23 09:23:50 +0200 <davean> perspective changes take a while
2022-06-23 09:24:29 +0200 <triteraflops> well, um, you're also not super good at communicating your ideas lol
2022-06-23 09:24:42 +0200MajorBiscuit(~MajorBisc@c-001-001-031.client.tudelft.eduvpn.nl) (Quit: WeeChat 3.5)
2022-06-23 09:24:45 +0200 <triteraflops> but it's working
2022-06-23 09:24:50 +0200nate4(~nate@98.45.169.16)
2022-06-23 09:24:54 +0200 <triteraflops> I am starting to understand what you mean.
2022-06-23 09:24:55 +0200 <davean> I should get to bed now though
2022-06-23 09:25:12 +0200MajorBiscuit(~MajorBisc@c-001-001-031.client.tudelft.eduvpn.nl)
2022-06-23 09:25:14 +0200jakalx(~jakalx@base.jakalx.net) (Disconnected: Replaced by new connection)
2022-06-23 09:25:15 +0200jakalx(~jakalx@base.jakalx.net)
2022-06-23 09:26:20 +0200 <davean> "merijn here's why you can't do any of them in C, because they're near impossible to prove correct in a mutable setting" <--- it still is sinking in even today just how important purity is for optimizing code.
2022-06-23 09:28:01 +0200 <triteraflops> So, I started with the impression that haskell couldn't mutate large objects, even when it would be safe.
2022-06-23 09:28:12 +0200 <triteraflops> But now, I see that it actually can.
2022-06-23 09:28:44 +0200 <triteraflops> The 3x set example with the vector demonstrates this.
2022-06-23 09:30:29 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net)
2022-06-23 09:30:50 +0200merijn(~merijn@c-001-001-018.client.esciencecenter.eduvpn.nl) (Quit: leaving)
2022-06-23 09:31:08 +0200caasih(sid13241@id-13241.ilkley.irccloud.com) (Quit: Updating details, brb)
2022-06-23 09:31:17 +0200caasih(sid13241@id-13241.ilkley.irccloud.com)
2022-06-23 09:32:02 +0200gurkenglas(~gurkengla@dslb-002-207-014-022.002.207.pools.vodafone-ip.de) (Ping timeout: 246 seconds)
2022-06-23 09:35:14 +0200vpan(~0@212.117.1.172)
2022-06-23 09:35:20 +0200benin0(~benin@183.82.30.117)
2022-06-23 09:39:00 +0200 <davean> Briefly back to mention you might notice SSA is directly related as a form of non-strictness. but I have to leave you to consider that on your own.
2022-06-23 09:41:38 +0200 <triteraflops> So, haskell basically sorta can mutate large objects in cases like the 3x set case above. But it would be inaccurate to call it mutation, per se, because the three sets are not time ordered operations being applied individually to an array. SSA is fusing them.
2022-06-23 09:41:40 +0200neoatnebula(~neoatnebu@49.206.16.59) (Quit: Client closed)
2022-06-23 09:42:12 +0200 <triteraflops> into a single copy+modify operation
2022-06-23 09:43:39 +0200 <triteraflops> and yeah, this potential reordering and fusing is a kind of non strictness.
2022-06-23 09:44:36 +0200gmg(~user@user/gehmehgeh)
2022-06-23 09:51:24 +0200machinedgod(~machinedg@66.244.246.252)
2022-06-23 09:54:00 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net) (Ping timeout: 272 seconds)
2022-06-23 09:54:39 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net)
2022-06-23 09:56:35 +0200chele(~chele@user/chele)
2022-06-23 09:58:40 +0200merijn(~merijn@86-86-29-250.fixed.kpn.net)
2022-06-23 10:00:13 +0200tzh(~tzh@c-24-21-73-154.hsd1.or.comcast.net) (Quit: zzz)
2022-06-23 10:04:00 +0200 <merijn> @tell Hecate I disagree with your assessment that ASCII literals for ByteString should die. It's incredibly nice for a lot of the "text" based protocols (HTTP/SMTP/etc.), what should die is error-less partial conversions (see also my comment on the bytestring issue), I already campaigned for that, I dunno, 7 years ago.
2022-06-23 10:04:00 +0200 <lambdabot> Consider it noted.
2022-06-23 10:06:42 +0200 <Hecate> merijn: I see
2022-06-23 10:08:09 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 10:08:25 +0200 <merijn> Hecate: also, if you're curious about semi-cursed application of ByteString's ForeignPtr I got those too :p
2022-06-23 10:08:36 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 10:08:51 +0200 <merijn> Hecate: https://github.com/merijn/Belewitte/blob/master/benchmark-analysis/src/Utils/Vector.hs
2022-06-23 10:11:44 +0200cfricke(~cfricke@user/cfricke)
2022-06-23 10:13:49 +0200 <Hecate> merijn: thank you :)
2022-06-23 10:14:03 +0200 <Hecate> merijn: btw, do you know how I could check for memory fragmentation of pinned bytestrings?
2022-06-23 10:14:57 +0200 <maerwald[m]> https://well-typed.com/blog/2021/01/fragmentation-deeper-look/
2022-06-23 10:15:58 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf) (Ping timeout: 240 seconds)
2022-06-23 10:16:14 +0200 <Hecate> thanks maerwald[m] <3
2022-06-23 10:17:08 +0200 <merijn> Hecate: High-level guess work from my side says that they shouldn't fragment the haskell heap at all
2022-06-23 10:17:44 +0200 <merijn> Ah, hmm, pack maybe does
2022-06-23 10:19:43 +0200 <merijn> I dunno the details of the mutablePinnedByteArray# stuff enough
2022-06-23 10:20:01 +0200 <lortabac> triteraflops: there are several concepts involved, one is sharing which makes you for example the parts of a data-structure that are not modified
2022-06-23 10:20:52 +0200 <triteraflops> Is there a word missing in that sentence?
2022-06-23 10:20:59 +0200 <triteraflops> I'm getting parse errors lol
2022-06-23 10:21:11 +0200 <lortabac> one is rewrite rules, which are ad-hoc transformations based on templates
2022-06-23 10:21:41 +0200 <triteraflops> I knew about list fusion
2022-06-23 10:21:59 +0200 <lortabac> yes, list fusion is performed through rewrite rules
2022-06-23 10:22:29 +0200 <lortabac> from what I understood, the discussion was quite abstract and detached from what GHC really does
2022-06-23 10:22:53 +0200 <triteraflops> But I couldn't see how rewrite rules could fuse multiple inserts into a hashmap or multiple vector mutations
2022-06-23 10:22:55 +0200 <lortabac> in fact GHC does not optimize the vector example through SSA
2022-06-23 10:23:03 +0200 <triteraflops> That was the part I thought haskell couldn't do at all
2022-06-23 10:23:09 +0200 <triteraflops> or rather couldn't optimise at all
2022-06-23 10:23:10 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 10:24:09 +0200 <lortabac> triteraflops: oh sorry a word was missing indeed, I meant "makes you reuse the parts..."
2022-06-23 10:25:03 +0200 <triteraflops> yeah I knew about subcomponent reuse
2022-06-23 10:25:23 +0200 <triteraflops> and again considered it irrelevant to the vector example, like fusion
2022-06-23 10:27:10 +0200 <lortabac> yes, sharing does not make vectors fuse magically
2022-06-23 10:28:00 +0200 <triteraflops> I can think of other examples involving vectors. Like an iteration in conway's game of life for instance
2022-06-23 10:29:43 +0200 <triteraflops> or an iteration step comprised of several linear operations on a vector
2022-06-23 10:29:51 +0200 <lortabac> I think you can achieve fusion in vector by using their stream interface
2022-06-23 10:30:36 +0200 <triteraflops> well, I think davean is basically right about the assembly level optimisation catching a lot of this.
2022-06-23 10:31:40 +0200 <lortabac> in theory yes, but I don't think GHC actually does it
2022-06-23 10:32:03 +0200 <lortabac> optimizations happen at a much higher level in GHC
2022-06-23 10:32:15 +0200Pickchea(~private@user/pickchea)
2022-06-23 10:32:27 +0200eggplant_(~Eggplanta@108-201-191-115.lightspeed.sntcca.sbcglobal.net) (Remote host closed the connection)
2022-06-23 10:33:03 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 10:33:46 +0200 <triteraflops> The way of doing an iteration 10 linear operations is to do one initial copy to v, allocate a new array, w, then do W = Av, v = Aw, etc.
2022-06-23 10:33:55 +0200 <triteraflops> *iteration comprised of
2022-06-23 10:34:24 +0200 <triteraflops> or maybe v = Bw
2022-06-23 10:35:52 +0200 <triteraflops> If you try inlining 10 matrix vector operations, the result will be a total fucking mess
2022-06-23 10:36:17 +0200 <triteraflops> maybe
2022-06-23 10:36:56 +0200alp_(~alp@user/alp) (Remote host closed the connection)
2022-06-23 10:37:04 +0200 <triteraflops> actually maybe not lol. The compiler may inline it to a single matrix vector op without realising it lol
2022-06-23 10:37:22 +0200dlbh^(~dlbh@50.237.44.186)
2022-06-23 10:37:48 +0200 <triteraflops> so, inlining
2022-06-23 10:37:54 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Ping timeout: 264 seconds)
2022-06-23 10:38:20 +0200 <triteraflops> inlining is transparent at the assembly / SSA level
2022-06-23 10:38:38 +0200alp(~alp@user/alp)
2022-06-23 10:40:51 +0200 <triteraflops> If there are nonlinear operations between the linear operations, inlining will definitely produce a huge mess. Or at least it could. If the operation is expanded once for every element of the output array
2022-06-23 10:41:46 +0200 <triteraflops> assembly level SSA optimisation should be able to determine what temporary variables are the most useful to keep and share.
2022-06-23 10:44:24 +0200dobblego(~dibblego@122-199-1-30.ip4.superloop.com)
2022-06-23 10:44:24 +0200dobblego(~dibblego@122-199-1-30.ip4.superloop.com) (Changing host)
2022-06-23 10:44:24 +0200dobblego(~dibblego@haskell/developer/dibblego)
2022-06-23 10:44:50 +0200machinedgod(~machinedg@66.244.246.252) (Remote host closed the connection)
2022-06-23 10:45:47 +0200dibblego(~dibblego@haskell/developer/dibblego) (Ping timeout: 255 seconds)
2022-06-23 10:45:47 +0200dobblegodibblego
2022-06-23 10:46:07 +0200machinedgod(~machinedg@66.244.246.252)
2022-06-23 10:48:49 +0200mima(~mmh@aftr-62-216-210-68.dynamic.mnet-online.de)
2022-06-23 10:49:18 +0200vglfr(~vglfr@88.155.20.3) (Ping timeout: 264 seconds)
2022-06-23 10:52:11 +0200z0k(~z0k@206.84.141.12) (Ping timeout: 246 seconds)
2022-06-23 10:52:29 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 10:53:49 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net) (Quit: Leaving)
2022-06-23 10:54:10 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 240 seconds)
2022-06-23 10:54:10 +0200ccntrq1ccntrq
2022-06-23 10:54:11 +0200z0k(~z0k@206.84.141.12)
2022-06-23 11:04:16 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net)
2022-06-23 11:04:30 +0200nate4(~nate@98.45.169.16) (Ping timeout: 240 seconds)
2022-06-23 11:05:58 +0200mattil(~mattil@helsinki.portalify.com)
2022-06-23 11:06:26 +0200gurkenglas(~gurkengla@dslb-002-207-014-022.002.207.pools.vodafone-ip.de)
2022-06-23 11:09:05 +0200rembo10_(~rembo10@main.remulis.com) (Quit: ZNC 1.8.2 - https://znc.in)
2022-06-23 11:09:24 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 11:09:41 +0200rembo10(~rembo10@main.remulis.com)
2022-06-23 11:10:54 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 264 seconds)
2022-06-23 11:10:54 +0200ccntrq1ccntrq
2022-06-23 11:16:49 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net)
2022-06-23 11:19:18 +0200xff0x(~xff0x@125x103x176x34.ap125.ftth.ucom.ne.jp) (Ping timeout: 264 seconds)
2022-06-23 11:21:41 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 11:23:55 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 256 seconds)
2022-06-23 11:23:55 +0200ccntrq1ccntrq
2022-06-23 11:30:36 +0200Vq(~vq@90-227-195-41-no77.tbcn.telia.com) (Ping timeout: 244 seconds)
2022-06-23 11:30:42 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 11:31:54 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net)
2022-06-23 11:31:59 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f)
2022-06-23 11:33:26 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 272 seconds)
2022-06-23 11:33:26 +0200ccntrq1ccntrq
2022-06-23 11:35:53 +0200nate4(~nate@98.45.169.16)
2022-06-23 11:37:50 +0200chomwitt(~chomwitt@2a02:587:dc0d:e600:4907:a32:4c72:2e8c)
2022-06-23 11:39:15 +0200shriekingnoise(~shrieking@201.212.175.181) (Quit: Quit)
2022-06-23 11:43:49 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 11:45:23 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 268 seconds)
2022-06-23 11:45:23 +0200ccntrq1ccntrq
2022-06-23 11:49:10 +0200odnes(~odnes@5-203-220-108.pat.nym.cosmote.net) (Ping timeout: 240 seconds)
2022-06-23 11:49:43 +0200unit73e(~emanuel@2001:818:e8dd:7c00:32b5:c2ff:fe6b:5291)
2022-06-23 11:53:59 +0200leib(~leib@2405:201:900a:f088:60b3:aae8:bd87:1f5f)
2022-06-23 11:56:18 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 11:56:52 +0200unit73e(~emanuel@2001:818:e8dd:7c00:32b5:c2ff:fe6b:5291) (Ping timeout: 272 seconds)
2022-06-23 11:57:59 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 246 seconds)
2022-06-23 11:57:59 +0200ccntrq1ccntrq
2022-06-23 11:58:56 +0200zaquest(~notzaques@5.130.79.72) (Remote host closed the connection)
2022-06-23 12:00:32 +0200zaquest(~notzaques@5.130.79.72)
2022-06-23 12:01:53 +0200gurkenglas(~gurkengla@dslb-002-207-014-022.002.207.pools.vodafone-ip.de) (Ping timeout: 256 seconds)
2022-06-23 12:15:57 +0200stiell(~stiell@gateway/tor-sasl/stiell) (Remote host closed the connection)
2022-06-23 12:16:36 +0200stiell(~stiell@gateway/tor-sasl/stiell)
2022-06-23 12:18:48 +0200Surobaki(~surobaki@137.44.222.80)
2022-06-23 12:19:16 +0200econo(uid147250@user/econo) (Quit: Connection closed for inactivity)
2022-06-23 12:22:43 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 12:23:11 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net) (Ping timeout: 246 seconds)
2022-06-23 12:24:26 +0200vysn(~vysn@user/vysn) (Read error: Connection reset by peer)
2022-06-23 12:25:22 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 272 seconds)
2022-06-23 12:25:22 +0200ccntrq1ccntrq
2022-06-23 12:34:41 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net)
2022-06-23 12:35:04 +0200 <dminuoso> triteraflops: By the way, you might be interested in uniqueness types in Clean, which allows for non-heuristic in-place mutation
2022-06-23 12:35:28 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 12:36:50 +0200 <dminuoso> triteraflops: So there, the compiler can turn a "produce an altered copy" into "mutate in place" if its safe to do so
2022-06-23 12:37:45 +0200jmdaemon(~jmdaemon@user/jmdaemon) (Ping timeout: 248 seconds)
2022-06-23 12:38:44 +0200EggGuest(~EggGuest@n114-74-2-39.bla3.nsw.optusnet.com.au)
2022-06-23 12:39:03 +0200 <EggGuest> Hello
2022-06-23 12:39:21 +0200EggGuest(~EggGuest@n114-74-2-39.bla3.nsw.optusnet.com.au) (Client Quit)
2022-06-23 12:43:11 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 12:47:01 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 12:47:39 +0200cosimone(~user@93-44-186-171.ip98.fastwebnet.it) (Remote host closed the connection)
2022-06-23 12:48:11 +0200zmt01(~zmt00@user/zmt00) (Ping timeout: 255 seconds)
2022-06-23 12:48:42 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 264 seconds)
2022-06-23 12:48:42 +0200ccntrq1ccntrq
2022-06-23 12:50:30 +0200geekosaur(~geekosaur@xmonad/geekosaur) (Ping timeout: 264 seconds)
2022-06-23 12:52:24 +0200geekosaur(~geekosaur@xmonad/geekosaur)
2022-06-23 12:56:16 +0200PiDelport(uid25146@id-25146.lymington.irccloud.com)
2022-06-23 12:57:40 +0200cosimone(~user@2001:b07:ae5:db26:57c7:21a5:6e1c:6b81)
2022-06-23 13:00:59 +0200merijn(~merijn@86-86-29-250.fixed.kpn.net) (Ping timeout: 246 seconds)
2022-06-23 13:02:01 +0200ezzieyguywuf(~Unknown@user/ezzieyguywuf)
2022-06-23 13:03:07 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:1762:327:502f:fc6c) (Quit: WeeChat 2.8)
2022-06-23 13:05:12 +0200coot(~coot@213.134.190.95) (Quit: coot)
2022-06-23 13:07:18 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net) (Ping timeout: 264 seconds)
2022-06-23 13:09:10 +0200jakalx(~jakalx@base.jakalx.net) ()
2022-06-23 13:10:09 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net)
2022-06-23 13:10:29 +0200nate4(~nate@98.45.169.16) (Ping timeout: 268 seconds)
2022-06-23 13:11:02 +0200 <vpan> hi, I'm trying to load CSV data using cassava that has separator records, which I'm trying to detect by a `parseRecord` guard in a `FromRecord` instance. The guard condition like `not $ null (v .! 11)` does not work. How to check for emptyness of a field?
2022-06-23 13:11:19 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 13:14:14 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 13:14:39 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 13:15:11 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 13:16:47 +0200jakalx(~jakalx@base.jakalx.net)
2022-06-23 13:17:56 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 272 seconds)
2022-06-23 13:17:56 +0200ccntrq1ccntrq
2022-06-23 13:18:17 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f) (Ping timeout: 248 seconds)
2022-06-23 13:19:34 +0200mecharyuujin(~mecharyuu@2405:204:302a:37df:1901:27c8:4070:e6e2)
2022-06-23 13:19:36 +0200 <dminuoso> vpan: What do you mean by "does not work"?
2022-06-23 13:21:04 +0200 <vpan> dminuoso: No instance for (Foldable Parser) arising from a use of ‘null’
2022-06-23 13:21:23 +0200 <dminuoso> :t null
2022-06-23 13:21:24 +0200 <lambdabot> Foldable t => t a -> Bool
2022-06-23 13:21:42 +0200 <dminuoso> (.!) :: FromField a => Record -> Int -> Parser a
2022-06-23 13:22:05 +0200 <dminuoso> vpan: You probably meant to apply `null` to the result of the parser (v .! 11), not the parser itself
2022-06-23 13:22:38 +0200 <dminuoso> So do something like `do { r <- v .! 11; guard (not (null r)); .... }`
2022-06-23 13:25:48 +0200lyle(~lyle@104.246.145.85)
2022-06-23 13:27:48 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 13:30:54 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 13:30:54 +0200king_gs(~Thunderbi@187.201.91.195) (Read error: Connection reset by peer)
2022-06-23 13:32:18 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 13:32:41 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 268 seconds)
2022-06-23 13:32:41 +0200ccntrq1ccntrq
2022-06-23 13:35:09 +0200dlbh^(~dlbh@50.237.44.186) (Ping timeout: 268 seconds)
2022-06-23 13:36:16 +0200 <maerwald[m]> guard always feels lke goto to me
2022-06-23 13:39:08 +0200leeb(~leeb@KD106154144179.au-net.ne.jp) (Ping timeout: 246 seconds)
2022-06-23 13:39:31 +0200chele_(~chele@user/chele)
2022-06-23 13:40:27 +0200cheleGuest5576
2022-06-23 13:40:27 +0200Guest5576(~chele@user/chele) (Killed (strontium.libera.chat (Nickname regained by services)))
2022-06-23 13:40:27 +0200chele_chele
2022-06-23 13:41:25 +0200 <dminuoso> maerwald[m]: Sure, that's what short-circuiting `if cond then return ... else ...` does in traditional languages. :)
2022-06-23 13:42:33 +0200mecharyuujin(~mecharyuu@2405:204:302a:37df:1901:27c8:4070:e6e2) (Ping timeout: 268 seconds)
2022-06-23 13:43:04 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f)
2022-06-23 13:43:05 +0200 <dminuoso> It's a bit bizarre you see this negativity towards `goto` in C, while every single `if/then/else` is essentially just a jnz or equivalent in disguise
2022-06-23 13:46:23 +0200 <vpan> dminuoso: you're right, the process of a Parser becoming the result type is still a bit magic to me. :) Trying to see if I can wrap the do block in a function and use it as a guard condition. Using an `if` in the function definition feels too imperative. :)
2022-06-23 13:47:04 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 13:47:44 +0200 <maerwald[m]> dminuoso: I use goto all over the place in C to jump to cleanup chunks
2022-06-23 13:47:48 +0200 <maerwald[m]> But it confuses me in Haskell
2022-06-23 13:48:10 +0200 <dminuoso> vpan: In Haskell we split things that traditional languages conflate into one thing.
2022-06-23 13:48:56 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 246 seconds)
2022-06-23 13:48:56 +0200ccntrq1ccntrq
2022-06-23 13:49:02 +0200 <dminuoso> vpan: The act of *evaluating* (v .! 11) does not give you the value of the field back. Instead, it can be thought of some parser computation that, if executed against some text, would then give you the field.
2022-06-23 13:50:28 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 13:51:12 +0200 <dminuoso> maerwald[m]: Well you can always ContT into complete confusion.
2022-06-23 13:51:43 +0200 <dminuoso> Twice the power, 8 times the complexity.
2022-06-23 13:51:44 +0200king_gs(~Thunderbi@187.201.91.195) (Read error: Connection reset by peer)
2022-06-23 13:52:11 +0200 <maerwald[m]> Yeah, ContT I refuse to use
2022-06-23 13:52:14 +0200mc47(~mc47@xmonad/TheMC47)
2022-06-23 13:53:02 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 13:53:44 +0200 <dminuoso> vpan: The thing is, <- does not "make it become the result", its just syntax sugar around >>=
2022-06-23 13:53:57 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 13:54:28 +0200 <vpan> dminuoso: right, binding the result to a variable invokes the "execution" you mentioned, similar to IO actions
2022-06-23 13:54:36 +0200 <dminuoso> vpan: Not really.
2022-06-23 13:54:44 +0200 <dminuoso> It doesnt invoke the execution
2022-06-23 13:55:06 +0200 <dminuoso> look at: (v .! 11) >>= \r -> if (null r) then ... else ...
2022-06-23 13:55:12 +0200 <dminuoso> It's really rather a kind of continuation
2022-06-23 13:55:31 +0200 <dminuoso> And the result of >>= computes a larger, more complex "execution"
2022-06-23 13:55:39 +0200 <dminuoso> But it doesnt "invoke" it
2022-06-23 13:56:02 +0200 <maerwald[m]> Aren't nix derivations a form of ContT? xD
2022-06-23 13:56:22 +0200 <dminuoso> maerwald[m]: Mmm in what sense?
2022-06-23 13:56:32 +0200 <dminuoso> Derivations are just simplistic functions
2022-06-23 13:58:53 +0200 <maerwald[m]> dminuoso: what do you think of PEP 383
2022-06-23 13:59:13 +0200 <vpan> dminuoso: ok, so binding to a name is not the point at which the execution happens, we continue to build boxes upon boxes until the result is required and that's when all the boxes spring to life :)
2022-06-23 13:59:45 +0200 <dminuoso> vpan: https://gist.github.com/dminuoso/3f700d36912f5a2932dc9c476d9ede3d
2022-06-23 14:00:01 +0200 <dminuoso> vpan: Do you agree that the mere act of writing Recipe3 does not actually *do* *cooking*?
2022-06-23 14:00:29 +0200 <vpan> sure
2022-06-23 14:00:42 +0200 <dminuoso> And similarly, Recipe3 itself is not *actual* *cooking* right?
2022-06-23 14:00:44 +0200 <dminuoso> It's just a recipe
2022-06-23 14:01:16 +0200shiraeeshi(~shiraeesh@46.34.206.119)
2022-06-23 14:02:16 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl) (Ping timeout: 272 seconds)
2022-06-23 14:02:30 +0200 <vpan> the part that you feed a result you don't yet have feels a bit strange, but I guess that's how lazy evaluation works
2022-06-23 14:02:42 +0200 <dminuoso> It has nothing to do with lazy evaluation
2022-06-23 14:02:47 +0200 <dminuoso> vpan: Is recipe3 lazy?
2022-06-23 14:03:30 +0200 <dminuoso> Let me make a less convoluted example
2022-06-23 14:03:38 +0200 <dminuoso> Or rather
2022-06-23 14:03:55 +0200 <dminuoso> "Given some whipped eggs, add sugar and whip in a bowl" this description itself has nothing to do with lazyness
2022-06-23 14:04:16 +0200 <dminuoso> It's a kind of continuation, where you assume that by some undefined process you already have whipped eggs, how do you carry on
2022-06-23 14:05:11 +0200 <dminuoso> Traditional programming works the same. In C, if a variable `x` is in scope, and you refer to it, you assume that, by some undefined prior process, x has been populated by a value.
2022-06-23 14:05:50 +0200 <dminuoso> This is sometimes even done explicitly, you may know it as callback style
2022-06-23 14:06:32 +0200 <geekosaur> conversely, purescript uses this same mechanism and is strict
2022-06-23 14:06:54 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 14:07:20 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 14:07:30 +0200mjacob(~mjacob@adrastea.uberspace.de) (Ping timeout: 240 seconds)
2022-06-23 14:07:51 +0200stiell(~stiell@gateway/tor-sasl/stiell) (Ping timeout: 268 seconds)
2022-06-23 14:08:06 +0200 <dminuoso> vpan: Note that this "assuming you have something, what do you do with it" is simply described by just a function.
2022-06-23 14:08:33 +0200 <[Leary]> Yeah, it's not about laziness. It's just clever use of higher order functions (taking functions as arguments) to pretend at having a singular result at hand. Then, (>>=) takes that function and maybe doesn't use it at all, maybe applies it multiple times, etc.
2022-06-23 14:08:35 +0200 <dminuoso> `\f -> <expr>` could be read: given some `f`, give <expr>
2022-06-23 14:09:21 +0200mjacob(~mjacob@adrastea.uberspace.de)
2022-06-23 14:10:55 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 14:11:25 +0200jpds1jpds
2022-06-23 14:12:18 +0200 <dminuoso> maerwald[m]: Im really unsure, I think PEP383 is ill-guided in the sense that the real problem its trying to address is lack of a bytestring type.
2022-06-23 14:12:28 +0200 <dminuoso> But that's only after a skim of the PEP
2022-06-23 14:12:35 +0200 <vpan> dminuoso: my understanding is that in C if a `x` is assigned the result of a function call, that function is called at the time x is assigned, i.e. when the assignment statement is executed
2022-06-23 14:12:54 +0200 <dminuoso> (or well, not lack of a bytestring type, but lack of mandating bytestring in in the filesystem apis)
2022-06-23 14:13:27 +0200lisbeths(uid135845@id-135845.lymington.irccloud.com) (Quit: Connection closed for inactivity)
2022-06-23 14:14:01 +0200 <dminuoso> vpan: https://gist.github.com/dminuoso/fd4f2f90e228012b719559ebac1f9ec4
2022-06-23 14:14:10 +0200 <merijn> vpan: Not really if we wanna be super precise
2022-06-23 14:14:27 +0200cosimone(~user@2001:b07:ae5:db26:57c7:21a5:6e1c:6b81) (Remote host closed the connection)
2022-06-23 14:14:35 +0200 <dminuoso> Ah sorry bad example
2022-06-23 14:14:41 +0200 <dminuoso> Updated.
2022-06-23 14:14:50 +0200 <merijn> vpan: C has explicit "sequence points" where "everything that happens before the sequence point is guaranteed to be finished happening after"
2022-06-23 14:14:56 +0200misterfish(~misterfis@ip214-130-173-82.adsl2.static.versatel.nl) (Ping timeout: 272 seconds)
2022-06-23 14:15:04 +0200 <merijn> vpan: But in between sequence points, the ordering and observability is unspecified
2022-06-23 14:15:06 +0200 <dminuoso> vpan: Do you note how the mere line 5 merely assumes that `somewhere before, x has been populated by a value, we dont know how and dont care why`
2022-06-23 14:15:31 +0200 <dminuoso> vpan: we could precisely describe this relationship with a function (or even a c routine): \z -> anotherFunc z
2022-06-23 14:16:18 +0200 <dminuoso> In fact `anotherFunc` already is exactly that routine.
2022-06-23 14:16:46 +0200 <dminuoso> It doesnt know how and where its parameter comes from, its just a mere description of "what to do assuming it had that parameter"
2022-06-23 14:16:54 +0200 <vpan> right, the lamba makes the "given z, invoke anotherFunc" more explicit
2022-06-23 14:17:22 +0200 <dminuoso> In Haskell we codify all these relationships in IO effects which such lambdas.
2022-06-23 14:17:28 +0200 <dminuoso> (assuming you have any such dependency)
2022-06-23 14:17:34 +0200 <dminuoso> : (>>)
2022-06-23 14:17:39 +0200 <dminuoso> :t (>>)
2022-06-23 14:17:40 +0200 <lambdabot> Monad m => m a -> m b -> m b
2022-06-23 14:17:58 +0200 <dminuoso> Lets you talk about sequencing two actions where you discard the result of the first, which is roughly the equivalent of calling a routine in C but not assigning the value to a binder.
2022-06-23 14:18:34 +0200 <dminuoso> For everything else, we just provide it into a lambda instead, and then the lambda by scoping can provide the "result" to all subsequent computations
2022-06-23 14:19:45 +0200 <shiraeeshi> discussing lambdas, huh
2022-06-23 14:19:50 +0200 <shiraeeshi> gud, gud
2022-06-23 14:20:07 +0200 <merijn> tbh, I would strongly caution against all these references to C :p
2022-06-23 14:20:18 +0200 <merijn> As they mostly don't seem to actually be references to C :p
2022-06-23 14:20:21 +0200stiell(~stiell@gateway/tor-sasl/stiell)
2022-06-23 14:20:35 +0200MajorBiscuit(~MajorBisc@c-001-001-031.client.tudelft.eduvpn.nl) (Ping timeout: 244 seconds)
2022-06-23 14:21:14 +0200 <shiraeeshi> I'm reading Purely Functional Data Structures y Chris Okasaki
2022-06-23 14:21:20 +0200 <shiraeeshi> *by
2022-06-23 14:21:26 +0200 <arahael> I thought the C compiler is basically allowed to ignore sequence point if it determines it's not actually significant?
2022-06-23 14:21:49 +0200 <merijn> arahael: Man, I can't answer that
2022-06-23 14:21:50 +0200cosimone(~user@93-44-186-171.ip98.fastwebnet.it)
2022-06-23 14:21:59 +0200 <merijn> because C's memory model is the stuff of Lovecraftian nightmares
2022-06-23 14:22:08 +0200 <dminuoso> arahael: In C things there's an as-if rule
2022-06-23 14:22:10 +0200 <merijn> I'd literally rather quit than be forced to give any definitive answer on that
2022-06-23 14:22:19 +0200 <arahael> dminuoso: Right, it's that rule I'm thinking about there.
2022-06-23 14:22:28 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi)
2022-06-23 14:22:32 +0200 <shiraeeshi> I was surprised when I learned that the examples are in Standard ML, and examples in Haskell are in the appendix
2022-06-23 14:22:49 +0200 <shiraeeshi> I thought that the book would only use Haskell
2022-06-23 14:23:06 +0200leeb(~leeb@KD106154144179.au-net.ne.jp)
2022-06-23 14:23:37 +0200 <dminuoso> arahael: *Very* roughly, outside of `volatile` and memory barriers, it can - for the most part - do whatever it wants to as long as you cant tell the difference.
2022-06-23 14:24:12 +0200coot(~coot@213.134.190.95)
2022-06-23 14:24:17 +0200 <dminuoso> Contrary to popular belief, C operates on an abstract machine and even has notions of "objects and types in memory"
2022-06-23 14:24:26 +0200bontaq(~user@ool-45779fe5.dyn.optonline.net)
2022-06-23 14:24:28 +0200 <dminuoso> (That is, the semantics are defined on an abstract machine)
2022-06-23 14:24:30 +0200 <dminuoso> It'
2022-06-23 14:24:35 +0200 <arahael> I knew C has an abstract machine, but I didn't know it had that notion.
2022-06-23 14:24:35 +0200 <dminuoso> It's most definitely not a high level assembler language.
2022-06-23 14:24:46 +0200 <dminuoso> You can observe this in the aliasing rules for example
2022-06-23 14:25:21 +0200 <arahael> I'm not actually aware of teh aliasing rules. :(
2022-06-23 14:25:32 +0200 <arahael> I just know that aliasing is a thing in C.
2022-06-23 14:25:42 +0200king_gs(~Thunderbi@187.201.91.195) (Ping timeout: 272 seconds)
2022-06-23 14:26:08 +0200 <dminuoso> So if you have a pointer to a struct T, you are not allowed to treat the memory the pointer points at as struct U.
2022-06-23 14:26:12 +0200 <maerwald[m]> dminuoso: https://www.reddit.com/r/haskell/comments/vivjdo/abstract_filepath_coming_soon
2022-06-23 14:26:16 +0200 <merijn> arahael: Strict aliasing rules says that two pointers with diferent types where neither is "char" *cannot* overlap or it's UB
2022-06-23 14:26:23 +0200 <dminuoso> If you do this you're in undefined behavior, and in fact compilers will generate all kinds of broken assembly if you do this.
2022-06-23 14:26:30 +0200 <dminuoso> Without any diagnostics.
2022-06-23 14:26:46 +0200 <arahael> Nice. :)
2022-06-23 14:27:01 +0200 <arahael> (But I'm expecting that there's a whole bunch of exceptions people end up using in practice)
2022-06-23 14:27:02 +0200 <dminuoso> Things like reordering writes after reads
2022-06-23 14:27:07 +0200 <dminuoso> Stuff you really dont expect. :)
2022-06-23 14:27:21 +0200 <arahael> :)
2022-06-23 14:27:45 +0200 <dminuoso> arahael: Sure, people often use it because they dont know its disallowed, and it just so happens the compiler (in that version used by the author) doesnt apply aggressive optimizations in those regions.
2022-06-23 14:28:09 +0200 <dminuoso> Ive found about a dozen of such bugs by accident when working with C
2022-06-23 14:28:15 +0200 <arahael> Not surprising. :(
2022-06-23 14:28:23 +0200 <arahael> I try to avoid C in business projects.
2022-06-23 14:28:30 +0200 <arahael> (I prefer literally anything else)
2022-06-23 14:28:46 +0200 <dminuoso> merijn: By the way, they *can* overlap.
2022-06-23 14:28:49 +0200 <dminuoso> That's not the problem.
2022-06-23 14:29:43 +0200 <merijn> dminuoso: It's UB if they do
2022-06-23 14:29:45 +0200 <dminuoso> No its not.
2022-06-23 14:29:57 +0200 <dminuoso> As bper 6.5p7, the wording is:
2022-06-23 14:30:16 +0200 <dminuoso> An object shall have its *stored* *value* *accessed* only by an lvalue expression that has one of the following types: [...]
2022-06-23 14:30:21 +0200 <dminuoso> (emphasis added by me)
2022-06-23 14:30:40 +0200 <dminuoso> It's the actual value access that's problematic, pointers can overlap
2022-06-23 14:30:47 +0200 <dminuoso> Unless the pointers are function pointers, they may not overlap
2022-06-23 14:30:57 +0200 <merijn> dminuoso: You can't access pointers through the wrong type, sure
2022-06-23 14:31:09 +0200kuribas(~user@ip-188-118-57-242.reverse.destiny.be)
2022-06-23 14:31:17 +0200 <merijn> dminuoso: But strict aliasing refers to the fact that pointers of different types in the same function scope cannot reference the same memory
2022-06-23 14:31:41 +0200 <merijn> dminuoso: It's violated so commonly most compilers don't actually assume people follow strict aliasing
2022-06-23 14:31:54 +0200 <dminuoso> merijn: The C standard is carefully phrased to not talk about pointers, because its about the abstract memory model rather
2022-06-23 14:31:57 +0200 <dminuoso> — a type compatible with the effective type of the object,
2022-06-23 14:32:14 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 14:32:20 +0200 <dminuoso> merijn: thats not true
2022-06-23 14:32:29 +0200 <dminuoso> Unless you exclude both clang and GCC from "most compilers"
2022-06-23 14:32:49 +0200 <dminuoso> Both will generate "buggy" (fsvo buggy in UB-land) code if you violate this rule
2022-06-23 14:32:56 +0200alejandro(~alejandro@47.48.23.95.dynamic.jazztel.es)
2022-06-23 14:33:27 +0200 <dminuoso> It's mostly read/write dependency reordering that happens
2022-06-23 14:33:42 +0200 <merijn> dminuoso: gcc disabled strict aliasing for decades, maybe they changed very recently
2022-06-23 14:34:55 +0200 <dminuoso> merijn: -fstrict-aliasing has been part of -O2 for a long time
2022-06-23 14:36:18 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net) (Ping timeout: 240 seconds)
2022-06-23 14:36:28 +0200leib(~leib@2405:201:900a:f088:60b3:aae8:bd87:1f5f) (Ping timeout: 272 seconds)
2022-06-23 14:37:14 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 14:38:38 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 246 seconds)
2022-06-23 14:38:38 +0200ccntrq1ccntrq
2022-06-23 14:43:10 +0200vglfr(~vglfr@46.96.172.76)
2022-06-23 14:43:38 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net) (Ping timeout: 240 seconds)
2022-06-23 14:45:57 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 14:47:18 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 240 seconds)
2022-06-23 14:47:18 +0200ccntrq1ccntrq
2022-06-23 14:47:47 +0200zmt00(~zmt00@user/zmt00)
2022-06-23 14:48:18 +0200coot(~coot@213.134.190.95) (Ping timeout: 240 seconds)
2022-06-23 14:48:47 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi) (Ping timeout: 246 seconds)
2022-06-23 14:48:55 +0200 <kuribas> People complain about compilation being slow, but do they factor in the time of not writing and running tests that we get from the high level language?
2022-06-23 14:49:07 +0200alejandro(~alejandro@47.48.23.95.dynamic.jazztel.es) (Quit: Leaving)
2022-06-23 14:49:35 +0200 <kuribas> yeah, no compilation time in python/clojure/javascript, but the testsuite takes 15 minutes then.
2022-06-23 14:50:21 +0200 <maerwald[m]> kuribas: not writing tests? Uhm
2022-06-23 14:50:52 +0200 <dminuoso> I dont think the type of errors caught by GHCs type system are usually explicitly covered in tests.
2022-06-23 14:50:54 +0200 <kuribas> maerwald[m]: I mean the time you gain from tests you don't need to write.
2022-06-23 14:51:03 +0200 <kuribas> dminuoso: for sure they are.
2022-06-23 14:51:14 +0200 <dminuoso> Not sure what kind of tests you encode into the type system them.
2022-06-23 14:51:19 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi)
2022-06-23 14:51:21 +0200 <kuribas> dminuoso: you end up testing pretty much everything in a dynamic language.
2022-06-23 14:51:28 +0200 <kuribas> Or be happy with bugs in production.
2022-06-23 14:51:46 +0200 <dminuoso> Sure, but there's no sensible way of writing a test suite that ensures `f is called with a string`
2022-06-23 14:51:47 +0200 <kuribas> dminuoso: I don't *encode* tests in the type system.
2022-06-23 14:52:00 +0200 <geekosaur> judging by the javascript console in my browser, runtime bugs are ignored a lot >.>
2022-06-23 14:52:02 +0200 <dminuoso> Other than just testing all code paths that involve `f` and assert no exception is raised.
2022-06-23 14:52:39 +0200 <kuribas> dminuoso: yeah, you don't do that. You just try to cover as much code paths as possible and hope nothing breaks.
2022-06-23 14:52:50 +0200 <kuribas> dminuoso: you don't particularly test if that field has a string.
2022-06-23 14:53:00 +0200 <dminuoso> kuribas: Covering as many code paths as possible is sensible in Haskell as well.
2022-06-23 14:53:01 +0200 <kuribas> THough you could put runtime assertions in important places.
2022-06-23 14:53:10 +0200 <dminuoso> In Haskell you have undefined in pure code and IO exceptions in impure code.
2022-06-23 14:53:18 +0200 <dminuoso> The slow compilation time does not save you from writing tests.
2022-06-23 14:53:21 +0200 <merijn> kuribas: I mean, faster compilation time is always better
2022-06-23 14:53:32 +0200 <maerwald[m]> dminuoso: String ought to be valid Unicode right? Riiight? xD
2022-06-23 14:53:40 +0200 <kuribas> dminuoso: I usually have a few tests, but rely on the type system to garantee the consistency. And I test on the repl a lot.
2022-06-23 14:53:43 +0200 <maerwald[m]> No need to test
2022-06-23 14:53:50 +0200 <merijn> I don't think GHC's compiles times are prohibitive to stop me from using it. But at the same time if it was 10x slower this would be a good QoL improvement to me
2022-06-23 14:54:01 +0200 <maerwald[m]> Oh wait, you can encode arbitrary invalid surrogate pairs with string
2022-06-23 14:54:03 +0200 <maerwald[m]> Oops
2022-06-23 14:54:19 +0200 <kuribas> merijn: if it is free for me, sure. But what if it means I have to give up on nice abstactions?
2022-06-23 14:54:22 +0200 <dminuoso> And to be fair, -O0 or disabling code generation gets you very far in terms of responsiveness for rapid development
2022-06-23 14:54:30 +0200 <kuribas> good point.
2022-06-23 14:54:31 +0200 <dminuoso> (And Im sure HLS doese these things)
2022-06-23 14:55:17 +0200 <dminuoso> At that point it's mostly just the ability to apply a rapid fix in production that is limited by compilation time with optimizations enabled
2022-06-23 14:55:19 +0200 <maerwald[m]> The sad truth is the type system doesn't enforce most invariants
2022-06-23 14:55:27 +0200 <kuribas> Also, a type checker in the IDE usually doesn't compile the whole system.
2022-06-23 14:56:08 +0200 <kuribas> maerwald[m]: I mean, you *can*, with dependent type. But probably shouldn't.
2022-06-23 14:56:33 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi) (Ping timeout: 268 seconds)
2022-06-23 14:56:38 +0200 <kuribas> maerwald[m]: having some runtime checks is just fine.
2022-06-23 14:56:48 +0200 <kuribas> you can always weight costs and benifits.
2022-06-23 14:56:54 +0200 <maerwald[m]> You mean TESTS
2022-06-23 14:57:06 +0200 <kuribas> for example.
2022-06-23 14:57:09 +0200 <shiraeeshi> finding bugs at compile time works in case of Rust
2022-06-23 14:57:23 +0200 <shiraeeshi> borrow checker and all that
2022-06-23 14:57:38 +0200 <maerwald[m]> A small subset of bugs
2022-06-23 14:57:39 +0200 <merijn> kuribas: The thing is that GHC 8.x is far slower that it needs to be
2022-06-23 14:57:40 +0200 <shiraeeshi> it's a killer feature of Rust
2022-06-23 14:58:19 +0200 <merijn> kuribas: Nobody is arguing to remove features to speed up compile times, it's just that nobody really put in effort to get things fast. Althought the 9.x series have seen pretty big improvements
2022-06-23 14:58:44 +0200 <kuribas> subsumption?
2022-06-23 14:59:10 +0200 <merijn> unrelated to that
2022-06-23 14:59:11 +0200 <geekosaur> a decent part of that was discovering and fixing some "laziness leaks"
2022-06-23 14:59:22 +0200 <merijn> Just boring old plumbing and engineering work
2022-06-23 14:59:24 +0200 <shiraeeshi> I also heard a saying "in languages like haskell, if it compiles, it works"
2022-06-23 14:59:26 +0200mattil(~mattil@helsinki.portalify.com) (Remote host closed the connection)
2022-06-23 14:59:28 +0200 <merijn> Profiling hotspots, improving them
2022-06-23 14:59:34 +0200pleo(~pleo@user/pleo)
2022-06-23 14:59:35 +0200 <geekosaur> shiraeeshi, one could hope
2022-06-23 14:59:41 +0200 <merijn> shiraeeshi: I mean, it's obviously bullshit. But also, kinda not
2022-06-23 14:59:50 +0200 <kuribas> shiraeeshi: it works, but maybe not efficiently :)
2022-06-23 15:00:08 +0200 <merijn> shiraeeshi: It's mostly that you get to focus your testing efforts on less boring kinds of tests
2022-06-23 15:00:14 +0200 <maerwald[m]> merijn: it's pure BS
2022-06-23 15:00:22 +0200 <kuribas> for me more like, it "usually" works.
2022-06-23 15:00:27 +0200 <kuribas> Or "most of the code works".
2022-06-23 15:00:54 +0200 <merijn> maerwald[m]: Overall my compiling code is a closer approximation of "works correctly" than it has even been in C/C++/Python
2022-06-23 15:00:55 +0200 <kuribas> for clojure, it's "it "usually" doesn't work. and "most of the code doesn't work".
2022-06-23 15:01:13 +0200 <merijn> Of course claiming "compiling code is bug free" is nonsense, but that should be obvious to anyone
2022-06-23 15:01:27 +0200 <merijn> But I waste a whole lot less time chasing down boring stupid shit in haskell
2022-06-23 15:01:33 +0200yrlnry(~yrlnry@pool-108-2-150-109.phlapa.fios.verizon.net)
2022-06-23 15:01:42 +0200 <kuribas> also, "after refactoring, it very likely works.", and in clojure: "after refactoring, it very likely is broken".
2022-06-23 15:01:43 +0200 <merijn> and like, 70% of my debugging efforts in C/C++ go into said "stupid shit"
2022-06-23 15:01:51 +0200 <kuribas> merijn: yeah this
2022-06-23 15:01:57 +0200 <maerwald[m]> merijn: I've spend a lot of time chasing boring bugs in Haskell
2022-06-23 15:02:00 +0200 <merijn> So overall, it *feels* much more correct
2022-06-23 15:02:23 +0200 <merijn> maerwald[m]: Well, find us 50-100 more anecdotal comments and we can discuss it :p
2022-06-23 15:02:44 +0200 <maerwald[m]> It's just that refactoring in Haskell is usually less error prone
2022-06-23 15:02:58 +0200 <maerwald[m]> But my python prototypes don't have significantly more bugs
2022-06-23 15:03:49 +0200 <maerwald[m]> E.g. Python doesn't use PEP 383 incorrectly like Haskell
2022-06-23 15:04:46 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 15:06:03 +0200progress__(~fffuuuu_i@45.112.243.220)
2022-06-23 15:06:33 +0200 <kuribas> maerwald[m]: I find my clojure code much harder to maintain.
2022-06-23 15:07:10 +0200 <maerwald[m]> kuribas: agreed
2022-06-23 15:07:21 +0200 <kuribas> and clojure > Python
2022-06-23 15:07:51 +0200 <maerwald[m]> I will never forget how removing a bracket didn't cause a compile error, but made the webpage go blank when clicking a button
2022-06-23 15:07:58 +0200 <maerwald[m]> Clojurescript is cancer
2022-06-23 15:08:14 +0200 <zzz> in haskell, debugging is called "learning"
2022-06-23 15:08:39 +0200litharge(litharge@libera/bot/litharge) (Quit: restarting)
2022-06-23 15:09:07 +0200litharge(litharge@libera/bot/litharge)
2022-06-23 15:09:47 +0200azimut(~azimut@gateway/tor-sasl/azimut) (Quit: ZNC - https://znc.in)
2022-06-23 15:10:31 +0200azimut(~azimut@gateway/tor-sasl/azimut)
2022-06-23 15:11:58 +0200 <kuribas> debugging in haskell for me is mosly putting trace statements.
2022-06-23 15:12:11 +0200 <kuribas> And trying out functions on the repl.
2022-06-23 15:12:33 +0200 <kuribas> there are no advanced debugging tools, but I rarely miss them.
2022-06-23 15:13:41 +0200 <maerwald[m]> kuribas: ghc-debug
2022-06-23 15:14:17 +0200misterfish(~misterfis@87.215.131.98)
2022-06-23 15:14:38 +0200 <shiraeeshi> kuribas: how about labeling cost centers and then viewing statistics?
2022-06-23 15:14:50 +0200 <kuribas> shiraeeshi: that's profiling :)
2022-06-23 15:14:52 +0200 <shiraeeshi> to debug space leaks
2022-06-23 15:15:38 +0200Unicorn_Princess(~Unicorn_P@93-103-228-248.dynamic.t-2.net)
2022-06-23 15:15:40 +0200 <kuribas> maerwald[m]: cool, I didn't know about that.
2022-06-23 15:15:48 +0200 <shiraeeshi> I wonder if profiling skills are more needed for Haskell developers than others
2022-06-23 15:15:57 +0200 <shiraeeshi> due to laziness
2022-06-23 15:16:02 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 15:16:23 +0200MajorBiscuit(~MajorBisc@wlan-145-94-167-213.wlan.tudelft.nl)
2022-06-23 15:16:27 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 15:19:13 +0200 <kuribas> shiraeeshi: in my opinion: no
2022-06-23 15:20:13 +0200 <shiraeeshi> if only we could gather statistics about space leaks from every running haskell program
2022-06-23 15:20:14 +0200dlbh^(~dlbh@50.237.44.186)
2022-06-23 15:20:27 +0200 <kuribas> I have not yet seen space leaks in my programs.
2022-06-23 15:20:33 +0200 <shiraeeshi> to be able to tell how often they occur compared to other languages
2022-06-23 15:21:47 +0200 <shiraeeshi> kuribas: you are good at avoiding them?
2022-06-23 15:22:04 +0200 <shiraeeshi> or they just don't occur and you don't even worry about them?
2022-06-23 15:22:45 +0200 <kuribas> both probably?
2022-06-23 15:23:04 +0200 <kuribas> well I mean, the programs I write are not sensitive to them maybe.
2022-06-23 15:23:25 +0200 <kuribas> Though I avoid pitfalls, like foldl instead of foldl'.
2022-06-23 15:24:13 +0200 <dolio> I think it's just not very hard to preemptively avoid most space leaks once you have experience.
2022-06-23 15:24:40 +0200 <dolio> Doesn't take much thought.
2022-06-23 15:26:18 +0200 <shiraeeshi> I find myself discussing space leaks more and more lately
2022-06-23 15:26:29 +0200 <shiraeeshi> but actually I wanted to ask about something else
2022-06-23 15:26:29 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:1762:327:502f:fc6c)
2022-06-23 15:27:00 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 15:27:03 +0200 <shiraeeshi> I'm reading a Purely Functional Data Structures by Chris Okasaki
2022-06-23 15:27:27 +0200 <shiraeeshi> I'm at the beginning
2022-06-23 15:27:41 +0200 <shiraeeshi> Leftist heaps
2022-06-23 15:28:01 +0200 <shiraeeshi> there is an exercise 3.3
2022-06-23 15:28:07 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 15:28:14 +0200 <shiraeeshi> it's about proving that fromList runs in O(n) time
2022-06-23 15:28:46 +0200 <shiraeeshi> it says: "Instead of merging the heaps in one
2022-06-23 15:28:46 +0200 <shiraeeshi> right-to-left or left-to-right pass using foldr or foldl, merge the heaps in [log n] passes, where each
2022-06-23 15:28:46 +0200 <shiraeeshi> pass merges adjacent pairs of heaps."
2022-06-23 15:28:57 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 248 seconds)
2022-06-23 15:28:57 +0200ccntrq1ccntrq
2022-06-23 15:29:13 +0200 <maerwald[m]> dolio: not everything is a space leak. Evaluating deep thunks in hot loops can kill your performance in orders of 10x magnitudes and forcing the result at the callsite won't fix it
2022-06-23 15:30:15 +0200 <maerwald[m]> It's really hard to debug it. There is no space leaks, the profiler doesn't tell you what's going on either
2022-06-23 15:30:48 +0200 <dolio> My comment was about space leaks, because that's what the topic was.
2022-06-23 15:31:33 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340)
2022-06-23 15:31:34 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 15:31:40 +0200coot(~coot@213.134.190.95)
2022-06-23 15:33:08 +0200 <shiraeeshi> earlier in the chapter the book says that merge runs in O(log n) time
2022-06-23 15:33:32 +0200 <shiraeeshi> and we invoke merge ceil(log n) times, right?
2022-06-23 15:33:52 +0200 <shiraeeshi> how does it follow that fromList runs in O(n) time?
2022-06-23 15:35:53 +0200 <maerwald[m]> dolio: the topic was about profiling too ;)
2022-06-23 15:36:11 +0200 <dolio> When I commented people were specifically talking about space leaks, and that's why I wrote what I wrote.
2022-06-23 15:36:43 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 15:37:46 +0200 <shiraeeshi> am I not seeing something obvious here?
2022-06-23 15:37:50 +0200 <[Leary]> shiraeeshi: I'm not saying this is it, but you should be aware that log n is O(n) and so is (log n)^2. O(n) time technically means linear or sub-linear.
2022-06-23 15:38:34 +0200 <geekosaur> besides which, the time is likely dominated by the time it takes to traverse the list which is by definition O(n)
2022-06-23 15:39:23 +0200 <shiraeeshi> [Leary]: yeah, that's what's not clear to me: how do you go from log n to n? because it seems to me that you should make it power of two
2022-06-23 15:40:22 +0200 <shiraeeshi> geekosaur: the exercise says that you should merge adjacent pairs instead of using folds, if I understood you correctly
2022-06-23 15:40:50 +0200 <dolio> shiraeeshi: When you use foldl/r, you'll be repeatedly merging a small things into increasingly large things, to the cost of each operation will approach log n, as the size of the large thing is O(n) for much of the time.
2022-06-23 15:40:56 +0200 <shiraeeshi> you divide the list to pairs and merge them, and you keep doing that until only one heap left
2022-06-23 15:41:22 +0200 <[Leary]> This is just the definition of big-O; it's not an "equals", it's a "less than or equals".
2022-06-23 15:41:26 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Ping timeout: 255 seconds)
2022-06-23 15:42:38 +0200 <[Leary]> So anything that's O(n) is autamtatically O(n^2), but not necessarily O(log n).
2022-06-23 15:42:43 +0200 <[Leary]> (e.g.)
2022-06-23 15:43:15 +0200 <maerwald> dolio: and when you were talking about profiling space leaks, I wrote what I wrote
2022-06-23 15:43:32 +0200 <shiraeeshi> [Leary]: I see your point, but I think the author expects something like an algebraic proof that concludes that the time is O(n)
2022-06-23 15:44:01 +0200 <dolio> Pair-wise, you'll be repeatedly merging things of the same size to double their size, and most of the merges will not be of things that are O(n) size.
2022-06-23 15:44:28 +0200 <dolio> Or at least, needn't be counted that way.
2022-06-23 15:44:57 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340) (Ping timeout: 248 seconds)
2022-06-23 15:45:19 +0200 <shiraeeshi> dolio: hmm, "repeatedly merging small things into increasingly large things" sounds like it could lead to the right answer, but I don't see how would you prove the O(n) time. I think I'm missing something.
2022-06-23 15:46:17 +0200jespada(~jespada@cpc121022-nmal24-2-0-cust171.19-2.cable.virginm.net) (Ping timeout: 256 seconds)
2022-06-23 15:47:06 +0200 <shiraeeshi> oh, right. the "n" when we say that merge takes O(log n) time is not the same as "n" when we say that fromList takes O(n) time
2022-06-23 15:47:25 +0200 <dolio> For example, with foldl, by the time you've processed half the list, your accumulator has size n/2, so the remaining n/2 merges take ~log n time.
2022-06-23 15:47:44 +0200Pickchea(~private@user/pickchea) (Ping timeout: 268 seconds)
2022-06-23 15:47:46 +0200 <dolio> For the pair-wise merges, only the last merge has size n/2.
2022-06-23 15:48:18 +0200 <dolio> The previous one had n/4, before that n/8, which is like n/2^k.
2022-06-23 15:48:30 +0200xff0x(~xff0x@b133147.ppp.asahi-net.or.jp)
2022-06-23 15:49:03 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi)
2022-06-23 15:50:30 +0200Colere(~colere@about/linux/staff/sauvin) (Ping timeout: 264 seconds)
2022-06-23 15:51:00 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340)
2022-06-23 15:51:13 +0200jespada(~jespada@cpc121022-nmal24-2-0-cust171.19-2.cable.virginm.net)
2022-06-23 15:51:31 +0200 <shiraeeshi> ok, so let's say we have a list with 16 elements
2022-06-23 15:51:57 +0200 <shiraeeshi> there are 8 pairs, so we invoke merge 8 times
2022-06-23 15:53:00 +0200 <shiraeeshi> the size of heaps to merged is 1, so merge takes O(1) time (or we can say O(log 1))
2022-06-23 15:53:08 +0200 <dolio> So many more of the merges happen at sizes that are 'effectively constant', and only an effectively constant number at the end are O(n). At least, that's sort of an intuitive idea behind it.
2022-06-23 15:53:31 +0200 <shiraeeshi> so going from 16 elements to 8 elements takes 8*O(1) time
2022-06-23 15:54:30 +0200 <shiraeeshi> now going from 8 elements to 4 elements takes 4*O(log2) = 4*O(1) time
2022-06-23 15:55:08 +0200alp(~alp@user/alp) (Ping timeout: 268 seconds)
2022-06-23 15:55:08 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 268 seconds)
2022-06-23 15:55:22 +0200 <shiraeeshi> going from 4 elements to 2 elements takes 2*O(log4) = 2*O(2)
2022-06-23 15:56:07 +0200 <shiraeeshi> and from 2 elements to 1 element should take O(log 8) = O(3) time
2022-06-23 15:56:19 +0200cfricke(~cfricke@user/cfricke) (Quit: WeeChat 3.5)
2022-06-23 15:56:59 +0200progress__(~fffuuuu_i@45.112.243.220) (Ping timeout: 268 seconds)
2022-06-23 15:57:51 +0200 <shiraeeshi> total time is the sum of all those iterations
2022-06-23 15:58:49 +0200 <shiraeeshi> sum (for i from 1 to half of n) (2*i * O(log (n - i)))
2022-06-23 15:59:05 +0200 <geekosaur> I don't think you get to do math with big-Os that way
2022-06-23 15:59:18 +0200Surobaki(~surobaki@137.44.222.80) (Ping timeout: 240 seconds)
2022-06-23 15:59:39 +0200 <geekosaur> in particular, your "8*O(1)" is O(n/2) which is O(n) with a constant factor (that drops out) of 0.5
2022-06-23 15:59:46 +0200 <dolio> Yeah, I mean, reasoning about a fixed example is erroneous, because it's all constant. But it helps see the pattern.
2022-06-23 15:59:49 +0200cfricke(~cfricke@user/cfricke)
2022-06-23 15:59:56 +0200 <[Leary]> Sounds like it's amenable to induction, if you want to do that a bit more rigourously.
2022-06-23 16:01:01 +0200Surobaki(~surobaki@137.44.222.80)
2022-06-23 16:01:52 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 16:01:55 +0200 <dolio> The sum is what to think about, though. Then show that it's O(n).
2022-06-23 16:02:00 +0200 <dolio> I think.
2022-06-23 16:03:36 +0200nate4(~nate@98.45.169.16)
2022-06-23 16:05:44 +0200dlbh^(~dlbh@50.237.44.186) (Remote host closed the connection)
2022-06-23 16:06:42 +0200 <dolio> Or maybe, `Σ(i = 1..log n) 2^(n-i) * i` is better?
2022-06-23 16:08:42 +0200causal(~user@50.35.83.177)
2022-06-23 16:08:42 +0200nate4(~nate@98.45.169.16) (Ping timeout: 268 seconds)
2022-06-23 16:08:42 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340) (Ping timeout: 268 seconds)
2022-06-23 16:09:49 +0200 <dolio> Oh, not 2^(n-i), 2^(log n - i)
2022-06-23 16:12:15 +0200lortabac(~lortabac@2a01:e0a:541:b8f0:1762:327:502f:fc6c) (Quit: WeeChat 2.8)
2022-06-23 16:12:18 +0200waleee(~waleee@2001:9b0:213:7200:cc36:a556:b1e8:b340)
2022-06-23 16:12:21 +0200 <dolio> Then turn it into a similar definite integral and ask wolfram alpha to solve it, and say something about correspondences between sums and integrals.
2022-06-23 16:14:44 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net)
2022-06-23 16:15:08 +0200 <shiraeeshi> the way I see it is, we have a triangle or a pyramid of decreasing number of increasing coefficients
2022-06-23 16:15:38 +0200 <shiraeeshi> but how do you go from that pyramid to n is not clear to me
2022-06-23 16:16:37 +0200 <shiraeeshi> (I should remind myself that it's not exactly n, but something like n in an asymptotic sense)
2022-06-23 16:17:18 +0200Surobaki(~surobaki@137.44.222.80) (Ping timeout: 240 seconds)
2022-06-23 16:18:06 +0200 <shiraeeshi> wait, all the iterations take the same time
2022-06-23 16:18:11 +0200Surobaki(~surobaki@137.44.222.80)
2022-06-23 16:19:11 +0200stefan-_(~cri@42dots.de) (Ping timeout: 268 seconds)
2022-06-23 16:19:56 +0200 <shiraeeshi> no, sounds like it leads to O(n * log n)
2022-06-23 16:21:08 +0200 <shiraeeshi> perhaps I should try induction
2022-06-23 16:22:07 +0200 <[Leary]> It's almost always the simplest way, when it applies.
2022-06-23 16:22:15 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de)
2022-06-23 16:22:39 +0200 <dolio> I guess wolfram can just do the sum, actually. No need for integrals.
2022-06-23 16:23:37 +0200stefan-_(~cri@42dots.de)
2022-06-23 16:23:53 +0200ccntrq(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 248 seconds)
2022-06-23 16:25:02 +0200ccntrq(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de)
2022-06-23 16:25:35 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470)
2022-06-23 16:26:26 +0200ccntrq1(~Thunderbi@dynamic-095-112-145-116.95.112.pool.telefonica.de) (Ping timeout: 246 seconds)
2022-06-23 16:28:38 +0200Surobaki(~surobaki@137.44.222.80) (Ping timeout: 240 seconds)
2022-06-23 16:31:28 +0200Vq(~vq@90-227-195-41-no77.tbcn.telia.com)
2022-06-23 16:31:38 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f) (Ping timeout: 240 seconds)
2022-06-23 16:35:55 +0200Surobaki(~surobaki@137.44.222.80)
2022-06-23 16:37:56 +0200vpan(~0@212.117.1.172) (Quit: Leaving.)
2022-06-23 16:43:36 +0200ccntrq1(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de)
2022-06-23 16:44:24 +0200Surobaki(~surobaki@137.44.222.80) (Ping timeout: 272 seconds)
2022-06-23 16:45:13 +0200ccntrq(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de) (Ping timeout: 256 seconds)
2022-06-23 16:45:13 +0200ccntrq1ccntrq
2022-06-23 16:46:31 +0200Surobaki(~surobaki@137.44.222.80)
2022-06-23 16:46:36 +0200shriekingnoise(~shrieking@201.212.175.181)
2022-06-23 16:49:59 +0200dsrt^(~dsrt@50.237.44.186)
2022-06-23 16:52:54 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net) (Ping timeout: 264 seconds)
2022-06-23 16:53:13 +0200 <dolio> shiraeeshi: BTW, if you have access to Knuth's Concrete Mathematics then Σ_k k*2^k is an example he shows how to solve using the calculus of finite differences. And that's the complicated part of Σ_k (lg n - k)*2^k, which is another way to write the sum.
2022-06-23 16:54:15 +0200Sgeo(~Sgeo@user/sgeo)
2022-06-23 16:55:34 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 268 seconds)
2022-06-23 16:55:35 +0200raehik(~raehik@cpc95906-rdng25-2-0-cust156.15-3.cable.virginm.net)
2022-06-23 16:56:46 +0200Surobaki(~surobaki@137.44.222.80) (Read error: Connection reset by peer)
2022-06-23 16:57:14 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 16:59:36 +0200pleo(~pleo@user/pleo) (Ping timeout: 272 seconds)
2022-06-23 17:00:34 +0200ccntrq1(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de)
2022-06-23 17:00:36 +0200FinnElija(~finn_elij@user/finn-elija/x-0085643) (Quit: FinnElija)
2022-06-23 17:02:08 +0200ccntrq(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de) (Ping timeout: 272 seconds)
2022-06-23 17:02:08 +0200ccntrq1ccntrq
2022-06-23 17:02:11 +0200FinnElija(~finn_elij@user/finn-elija/x-0085643)
2022-06-23 17:03:54 +0200king_gs(~Thunderbi@187.201.91.195) (Read error: Connection reset by peer)
2022-06-23 17:03:55 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f)
2022-06-23 17:08:53 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 17:11:39 +0200ccntrq1(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de)
2022-06-23 17:13:22 +0200cfricke(~cfricke@user/cfricke) (Quit: WeeChat 3.5)
2022-06-23 17:13:54 +0200ccntrq(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de) (Ping timeout: 264 seconds)
2022-06-23 17:13:54 +0200ccntrq1ccntrq
2022-06-23 17:15:46 +0200jao(~jao@cpc103048-sgyl39-2-0-cust502.18-2.cable.virginm.net)
2022-06-23 17:17:21 +0200 <shiraeeshi> dolio: thanks, gonna take a look if I get stuck
2022-06-23 17:20:02 +0200jakalx(~jakalx@base.jakalx.net) (Error from remote client)
2022-06-23 17:20:20 +0200jakalx(~jakalx@base.jakalx.net)
2022-06-23 17:21:29 +0200 <triteraflops> dminuoso: Yeah, I knew about this kind of typing, which is why I thought haskell couldn't do it on its own, without its linear extensions, for example.
2022-06-23 17:28:18 +0200mixfix41(~sdenynine@user/mixfix41) (Ping timeout: 240 seconds)
2022-06-23 17:31:16 +0200ccntrq(~Thunderbi@dynamic-077-006-224-164.77.6.pool.telefonica.de) (Ping timeout: 272 seconds)
2022-06-23 17:33:12 +0200alp(~alp@user/alp)
2022-06-23 17:33:57 +0200hgolden(~hgolden2@cpe-172-251-233-141.socal.res.rr.com) (Remote host closed the connection)
2022-06-23 17:35:00 +0200fryguybob(~fryguybob@cpe-74-67-169-145.rochester.res.rr.com)
2022-06-23 17:37:19 +0200hgolden(~hgolden2@cpe-172-251-233-141.socal.res.rr.com)
2022-06-23 17:38:19 +0200jrm(~jrm@user/jrm) (Quit: ciao)
2022-06-23 17:39:21 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net) (Ping timeout: 268 seconds)
2022-06-23 17:39:32 +0200jrm(~jrm@user/jrm)
2022-06-23 17:40:10 +0200hgolden(~hgolden2@cpe-172-251-233-141.socal.res.rr.com) (Remote host closed the connection)
2022-06-23 17:43:30 +0200hgolden(~hgolden2@cpe-172-251-233-141.socal.res.rr.com)
2022-06-23 17:45:38 +0200dsrt^(~dsrt@50.237.44.186) (Ping timeout: 240 seconds)
2022-06-23 17:47:20 +0200dsrt^(~dsrt@50.237.44.186)
2022-06-23 17:50:10 +0200mcglk(~mcglk@131.191.49.120) (Ping timeout: 240 seconds)
2022-06-23 17:52:29 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 17:57:21 +0200 <juri_> hmm. bumped GHC to 9.0.2, and now rounded-hw is failing to dependency resolve. :(
2022-06-23 17:58:41 +0200juri_pokes it with a stick.
2022-06-23 17:59:33 +0200shaprexplodes
2022-06-23 17:59:35 +0200mcglk(~mcglk@131.191.49.120)
2022-06-23 18:00:11 +0200RudraveerMandal[(~magphimat@2001:470:69fc:105::2:eb9) (Quit: You have been kicked for being idle)
2022-06-23 18:00:29 +0200brettgilio(~brettgili@c9yh.net) (Quit: The Lounge - https://thelounge.chat)
2022-06-23 18:01:46 +0200 <shapr> @seen dons
2022-06-23 18:01:47 +0200 <lambdabot> I saw dons leaving #haskell 1m 2h 46m 2s ago.
2022-06-23 18:01:53 +0200 <shapr> huh, a month?
2022-06-23 18:02:02 +0200 <shapr> I'm glad @seen is working again
2022-06-23 18:02:34 +0200brettgilio(~brettgili@c9yh.net)
2022-06-23 18:02:45 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Remote host closed the connection)
2022-06-23 18:02:46 +0200cherries[m](~cherriesm@2001:470:69fc:105::2:16c0) (Quit: You have been kicked for being idle)
2022-06-23 18:02:57 +0200phuegrvs[m](~phuegrvsm@2001:470:69fc:105::1:65e4) (Quit: You have been kicked for being idle)
2022-06-23 18:03:00 +0200pleo(~pleo@user/pleo)
2022-06-23 18:06:01 +0200brettgilio(~brettgili@c9yh.net) (Client Quit)
2022-06-23 18:08:37 +0200 <lambdabot> ∿∿∿∿∿∿∿∿∿∿∿∿∿
2022-06-23 18:09:11 +0200brettgilio(~brettgili@c9yh.net)
2022-06-23 18:10:22 +0200Tuplanolla(~Tuplanoll@91-159-69-97.elisa-laajakaista.fi)
2022-06-23 18:10:32 +0200 <shapr> heippa hei Tuplanolla
2022-06-23 18:13:23 +0200tzh(~tzh@c-24-21-73-154.hsd1.or.comcast.net)
2022-06-23 18:13:27 +0200benin0(~benin@183.82.30.117) (Quit: The Lounge - https://thelounge.chat)
2022-06-23 18:14:03 +0200 <Tuplanolla> Hey, shapr. What's going on?
2022-06-23 18:14:23 +0200 <shapr> Oh, delaying getting started on some code. What about you?
2022-06-23 18:14:38 +0200 <shapr> I'm also trying to figure out the process for becoming a maintainer for a piece of GHC
2022-06-23 18:15:02 +0200 <shapr> And looking into lambdabot to see how much the code has changed in the past (checks calendar) uh, 20 years
2022-06-23 18:17:04 +0200 <shapr> Kinda awesome to see the initial import into git from, uh, darcs? : https://github.com/lambdabot/lambdabot/commits?author=shapr
2022-06-23 18:17:04 +0200king_gs(~Thunderbi@187.201.91.195) (Read error: Connection reset by peer)
2022-06-23 18:17:38 +0200 <shapr> Was lambdabot even *in* source control in the beginning? I don't remember.
2022-06-23 18:18:43 +0200MajorBiscuit(~MajorBisc@wlan-145-94-167-213.wlan.tudelft.nl) (Ping timeout: 256 seconds)
2022-06-23 18:19:21 +0200king_gs(~Thunderbi@187.201.91.195)
2022-06-23 18:23:20 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470) (Remote host closed the connection)
2022-06-23 18:24:03 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470)
2022-06-23 18:25:28 +0200mc47(~mc47@xmonad/TheMC47) (Remote host closed the connection)
2022-06-23 18:26:15 +0200 <dminuoso> triteraflops: The linear extension is orthogonal (dual even, in a sense) to uniqueness types.
2022-06-23 18:26:41 +0200 <dminuoso> The uniqueness types in Clean form a secondary/orthogonal type system if I understand it correctly
2022-06-23 18:27:05 +0200 <dminuoso> And even with linear types, you couldn't do it.
2022-06-23 18:27:13 +0200 <Tuplanolla> Oh, cool.
2022-06-23 18:27:29 +0200 <Tuplanolla> I've been breaking another compiler instead. https://github.com/coq/coq/issues?q=is%3Aissue+author%3ATuplanolla
2022-06-23 18:28:04 +0200 <shapr> Tuplanolla: wow nice! that's a good run of breakage
2022-06-23 18:29:31 +0200 <Tuplanolla> If only it would end.
2022-06-23 18:29:31 +0200 <shapr> Also been working on some fun things with cdsmith, we built a MUD https://github.com/cdsmith/ourmud that's backed by a serializable graph database https://github.com/cdsmith/edgy
2022-06-23 18:29:33 +0200 <lechner> Hi, is there a difference in the "before" and "after"\ code examples for optparse-applicative here? https://blog.ocharles.org.uk/posts/2022-06-22-list-of-monoids-pattern.html
2022-06-23 18:29:55 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470) (Remote host closed the connection)
2022-06-23 18:30:41 +0200 <shapr> In the process, we fixed up a few libraries to compile (and work?) with GHC 9.2.2; but I couldn't find a process for uploading "this probably works" kind of releases for packages that haven't been updated in a few years.
2022-06-23 18:30:52 +0200adanwan(~adanwan@gateway/tor-sasl/adanwan) (Remote host closed the connection)
2022-06-23 18:30:54 +0200 <dminuoso> lechner: Not that I can see.
2022-06-23 18:31:05 +0200 <dminuoso> Think some copy/paste bug stole its way in there
2022-06-23 18:31:08 +0200adanwan(~adanwan@gateway/tor-sasl/adanwan)
2022-06-23 18:31:14 +0200 <shapr> @seen ocharles
2022-06-23 18:31:14 +0200 <lambdabot> I saw ocharles leaving #ghc 8d 18h 19m 34s ago.
2022-06-23 18:32:09 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470)
2022-06-23 18:32:47 +0200 <Tuplanolla> I still have one side project that's being written in Haskell. Oddly enough, it has to do with the modeling and visualization of a particular theory of psychology.
2022-06-23 18:32:50 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470) (Remote host closed the connection)
2022-06-23 18:33:01 +0200 <shapr> well now I want a link
2022-06-23 18:33:06 +0200dextaa(~DV@user/dextaa) (Read error: Connection reset by peer)
2022-06-23 18:33:18 +0200 <shiraeeshi> shapr: what's MUD?
2022-06-23 18:33:31 +0200 <geekosaur> "multi-user dungeon"
2022-06-23 18:33:35 +0200 <shapr> multi user dungeon
2022-06-23 18:33:36 +0200 <shapr> yeah that
2022-06-23 18:33:45 +0200 <shapr> https://en.wikipedia.org/wiki/MUD
2022-06-23 18:34:10 +0200 <lechner> dminuoso: thanks! as a beginner, i wasn't sure
2022-06-23 18:34:14 +0200misterfish(~misterfis@87.215.131.98) (Ping timeout: 268 seconds)
2022-06-23 18:34:16 +0200mastarija(~mastarija@2a05:4f46:e02:8c00:3026:72db:30b8:5822)
2022-06-23 18:34:25 +0200 <shiraeeshi> wow, interesting
2022-06-23 18:34:28 +0200 <Tuplanolla> It's not public yet, because it doesn't even work, but here is the related organization. https://www.thebowencenter.org/
2022-06-23 18:34:52 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 18:34:54 +0200 <Tuplanolla> It's about "Bowen family systems theory".
2022-06-23 18:35:02 +0200 <shapr> haven't heard of that
2022-06-23 18:35:08 +0200 <Tuplanolla> Neither had I.
2022-06-23 18:35:15 +0200dextaa(~DV@user/dextaa)
2022-06-23 18:35:51 +0200coot(~coot@213.134.190.95) (Quit: coot)
2022-06-23 18:36:22 +0200coot(~coot@213.134.190.95)
2022-06-23 18:36:30 +0200 <shiraeeshi> Tuplanolla: what kind of visualization?
2022-06-23 18:36:48 +0200 <shiraeeshi> I mean, something like histograms, graphs, something else?
2022-06-23 18:36:59 +0200 <Tuplanolla> It looks like a dynamical system on a graph, so FGL, GraphViz and Gnuplot are a perfect fit for it.
2022-06-23 18:37:30 +0200 <Tuplanolla> I just wish there was a little less Henning in them.
2022-06-23 18:37:53 +0200 <shiraeeshi> it's weird that you use one word graph in english to talk about graphs, and, well, graphs
2022-06-23 18:38:19 +0200 <shiraeeshi> "graphical graphs" and "graphs as edges-and-nodes"
2022-06-23 18:38:27 +0200 <Tuplanolla> Well, all three senses are present, so it all works out!
2022-06-23 18:39:06 +0200 <darkling> I had a lecturer that pronounced them "graaph" (English RP) for the two-axes-and-aline thing, and "graf" (Northern English, short "a") for the nodes-and-edges thing.
2022-06-23 18:39:41 +0200 <shapr> I'm trying not to make jokes about a Van de Graph generator
2022-06-23 18:39:52 +0200 <Tuplanolla> Personally, I find "network" and "curve" to be more appropriate.
2022-06-23 18:39:59 +0200 <shapr> Is there a good binding to matplotlib or other graph visualization library?
2022-06-23 18:40:12 +0200 <shapr> I never could get my head wrapped around fgl, does it have benefits over alga?
2022-06-23 18:40:12 +0200kuribas(~user@ip-188-118-57-242.reverse.destiny.be) (Remote host closed the connection)
2022-06-23 18:40:13 +0200 <Tuplanolla> No; just bad, shapr.
2022-06-23 18:40:22 +0200 <shapr> oh, too bad
2022-06-23 18:40:25 +0200 <shapr> use diagrams instead?
2022-06-23 18:40:30 +0200 <shapr> or JuicyPixels?
2022-06-23 18:40:46 +0200 <juri_> ug. fighting juicyPixels build trouble atm.
2022-06-23 18:41:26 +0200 <shapr> I'm pushing PRs for the StrictCheck and TCache, though I don't expect them to be merged.
2022-06-23 18:43:17 +0200jakalx(~jakalx@base.jakalx.net) (Error from remote client)
2022-06-23 18:43:28 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl) (Ping timeout: 272 seconds)
2022-06-23 18:44:02 +0200jakalx(~jakalx@base.jakalx.net)
2022-06-23 18:46:32 +0200leeb(~leeb@KD106154144179.au-net.ne.jp) (Quit: WeeChat 3.4.1)
2022-06-23 18:47:11 +0200 <shapr> juri_: are you making pretty pictures?
2022-06-23 18:47:39 +0200faitz(~faitz@169.150.201.38)
2022-06-23 18:49:37 +0200 <juri_> shapr: I make things. :)
2022-06-23 18:49:58 +0200 <juri_> i just happen to support output formats that are pixel-based.
2022-06-23 18:52:02 +0200vglfr(~vglfr@46.96.172.76) (Ping timeout: 246 seconds)
2022-06-23 18:54:34 +0200mecharyuujin(~mecharyuu@2409:4050:2d4b:a853:8048:c716:f88e:d09f) (Quit: Leaving)
2022-06-23 18:55:34 +0200kenran(~kenran@200116b82b215d00f12ab9e70125a9a1.dip.versatel-1u1.de)
2022-06-23 18:57:00 +0200coot(~coot@213.134.190.95) (Quit: coot)
2022-06-23 18:57:01 +0200BusConscious(~martin@ip5f5bdf00.dynamic.kabel-deutschland.de)
2022-06-23 18:57:38 +0200kenran(~kenran@200116b82b215d00f12ab9e70125a9a1.dip.versatel-1u1.de) (Client Quit)
2022-06-23 18:57:46 +0200 <dminuoso> lechner: I *think* the intent was to display `flag True False (mconcat [f1, f2, f3])` vs `flag True False [f1, f2, f3]`
2022-06-23 18:59:08 +0200 <dminuoso> Or perhaps `flag True False (f1 <> f2 <> f3)` in the former case
2022-06-23 18:59:48 +0200 <dminuoso> Judging from the final sentence, I think they meant to use (<>) in the before example
2022-06-23 19:02:32 +0200mikoto-chan(~mikoto-ch@esm-84-240-99-143.netplaza.fi) (Ping timeout: 246 seconds)
2022-06-23 19:05:14 +0200econo(uid147250@user/econo)
2022-06-23 19:05:54 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 19:06:13 +0200[itchyjunk](~itchyjunk@user/itchyjunk/x-7353470)
2022-06-23 19:07:18 +0200notzmv(~zmv@user/notzmv) (Ping timeout: 264 seconds)
2022-06-23 19:09:24 +0200jpds(~jpds@gateway/tor-sasl/jpds) (Ping timeout: 268 seconds)
2022-06-23 19:10:18 +0200jpds(~jpds@gateway/tor-sasl/jpds)
2022-06-23 19:10:56 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl) (Ping timeout: 246 seconds)
2022-06-23 19:11:18 +0200tzh(~tzh@c-24-21-73-154.hsd1.or.comcast.net) (Ping timeout: 240 seconds)
2022-06-23 19:12:30 +0200Neuromancer(~Neuromanc@user/neuromancer) (Ping timeout: 240 seconds)
2022-06-23 19:13:43 +0200mastarija(~mastarija@2a05:4f46:e02:8c00:3026:72db:30b8:5822) (Quit: Leaving)
2022-06-23 19:13:59 +0200jpds(~jpds@gateway/tor-sasl/jpds) (Remote host closed the connection)
2022-06-23 19:14:27 +0200jpds(~jpds@gateway/tor-sasl/jpds)
2022-06-23 19:16:43 +0200Henson(~kvirc@107-179-133-201.cpe.teksavvy.com)
2022-06-23 19:19:32 +0200 <Henson> has anybody here had experience with the StackBuilders or Serokell consulting companies for Haskell development? If so, would you be interested in telling me about it?
2022-06-23 19:24:01 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Remote host closed the connection)
2022-06-23 19:24:22 +0200moet(~moet@lib-02-subnet-194.rdns.cenic.net)
2022-06-23 19:26:52 +0200eggplantade(~Eggplanta@108-201-191-115.lightspeed.sntcca.sbcglobal.net)
2022-06-23 19:27:37 +0200king_gs(~Thunderbi@187.201.91.195) (Quit: king_gs)
2022-06-23 19:29:35 +0200pleo(~pleo@user/pleo) (Quit: quit)
2022-06-23 19:30:17 +0200coot(~coot@213.134.190.95)
2022-06-23 19:30:39 +0200hnOsmium0001(uid453710@user/hnOsmium0001) (Quit: Connection closed for inactivity)
2022-06-23 19:33:11 +0200faitz(~faitz@169.150.201.38) (Quit: Lost terminal)
2022-06-23 19:34:54 +0200tzh(~tzh@c-24-21-73-154.hsd1.or.comcast.net)
2022-06-23 19:39:00 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 19:39:03 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 19:39:37 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 19:41:04 +0200Pickchea(~private@user/pickchea)
2022-06-23 19:42:26 +0200 <maerwald> why am I mind boggled every time I look at 'readsPrec'?
2022-06-23 19:43:05 +0200chele(~chele@user/chele) (Remote host closed the connection)
2022-06-23 19:44:32 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Ping timeout: 268 seconds)
2022-06-23 19:44:38 +0200 <moet> maerwald: because it should have used Either/Maybe but instead uses a list.
2022-06-23 19:44:56 +0200 <geekosaur> because it's a crawling horror…
2022-06-23 19:45:02 +0200 <maerwald> 10 years of Haskell and I still can't write a Read instance by hand
2022-06-23 19:45:05 +0200 <maerwald> ...
2022-06-23 19:46:05 +0200tzh_(~tzh@c-24-21-73-154.hsd1.or.comcast.net)
2022-06-23 19:46:18 +0200mbuf(~Shakthi@122.164.15.160) (Quit: Leaving)
2022-06-23 19:46:28 +0200tzh(~tzh@c-24-21-73-154.hsd1.or.comcast.net) (Read error: Connection reset by peer)
2022-06-23 19:46:34 +0200hnOsmium0001(uid453710@user/hnOsmium0001)
2022-06-23 19:54:38 +0200Sklogw(~Sklogw@88.232.49.3)
2022-06-23 19:54:46 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net)
2022-06-23 19:59:16 +0200tzh_(~tzh@c-24-21-73-154.hsd1.or.comcast.net) (Remote host closed the connection)
2022-06-23 19:59:32 +0200tzh_(~tzh@c-24-21-73-154.hsd1.wa.comcast.net)
2022-06-23 20:00:10 +0200 <Henson> Read instances are weird
2022-06-23 20:07:04 +0200nate4(~nate@98.45.169.16)
2022-06-23 20:07:07 +0200jpds(~jpds@gateway/tor-sasl/jpds) (Remote host closed the connection)
2022-06-23 20:07:39 +0200jpds(~jpds@gateway/tor-sasl/jpds)
2022-06-23 20:09:44 +0200hiredman(~hiredman@frontier1.downey.family) (Ping timeout: 246 seconds)
2022-06-23 20:11:13 +0200 <monochrom> I can, after spending 10 hours on "the Hom functor" in category theory. Then it's "just" function composition :)
2022-06-23 20:11:44 +0200Sklogw(~Sklogw@88.232.49.3) (Quit: Client closed)
2022-06-23 20:12:03 +0200 <monochrom> Oh wait, misread, you said Read, not Reader.
2022-06-23 20:12:11 +0200nate4(~nate@98.45.169.16) (Ping timeout: 246 seconds)
2022-06-23 20:12:32 +0200 <monochrom> But I can write a Read instance by hand too. :)
2022-06-23 20:13:57 +0200ChaiTRex(~ChaiTRex@user/chaitrex) (Remote host closed the connection)
2022-06-23 20:14:01 +0200 <monochrom> I think it is because I wondered how do you know when and when not to add parentheses, and readsPrec answered my question.
2022-06-23 20:14:22 +0200ChaiTRex(~ChaiTRex@user/chaitrex)
2022-06-23 20:14:52 +0200 <monochrom> Err wait, that's Shows and showsPrec.
2022-06-23 20:15:16 +0200 <monochrom> OK, I learned monads around the same time I learned Read, that's how.
2022-06-23 20:16:15 +0200moet_(~moet@lib-02-subnet-194.rdns.cenic.net)
2022-06-23 20:16:59 +0200allbery_b(~geekosaur@xmonad/geekosaur)
2022-06-23 20:16:59 +0200geekosaur(~geekosaur@xmonad/geekosaur) (Killed (NickServ (GHOST command used by allbery_b)))
2022-06-23 20:17:02 +0200allbery_bgeekosaur
2022-06-23 20:17:44 +0200dextaa4(~DV@user/dextaa)
2022-06-23 20:18:08 +0200Vajb(~Vajb@85-76-45-183-nat.elisa-mobile.fi) (Ping timeout: 246 seconds)
2022-06-23 20:18:50 +0200moet(~moet@lib-02-subnet-194.rdns.cenic.net) (Ping timeout: 246 seconds)
2022-06-23 20:18:50 +0200takuan(~takuan@178-116-218-225.access.telenet.be) (Ping timeout: 246 seconds)
2022-06-23 20:18:59 +0200toluene4(~toluene@user/toulene)
2022-06-23 20:19:32 +0200Pickchea(~private@user/pickchea) (Ping timeout: 246 seconds)
2022-06-23 20:19:32 +0200dextaa(~DV@user/dextaa) (Ping timeout: 246 seconds)
2022-06-23 20:19:32 +0200jao(~jao@cpc103048-sgyl39-2-0-cust502.18-2.cable.virginm.net) (Ping timeout: 246 seconds)
2022-06-23 20:19:32 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 246 seconds)
2022-06-23 20:19:32 +0200toluene(~toluene@user/toulene) (Ping timeout: 246 seconds)
2022-06-23 20:19:32 +0200dextaa4dextaa
2022-06-23 20:19:33 +0200toluene4toluene
2022-06-23 20:20:50 +0200takuan(~takuan@178-116-218-225.access.telenet.be)
2022-06-23 20:21:45 +0200jao(~jao@cpc103048-sgyl39-2-0-cust502.18-2.cable.virginm.net)
2022-06-23 20:21:51 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 20:23:49 +0200ChaiTRex(~ChaiTRex@user/chaitrex) (Quit: ChaiTRex)
2022-06-23 20:24:26 +0200ChaiTRex(~ChaiTRex@user/chaitrex)
2022-06-23 20:26:30 +0200segfaultfizzbuzz(~segfaultf@192-184-223-90.static.sonic.net)
2022-06-23 20:27:27 +0200 <segfaultfizzbuzz> so, i had a weird thought... true pure functional code (with strictly no io, entropy sources etc) can only preserve the entropy of its inputs or decrease the entropy of its inputs
2022-06-23 20:27:42 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl) (Ping timeout: 264 seconds)
2022-06-23 20:28:14 +0200 <segfaultfizzbuzz> that is to say if i feed one megabyte worth of bits into pure functional code, i can get say, one byte out, or i can get one megabyte out, but i cannot get ten megabytes out
2022-06-23 20:28:51 +0200shiraeeshi(~shiraeesh@46.34.206.119) (Remote host closed the connection)
2022-06-23 20:29:06 +0200 <monochrom> You know what, true of I/O too, you just have to draw your "system boundary" larger.
2022-06-23 20:29:09 +0200shiraeeshi(~shiraeesh@46.34.206.119)
2022-06-23 20:29:09 +0200 <segfaultfizzbuzz> the pure functional code can produce what appears to be ten megabytes from one megabyte, but it will always compress
2022-06-23 20:29:23 +0200 <segfaultfizzbuzz> monochrom: talking to me there?
2022-06-23 20:29:29 +0200 <monochrom> Yes.
2022-06-23 20:29:42 +0200 <geekosaur> only on ideal hardware. on real hardware there will be entropy leaks in both directions
2022-06-23 20:30:02 +0200 <segfaultfizzbuzz> well yes i suppose you can "freeze" the amount of entropy fed into a pure functional snippet of code after an io event
2022-06-23 20:30:09 +0200 <monochrom> pure function + readFile cannot generate more info than input and what's already on disk.
2022-06-23 20:30:47 +0200toluene1(~toluene@user/toulene)
2022-06-23 20:30:50 +0200 <segfaultfizzbuzz> geekosaur: there is probably nondeterminism which leaks into real-world pure functional code (even without io etc) when it executes (runtime, operating system, etc)
2022-06-23 20:31:26 +0200toluene(~toluene@user/toulene) (Ping timeout: 246 seconds)
2022-06-23 20:31:26 +0200toluene1toluene
2022-06-23 20:31:42 +0200 <geekosaur> and pure functional code itself can cause system load, heat output, cache effects, etc.
2022-06-23 20:32:07 +0200 <monochrom> Oh, that.
2022-06-23 20:32:16 +0200 <segfaultfizzbuzz> geekosaur: right, that is an entropy input leak
2022-06-23 20:32:32 +0200 <Tuplanolla> The term for the other concept is "reversible computing".
2022-06-23 20:33:00 +0200 <segfaultfizzbuzz> well, you can easily have nonreversible pure functions
2022-06-23 20:33:13 +0200 <Tuplanolla> Those produce waste heat.
2022-06-23 20:33:20 +0200 <segfaultfizzbuzz> but all true pure functions are monotonically decreasing in their entropy or at best are entropy-preserving
2022-06-23 20:33:29 +0200z0k(~z0k@206.84.141.12) (Ping timeout: 248 seconds)
2022-06-23 20:33:30 +0200Pickchea(~private@user/pickchea)
2022-06-23 20:34:10 +0200 <monochrom> Well, in this context we focus on input entropy and output entropy. output <= input can be true because output + waste heat >= input. We are in agreement.
2022-06-23 20:34:47 +0200 <segfaultfizzbuzz> i don't mean physical entropy here i mean information theory entropy
2022-06-23 20:34:59 +0200 <monochrom> Instead of idealizing hardware, I am idealizing "output".
2022-06-23 20:35:01 +0200 <segfaultfizzbuzz> and yes i know about the physical reversibility stuff
2022-06-23 20:36:01 +0200 <segfaultfizzbuzz> anyway i just thought this was an interesting observation
2022-06-23 20:36:59 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Ping timeout: 256 seconds)
2022-06-23 20:39:19 +0200eggplantade(~Eggplanta@108-201-191-115.lightspeed.sntcca.sbcglobal.net) (Remote host closed the connection)
2022-06-23 20:42:16 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net) (Ping timeout: 265 seconds)
2022-06-23 20:42:43 +0200superbil(~superbil@1-34-176-171.hinet-ip.hinet.net)
2022-06-23 20:47:19 +0200progress__(~fffuuuu_i@45.112.243.220)
2022-06-23 20:52:01 +0200mshiraeeshi(~shiraeesh@46.34.206.119)
2022-06-23 20:52:06 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net)
2022-06-23 20:54:13 +0200shiraeeshi(~shiraeesh@46.34.206.119) (Ping timeout: 268 seconds)
2022-06-23 20:56:29 +0200coot(~coot@213.134.190.95) (Quit: coot)
2022-06-23 20:56:40 +0200 <Henson> by inputs, do you mean inputs to the function generating the outputs, or does inputs also include the content of the function doing the generating?
2022-06-23 20:56:59 +0200 <segfaultfizzbuzz> my pure function is f and i am feeding it x, so i am calling f x
2022-06-23 20:57:00 +0200zeenk(~zeenk@2a02:2f04:a301:3d00:39df:1c4b:8a55:48d3) (Quit: Konversation terminated!)
2022-06-23 20:57:01 +0200tomgus1(~tomgus1@2a02:c7e:4229:d900:dea6:32ff:fe3d:d1a3) (Quit: ZNC 1.8.2+deb2 - https://znc.in)
2022-06-23 20:57:21 +0200 <segfaultfizzbuzz> entropy of (f x) is strictly equal to or less than that of x if f is a truely pure function
2022-06-23 20:57:43 +0200 <Henson> a face-generation ML network as a function that takes a seed integer as an input would be able to greatly amplify the information present in the input, but all of the potential for that information generation resides in the face-generating function
2022-06-23 20:58:34 +0200 <ski> maerwald : `readsPrec' isn't much harder than `showsPrec'
2022-06-23 20:58:42 +0200tomgus1(~tomgus1@2a02:c7e:4229:d900:dea6:32ff:fe3d:d1a3)
2022-06-23 20:58:53 +0200 <ski> moet_ : that breaks distributivity
2022-06-23 20:59:00 +0200 <maerwald> ski: great... tell me how to debug "no parse"
2022-06-23 21:00:02 +0200shapr(~user@2600:4040:2d31:7100:677f:8b5d:34bb:4aea) (Ping timeout: 255 seconds)
2022-06-23 21:00:02 +0200 <moet_> ski: :thinkthink:
2022-06-23 21:00:42 +0200machinedgod(~machinedg@66.244.246.252) (Ping timeout: 264 seconds)
2022-06-23 21:01:47 +0200ski. o O ( "1972 - Alain Colmerauer designs the logic language Prolog. His goal is to create a language with the intelligence of a two year old. He proves he has reached his goal by showing a Prolog session that says \"No.\" to every query." -- "A Brief, Incomplete, and Mostly Wrong History of Programming Languages" by James Iry in 2009-05-07 at
2022-06-23 21:01:52 +0200ski<https://james-iry.blogspot.com/2009/05/brief-incomplete-and-mostly-wrong.html> )
2022-06-23 21:06:10 +0200unit73e(~emanuel@2001:818:e8dd:7c00:32b5:c2ff:fe6b:5291)
2022-06-23 21:07:21 +0200hiredman(~hiredman@frontier1.downey.family)
2022-06-23 21:08:59 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:09:30 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:12:29 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 21:13:02 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:13:34 +0200Pickchea(~private@user/pickchea) (Ping timeout: 272 seconds)
2022-06-23 21:14:13 +0200 <segfaultfizzbuzz> Henson: face-generation? wha...?
2022-06-23 21:15:10 +0200 <segfaultfizzbuzz> Henson: it's just the pidgeonhole principle, and the only observation here is that the pidgeonhole principle applies to pure functions/programming, which is just something i hadn't considered
2022-06-23 21:15:47 +0200 <segfaultfizzbuzz> what is the practical significance of this observation? i suppose that you can use this kind of observation to think about the "memoizability or partial memoizability" of your code
2022-06-23 21:16:50 +0200coot(~coot@213.134.190.95)
2022-06-23 21:17:55 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:18:21 +0200dsrt^(~dsrt@50.237.44.186) (Ping timeout: 256 seconds)
2022-06-23 21:19:31 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:21:58 +0200segfaultfizzbuzz(~segfaultf@192-184-223-90.static.sonic.net) (Ping timeout: 240 seconds)
2022-06-23 21:21:59 +0200dsrt^(~dsrt@50.237.44.186)
2022-06-23 21:25:24 +0200segfaultfizzbuzz(~segfaultf@192-184-223-90.static.sonic.net)
2022-06-23 21:28:44 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:29:32 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net)
2022-06-23 21:30:00 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:30:30 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:31:55 +0200notzmv(~zmv@user/notzmv)
2022-06-23 21:33:33 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:33:50 +0200moet_(~moet@lib-02-subnet-194.rdns.cenic.net) (Ping timeout: 272 seconds)
2022-06-23 21:34:47 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:36:29 +0200segfaultfizzbuzz(~segfaultf@192-184-223-90.static.sonic.net) (Quit: segfaultfizzbuzz)
2022-06-23 21:36:54 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:37:37 +0200Topsi(~Topsi@dyndsl-095-033-088-224.ewe-ip-backbone.de)
2022-06-23 21:40:42 +0200 <monochrom> Significance for environmentalists: Functional programming cannot be carbon-neutral. >:)
2022-06-23 21:41:09 +0200yauhsien(~yauhsien@61-231-23-53.dynamic-ip.hinet.net) (Remote host closed the connection)
2022-06-23 21:41:37 +0200 <monochrom> Significance for security entrepenuers: Functional programming always leaks information in side channels. >:)
2022-06-23 21:42:28 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:42:45 +0200Midjak(~Midjak@82.66.147.146) (Quit: Leaving)
2022-06-23 21:43:41 +0200 <Tisoxin> Is there a tool that shows if a binding is used strict or lazy (due to strictness analysis)?
2022-06-23 21:43:58 +0200 <Tisoxin> I think I'd be really nice to have sth. like that in HLS
2022-06-23 21:44:03 +0200 <unit73e> I only know practical comparisons, like this colleague was trying to program Java like Haskell or maybe Scala and I had to tell him to give it up. Putting lipstick in a horse doesn't make it prettier.
2022-06-23 21:44:04 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:44:27 +0200 <unit73e> also hi
2022-06-23 21:44:31 +0200 <monochrom> The horse is already pretty.
2022-06-23 21:44:34 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:44:38 +0200 <unit73e> it does the job
2022-06-23 21:45:31 +0200 <unit73e> anyway he was trying to make Java declarative, but verbosity of the language design ends up always looking meh
2022-06-23 21:45:38 +0200 <monochrom> But you can try this better analogy: putting lipstick around your eyes. 8)
2022-06-23 21:45:43 +0200tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2022-06-23 21:46:18 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:46:42 +0200 <monochrom> Tisoxin: If you know how to read the output of -ddump-stg, it has that information.
2022-06-23 21:47:11 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 21:48:42 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:49:06 +0200 <monochrom> I haven't used -ddump-str-signatures, but it is advertised as "Dump strictness signatures".
2022-06-23 21:50:25 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:50:54 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 21:51:39 +0200_ht(~quassel@231-169-21-31.ftth.glasoperator.nl)
2022-06-23 21:51:47 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 21:53:54 +0200 <monochrom> Looks like s/-ddump-stg/-ddump-simpl/ suffices. https://downloads.haskell.org/ghc/latest/docs/html/users_guide/hints.html#faster-producing-a-progr… then scroll down to "How do I find out a function’s strictness?" for how to read the notation.
2022-06-23 21:54:49 +0200 <monochrom> "Besides, Core syntax is fun to look at!" haha
2022-06-23 21:54:55 +0200 <monochrom> "have fun"
2022-06-23 21:55:18 +0200 <Franciman> re. carbon neutral, we have a monad for that!
2022-06-23 21:55:45 +0200 <Franciman> or maybe it should be a comonad?
2022-06-23 21:55:48 +0200 <Franciman> hmm
2022-06-23 21:55:51 +0200pleo(~pleo@user/pleo)
2022-06-23 21:56:09 +0200 <Tisoxin> > "Besides, Core syntax is fun to look at!" haha
2022-06-23 21:56:10 +0200 <lambdabot> error: Variable not in scope: haha
2022-06-23 21:56:17 +0200 <Tisoxin> It's actually not that bad with inspection-testing
2022-06-23 21:57:47 +0200 <mshiraeeshi> unit73e: borrowing some ideas from fp when coding in java could work in some cases
2022-06-23 21:58:04 +0200 <mshiraeeshi> I think you should judge on case-by-case basis
2022-06-23 21:59:15 +0200progress__(~fffuuuu_i@45.112.243.220) (Quit: Leaving)
2022-06-23 22:00:18 +0200alp(~alp@user/alp) (Remote host closed the connection)
2022-06-23 22:00:22 +0200 <unit73e> mshiraeeshi, agreed. the issue was that the guy was trying to have small methods for everything without thinking much about why. a failed attempt to make code readable. it's hard when java has some design flaws.
2022-06-23 22:00:33 +0200alp(~alp@user/alp)
2022-06-23 22:00:39 +0200 <unit73e> it's not a horrible language but definitely has its flaws
2022-06-23 22:00:56 +0200 <ski> "Java Precisely" by Peter Sestoft (of Moscow ML fame) in 2016 (3rd ed.) at <https://www.itu.dk/people/sestoft/javaprecisely/> covers functional interfaces, streams, parallel streams, parallel arrays
2022-06-23 22:00:58 +0200 <monochrom> Like this? http://www.vex.net/~trebla/humour/Nightmare.java
2022-06-23 22:00:59 +0200 <unit73e> now perl, perl is horrible. sorry perl fans
2022-06-23 22:03:00 +0200 <unit73e> I'd say scala is java done right, though bloated
2022-06-23 22:04:02 +0200 <Tisoxin> It'd rather call it a better version of java, instead of java done right
2022-06-23 22:04:05 +0200 <dolio> People have been functional programming in Java for a long time. I remember way back reading some book on the AWT, and it described this really 'novel' data structure that got used for event listeners...
2022-06-23 22:04:19 +0200 <dolio> The novel data structure was an immutable tree. :þ
2022-06-23 22:04:50 +0200 <mshiraeeshi> unit73e: take a Flyweight pattern from GoF book, for example
2022-06-23 22:04:50 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Ping timeout: 240 seconds)
2022-06-23 22:04:52 +0200 <dolio> What a concept.
2022-06-23 22:04:54 +0200 <unit73e> Tisoxin, fair enough
2022-06-23 22:04:58 +0200 <monochrom> Ah, so functional programming has been novel in Java for a long time? :)
2022-06-23 22:05:22 +0200 <mshiraeeshi> "Use sharing to support large numbers of fine-grained objects efficiently"
2022-06-23 22:05:47 +0200 <mshiraeeshi> that's how String in Java works and that's the reason String is immutable in Java, btw
2022-06-23 22:06:00 +0200 <monochrom> I had been saying "I'll finish my thesis in a year" for many years, too.
2022-06-23 22:07:03 +0200Techcable(~Techcable@user/Techcable) (Remote host closed the connection)
2022-06-23 22:07:11 +0200Techcable(~Techcable@user/Techcable)
2022-06-23 22:08:59 +0200 <unit73e> I finished my thesis... but I'm one of those guys that starts projects and doesn't finish lol
2022-06-23 22:09:25 +0200 <unit73e> my xp3 extract thingy is working
2022-06-23 22:09:39 +0200yauhsien(~yauhsien@61-231-38-201.dynamic-ip.hinet.net)
2022-06-23 22:10:55 +0200 <monochrom> We turn induction into co-induction
2022-06-23 22:11:52 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:11:52 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:14:15 +0200 <Henson> has anybody here had experience with the StackBuilders or Serokell consulting companies for Haskell development? If so, would you be interested in telling me about it?
2022-06-23 22:14:17 +0200yauhsien(~yauhsien@61-231-38-201.dynamic-ip.hinet.net) (Ping timeout: 248 seconds)
2022-06-23 22:14:40 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:15:04 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:16:53 +0200 <mshiraeeshi> although I'm not sure we can say that the idea of "making things immutable to enable sharing of large numbers of objects" originated from fp
2022-06-23 22:17:38 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:17:42 +0200 <mshiraeeshi> perhaps it's more like a convergence, when the same idea gets invented in several paradigms independently
2022-06-23 22:18:07 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:19:57 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Ping timeout: 268 seconds)
2022-06-23 22:20:18 +0200Pickchea(~private@user/pickchea)
2022-06-23 22:20:37 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:21:01 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:21:15 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl) (Ping timeout: 256 seconds)
2022-06-23 22:22:17 +0200jrm(~jrm@user/jrm) (Ping timeout: 248 seconds)
2022-06-23 22:23:07 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:23:41 +0200heath(~heath@user/heath) (Quit: WeeChat 1.7)
2022-06-23 22:24:08 +0200heath(~heath@user/heath)
2022-06-23 22:24:18 +0200 <EvanR> pure functional programming predates java at least
2022-06-23 22:24:36 +0200jrm(~jrm@user/jrm)
2022-06-23 22:24:48 +0200 <mshiraeeshi> it's like asking "if two melodies are similar, does it mean than one plagiarized the other?"
2022-06-23 22:24:52 +0200tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2022-06-23 22:24:54 +0200jargon(~jargon@184.101.186.108) (Remote host closed the connection)
2022-06-23 22:25:27 +0200 <EvanR> what are we railing against? If stealing ideas from other languages is technically stealing? xD
2022-06-23 22:26:17 +0200 <mshiraeeshi> whether applying fp techniques in Java is a good or bad idea
2022-06-23 22:26:17 +0200 <monochrom> In the case of music, plagiarism can happen and can be bad.
2022-06-23 22:26:30 +0200 <EvanR> immutable data can be good anywhere
2022-06-23 22:26:40 +0200 <monochrom> This does not apply to cross pollination in programming.
2022-06-23 22:27:02 +0200acidjnk(~acidjnk@dynamic-046-114-168-206.46.114.pool.telefonica.de)
2022-06-23 22:27:06 +0200sebastiandb(~sebastian@pool-108-31-128-56.washdc.fios.verizon.net) (Ping timeout: 264 seconds)
2022-06-23 22:27:32 +0200 <monochrom> But to answer the question of what we are railing against:
2022-06-23 22:27:35 +0200Vajb(~Vajb@hag-jnsbng11-58c3a8-176.dhcp.inet.fi)
2022-06-23 22:27:54 +0200 <EvanR> on fp in java, if monochrom hasn't posted their lazy list implemented in java via exceptions yet, maybe they should. As an example of "bad" xD
2022-06-23 22:27:55 +0200 <monochrom> Everyone is railing against someone else having an ever so slightly different opinion, obviously.
2022-06-23 22:28:28 +0200 <monochrom> But I think I posted it already!
2022-06-23 22:28:31 +0200 <dolio> Yeah, using exceptions for that definitely sounds bad.
2022-06-23 22:28:49 +0200 <EvanR> figured
2022-06-23 22:29:03 +0200 <monochrom> Oh, post and state that it's bad.
2022-06-23 22:29:15 +0200 <EvanR> no it stands for itself
2022-06-23 22:29:44 +0200 <monochrom> Is having "humour" in the URL close enough? :)
2022-06-23 22:29:57 +0200 <monochrom> and using the filename "Nightmare"
2022-06-23 22:30:05 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:30:41 +0200 <EvanR> does java have an "acme" section
2022-06-23 22:31:22 +0200 <monochrom> I think the number or percentage of people who use my Nightmare.java technique is about the same as that of using that "type-level interview solution" technique.
2022-06-23 22:31:49 +0200 <monochrom> Java doesn't have a community central repo, does it?
2022-06-23 22:31:55 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 22:32:01 +0200 <dolio> Maven?
2022-06-23 22:32:18 +0200 <monochrom> Ah OK my bad.
2022-06-23 22:32:22 +0200 <Henson> monochrom: where can I find your Nightmare.java technique?
2022-06-23 22:32:35 +0200odnes(~odnes@5-203-187-167.pat.nym.cosmote.net)
2022-06-23 22:32:39 +0200 <monochrom> http://www.vex.net/~trebla/humour/Nightmare.java
2022-06-23 22:32:40 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:34:23 +0200odnes(~odnes@5-203-187-167.pat.nym.cosmote.net) (Remote host closed the connection)
2022-06-23 22:34:33 +0200ft(~ft@shell.chaostreff-dortmund.de) (Ping timeout: 248 seconds)
2022-06-23 22:35:49 +0200ft(~ft@shell.chaostreff-dortmund.de)
2022-06-23 22:38:07 +0200lyle(~lyle@104.246.145.85) (Quit: WeeChat 3.5)
2022-06-23 22:39:31 +0200_ht(~quassel@231-169-21-31.ftth.glasoperator.nl) (Remote host closed the connection)
2022-06-23 22:39:52 +0200 <Tuplanolla> Wait, why exceptions?
2022-06-23 22:40:16 +0200odnes(~odnes@5-203-187-167.pat.nym.cosmote.net)
2022-06-23 22:40:36 +0200 <geekosaur> just to prove it could be done?
2022-06-23 22:40:46 +0200 <EvanR> Because It's There
2022-06-23 22:41:09 +0200slack1256(~slack1256@191.125.99.212)
2022-06-23 22:42:07 +0200 <monochrom> Because it is closest to pattern matching. :)
2022-06-23 22:42:11 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Remote host closed the connection)
2022-06-23 22:43:02 +0200 <slack1256> It there an option so the `-xc` dumps a json instead of the usual format? Google cloud run doesn't deal with raw text well.
2022-06-23 22:43:32 +0200tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2022-06-23 22:43:56 +0200 <monochrom> And some SML users use their exception system for OOP because it's the only subtyping system. :)
2022-06-23 22:44:36 +0200 <Tuplanolla> Pleasant disgust.
2022-06-23 22:44:48 +0200 <EvanR> one of the next 700 languages needs to give up and just make exceptions the only thing
2022-06-23 22:45:09 +0200 <EvanR> "everything's an exception"
2022-06-23 22:45:16 +0200odnes_(~odnes@5-203-187-167.pat.nym.cosmote.net)
2022-06-23 22:45:19 +0200 <EvanR> (ironically)
2022-06-23 22:45:23 +0200odnes(~odnes@5-203-187-167.pat.nym.cosmote.net) (Read error: Connection reset by peer)
2022-06-23 22:45:26 +0200nate4(~nate@98.45.169.16)
2022-06-23 22:45:39 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:45:42 +0200 <monochrom> Next April 1st, someone might claim "I found an unpublished secret paper of Guy Steeles, it's called Exception: The Ultimate Lambda" >:)
2022-06-23 22:45:45 +0200 <geekosaur> pretty soon everything will have to produce json output and compile to or simply bve javascript, the only language
2022-06-23 22:46:19 +0200 <slack1256> Well, maps are useful :>
2022-06-23 22:46:27 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 268 seconds)
2022-06-23 22:46:45 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:46:48 +0200 <EvanR> google can't convert literally anything to json yet?
2022-06-23 22:47:16 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:47:17 +0200 <slack1256> Not -xc output atleast. It record a new json per line. And my stack traces are looooong.
2022-06-23 22:47:29 +0200 <slack1256> s_record_records_
2022-06-23 22:47:47 +0200 <mshiraeeshi> lol, and "main" jumpstarts everything with a single "throw" statement
2022-06-23 22:48:14 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 22:48:25 +0200 <monochrom> Isn't it a beauty.
2022-06-23 22:48:33 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:48:53 +0200 <mshiraeeshi> it's interesting though how would SML users model exceptions in OOP
2022-06-23 22:49:01 +0200 <mshiraeeshi> exceptions on top of exceptions
2022-06-23 22:49:02 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:49:38 +0200 <mshiraeeshi> or just exceptions would suffice? not sure
2022-06-23 22:50:33 +0200 <ski> `exn' is an "open sum type". you can even generate new exception constructors at run-time (e.g. every time a function is called)
2022-06-23 22:50:58 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 22:51:37 +0200 <monochrom> Yeah they don't use throw like I do. They just enjoy exn being a parent type.
2022-06-23 22:52:32 +0200 <monochrom> IOW they declare a lot of "exceptions" that are never throw but just used as normal parameters and normal return values.
2022-06-23 22:52:43 +0200 <monochrom> s/throw/thrown/
2022-06-23 22:53:09 +0200 <monochrom> Oh haha do you mind if I also call my technique "the game of throws"
2022-06-23 22:53:18 +0200takuan(~takuan@178-116-218-225.access.telenet.be) (Remote host closed the connection)
2022-06-23 22:54:19 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:55:12 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:55:49 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475) (Ping timeout: 256 seconds)
2022-06-23 22:55:54 +0200dsrt^(~dsrt@50.237.44.186) (Ping timeout: 264 seconds)
2022-06-23 22:56:29 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:56:58 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 22:57:33 +0200mon_aaraj(~MonAaraj@user/mon-aaraj/x-4416475)
2022-06-23 22:57:36 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 22:59:30 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net) (Ping timeout: 240 seconds)
2022-06-23 23:01:25 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 23:02:48 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt)
2022-06-23 23:05:47 +0200 <alexfmpe[m]> is there a way to write the equivalent of... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/675e0345137e20b45e428da1305e5860386d…)
2022-06-23 23:06:10 +0200 <alexfmpe[m]> since `D` can't figure out the `y` even though there's a functional dependency
2022-06-23 23:06:42 +0200haskellapprenti(~haskellap@204.14.236.211)
2022-06-23 23:07:09 +0200 <haskellapprenti> pl \a b -> if f a b == GT then a else b
2022-06-23 23:07:19 +0200 <haskellapprenti> @pl \a b -> if f a b == GT then a else b
2022-06-23 23:07:19 +0200 <lambdabot> join . (flip =<< (if' .) . flip flip GT . ((==) .) . f)
2022-06-23 23:07:54 +0200unit73e(~emanuel@2001:818:e8dd:7c00:32b5:c2ff:fe6b:5291) (Ping timeout: 264 seconds)
2022-06-23 23:08:14 +0200 <slack1256> alexfmpe[m]: Maybe an explicit forall?
2022-06-23 23:08:40 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Ping timeout: 268 seconds)
2022-06-23 23:08:54 +0200 <alexfmpe[m]> er, where
2022-06-23 23:10:01 +0200 <slack1256> class definition of D.
2022-06-23 23:10:28 +0200 <slack1256> alexfmpe[m]: https://gitlab.haskell.org/ghc/ghc/-/wikis/quantified-constraints
2022-06-23 23:10:43 +0200 <slack1256> Seems what you want. But you will use the most basic example.
2022-06-23 23:10:53 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 23:11:11 +0200 <alexfmpe[m]> I mean sure, `class (forall y. C x y) => D x` compiles
2022-06-23 23:11:15 +0200 <alexfmpe[m]> but that means a different thing
2022-06-23 23:11:38 +0200 <alexfmpe[m]> that forces `C x` to work for all `y`
2022-06-23 23:12:00 +0200 <alexfmpe[m]> meanwhile there's a fundep saying `x -> y`
2022-06-23 23:12:09 +0200 <alexfmpe[m]> not sure why this even compiles actually
2022-06-23 23:12:19 +0200 <slack1256> Who cares about y on that instance actually?
2022-06-23 23:12:34 +0200 <alexfmpe[m]> I guess an unsatisfiable constraint is still allowed, just unusable
2022-06-23 23:12:53 +0200 <slack1256> I mean what method
2022-06-23 23:13:16 +0200alp(~alp@user/alp) (Ping timeout: 272 seconds)
2022-06-23 23:13:30 +0200pleo(~pleo@user/pleo) (Quit: quit)
2022-06-23 23:13:51 +0200 <alexfmpe[m]> well these classes are method-less, I ran into this on a more complex scenario
2022-06-23 23:13:55 +0200pleo(~pleo@user/pleo)
2022-06-23 23:13:56 +0200 <alexfmpe[m]> basically `D` adds an identity
2022-06-23 23:14:35 +0200 <alexfmpe[m]> or rather, `D` reproduces the issue I had on the class where I wanted to add an identity
2022-06-23 23:16:18 +0200 <slack1256> Any particular reason why you don't want to use Type Families?
2022-06-23 23:16:34 +0200 <alexfmpe[m]> indeed, trying the quantifiable thing leads to a deadend
2022-06-23 23:16:34 +0200 <alexfmpe[m]> `instance D Bool` -> No instance for (C Bool y)
2022-06-23 23:16:34 +0200 <alexfmpe[m]> `instance C y` -> The coverage condition fails in class ‘C’ for functional dependency: ‘x -> y’
2022-06-23 23:16:38 +0200eod|fserucas(~eod|fseru@193.65.114.89.rev.vodafone.pt) (Remote host closed the connection)
2022-06-23 23:16:46 +0200 <slack1256> AFAIK functional dependencies where more of a hack to make type inference work in presence of MTL like effects.
2022-06-23 23:16:49 +0200 <alexfmpe[m]> nah, TF are fine, I was just confirming whether they're necessary
2022-06-23 23:17:30 +0200werneta(~werneta@70-142-214-115.lightspeed.irvnca.sbcglobal.net) (Quit: Lost terminal)
2022-06-23 23:17:46 +0200 <ski> (yes, you'd rather want `exists', than `forall')
2022-06-23 23:18:32 +0200azimut(~azimut@gateway/tor-sasl/azimut) (Ping timeout: 268 seconds)
2022-06-23 23:21:58 +0200werneta(~werneta@70-142-214-115.lightspeed.irvnca.sbcglobal.net)
2022-06-23 23:22:42 +0200odnes_(~odnes@5-203-187-167.pat.nym.cosmote.net) (Remote host closed the connection)
2022-06-23 23:24:02 +0200BusConscious(~martin@ip5f5bdf00.dynamic.kabel-deutschland.de) (Quit: Lost terminal)
2022-06-23 23:24:04 +0200slack1256(~slack1256@191.125.99.212) (Read error: Connection reset by peer)
2022-06-23 23:24:48 +0200slack1256(~slack1256@186.11.82.163)
2022-06-23 23:26:25 +0200sibnull[m](~sibnullma@2001:470:69fc:105::1:1291)
2022-06-23 23:26:33 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex) (Ping timeout: 268 seconds)
2022-06-23 23:27:42 +0200jgeerds(~jgeerds@55d45f48.access.ecotel.net)
2022-06-23 23:31:17 +0200bitdex(~bitdex@gateway/tor-sasl/bitdex)
2022-06-23 23:32:28 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Remote host closed the connection)
2022-06-23 23:33:38 +0200tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl) (Quit: My iMac has gone to sleep. ZZZzzz…)
2022-06-23 23:34:39 +0200zebrag(~chris@user/zebrag)
2022-06-23 23:39:10 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 23:44:14 +0200dsrt^(~dsrt@50.237.44.186)
2022-06-23 23:47:06 +0200tromp(~textual@92-110-219-57.cable.dynamic.v4.ziggo.nl)
2022-06-23 23:48:33 +0200merijn(~merijn@c-001-001-027.client.esciencecenter.eduvpn.nl)
2022-06-23 23:49:05 +0200nate4(~nate@98.45.169.16) (Ping timeout: 256 seconds)
2022-06-23 23:50:54 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e) (Remote host closed the connection)
2022-06-23 23:54:29 +0200slac65789(~slack1256@191.126.99.212)
2022-06-23 23:55:00 +0200coot(~coot@213.134.190.95) (Quit: coot)
2022-06-23 23:56:45 +0200slack1256(~slack1256@186.11.82.163) (Ping timeout: 268 seconds)
2022-06-23 23:57:07 +0200machinedgod(~machinedg@66.244.246.252)
2022-06-23 23:58:24 +0200eggplantade(~Eggplanta@2600:1700:bef1:5e10:99c9:a0a4:f69e:b22e)
2022-06-23 23:58:50 +0200slac65789slack1256