2025/05/24

Newest at the top

2025-05-24 20:24:57 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr)
2025-05-24 20:24:32 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection)
2025-05-24 20:21:30 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn
2025-05-24 20:20:46 +0200tzh(~tzh@c-76-115-131-146.hsd1.or.comcast.net) tzh
2025-05-24 20:20:29 +0200 <EvanR> in order of decreasing responsiveness: web 2.0, AI stuff, safety theory, legal framework, congressional clue
2025-05-24 20:20:01 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr)
2025-05-24 20:19:38 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection)
2025-05-24 20:18:52 +0200 <EvanR> by the time the funding goes through something has already broken out of the lab
2025-05-24 20:18:23 +0200 <EvanR> why does this seem like a ship has sailed sort of area
2025-05-24 20:18:11 +0200 <EvanR> I've been seeing a lot of safety and governance stuff lately
2025-05-24 20:17:54 +0200 <talismanick> https://cybercat.institute/blog/
2025-05-24 20:17:40 +0200 <talismanick> https://cybercat.institute/
2025-05-24 20:17:20 +0200 <EvanR> cloudhaskell : skynet :: cyberhaskell : terminator
2025-05-24 20:16:31 +0200 <talismanick> https://www.philipzucker.com/reverse-mode-differentiation-is-kind-of-like-a-lens-ii/
2025-05-24 20:15:57 +0200 <EvanR> who may or may not terminate
2025-05-24 20:15:57 +0200 <talismanick> some of the work in that line even explicitly mentioned a blogpost which demonstrated that reverse-mode automatic differentiation ("backprop" to the heathen masses) has the type of a(n unlawful) lens lol
2025-05-24 20:15:52 +0200 <EvanR> you heard of cloud haskell, now we can have cyberhaskell
2025-05-24 20:15:28 +0200 <monochrom> "Haskell can directly manipulate me!" Now that's a killer app. Literally.
2025-05-24 20:15:15 +0200erty(~user@user/aeroplane) (Ping timeout: 276 seconds)
2025-05-24 20:15:08 +0200 <monochrom> I misread that as "direct manipulation" because I was primed by "cybernetics".
2025-05-24 20:14:39 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr)
2025-05-24 20:14:17 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection)
2025-05-24 20:14:14 +0200 <talismanick> there's also categorical cybernetics, which helps bring up work closer to direct implementation in Haskell
2025-05-24 20:12:37 +0200 <talismanick> but yeah, the manifold hypothesis is the principle raison d'etre for "geometric deep learing", which tries to bring diffgeo into machine learning
2025-05-24 20:12:20 +0200 <monochrom> hee hee was going to say
2025-05-24 20:12:12 +0200 <EvanR> yes, vector-space
2025-05-24 20:12:07 +0200 <EvanR> does haskell have killer webapps
2025-05-24 20:11:16 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) (Ping timeout: 276 seconds)
2025-05-24 20:11:09 +0200 <monochrom> I don't know anything beyond that.
2025-05-24 20:10:50 +0200 <talismanick> isn't topology only explicitly used in rarer techniques like persistent homology/TDA?
2025-05-24 20:09:07 +0200 <monochrom> (so not just linear algebra, but also diff geom and topology)
2025-05-24 20:08:25 +0200 <monochrom> Although, I now recall a talk from which I learned that research on the brain does model collected data as an n-dim manifold (if there are n measuring probes).
2025-05-24 20:08:14 +0200 <talismanick> find a submanifold*
2025-05-24 20:07:51 +0200 <talismanick> and training it means walking this manifold surface with calculus to find a manifold with much smaller n
2025-05-24 20:07:30 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr)
2025-05-24 20:07:07 +0200 <EvanR> I wish my linear algebra course prefaced the innumerable matrix exercises with "this is used in literally everything ever so pay attention"
2025-05-24 20:07:06 +0200 <talismanick> which - for some arbitrarily-large n - maps trivially onto an n-manifold (some arbitrarily curved and connected space with the restriction that it be locally similar to R^n, so calculus is possible)
2025-05-24 20:07:06 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Read error: Connection reset by peer)
2025-05-24 20:05:51 +0200 <monochrom> Wait why is it not called simply "the linear algebra hypothesis"? :)
2025-05-24 20:05:42 +0200merijn(~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn
2025-05-24 20:05:40 +0200srazkvt(~sarah@user/srazkvt) (Quit: Konversation terminated!)
2025-05-24 20:05:18 +0200 <monochrom> Ah. Thanks.
2025-05-24 20:05:07 +0200 <talismanick> for neural nets, the data is internally encoded in long, long vectors of floats
2025-05-24 20:04:48 +0200 <EvanR> it's the 100 - 0 rule, 100% of the practical value comes from 0% of the possibly dimensions
2025-05-24 20:04:19 +0200peterbecich(~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 260 seconds)
2025-05-24 20:04:11 +0200 <monochrom> What is the manifold hypothesis?
2025-05-24 20:02:34 +0200 <talismanick> and whatever happened to all that talk of the manifold hypothesis, anyways? do they still tag on some arbitrarily high dimension rather than speaking of (uniformly-convergent?) sequences of spaces because infinite-dimensional manifolds are too much of a pain to be worth the extra effort?
2025-05-24 20:02:12 +0200 <monochrom> (Instead I paid attention to distinguishing between fast food franchises heh.)
2025-05-24 20:00:48 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr)
2025-05-24 20:00:22 +0200sabathan2(~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection)