Newest at the top
2025-05-24 20:18:23 +0200 | <EvanR> | why does this seem like a ship has sailed sort of area |
2025-05-24 20:18:11 +0200 | <EvanR> | I've been seeing a lot of safety and governance stuff lately |
2025-05-24 20:17:54 +0200 | <talismanick> | https://cybercat.institute/blog/ |
2025-05-24 20:17:40 +0200 | <talismanick> | https://cybercat.institute/ |
2025-05-24 20:17:20 +0200 | <EvanR> | cloudhaskell : skynet :: cyberhaskell : terminator |
2025-05-24 20:16:31 +0200 | <talismanick> | https://www.philipzucker.com/reverse-mode-differentiation-is-kind-of-like-a-lens-ii/ |
2025-05-24 20:15:57 +0200 | <EvanR> | who may or may not terminate |
2025-05-24 20:15:57 +0200 | <talismanick> | some of the work in that line even explicitly mentioned a blogpost which demonstrated that reverse-mode automatic differentiation ("backprop" to the heathen masses) has the type of a(n unlawful) lens lol |
2025-05-24 20:15:52 +0200 | <EvanR> | you heard of cloud haskell, now we can have cyberhaskell |
2025-05-24 20:15:28 +0200 | <monochrom> | "Haskell can directly manipulate me!" Now that's a killer app. Literally. |
2025-05-24 20:15:15 +0200 | erty | (~user@user/aeroplane) (Ping timeout: 276 seconds) |
2025-05-24 20:15:08 +0200 | <monochrom> | I misread that as "direct manipulation" because I was primed by "cybernetics". |
2025-05-24 20:14:39 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) |
2025-05-24 20:14:17 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection) |
2025-05-24 20:14:14 +0200 | <talismanick> | there's also categorical cybernetics, which helps bring up work closer to direct implementation in Haskell |
2025-05-24 20:12:37 +0200 | <talismanick> | but yeah, the manifold hypothesis is the principle raison d'etre for "geometric deep learing", which tries to bring diffgeo into machine learning |
2025-05-24 20:12:20 +0200 | <monochrom> | hee hee was going to say |
2025-05-24 20:12:12 +0200 | <EvanR> | yes, vector-space |
2025-05-24 20:12:07 +0200 | <EvanR> | does haskell have killer webapps |
2025-05-24 20:11:16 +0200 | merijn | (~merijn@host-vr.cgnat-g.v4.dfn.nl) (Ping timeout: 276 seconds) |
2025-05-24 20:11:09 +0200 | <monochrom> | I don't know anything beyond that. |
2025-05-24 20:10:50 +0200 | <talismanick> | isn't topology only explicitly used in rarer techniques like persistent homology/TDA? |
2025-05-24 20:09:07 +0200 | <monochrom> | (so not just linear algebra, but also diff geom and topology) |
2025-05-24 20:08:25 +0200 | <monochrom> | Although, I now recall a talk from which I learned that research on the brain does model collected data as an n-dim manifold (if there are n measuring probes). |
2025-05-24 20:08:14 +0200 | <talismanick> | find a submanifold* |
2025-05-24 20:07:51 +0200 | <talismanick> | and training it means walking this manifold surface with calculus to find a manifold with much smaller n |
2025-05-24 20:07:30 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) |
2025-05-24 20:07:07 +0200 | <EvanR> | I wish my linear algebra course prefaced the innumerable matrix exercises with "this is used in literally everything ever so pay attention" |
2025-05-24 20:07:06 +0200 | <talismanick> | which - for some arbitrarily-large n - maps trivially onto an n-manifold (some arbitrarily curved and connected space with the restriction that it be locally similar to R^n, so calculus is possible) |
2025-05-24 20:07:06 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Read error: Connection reset by peer) |
2025-05-24 20:05:51 +0200 | <monochrom> | Wait why is it not called simply "the linear algebra hypothesis"? :) |
2025-05-24 20:05:42 +0200 | merijn | (~merijn@host-vr.cgnat-g.v4.dfn.nl) merijn |
2025-05-24 20:05:40 +0200 | srazkvt | (~sarah@user/srazkvt) (Quit: Konversation terminated!) |
2025-05-24 20:05:18 +0200 | <monochrom> | Ah. Thanks. |
2025-05-24 20:05:07 +0200 | <talismanick> | for neural nets, the data is internally encoded in long, long vectors of floats |
2025-05-24 20:04:48 +0200 | <EvanR> | it's the 100 - 0 rule, 100% of the practical value comes from 0% of the possibly dimensions |
2025-05-24 20:04:19 +0200 | peterbecich | (~Thunderbi@syn-047-229-123-186.res.spectrum.com) (Ping timeout: 260 seconds) |
2025-05-24 20:04:11 +0200 | <monochrom> | What is the manifold hypothesis? |
2025-05-24 20:02:34 +0200 | <talismanick> | and whatever happened to all that talk of the manifold hypothesis, anyways? do they still tag on some arbitrarily high dimension rather than speaking of (uniformly-convergent?) sequences of spaces because infinite-dimensional manifolds are too much of a pain to be worth the extra effort? |
2025-05-24 20:02:12 +0200 | <monochrom> | (Instead I paid attention to distinguishing between fast food franchises heh.) |
2025-05-24 20:00:48 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) |
2025-05-24 20:00:22 +0200 | sabathan2 | (~sabathan@amarseille-159-1-12-107.w86-203.abo.wanadoo.fr) (Remote host closed the connection) |
2025-05-24 20:00:17 +0200 | <talismanick> | like, at what point do you start getting better results with automated optimizations when you instead write "assume we have a Banach space with these additional properties giving us reasonably-fast convergence for special cases x, y, & z"? |
2025-05-24 19:59:48 +0200 | <monochrom> | I grew up in a very urban city (Hong Kong) so I never paid attention to tree species and names, be it in English or Chinese. |
2025-05-24 19:58:50 +0200 | <talismanick> | yeah, exactly lol |
2025-05-24 19:58:46 +0200 | <int-e> | . o O ( it's a tree ) |
2025-05-24 19:58:40 +0200 | wootehfoot | (~wootehfoo@user/wootehfoot) (Read error: Connection reset by peer) |
2025-05-24 19:58:40 +0200 | <monochrom> | Well ML just needs 10^10-dimensional for now. |
2025-05-24 19:58:38 +0200 | <talismanick> | and the last commit was a year ago |
2025-05-24 19:58:13 +0200 | <monochrom> | heh OK |