Hacker Newsnew | past | comments | ask | show | jobs | submit | LeCompteSftware's commentslogin

The problem is that if you're wealthy enough to hire someone to do your errands, those errands likely aren't very mundane - the exception is a socialite giving their friend a low-effort job, but executive assistants are paid well because their jobs are cognitively demanding.

OTOH a lower-middle-class Joe like me really does have a lot of mundane social/professional errands, which existing software has handled just fine for decades. I suppose on the margins AI might free up 5 minutes here or there around calendar invites / etc, but at the cost of rolling snake eyes and wasting 30 minutes cleaning up mistakes. Even if it never made mistakes, I just don't see the "personal assistant" use case really taking off. And it's not how people use LLMs recreationally.

Really not trying to say that LLM personal assistants are "useless" for most people. But I don't think they'll be "big," for the same reason that Siri and Alexa were overhyped. It's not from lack of capability; the vision is more ho-hum than tech folks seem to realize.


Let's also point out the $180 is going to a hideously evil AI company which pirated millions of books and movies.

Yeah I didn't want to burst anyone's spiritual bubble earlier, but I had a very similar experience one time when I smoked salvia divinorum: there was an eerie and overwhelming purple light, sort of like a "fluorescent" UV bulb, and the Ministers of The Universe pulled my life history in front of me, something something ALL OF SPACE AND TIME WAS-

I wasn't speaking to God. I was high on salvia. And I'm quite certain A.J. Ayer was high on oxygen deprivation.


No you didn't that's not even what a salvia trip is like why lie about that in HN of all places haha

Naively this is quite surprising, but the devil is in the details. With the exception of Lean I'd point out they're all fairly close: Chez being 2.5x slower than C++ is not ignorable but it's also quite good for a dynamically-typed JITted language[1]. And I'm not surprised that F# does so well at this particular task. Without looking into it more closely, this seems to be a story about F# on .NET Core having the most mature and painless out-of-the-box parallelism of these languages. I assume this is elapsed time, it would be interesting to see a breakdown of CPU time.

I don't think these results are quite comparable because of slightly differing parallelism strategies; I'd expect the F# implementation of just spinning off threads to be more a little more performant than a Rayon parallel iterator, which presumably has some overhead. But that really just shows how hard it is to do a cross-language comparison; Rust and C++ can certainly be made faster than the F# code by carefully manipulating a ton of low-level OS concurrency primitives. This would arguably also be little misleading. Likewise Chez and Haskell have good C FFI; does that count? It's a tricky and highly qualitative analysis.

[1] FYI, one possible performance improvement with the Chez code is keeping the permutations in fxvectors and replace math operations with the fixnum-specific equivalent - this tells the compiler/interpreter that the data are guaranteed to be machine integers rather than bigints, so they aren't boxed/unboxed. I am not sure without running it myself, but there seems to be avoidable allocations in the Chez implementation. https://cisco.github.io/ChezScheme/csug/objects.html#./objec...


Lots of us are having fun identifying our choice for missing family :)

One I might suggest is scripting languages, defined loosely by programming tools which dispatch high-level commands to act on data pipelines: sed, AWK, the sh family, Perl, PowerShell, Python and R as honorary members. In practice I might say SQL belongs here instead of under Prolog, but in theory of course SQL is like Prolog. Bourne shell might be the best representative, even if it's not the oldest.

AWK et al share characteristics from ALGOL and APL, but I feel they are very much their own thing. PowerShell is quite unique among modern languages.


I'd add dataflow "languages" such as Excel and LabVIEW.

This is just wrong, you're being too didactic. Idris specifically lets you implement nontotal functions in the same way that Rust lets you write memory-unsafe code. The idea is you isolate it to the part of the program with effects (including the main loop), which the compiler can't verify anyway, and leave the total formally verified stuff to the business logic. Anything that's marked as total is proven safe, so you only need to worry about a few ugly bits; just like unsafe Rust.

Idris absolutely is a general-purpose functional language in the ML family. It is Haskell, but boosted with dependent types.


Random passerby chiming in: so this means you can write "regular" software with this stuff?

While reading TFA I thought the theorem stuff deserved its own category, but I guess it's a specialization within an ur-family (several), rather than its own family?

It definitely sounds like it deserves its own category of programming language, though. The same way Lojban has ancestry in many natural languages but is very much doing its own thing.


Yes Idris was meant to write regular code. F* is also meant to write regular code

But I think that the theorem prover that excels most at regular code is actually Lean. The reason I think that is because Lean has a growing community, or at least is growing much faster than other similar languages, and for regular code you really need a healthy ecosystem of libraries and stuff.

Anyway here an article about Lean as a general purpose language https://kirancodes.me/posts/log-ocaml-to-lean.html


> if you're a developer who wants to exploit the multiplicative factor of a truly flexible and extensible programming language with state of the art features from the cutting-edge of PL research, then maybe give Lean a whirl!

Does not sound that appealing to me. Sounds like little consistency and having to learn a new language for every project.


I mentioned this later

> You can separate terms that can be used in proofs (those must be total) from terms that can only be used in computations (those can be Turing complete), like in Lean

What I meant is that the part of Idris that lets people prove theorems is the non-total part

But, I think you are right. Haskell could go there by adding a keyword to mark total functions, rather than marking nontotal functions like Idris does. It's otherwise very similar languages


Haskell has liquid haskell to do that.

Lean is definitely a dependently typed ML-family language like Agda and Idris, so "ML" has it covered. And the long-term goal of Lean certainly is not "execution is only secondary"; Microsoft is clearly interested in writing real software with it: https://lean-lang.org/functional_programming_in_lean/Program...

OTOH if you really want to emphasize "intended to express proofs" then surely Prolog has that covered, so Lean can be seen as half ML, half Prolog. From this view, the Curry-Howard correspondence is just an implementation detail about choosing a particular computational approach to logic.


The real question is why were 1.44mb 3 1/2" floppy drives used for so long when they were totally obsolete by 1990. I would love to read a more coherent and unified history; my understanding is that there were tons of competing higher-capacity 3 1/2" drives between ~1985 and 1995, but software developers were stuck releasing on 1.44mb because that was the only format which worked reliably across manufacturers. By the time Zip drive came out, software was distributed on CD and higher-capacity floppies were really only used for (geographically) local data transfer.

Wikipedia says there was a serious attempt to standardize a 20mb floppy in 1990 which fell apart: https://en.wikipedia.org/wiki/Floppy_disk#High-capacity It's really not the case that Zip made some great leap forward; 15 years of technology's steady march didn't fully trickle down to consumer hardware because of compatibility issues between competing manufacturers.


3.5 inch already peaked in 1985, thats when NEC first shipped 1.44MB inside PC-8801 mkII MR. IBM followed two years later switching PS/2 to HD floppies, Apple in 1988. 80 tracks ~50KB/s speed. In 1990 IBM bumped PS/2 to 2.88 ED. Different magnetic material, double the bitrate, ~100KB/s.

... But NEC beat IBM by already doing 'five blades' in 1988 selling PC-88 VA3 with 'Triple' or '2TD' format 3.5" floppy sporting 13MB unformatted 9MB formatted capacity. Same perpendicular head as ED, same magnetic medium, same bitrate, 3 times more tracks (240) while still using cheap stepper motor unlike ZIP head actuators, compatible with same standard ED floppy controller chips. Sadly no one in the west adopted it :(((

There was one more avenue for bumping capacity never really explored on PC - zone bit recording invented by Chuck Peddle in 1961 and supported by Floppy controllers in Macintoshes, Commodore (Chuck Peddle designed drives) and Victor 9000 (Chuck Peddle designed whole computer). Free 50% capacity bump. Victor 9000 pulled 1.2MB capacity out of Double Density 80 track 5 1/4 drive.

Combine 2TD wiht ZBR and we could have had cheap 13.5MB formatted capacity floppies since 1988.


That's kind of the point of my comment - software developers couldn't release on NEC without excluding IBM customers etc etc. They were stuck with 1.44MB because that was the only thing guaranteed to work. There was a human management problem around agreeing on a specification; drive manufacturers and software companies simply had conflicting incentives, so the market was a mess.

In retrospect I think the only reason Zip was able to become the undisputed market leader in high-capacity disks is that CD-ROM fully took over commercial software distribution.


>software developers couldn't release on NEC without excluding IBM customers

Oh they could and they did in Japan when those computers were sold. PC-88 is not IBM PC compatible.

https://necretro.org/PC-88_VA3


Just FYI, this section at the end about R6RS Scheme is a little confused: https://fset.common-lisp.dev/Modern-CL/Top_html/Scheme-_0028...

   Strings are immutable [in Scheme]. Functional point update operations are not provided, presumably out of time complexity concerns, but string-append and substring are provided, and there are functions to convert to and from lists of characters; I guess the idea is that fine-grained string construction will be done using lists and then converted. Amusingly, there’s string-copy, though it’s hard to see why one would ever use it.
Strings are actually mutable in R6RS. See https://www.r6rs.org/final/html/r6rs/r6rs-Z-H-14.html#node_s... - there is an imperative update-in-place function which mutates the argument. So of course string-copy really is useful, you might want to mutate a string and keep an unaltered copy. And the intent of string->list is to automatically let your list-processing code become string-processing code. It is way too strong to say "Functional point update operations are not provided, presumably out of time complexity concerns" - R6RS actively encourages functional operations on strings by calling string->list first, even though that's O(n) overhead.

The overall point you are making seems clearly correct: R6RS Scheme does not provide any "mostly functional" datatypes beyond basic s-expressions, so it would take a lot of work to develop Clojure/FSet-style tools. But it's strange to so badly misstate what strings in Scheme are like.


I mentioned the existence of`string-set!` a few days ago in the reddit post on this release of fset[1] (as well as some Racket inaccuracies). He might just not have gotten around to updating it.

[1] https://old.reddit.com/r/Common_Lisp/comments/1sk2nsl/fset_v...


This is very cool, and quite surprising. Cleaner fish are thought to be among the most intelligent fish because of the complexity and danger of their feeding strategy: it takes careful planning and quick thinking. But they aren't tied to any particular species of host or general tactic; naively I imagine cleaner fish are more versatile and adaptable than cone ants.

It would be interesting to learn if this occurs with other species of ants. I suppose until now nobody thought to look.


Cooperation and symbiosis are very general survival strategies. They apear at all levels of the biological abstraction hierarchy, all the way down to mitochondria, which are almost certainly descended from what was once an independent organism. In fact, even a genome itself can be seen as a collection of mutually cooperating replicators. No intelligence is required for cooperation to evolve. It's a straightforward consequence of game theory.

Yes! Cooperation seems to be just as fundamental, if not more, than competition. We wouldn't have gotten volcanic islands to break down into soil if it weren't for the partnership that is lichen. We wouldn't be able to digest a tenth of what we're able to eat if it weren't for our gut bacteria. We wouldn't have trees if it weren't for mycorrhizal fungi which over 90% of plants depend on.

There's a famous paper/framework called "Major evolutionary transitions in individuality" that sketches out a big picture pattern of major evolutionary advances in complexity following a surprisingly consistent pattern: As cooperation and division of labor strongly increase, selection starts working on larger entities. This pattern holds all the way back to the origin of life itself as things moved from self-assembling molecules to compartmentalized populations of molecules, from replicators to chromosomes, from RNA+enzymes to DNA+proteins, from prokaryotic cells to eukaryotic cells, from unicellular to multicellular, from individuals to colonies/superorganisms, and (possibly) onwards to more complex societies

https://www.pnas.org/doi/10.1073/pnas.1421402112


> Cooperation seems to be just as fundamental, if not more, than competition

Both are fundamental. You can't survive without cooperating, but you also can't survive if you try to cooperate with the entire biosphere because ultimately there is competition for scarce resources. If you don't assert yourself to claim your share of those, something else will.


Yes, but does your kidney compete with your lung?

I would highly recommend you read through the paper. Alongside these "advanced in individuality" comes reduced (internal) competition.

I agree that both are important but competition only seems to be important at the very edges of what selection is acting upon while cooperation is truly "fundamental"


> does your kidney compete with your lung?

I don't know of any cases of direct kidney v lung competition, but competition among body parts is common. Sometime that competition is adaptive, sometimes not. Examples of adaptive competition are things like when under extreme circumstances (particularly cold or hunger) your body will sacrifice parts of itself to keep other parts going. Examples of non-adaptive competition are things like autoimmune diseases. Also, sometimes individual cells go rogue and stop cooperating. That's called cancer.

> competition only seems to be important at the very edges of what selection is acting upon

I guess that depends on what you consider "the edges". In the case of humans, selection produces intuitions about "us vs them". Those intuitions range from very closely drawn boundaries ("us" includes only my immediate family or clan) to very broadly drawn boundaries ("us" is my entire species, or my entire phylum, or all living things). In between are things like "us" is all members of my species with my skin color. But the extremes of this range are non-adaptive. Draw the boundaries too narrowly and you end up without enough genetic diversity in your in-group to drive out maladaptive mutations. Draw them too broadly and you end up defenseless against parasites and with nothing to eat.


That's exactly my point. Competition at the fundamentals is called cancer. It's the exception, not the norm.

Your body prioritizing which parts to keep alive for the survival of the whole ship is not an example of competition. Competition would be if a body part actively attacked another body part. In this case, survival of the entire body will eventually benefit all body parts

> I guess that depends on what you consider "the edges"

The "edges" as thoroughly defined in the paper I linked. Major evolutionary transitions in individuality (METI). METI is a widely accepted framework in biology


> Competition would be if a body part actively attacked another body part.

I guess we'll just have to agree to disagree about this. I don't see a lot of daylight between "active attack" and starvation of resources. Just because your attacker chooses to lay siege to you rather than mount a full frontal assault doesn't make them any less of an attacker IMHO.

> The "edges" as thoroughly defined in the paper I linked.

Sorry, I don't see it. AFAICT the word "edges" only appears once in the paper:

"Evolution is a process of continuous change, and so we should expect blurry edges with a mosaic of features (1)."

[UPDATE] Oh, BTW, I think that paper is actually very good. Thanks for bringing it to my attention.


mmmm it's equally likely that mitochondria's precursors were self interested parasites or predators, whose negative effects were competitively neutralized by defensive host adaptations that exploititively colonialized them. No intelligence or cooperation is required for co-opt-ation to evolve. It's a straightforward consequence of game theory.

> it's equally likely that mitochondria's precursors were self interested parasites or predators

That is in no way at odds with what I said.


This is a frustrating response to my comment. I am aware that symbiosis is universal. That's not what I'm talking about. I am talking about the specific and highly unusual behavior of crawling inside of a much larger animal's mouth and trusting it not to eat you. Cleaner fish are highly intelligent[1] and it appears that this intelligence is necessary for their niche:

- picking a good location for a cleaning station requires long-term planning and real strategic judgment;

- deciding which hosts to accept is a complex skill requiring some sort of rudimentary theory of mind + long-term development of social ties;

- like crows, cleaner fish are jerks who constantly try to screw each other over, so there is something of a cognition arms race.

I will add that the wrasse family of cleaner fish use rocks to smash open shellfish (i.e. they are tool-users), and they have very complex group strategies for raising their young. In fact I'm not convinced that wrasse evolved to be cleaner fish at all: they are natural scavengers and scum-suckers, perhaps cleaning stations are a form of cultural technology.

I would be extremely surprised if any of this was true for cone ants. I suspect that is more hard-wired, perhaps a local subspecies stumbled into a genetic fluke, and as you say due to game theory it is a local optimum this population has settled on. If this behavior were common like it is in vertebrates, we probably would have seen it earlier. But who knows? 20 years ago I would have thought "fish have a form of culture" is too ridiculous an idea to consider.

[1] Seriously: https://journals.library.columbia.edu/index.php/cusj/blog/vi... https://www.nature.com/articles/s41598-025-25837-0


> I am talking about the specific and highly unusual behavior of crawling inside of a much larger animal's mouth and trusting it not to eat you.

OK.

> Cleaner fish are highly intelligent[1]

Yes.

> and it appears that this intelligence is necessary

Manifestly not, at least not in general.

> for their niche:

That is an open question. Just because an unintelligent cleaner fish hasn't evolved doesn't mean it couldn't.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: