Hacker Newsnew | past | comments | ask | show | jobs | submit | pydry's commentslogin

They're experts at divide and conquer. They'll probably be able to convince us that we did this to each other.

Just like they convinced the younger generation that "boomers" stole their future.


What if the hiring managers at the jobs you'd actually prefer to work at also cringe when they see it on your profile?

It's becoming so ubiquitous, I highly doubt it. At worst I think a manager would just see it as fluff, but not a negative.

I hope the hiring managers I would actually want to work for would see it as a red flag on resumes

At this point, I'd assume those hiring managers are also being forced to use AI in their jobs (or pretend, at least) and probably wouldn't read too much into it if it's not a substantial portion of their resume. I do feel the same way, though.

Why? It's just the name of the game, everyone gets it. Especially if you're a generalist/frontend type.

It's simply not a game I'm interested in playing. I'll find something else to do instead, leave the AI jockeying to others.

I asked coz know several managers who would look upon it as a red flag and I suspect OP would probably prefer to work for them rather than AI sheep.

That's actually a really good point.

>What is AI actually good at? Implementation. What is it genuinely bad at? Figuring out what you actually want

I've found it to be pretty bad at both.

If what you're doing is quite cookie cutter though it can do a passable job of figuring out what you want.


LLMs work OK for "Mostly iterative and mostly one-off" tasks like codegen, where you can effectively "review the result into existence", and that's where most of the buzz is at the moment.

Where they don't work at all well is for hands-off repeatable tasks that have to be correct each time. If you ask a LLM for advice, it will tell you that you need to bound such tasks with deterministic input contract and a deterministic output contract, and then externally validate the output for correctness. if need to do that you can probably do the whole thing old-skool with not much more effort, especially if you use a LLM to help gen the code, as above. That's not a criticism of LLMs, it's just a consequence of the way they work.

They are also prone to the most massive brain farts even in areas like coding - I asked a LLM to look for issues in some heavily multithreaded code. Its "High priority fix" for a infrequently used slow path that checked for uniqueness under a lock before creating an object was to replace that and take out a read lock, copy the entire data structure under the lock, drop the lock, check for uniqueness outside of any lock, then take a write lock and insert the new object. Of course as soon as I told it it was a dumbass it instantly agreed, but if I'd told it to JFDI its suggestions it would have changed correct code into badly broken code.

Like anything else that's new in the IT world, a useful tool that's over-hyped as sweeping awsy everything that came before it and that's gleefully jumped on by PHBs as a reason to get rid of those annoying humans. Things will settle down eventually and it will find its place. I'm just thankful I'm in the run up (down?) to retirement ;-)


It isnt but the fact it ultra vague and hand wavey means anybody can claim anything they do is agile including things that the exact opposite.

I actually think OP's criticisms apply mostly to Scrum. Scrum is well defined but its adherents' wont hear a critical word said about it. "You just werent doing it right" even when you were doing it precisely as described.


> It isnt but the fact it ultra vague and hand wavey means anybody can claim anything they do is agile including things that the exact opposite.

I don't really agree. The set of principles are quite straight forward. It's things like delivering software frequently, accommodating new requirements, continuously looking into improving processes, business types and developers working together, etc.

Then you have concrete executions like scrum vs kanban. Agile doesn't specify one or the other. Retrospective meetings are popular, but aren't specified by Agile per se.


Their main value is in being cheap before they realize that they're underpaid and hop jobs.

They tend to catch on quicker these days, making companies more reluctant to hire them. It has little to do with AI.


The cost of nuclear power is absurd. It's 5x the cost of solar and wind.

If you use electricity to synthesize gas and then burn that later to generate electricity that is still cheaper than nuclear power.

https://theecologist.org/2016/feb/17/wind-power-windgas-chea...

Nobody builds nuclear power because it's cost effective or green. They either have nukes like China or have purchased an option on nukes (like Iran or Poland).


You need to overprovision solar and wind capacity by at minimum 5x for northern latitudes' winter months compared to the summer, plus another few multipliers to keep storage topped up, or invest heavily in HVDC and massively overprovision the southern states.

For that scenario, nuclear is still marginally cheaper (at today's prices at least).


Northern latitudes have low population density and plenty of hydro power which, unlike nuclear power, CAN actually operate as a battery at a reasonable cost.

There is still nowhere in the world nuclear power makes economic sense.


Poland was ~80% coal before Ukraine. It wasnt energy independence which got them interested in nuclear power it was the idea that they might one day want a nuclear bomb (in case the current nuclear umbrella goes away).

It's never an economic decision to build nuclear power stations. They're 5x the cost of solar and wind.


If we actually cared about making nuclear cheap - getting rid of the political barriers to building Gen IV reactors, not throwing away our “waste”, it would beat the pants off solar by operating 24/7 and not using up all our land.

While we're at it I would actually prefer it if nuclear power paid for its own catastrophe insurance instead of lumping that burden on taxpayers.

Currently their liability is capped at $300 million. Fukushima cleanup cost $800 billion.

End the insurance free ride first and then maybe lets talk about deregulation.


While I am a big fan of nuclear, I think the issue of land usage for solar is overblown. We use a lot of land for far less useful things. In the end, anything that helps us burn less fossil fuels, I am happy with.

You're also taking away farmland that could be used to produce all kinds of things. Most of the prime solar areas are the same prime areas for agriculture. By creating massive solar farms, you're at the same time, reducing acreage that could be used for range animals and other agriculture:

Modeling by the American Farmland Trust (AFT) finds that 83% of projected solar development will be on agricultural land, of which 49% will be on land AFT deems “nationally significant” due to high levels of productivity, versatility, and resiliency. In May 2024, the U.S. Department of Agriculture’s (USDA) Economic Research Service (ERS) reported that between 2009 and 2020, 43% of solar installations were on land previously used for crop production and 21% on land used as pasture or rangeland.

In a few years we'll have to deal with an impending disposal issue on farmland:

Forecasts suggest that 8 million metric tons of solar panels will have reached the end of their lifecycles by 2030. The National Renewable Energy Laboratory reports that less than 10% of decommissioned panels are recycled. Many end up in landfills at the end of their lifecycle, which could be problematic, according to researchers with the Electric Power Research Institute because panels could break and leak toxic materials like lead and cadmium into the soil. If decommissioned panels are not disposed of properly, they could contaminate the surface and groundwater in the surrounding area, making disposal a major issue for farmers and rural communities who rely on groundwater for needs ranging from crop irrigation to drinking water.


Agricultural land in large parts of the US is going through a massive degradation cycle. We are heading for dustbowl 2.0 especially now that a bunch of the weird land universities have been shut down. In short its being used wrong and left empty too long, meaning the top soil is blowing away. Not to mention the land drains stopping proper soaking leading to flash flooding and runoff events.

Depending on how the panels are put in place, the land and soil quality will increase significantly because its reverting to fallow and long rooted stabilising plants will have 25 years to build up the biome again. Converting land back to farming is pretty quick.

I understand the point your making, and I do agree with the end of life cycle issues. THere is going to be a lot of lead leaching into water courses if not dealt with properly.


The land use argument is less than zero.

If you replaced ONLY existing fields used to grow corn for ethanol, and turned those into solar panels, you would already exceed the entire current US demand for electricity.

Solar energy is a phenomenal use of land, of which we have enormous amounts of in this country.


Fewer cows would be a huge environmental win. Beef farming is a major source of GHG. Also a very expensive/inefficient way to produce calories.

You can do both farming and solar on the same land and it improves crop yields. As of yesterday, studies found it creates rainfall in the desert

And also be peaceful and never bomb plants.

Ive yet to see a human process which used an excessive number of cheap junior developers precisely architected to create high quality software.

If that could have been achieved it would have been very profitable, too. There's no shortage of cheap, motivated interns/3rd world devs and the executive class prefer to rely on disposable resources even when it costs more overall.

The net result was always the opposite though - one or two juniors on a leash could be productive but more than that and it always caused more problems than it solved.

Seeing the same problem with agents. Multi agent orchestration seems like a scam to manufacture demand for tokens.


I'm in absolute agreement that the AI coordination problem exists today when the AI is at junior level. I'm just saying that the mathematical argument is silly to apply to arbitrary future AIs, if and when they reach human capability. Because while coordination problems have not been mathematically solved, the world economy is a case in point that it is possible to coordinate human-level agents to achieve large scale projects at generally sufficient quality levels.

So to be clear, I'm not advising anyone to change their current token consumption habit. I'm just saying that it's silly to apply math to prove the impossibility of something we can literally see around us. It's like a mathematical proof that water isn't really wet.


Or theyre just picturing some CRUD app that needs to connect to a few APIs being made for a clueless exec who doesnt even understand their own business problem.

To be fair, a lot of programming does end up being just that.


>You realize that "The Lisp Curse" isn't some paper, survey or objective reflection? It's just someone's essay back from 2011 - an opinion.

It's also the deficit of code we actually use day to day that is actually written in lisp.

I file it under the same heading as haskell - a language that clearly has useful ideas, but...


You're using Lisp software right now!

I think this is the most treacherous assumption people tend to make about programming languages, for a few reasons. One of them is that we really don't have any way to measure software that we actually use day to day.

Think about the software controlling your local water treatment plant, traffic lights, the software your local power company relies on, the software running the servers you connect to, and all the servers those things connect to. All the infrastructure in between and the infrastructure's own infrastructure. Allegro Lisp's customers are shotgun spread in industries like healthcare, finance and manufacturing. They're paying for it, so we can infer they're using it, but can anybody actually name what software is written in it?

If we play six degrees of separation, accounting for the full gamut of every single computer that does something relevant to your life no matter how distant, how much of that software are you actually familiar with? The fact of the matter is that we genuinely have no broad picture. There is no introspective method to find out what software you are relying on in your day to day life, almost all of it is completely opaque and implicit. To ask "what software do I use?" is to ask an unanswerable question. So to then synthesize an answer is to work with an unsound, unsupported, incomplete conclusion, which is exactly how you end up assuming you don't use software written in Lisp, while directly using software written in Lisp (HN)

Of course, even accounting for the epistemic issue, the premise is still flawed. ATS is a language with 'useful ideas, but...', Haskell is an aging pragmatic kitchen sink. Positioning the latter as the former is almost comedic.


That's a lot of words to say a lot of FUD.

I know people who work in the embedded space working on stuff similar to traffic lights and LISP isnt even on their radar. Rust is. LISP isnt.

Every niche language has its fanboys who can end up using it all over the place but when it doesnt spread to non fanboys there is usually a reason to which they are wilfully blind, usually related to its practical value.


> Every niche language

Lisp is not a really a programming language. It is an idea.

Lisp didn't emerge the way most languages do - someone sitting down to design syntax and features for practical software engineering. McCarthy was formalizing a notation for computation itself, building on Church's lambda calculus. The fact that it turned out to be implementable was almost a surprise.

And that origin story matters because it explains why Lisp keeps regenerating. Most languages are artifacts - they're designed, they peak, they fossilize. Lisp is more like a principle that keeps getting re-instantiated: Common Lisp, Scheme, Racket, Clojure, Fennel, Jank. Each one is radically different in philosophy and pragmatics, yet they all share something that isn't really about parentheses - it's about code-as-data, minimal syntax hiding maximal abstraction, and the programmer's ability to reshape the language to match the problem rather than the reverse.

The counterargument, of course, is that at some point the idea has to become concrete to be useful, and once it does, it's subject to all the same engineering tradeoffs as any other language. Rich Hickey for example made very specific, opinionated decisions that are engineering choices, not mathematical inevitabilities. So there's a productive tension between Lisp-as-idea and any particular Lisp-as-language.

> related to its practical value.

Don't be daft, preaching pragmatics to modern Lispers is like trying to explain synaptic connections and their plasticity to neurosurgeons. They already know what's what - tis you who's clueless.


You're all over the place here. Yes a chunk of my post is about the epistemic impossibility of knowing what software you rely on. I had to look up what FUD means, I assume it's just an algorithmbrained primitive cognate of epistemic skepticism.

At no point does a post about "You can't draw sweeping conclusions about this kind of thing" imply "all dark matter tech relies on esoteric stacks". I'm not sure why you would even bring up that anecdote?

> there is usually a reason to which they are willfully blind, usually related to its practical value.

Lame passive aggression aside, I'm not a Lisp "fanboy" and I actively don't like the grain of the language. Language adoption is always down to familiarity, taste and ecosystem constraints. But I'm also not deluded enough to assert something like this because I actually know better. It's an argument that's always positioned without substance, because there can be none. You're positioning ignorant snobbishness as enlightened pragmatism, no offense but that's just pretense. If you can't get a Lisp program working on a 100mhz microcontroller with 5k of flash, that's kind of a skill issue dude.


What a thought-terminating cliché. It sounds reasonable while saying nothing. "Useful ideas, but..." what? "But it's not popular"? Neither was Python before it become one. But it doesn't have a big ecosystem? You're describing a consequence of adoption and presenting it as a cause.

Like I said: Clojure runs significant chunks of Walmart's infrastructure, Nubank's entire banking stack serving several hundred million customers - for a second, Nubank is the biggest digital bank in the world; Apple uses it; Cisco uses it. Emacs Lisp runs one of the most enduring pieces of software in computing history. You you ever used Grammarly - it's powered on Lisp; This very site runs on Lisp.

"I don't encounter Lisp in my work and that feels meaningful to me" (there's no other way to interpret your words) is just another, personal opinion that has no practical, objective angle. "The Curse of Lisp" opinion, at least back in the day had some objective relevance that no longer holds true.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: