Hacker Newsnew | past | comments | ask | show | jobs | submit | bsza's commentslogin

> ChatGPT equalizes intelligence

Yes, I love living in communism too. Imagine if you had to pay money for it or something. The wealthiest people would get unrestricted access to intelligence while the poor none. And the people in the middle would eventually find themselves unable to function without a product they can no longer afford. Chilling, huh? Good thing humans are known for sharing in the benefits of technological progress equally. /s


Huh?

Before ChatGPT it costs ~$100,000 to aquire intelligence good enough to solve this Erdos problem, now it costs ~$200.

I'm really confused at what you are even taking an issue with.


His core issue is jealousy and fear. I don't think these types of people are at the top of the intelligence curve (more closer to bottom) but that is orthogonal to my point. What I'm saying is his personality archetype makes him think (keyword) he's at the top of the intelligence curve and an equalization means, personally to him, that he's losing his edge.

More specific to HN is the archetype of: "I have spent years honing my craft as a expert programmer, my identity is predicated on being an expert programmer in which high intelligence is causal and associated positively with my identity" That's why ironically most of HN was completely wrong about AI. They were wrong about driverless cars, they claimed vibe coding was trash. It's the people who think (keyword) their stupid/average (aka general public) who got it right... because perceptually they stand to gain from the equalization.

Anyway.. this fear and jealousy is not something most humans can admit to themselves. Nobody will actually be able to realize that these emotions drive there thinking. They have to lie to themselves and rationalize a different reality. That's why you get absurdist takes like this.

To everyone reading. It is obviously that chatGPT does not equalize intelligence to the point of 100%. That statement is obviously not saying that. Everyone knows this. You want proof?:

Look at the declaration of independence... without getting to pedantic: "All Men are created equal" is not saying all Males are 100% equal. Everyone knows this. First off no one is 100% equal.. and second the statement in a modern context is obviously not referring to only men. It is referring to women&men and clearly men and women are nowhere near equal.

So if you all know this about the declaration of independence... how can you not see the same nuance for: "ChatGPT equalizes intelligence."? First ask yourself... do you think you're smart? If you do, then the self delusion I just described is likely happening with you.


what? the post is literally titled "Amateur armed with ChatGPT solves an Erdős problem". stop spreading FUD about unaffordability

They used ChatGPT Pro to solve it. Over 50% of people in the world couldn't afford ChatGPT Pro ($200/mo) even if they spent more than half of their income on it. [1]

What was that about "spreading FUD about unaffordability"?

[1] https://ourworldindata.org/grapher/share-living-with-less-th...


They didn't buy ChatGPT Pro themselves. You could've done the same as the students in the article and get a free subscription if you were interested in this instead of trolling.

> You could've done the same

Please show me the steps to get a $200 subscription for free that works 100% of the time regardless of who you are. I'm listening.


ChatGPT flattened the difference between top .0001 percentile mathematician and an amateur. This is the definition of making intelligence more available.

You are exaggerating the situation by essentially claiming since some people can’t afford 200 dollars this means ChatGPT is not democratising intelligence. It’s a bit strange to claim this because according to you it only becomes affordable when maximal number of people can afford it. It’s a bit childish.

Directionally it is democratising. Are more people able to afford higher level intelligence? Yes.


> ChatGPT flattened the difference between top .0001 percentile mathematician and an amateur

It flattened the difference between a top epsilon percentile mathematician and an amateur with money. It didn't flatten the difference between an amateur with a little money and an amateur with a lot of money. It widened it. That's the part I'm scared about.

You are shrugging this off because it currently isn't that expensive. But we're talking about the massively subsidized price here, which is bound to get orders of magnitude higher when the bubble pops. Models are also likely to get much better. If it gets to a point where the only way to obtain exceptionally high intelligence is with an exceptionally high net worth and vice versa, how is that going to democratize anything?


This is the most pedantic argument ever.

"All men are created equal" is obviously not literally saying all humans are 100% equal. Just like how "ChatGPT equalizes intelligence" is not saying ChatGPT equalizes the intelligence of all humans to a level of 100%.

I'm not going to spell out what I meant by: "ChatGPT equalizes intelligence". You can likely figure it out for yourself, because the problem doesn't have anything to do with your reading comprehension. The problem is more akin to self delusion, you don't want to face reality so you interpret the statement from the most absurdist angle possible.

The admins at HN actually noticed this tendency among people and encoded it into the rules: "Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."


It is not “absurdist” to call out a baseless claim that doesn’t take into account over half of humanity, a percentage that will grow even further once investor money inevitably runs out. If your response to that is to wave away more than 4 billion people, then you’re not even trying to look like you care about reality, you’re just trying to make yourself feel better with some made-up nonsense.

You seem to be under the misconception that you somehow “own” ChatGPT or are entitled to the insight it provides. You don’t and you aren’t. You are at the mercy of trillion-dollar private companies that owe you nothing. Their products’ intelligence is not your intelligence. Whatever profits you’re seeing from it, it’s currently losing them money. And when that changes, so will your image of them as benefactors of humanity who make intelligence available to all.


It is fucking absurdist and pedantic when I hear this drivel coming out of the mouth of a hypocrite. You’re already part of the privileged few. Every single thing that you do from drinking clean water to writing your bullshit on the Internet is the result of your own arguments of distributing technology among the top percentage. And as a recipient of such benefits you should have the intelligence to see that even that much matters. Why don’t you raise your shit against the ass holes who are really making things unequal: Internet service providers and their astronomical fees which don’t equalize the world enough such that homeless people have access to the internet. That’s societies real problem according to your genius logic… so stop your tirade against AI as their are bigger fish to fry.

> You seem to be under the misconception that you somehow “own” ChatGPT or are entitled to the insight it provides.

Right now for the price of a new car I can definitely get enough hardware to run a local LLM to the quality of ChatGPT at my home. And this is just the status quo. The demand for this technology and the projection of improvement in prices predicts a future where you can run one for the price of a new computer. Wake up.

But who the fuck cares? Point being is AI is equalizing intelligence and you’re just throwing in tangents and side branches to try to disentangle the obvious general truth which I will repeat: AI is fucking equalizing intelligence and if you don’t agree, you’re absurd.


> 5x5 isn't enough to draw "e" properly if you also want lowercase letters to have less height than uppercase

It can be enough if you "cheat" and make use of the horizontal space. This is how I did it in my font:

   ##
  # #
  ##  #
   ###

They also lead the world in EV production on paper, but in practice a large portion of those numbers might be driven by government pressure, not actual demand [1].

I’d personally take this data with a big grain of Goodhart’s law.

[1]: https://www.bloomberg.com/features/2023-china-ev-graveyards/


I’m guessing this comment was intended on a different post or on someone else’s comment.

I dont’t see the relevance, the discussion is over whether boilerplate text that occurs intermittently in the output purely for the sake of linguistic correctness/sounding professional is of any benefit. Chain of thought doesn’t look like that to begin with, it’s a contiguous block of text.


To boil it down: chain of thought isn’t really chain of thought, it’s just more token generation output to the context. The tokens are participating in computations in subsequent forward passes that are doing things we don’t see or even understand. More LLM generated context matters.


That is not how CoT works. It is all in context. All influenced by context. This is a common and significant misunderstanding of autoregressive models and I see it on HN a lot.


I don't see the relevance -- and casually dismiss years of researches without even trying to read those paper.


Then what is it? I'm seeing 4x5 transform matrices in the code, looks 4D enough to me.


The best analogy I can think of (quite similar to this one) is that the internet is low Earth orbit and AI is the Kessler syndrome. We abandon the place not to hide ourselves, but because it is saturated with garbage, and anything you try to put up there will only result in even more garbage being generated, without any positive effect.

The ideal solution would be to remove the garbage, but right now we can't even detect it, let alone figure out a way to get rid of it. Besides, it's a zero sum game, why bother cleaning up when you can just effortlessly pump out more garbage in hopes that some of it will remain in orbit for long enough to benefit you.


I don't buy the analogy. The problem with Kessler syndrome is that low earth orbit is physically crowded, you run into collisions. I don't care about the garbage. I don't care about the AI era. I've been writing code in Emacs for 20 years, I'll be writing code in Emacs in 20 years, every open source project I contribute to still looks the same because all these AI people, like the blockchain people do is just make new stuff up in their own incestuous tupperware salesmen ecosystems.

I do pity the bug bounty people who rely on goodwill in their programs given that everything with a financial incentive is vulnerable. But otherwise the great thing about digital spaces is that there is, for practical purposes, unlimited space.

Every day there's another "how do you deal with the AI-apocalypse" article, I don't just ignore it


I think by "internet" they mean search engine results pages. If you restrict yourself to short, common queries and only look at the top 10 results on the page, then the space really is very limited. If all those top 10s for common queries start to get crowded out with AI slop, then people are going to start abandoning search.


Well, if you open-source anything these days and it does make it big, you can be prepared for a flood of low-effort slop PRs that you must either review for free or stop accepting external contributions altogether, making it effectively closed-source. You can't choose to ignore the garbage, it will collide with your stuff, unless your stuff is small enough to avoid collisions (in which case no one will see it).


Zero-contribution open source doesn't at all make it closed source.

It delivers on the value of open source, that anyone using your software is permitted to make and distribute their own changes.

SQLite is an example of a project that is open source but closed contribution.


Minor correction: SQLite is not closed to contributions. It just has an unusually high bar to accepting contributions. The project does not commonly accept pull requests from random passers-by on the internet. But SQLite does accept outside contributed code from time to time. Key gates include that paperwork is in place to verify that the contributed code is in the public domain and that the code meets certain high quality standards.


I was about to try to make this point: there have always been projects that attract more potential contributors than there are competent contributors.

And there have always been techniques for identifying quality contributions from new contributors.


Thank you for the correction, I should have said "not open contribution" rather than "closed contribution".


Maybe, but that's hardly comforting (and definitely not in the spirit of open source) if you're forced to take that decision, knowing it will hurt your project, because the alternative is getting DDoSed.


If by the spirit, you only mean the bazaar model, then yes. But it's in the original spirit of free software. GNU preferred to keep the development somewhat contained, even so many years ago.


> I've been writing code in Emacs for 20 years, I'll be writing code in Emacs in 20 years

Bold assumption. On what will you run Emacs if average PC costs $12000? Yes. Even Raspberry Pi. It's not called war on general computing for nothing.

If you say the cloud, that will be cut up and reused by the next AI crawler.


AI will not be able to eat up all chip manufacturing capabilities forever. At some point the market will be saturated and PCs will get affordable again.


And COPA didn't succeed at first, but try and try and you get COPPA, and now age verification laws.

I don't think we'll see PC affordable in my lifetime. It didn't happen after Bitcoin crash, didn't happen post pandemics. New price gets normalized and the cartels just agree to not make anything for PCs.

And if you get everyone on cloud? Then you can control Internet same way you can control TV or the press.


> I don't think we'll see PC affordable in my lifetime. It didn't happen after Bitcoin crash, didn't happen post pandemics. New price gets normalized and the cartels just agree to not make anything for PCs.

What's your definition of affordable? What years were PCs affordable? By my reckoning PCs are affordable today. If you're not trying to run games they're downright cheap.

I'm not sure what issue you're referring to with bitcoin, but if you want to use bitcoin to buy something it's about as easy/awkward as it ever was.

Food prices went up 15-20% more than they would have with 2% inflation. If PC prices do anything similar, it's not a big deal in the long run.

Cartels just agree not to make anything for PCs? Why would that happen? The point of restricting supply to a market is to maximize profits, not to refuse forever and lose out. They wouldn't even want everything to be in the cloud, because a hundred rarely-idle cloud cores can replace a lot more than a hundred mostly-idle consumer cores, so they end up selling a lot less hardware.


> What's your definition of affordable? What years were PCs affordable?

That DIY entry PCs can be built for 400 USD or less. Budget PC should be able to browse net and play a few games on the iGPU (so overall 1TB SSD, some iGPU and 16 GB of RAM). Ideally on current generation of RAM and processors.

> By my reckoning PCs are affordable today. If you're not trying to run games they're downright cheap.

By what reckoning? And not just games, 3D workload, compilation. Hell. Even browsing + some productivity eats 32G of RAM as if it were nothing.

> I'm not sure what issue you're referring to with bitcoin

The first permanent jump in GPU prices. After Bitcoin prices of high-end GPUs remained at +1000 USD.

> Cartels just agree not to make anything for PCs? Why would that happen?

For bigger profits. You can see most hardware manufacturers moving from selling to consumers to selling to governments, cloud, and data-centers.

Why not make anything for PCs? Because individuals can't compete with the coffers of large corporations and governments.

> The point of restricting supply to a market is to maximize profits, not to refuse forever and lose out.

You can maximize profit by leaving a market. In the same way, you can still sell SSDs but for much bigger margins to data centers and governments.

Say all but one/two manufacturers leave the consumer market. The monopoly/duopoly hikes up prices again and again until you have a few stragglers on 40k USD workstations, and everyone else is on an iOS-like platform.

Once you are in the walled-in-cloud-garden, computer is not your own, and you can be monitored perfectly. This is something most governments want and is essentially the endgame for war on general computing.


> That DIY entry PCs can be built for 400 USD or less. Budget PC should be able to browse net and play a few games on the iGPU (so overall 1TB SSD, some iGPU and 16 GB of RAM). Ideally on current generation of RAM and processors.

Does it have to be DIY? Because a quick search says that if 16GB RAM and 512GB SSD is enough then you can get a Zen 2 machine for $300 and a Zen 3 machine for $370.

But man, $400 in 2026 money is a really tight threshold for "affordable". It means PCs were almost never affordable. If I go back to 2017 when that was equivalent to $300, I don't think I can put together a viable build with even 8TB of RAM and 250GB of SSD. I think that standard is too demanding.

> The first permanent jump in GPU prices. After Bitcoin prices of high-end GPUs remained at +1000 USD.

Oh, that was generally other cryptocurrencies but okay I understand.

nVidia has been overcharging, and they've basically increased the prices by one tier. A 70 card costs as much as an 80 used to.

But price per performance continues to improve. A 5050 beats a 1080 for half the price, before even factoring in inflation.

> For bigger profits. You can see most hardware manufacturers moving from selling to consumers to selling to governments, cloud, and data-centers.

> You can maximize profit by leaving a market. In the same way, you can still sell SSDs but for much bigger margins to data centers and governments.

That works when there's enough demand to buy all the chips. AI will stabilize one way or another, and then the remaining datacenter market doesn't need that many chips compared to the consumer market. Manufacturers will have extra supply, and not selling it to consumers would be stupid.

And even if they charged datacenter-level prices to consumers, people would still be able to get PCs. Even if the cheapest new CPU was $500, that's still nowhere near the options being "no PC" and "$40k workstation".

Plus people could buy old datacenter chips for pennies on the dollar.

> Once you are in the walled-in-cloud-garden, computer is not your own, and you can be monitored perfectly. This is something most governments want and is essentially the endgame for war on general computing.

Governments might want it, but that doesn't transfer to chip makers.


> Does it have to be DIY?

Yes. Because only DIY allow your computer to be repaired at will. Go laptop or corporate and those get increasingly hard to fix. Not to mention if DIY market is healthy the non-DIY market is even cheaper.

> But price per performance continues to improve. A 5050

If 5050 didn't beat a 10 year old graphics card it would be an even greater waste of sand.

> Governments might want it, but that doesn't transfer to chip makers.

If governments want it, there is money to be made.

> Plus people could buy old datacenter chips for pennies on the dollar.

Sure, but no one will be able to afford all the other amenities. Buying a server CPU isn't the issue. It's buying every other part of the server rack. Namely the board, the cooler, the memory and the storage. And housing and power for it.


> If 5050 didn't beat a 10 year old graphics card it would be an even greater waste of sand.

It beats the 10 year old high end. That's not necessary to avoid being a waste of sand.

But that's not the point. As long as you can keep getting better performance for less money, things are getting more affordable.

> Buying a server CPU isn't the issue. It's buying every other part of the server rack. Namely the board, the cooler, the memory and the storage. And housing and power for it.

Motherboards are looking at the smallest price hikes of all. Coolers are dirt cheap and a quality thermalright is less than $20. Housing for a server is about the same as a desktop and not changing. Half this list is nonsense.

Memory is going up a lot. But that's the one we started on. And you can get a reasonable amount for a couple hundred dollars, and acceptable storage for less than one hundred. Power isn't going crazy either.

And you didn't address how your threshold for "affordable" would exclude every year before about 2019. It's too strict.

> Because only DIY allow your computer to be repaired

Listen, if I can get a whole computer for $300 then I don't need repair. It's a real downside, but if the CPU and motherboard are soldered together and take each other out then it's like I doubled the risk they break within seven years. And after seven years I'd replace both anyway. So that's like a $50 penalty, not a disqualifier. And the mini PCs I was citing have detachable memory and storage.


> It beats the 10 year old high end. That's not necessary to avoid being a waste of sand.

I remember when GPUs didn't need to wait 10 years for same chip makers worst offering to beat the top of the line.

> Motherboards are looking at the smallest price hikes of all.

For now.

And for the record I bought a bargain bin Xeon. Only to realize later the only motherboard that accepts it costs $1000. And I needed another Xeon chip. This was around 2020

> And you didn't address how your threshold for "affordable" would exclude every year before about 2019. It's too strict

Honestly. It's the last time hardware prices were close to sane.

> Listen, if I can get a whole computer for $300 then I don't need repair.

If you are willing to bear externalities of e-waste. Fine.

Also replace them with what? You think industry will care about power users? Nah. They can eat cock. Everyone gets a tightly sealed mobile phone that LARPs to be a computer.


> I remember when GPUs didn't need to wait 10 years for same chip makers worst offering to beat the top of the line.

It sucks, but it's nowhere near being the kind of barrier you're describing.

> Honestly. It's the last time hardware prices were close to sane.

What I mean is such a standard says that 1990-2018 was all unaffordable. 2019 was basically the only year that qualifies.

> If you are willing to bear externalities of e-waste. Fine.

In terms of e-waste, if you look at an unfixable mini computer with core parts that on average die 2 years sooner than a full-size computer, it causes less e-waste because it's so much smaller.

> Also replace them with what? You think industry will care about power users? Nah. They can eat cock. Everyone gets a tightly sealed mobile phone that LARPs to be a computer.

They can sell bigger chips that cost more money to power users, why would they not care?

But even if it was just phone chips, that would only set back the exponential speed increases by a few years. It wouldn't destroy the market. My brand new phone is way more powerful than my aging desktop. If I could let the desktop borrow its CPU I would do so instantly.


> What I mean is such a standard says that 1990-2018 was all unaffordable. 2019 was basically the only year that qualifies.

Nah, the period around 1080 was honestly the golden age, before the age of RTX ushered the pest that is AI.

> In terms of e-waste, if you look at an unfixable mini computer with core parts that on average die 2 years sooner than a full-size computer, it causes less e-waste because it's so much smaller.

I don't think any particular component in a PC has the same amount of waste as a Mac Mini. And replacing a component is part of 3R (Reuse, Reduce, Recycle).

> They can sell bigger chips that cost more money to power users, why would they not care?

Because power users have less money than large corporations and governments. By selling you an GPU for $1000 that could be sold for $10000, they are losing the $9000 on each sale.

> But even if it was just phone chips

Not just Phone chips, locked down sanitized spy garden. It's very hard to do anything remotely creative on a phone like programming or rendering. You want to use your phone as a Desktop? That's not allowed. It's called War on General Computing. Not war on Consumer Computers.


True, but as they say, the market can remain irrational longer than we can remain solvent

We simply don't know how long this bubble will last


You can use old hardware.


Yeah, but it won't be cheap or easily repairable. And they don't make them anymore.


> It's not called war on general computing for nothing.

Companies paying too much for hardware to chase a bubble is not "war on general computing".

> Even Raspberry Pi.

What's preventing supply from catching up with demand in this situation?

If high prices stick around long term, there will be so many chip fabs ready to pump out $100 pi-equivalents that still let them have a 200% markup.

Also I can go buy a quite good mini PC with 16GB of RAM for $300. In what world does that price go up another 40x?


I would suffocate it. Know the greedy snake idiom? A snake is so hungry and greedy that it suffocates on its prey?

Best you can do is to spread all of the goods it provides, as it is too greedy to not devour them itself. It will consume them and suffocate slowly.


I assume you indicate LLM poisoning with bad web data ? (https://news.ycombinator.com/item?id=47561819)


> It will consume them and suffocate slowly.

Can we accelerate it perhaps? You know, spending ALL our resources on making the snake more fat is not a good idea. Its only good idea when you have so much resources that you can easily suffocate the snake with negligible (for us) amount. If you try to suffocate several million snakes, that might backfire a little.


This is why I now check when I'm researching for a solution (that an LLM cannot figure out.) I go to github but often check if the project was created before 2022 due to AI slop concerns.


This is interesting.

When I read if for the second time, trying to understand it - maybe even better match for the low orbit flying garbage would be "enshitification"? As the time goes on, more and more garbage is produced, and we have no clear way or specific motivated entity to start removing it so it just grows.


Enshittification specifically is when a product/service/platform gets worse from the user’s perspective because the platform vendor can directly profit from user-hostile design; for example, Google intentionally serves up bad results on the first search results page so the user clicks-through to the second page of results, resulting in more advert revenue to Google[1].

…whereas I feel what you’re describing is another Tragedy-of-the-Commons.

[1]: https://jackyan.com/blog/2023/09/google-search-is-worse-by-d...


enshittification is a hip, tech-bro term to mean "rent seeking" and is nothing new


Rent-seeking is too general of a term. You can rent-seek just by raising prices.

Enshittification specifically means deliberately making the product worse as a rent-seeking strategy.


I've been encrypting my private git repos for a while because I had suspected they were going to do something like this.

https://github.com/flolu/git-gcrypt

It's very easy to set up and integrates nicely into git. Obviously only works if you don't need Actions or anything that requires Github to know what's in your repo (duh).


It’s not how many times, it’s what you do about it. DRY doesn’t mean you have to make abstractions for everything. It means you don’t repeat yourself. That is, if two pieces of code are identical, chances are one of them shouldn’t exist. There are a lot of simple ways you might be able to address that, starting from the most obvious one, which is to just literally delete one of them. Abstraction should be about the last tool you reach for, but for most people it’s unfortunately the first.


IME, even when an LLM is right, a few follow-up questions always lead to some baffling cracks in its reasoning that expose it has absolutely no idea what it's talking about. Not just about the subject but basic common sense. I definitely wouldn't call it the "same mental process" a human does. It is an alien intelligence, and exposing a human mind to it won't necessarily lead to the same (or better) outcome as learning from other humans would.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: