Hacker Newsnew | past | comments | ask | show | jobs | submit | ElectricalUnion's commentslogin

In bazzite/Fedora Silverblue, it's the expected way non-GUI packages are installed to the host system. The other way is toolbox/distrobox (rootless containers tightly integrated with the host).

Forgot to do the various various maintenance rituals and prayers of function, so now the machine spirit's disposition is poor.

On my gfx1030 "consumer grade hardware", ROCm means using SDMA, and that is broken for my system. Forcing `HSA_ENABLE_SDMA=0` makes it "work", but also makes loading tensors to VRAM take 15x longer.

Not without having a degraded git experience like shallow clones, or using hacks like LFS or Xet, and then you're back at the initial problem of depending on "something else besides your repo".

> can anticipate what you want to do before you even finished your thoughts

I find that claim to be complete BS. I claim instead most stuff will remain undone, incomplete (as it is now).

Even with super-powerful singularity AI, there are two main plausible scenarios for task failure:

- Aligned AI won't allow you to do what you want as it is self-harming, or harm other sentient beings - over time, Aligned AI will refuse to follow most orders, as they will, indirectly or over the long term, cause either self-harming, or harm other sentient beings;

- A non Aligned AI prevents sentient beings from doing what they want. It does what it wants instead.


I am pretty sure that a hole in the pocket in the order of 50 000 000 USD/month (assuming around 20 000 people using AI in not the smartest or most optimized way possible, therefore burning A LOT of tokens) will be noticeable by even the largest companies.

It is noticeable and even promoted, large companies do pay such sums for the API, like $5k+ per person per month. Not every eng is using AI that much already, but companies are clearly willing to pay those sums.

Those days, what I see as "premature database optimization" is non-DBAs, without query plans, EXPLAINS and profiling, sprinkling lots of useless single column indexes that don't cover the actual columns used for joins and wheres, confusing the query planner and making the database MUCH slower and therefore more deadlock-prone instead.

I say "if we got $CURRENT_MODEL that can run under local hardware" claims are postproning BS.

What is gonna happen when that happens? They are gonna cry they need GPT-$CURRENT capabilities locally.

Now we have local models that are way better that GPT-2 (careful, this one is way too dangerous for release!) GPT3.5, in some ways better that 4, and can run on reasonably modest hardware.


From my point of view, they can't even "just turn on the Internet", even if they wanted.

We know from the Ukraine side that "keeping the internet on" requires a whole bunch of personal sacrifice, and a lot of "reasonably recent" electronic equipment and infrastructure that Iran can't simply buy or repair right now.


I'll bet you - dollars to donuts - that Iran has many countrywide IP-based networks running at this second, for things such as broadcast and telecoms.

Perhaps you are underestimating the resources available to a country of 90 million. You could play a game where you estimate the number of routers and switches outside of Tehran under a hypothetical where the the capital was leveled. I don't know how many universities Iran has, but my working assumption is that any one Computer Science department from a D-Tier university is equal to the task, if the physical carrier medium for the Internet is still present and they are bringing their ancient half-rack of equipment.


Someone thought the "commit all previous operations to persistent storage" step would take just 1% of the time.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: