It’s not easy to quantify because you’re basically substituting or augmenting labor. How do you quantify an ROI on employees? You can look at profit of a project they’re hired to execute. But with AI, it’s mixed with the employees, so how do you distinguish the ROI of the two? With time, we might be able to make comparisons, but outside of very specific scenarios it’s difficult to quantify.
Why is written with an assumption that we have finite hardware production capacity? Industrial processes can scale up, new factories can come online… it will take a while but the whole point of economics is that supply will scale to meet demand. The shortage is a temporary, point-in-time metric.
And that’s not considering the software innovation that can happen in the meantime.
The economic hypothesis that has dominated the past hundred years is that economic growth is infinite because resources are infinite and (almost) free. We all know this is unrealistic and disconnected from our human condition.
Regarding "innovation", I agree with your idea. I even think that the major innovation will be to transpose models locally, using reduced infrastructures that will still be sufficient for the majority of use cases.
I like how GitLab does this, with an SSH server that implements only a few commands for creating PATs, so you can authenticate with your SSH keys and create a short-lived PAT in one command.
I promise you they’re claiming taxes on the depreciation of that machine every year. If anything they’ll be upset you didn’t tell them sooner so they could have claimed more.
If you're a US employee being paid market wages, the cost of the Macbook is rather trivial compared to how much you cost the company, and how much it costs them for you to be not working. But some lower-level managers and employees don't seem to understand this.
“DO YOU HAVE ANY IDEA OF THE PRICE FOR THE PARTS AND LABOR TO REPLACE A SINGLE, GENUINE, APPLE-BRAND, 2021, MACBOOK PRO KEYCAP?!?! CALL THE ACCOUNTANTS, WE WONT BE PAYING TAXES FOR A FEW YEARS!!!”
reply