Hey, I notice this kind of thing all the time. People use "data" to tell the story they want to -- similar to how it seems humans make a decision subconsciously then weave a rational decision to back it up afterwards.
Do you have principles on how to tackle this? I feel stuck between the irrationality of anecdata and the irrationality of lying with numbers. As if the only useful statistic is one I collect and calculate myself. And, even then, I could be lying to myself.
Review the methodology, if you can, and form your own conclusions. Don't bother trying to change people's minds. It rarely works, and often causes conflict, even in the case of people who say they're data-driven.
> They also found that, if inflation adjusted, you get could, in most categories, the same or better quality for the same price.
I argue you must evaluate against median purchasing power; it accounts for inflation and (lack of) wage increases.
Comments from your linked video:
> The problem with the “adjusted for inflation” argument is that it does not factor in buying power. The increase in wages has risen at out half the rate of inflation, so sure; $20 in 1975 would be $124 today, but the minimum wage in 1975 was $2.10 an hour as opposed to $7.25 today, giving you half the buying power you had 50 years ago.
> healthcare, housing, and education ... have increased by an insane margin leaving people with less money once that has been paid for (if at all).
> It's even worse when you consider that people are paying 45-55% of their monthly income on a house that cost 20x more than it would have in 1975. Your buying power is fucked from all sides.
Purchasing power is probably a better metric in a vacuum, but it's hard to analyze accurately
For example, the comment you're citing is claiming that because minimum wage has increased only 3x over the same period of time in which inflation has eroded the relative value of a dollar by 6x, that wages overall have increased at half the rate of inflation. But minimum wage is a measurement of a minimum, while inflation is a measurement of /average/ price increase so they can't be compared 1:1 in this way.
The housing argument also seems odd. In New Zealand (where I'm from -- I'm not familiar with the US' housing market, so the commenter could be right about that geo!) house prices have increased by far more than 20x since the 70s, but the houses available are of substantially higher quality due to improved regulations (e.g. all newer homes are subject to healthy homes rules which mandate insulation) so just comparing inflation-adjusted home prices vs income doesn't tell the full story
(Aside from that, a whole heap of items like food, electronics, transportation are all both far cheaper AND higher quality today than in the 70s)
“Higher quality” isn’t an objective measurement though. And it certainly doesn’t matter if the end result is that people cannot afford to buy it.
What I’d be interested to understand is whether changes to materials (be that buildings or home appliances) has caused an increase in the cost to manufacturer.
I’d wager most things have gotten cheaper to produce these days because the same improvements in technology that can be integrated into the product also applies to technology used to reduce the cost to manufacturer. Plus if wages are below inflation then any labour costs would have declined (relatively speaking) in that time too.
Modern US houses are made of the cheapest, shittiest, flimsiest materials money can buy. I go out of my way not to live in US housing less than 50 years old.
This isn't true for median purchasing power. You're looking at the federal minimum wage, not the median. Only about 1% of hourly workers earn $7.25 or less.
Median earnings were $48,070 in 1975, measured in 2024 dollars, and $51,370 in 2024.
Median earnings in 1970 were closer to 56k in today's dollars. 1970-1980 was a recessionary period, followed by stagflation in the 80s. I hate when people use that time period as an anchor to show growth. It's like using 2009 as an anchor.
I didn't choose 1975. That's the year the parent comment claimed median earnings have dropped from in comparison, so that's the year I have to use to refute the claim.
Estimated median earnings for full-time male workers peaked in 1973 in the chart, until surpassing it in the 2010s. It's hard to find directly comparable data for earlier decades, but estimates put wages significantly lower. If you anchored to the 1920s, 30s, 40s or 50s instead, you'd just show even more growth in median wage. If you're saying we shouldn't compare to the 70s or 80s either, then what's left? Just years after 1990?
What data are you using? It is hard to get solid numbers pre 1975. I looked at SSA Wage index which has 1970 at $6,186. Adjust using PCE, that is only $42,808 in present dollars.
In either case, IMO, +-10% over 60 years should just be considered flat. Calling it flat is probably generous considering how inflation has affected durable goods vs necessities. We can buy more appliances now, but places to put them have never been more expensive relative to income.
Where are you sourcing that data from? The graph I linked using data from the U.S. Bureau of Labor Statistics doesn't go back that far, so comparing to 1970 would not be possible.
That's household income. You need to adjust for the change in households with multiple earners. That's why I said the census data is dirty and conflates things. The number of households with both parents working increased from 46% to 52%, so median household income staying flat means median income for individuals went down pretty significantly.
I'm really frustrated by inflation numbers because there doesn't seem to be a metric that makes sense.
CPI ignores the reality people feel (and swaps in cheaper items that aren't necessarily on par with the original to keep the number lower), gold isn't really a 1:1 with purchasing power...there must be some sort of useful composite metric that merges multiple data points over time like rental/house prices, CPI market basket, dollar vs hard assets like gold to come up with a more accurate number.
The CPI doesn’t arbitrarily “swap in” items. It changes based on consumer behavior. That’s why it now tracks streaming services but not VCRs. Similarly, if the price of Gala apples triples and everyone switches to Fuji, a fixed index would overstate the actual cost of living.
Insofar as gold impacts the cost of things people buy, it’s already included. Adding it directly to the CPI makes no more sense than adding Bitcoin or soybean futures.
The cost of housing is already is a massive component of the CPI.
But if you used to be able to afford steak and now all you can afford is ground turkey, readjusting the basket of goods for that shift in "preference" is just hiding the fact that nobody can afford steak anymore.
And similarly, the hedonic adjustments to smartphones sort of implicitly claim that the $100 cheap smartphone you can buy today is worth $8000 back in 2009 because of how much better processors and memory have gotten. But you can't buy an iPhone 1.0 for $1 to satisfy the need to have a phone that you can install apps onto (and the upgrade cost every few years as cheap phones can no longer run an O/S version that your banking app requires).
The assertion that the CPI simultaneously overlooks downward product substitution and prices in product improvements in order to paint an overly-rosy picture is belied by the fact that most stuff is cheaper than it’s ever been.
Thirty years ago, internet service was $2.95/hour (in 1996 dollars!), long-distance phone calls were 10 cents/minute, and a low-res 28” color TV with 5 channels cost a fortune.
I don't care about internet service, long distance phone calls, or TVs. I care about shelter, groceries, healthcare, and education. I can forego the former, I must buy the latter.
> a low-res 28” color TV with 5 channels cost a fortune.
Uh, back in 2000 (okay not quite 30 years ago, but getting close) I had a 36" Sony Wega which cost around $1500 with DirectTV and hundreds of channels. A 25" to 27" TV was more of a 1988 kind of thing (which is almost 40 years ago now). Being limited to 5 OTA channels was more of a 1980 thing.
But again you can't really buy that Sony Wega anymore, even though the CPI probably prices it at $20 these days.
Back 50 years ago, average household spend on the Internet was also $0, so it was very cheap, we weirdly didn't spend anything on it when I was growing up. Now I spend $80/month on it, and have trouble finding anything cheaper around here.
If you want to consider "communications spend", back in 1988, you might have spent $50/month on your landline, cable tv and newspaper subscription. Today households tend to spend $280/month on internet, wireless and streaming/cable services. That is actually double the CPI. They get lots more for that, but the cost of being an average middle class household has grown at double the CPI. And these days you need the internet in order to keep up with Joneses, it isn't really a choice.
US households faced much higher costs than you recall. According to the BLS, mean monthly expenditures were $44.75 on landline service, $13.50 on cable TV, and $12.33 on newspapers. That’s $70.33 in 1996 USD, or ~$193 in 2026 USD.
More households subscribe to services today, which inflates the "average expenditures" data cited below: 93% of 1988 households had a landline, 53% cable TV, and 63% newspaper. Compare with today's household services penetration: 98% mobile phone, 94% broadband, and 74% streaming media.
You’re right that this is less than the cost of internet + cell + streaming services today — these are ~50% higher than the 1988 bundle — but consider the differences: you can access almost any kind of content from almost anywhere. And you can consume it on a smart phone or TV that costs 75% less in real terms than that TV from 2000.
Meanwhile, real median household income grew from ~$65,130 in 1988 to $83,730 in 2024 — and furthermore, the tax burden on the middle class fell during this period.
You're only going to hear from people who think that the CPI underestimates inflation. If the CPI overestimates inflation for an given individual, they have no reason to comment on it.
> The problem with the “adjusted for inflation” argument is that it does not factor in buying power. The increase in wages has risen at out half the rate of inflation, so sure; $20 in 1975 would be $124 today, but the minimum wage in 1975 was $2.10 an hour as opposed to $7.25 today, giving you half the buying power you had 50 years ago.
Now do the same analysis but using median wage not minimum. YouTube comments are for entertainment purposes only.
The parent poster is saying (and I agree) that Airpods and Airtags are only superior because Apple anti-competitively privileges their integration with iPhones. It's not that they are better at the hardware level by itself.
And since iPhones form the largest single company's device network in the rich countries, that is a pretty big advantage.
Who do you follow for news on the Ukraine-Russian war? I use to follow combat footage to see what was going on, but I had to stop after seeing too many minefield and drones bombing humans videos.
This is considered the most "pro-Russian" sub-Reddit, so it balances the Anglosphere deluge of pro-Ukrainian material pretty well. The most important poster is u/HeyHeyHayden, who's content is so important another user built a site to archive it: https://old.reddit.com/r/UkraineRussiaReport/comments/1pfjpx...
Hayden mostly compiles battlefield progress from Suriyak Maps (reputable Russian mapper) and AMK_Mapping (reputable pro-Ukrainian mapper). I think you can find both of them on Twitter.
On Youtube, I recommend WillyOAM (Australian infantryman turned journalist), MarkTakacs (Hungarian infantry officer who makes tactical analysis vids), Daniel Davis Deep Dive (retired US Army LtColonel, Desert Storm vet), and HistoryLegends (meme-heavy but generally well-researched battlefield progress vids).
Can anyone point me to a good report of the current working status and known drawbacks of Asahi on Apple Silicon? Would there ever be a reason to run it on a Mac Mini or Apple desktop device? Or at that point would you just get a Linux box?
I’ve managed to get NixOS running on an 8gb MacBook air which tools a bit of tweaks but asahi installer sets everything up where you can boot and install from NixOS
More or less- Due to the amount of unusual requirements for installing on Apple hardware (such as being kicked off from macOS, to name the tip of the iceberg) the Asahi installer gets used for most (all?) distros running on Apple Silicon. https://asahilinux.org/docs/alt/policy/#installation-procedu...
edit: The minimal UEFI part of the Asahi installer specifically sets up a “normal” environment that other distros (like Nix) can use, it doesn’t actually install a full distro like Asahi Fedora
Apple makes great hardware (even more so now with their own CPUs) but I've steered clear of it simply because I run Linux. Last I checked the GPU wasn't fully supported and there were also concerns of of efficiency, that power draw is generally higher than macOS, thus the same hardware on Linux doesn't have the same benefit as with macOS.
Asahi includes a shell script that you run from macOS before installation to properly partition the storage (it’s quite involved). I guess, GP ran the script and then just booted from Nix ISO and installed to the new partition.
Considering how far behind they are of new releases of hardware I'd imagine the most appealing use case is going to be trying to squeeze some more life out of outdated hardware that struggles running the latest Apple software. But that's kind of the sweet spot for a Linux desktop anyway, isn't it?
Does an M3 struggle to run the latest Apple software? I'm running an M2 Pro as my daily driver, and I doubt this thing will need replacing this side of ~5 years
No. But that laptop could easily last ten+ years. If they're just starting to get it working now I doubt the experience is going to have all the kinks worked out for a while anyhow.
Same with my M1. I haven’t noticed anything struggling, even with tons of expensive apps running. Tahoe slowed it down to shit (and I’m not just talking about electron-gate), but Tahoe slowed everyone down to shit.
Local models are slowish, I guess, but that’s pretty niche and they’re still usable. Nothing else is even noticeably laggy at all compared to my partner’s M4.
Do you have principles on how to tackle this? I feel stuck between the irrationality of anecdata and the irrationality of lying with numbers. As if the only useful statistic is one I collect and calculate myself. And, even then, I could be lying to myself.
reply