> That's the reason to ask for the source: so you can judge whether it's reliable.
So the solution to checking whether an article is reliable is to check whether its sources are reliable? How far back do you go? Or do you disregard immediately any article that does not cite only sources you already trust?
You have one thing on your mind - "social media bad" - and it is poisoning your ability to see the complexity in the world. There is rarely a single cause for anything, and there is never a single cause for everything.
"[G]lobal" is doing a lot of work in this sentence if I'm reading it as intended; this seems to exclude international conflict and intra-national strife (which are very big issues).
I was in high school in the early 2010s. In 2010 I went to a school that gave all students ipod touchs, which seemed futuristic at the time. By 2012 phones weren't banned from school, but a teacher would still take it if you were blatantly using it during class.
> Longer term, he was also quite optimistic on its ability to cut out roles like radiologists, instead having a software program interpret the images and write a report to send to a consultant.
As a medical imaging tech, I think this is a terrible idea. At least for the test I perform, a lot of redundancy and double-checking is necessary because results can easily be misleading without a diligent tech or critical-thinking on the part of the reading physician. For instance, imaging at slightly the wrong angle can make a normal image look like pathology, or vice versa.
Maybe other tests are simpler than mine, but I doubt it. If you've ever asked an AI a question about your field of expertise and been amazed at the nonsense it spouts, why would you trust it to read your medical tests?
> Since the consultant already checks the report against any images, the AI being more sensitive to potential issues is a positive thing: giving him the power to discard erroneous results rather than potentially miss something more malign.
Unless they had the exact same schooling as the radiologist, I wouldn't trust the consultant to interpret my test, even if paired with an AI. There's a reason this is a whole specialized field -- because it's not as simple as interpreting an EKG.
I would wager that's how it goes for most people that are both good artists and good programmers -- they were artists first, then learned to program. It takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer. I suspect that might be why the article opens the way it does.
> "It takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer."
Such overgeneralizations are not helpful. People gravitate stronger towards certain creative disciplines, or a selection of them; how long it exactly takes to develop-out "reasonable" skills is dependent on a litany of factors, some of which cannot be controlled (e. g. force majeure). Both programming and pixel art requires unwavering commitment and exercise´; there is no way to "wing it" if you are intellectually honest and take your craft seriously.
I think it is helpful for certain purposes, and I think you'll be hard pressed to find exceptions to the general rule.
Art is all about repetition. Even if you've done it successfully many times, you still need to keep doing it until it's second nature.
Programming is more like solving puzzles. Once you've solved it once, you can pull the solution out of your head as many times as you need, as long as you still remember it.
With art, it doesn't matter if you remember how to do it, it still takes practice to get reproducible results. Of course it takes longer.
> "Art is all about repetition. [...] Programming is more like solving puzzles. Once you've solved it once, you can pull the solution out of your head as many times as you need, as long as you still remember it. With art, it doesn't matter if you remember how to do it, it still takes practice to get reproducible results. Of course it takes longer."
First and foremost, contrary to you it seems, I see art as a measure of quality, not as a simple descriptor of manifestations of human personal, and therefore cultural, expression (albeit using a, naturally technically imprecise, colloquialism such as "pixel art" to describe a school of aesthetics, or style). See also: The Art of Programming. Et cetera.
And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems, rationally or intuitively; in both it takes practice to get reproducible results, in both you need to keep doing it until it becomes "second nature". This refers to the process itself, the process to hone one's craft.
I don't understand what you mean by this. Do you mean to say the worth of an artwork for you is tied to how well it executes technque? "Art" is a word so nebulous that it's hard to pin down a definition, but I think the millions of people that prefer a punk rock song over an academic figure drawing study would disagree with this.
>And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems
Well, I'm both an artist and a programmer, and I can tell you I engage in neither to solve problems. I do both because the process of doing them is enjoyable. If they stop being fun, I'll stop doing them, and there wouldn't be any lingering problem in my life to go unsolved.
If you say you picked up art faster than programming I'll believe you, because I only meant it as a general observation.
Art is like playing Dark Souls -- maybe you beat the hardest boss once, but that doesn't mean you won't die ten more times before beating them again.
Programming is like Zelda. Once you know the solutions to the puzzles, you're basically going through the motions.
This isn't me guessing based on philosophy -- this is my lived experience as both an artist and a programmer.
> "I don't understand what you mean by this. Do you mean to say the worth of an artwork for you is tied to how well it executes technque?"
Art, to me, is a marker of excellence in the already mentioned confines. Technique is just a part of it.
> "Art" is a word so nebulous that it's hard to pin down a definition, [...]"
On that we agree; hence me informing you about mine, otherwise we just run circles around each other.
> "[...] but I think the millions of people that prefer a punk rock song over an academic figure drawing study would disagree with this."
As you probably can deduce by now, I see both examples as having the potential of being art. The rest of your rather labored example is an appeal to preference based on form or expression; such a thing is neither static (e. g. it change change with one's moods, a. s. o.) nor does it have to be a false dichotomy (i. e. I can enjoy both manifestations, even at the same time, and, more importantly, recognize both as artful). But this is also all very basic stuff and in itself tedious, and, especially for the reason you stated, also often useless to engage in online.
> Art is like playing Dark Souls -- maybe you beat the hardest boss once, but that doesn't mean you won't die tent more times before beating them again. Programming is like Zelda. Once you know the solutions to the puzzles, you're basically going through the motions.
Such comparisons, as relatable as they might sound to someone who is familiar with these titles, are often useless as well (I am aware of these games and their game mechanics, but have never played them nor care to do so).
Furthermore, for the reason outlined in the posts you responded to, they're a misfire anyway as art, to me, is first and foremost about the result, and not the way towards the result (as long as certain conditions have been met) as well as life itself being much more complicated... with significant implications for the process of making art and the development of an artist in one or more disciplines.
> As you probably can deduce by now, I see both examples as having the potential of being art
I suppose it was a mistake to get distracted by trying to find out what exactly you're trying to say -- it's now completely clear that it has nothing to do with whether art or programming takes longer to gain proficiency.
>labored [...] tedious
Saying my points are long-winded or redundant also doesn't support your point. You're doing a lot of philosophizing about what art is or whether my points are "useless," but you still haven't reasoned about why it's not true that art takes longer to learn than programming. Which is rich since you've spent more words on this matter than me.
>Such comparisons, as relatable as they might sound to someone who is familiar with these titles, are often useless as well (I am aware of these games and their game mechanics, but have never played them nor care to do so).
So, you haven't played the games, therefore you have no insight into the analogy, so you're not really in a position to say whether the comparison is useless.
You've also used the word "useless" a handful of times here, all without any follow-up as to why exactly. What "use" are you referring to here?
In the context of a programmer wanting to know how learning to draw compares to learning to program (something I've only been asked once, but even once is enough to prove it's useful), to say "expect drawing proficiency to take longer, because it requires more repetition" is useful.
Once again, this isn't deduction or hypothesis. It's my own experience with both crafts.
> "[...] it's now completely clear that it has nothing to do with whether art or programming takes longer to gain proficiency."
I just replied directly to your comment, as I usually do in discussions. Besides, your point of contention, i. e. what takes longer to gain proficiency in (whatever you define as art or the act of programming), has already been adressed multiple times.
> "So, you haven't played the games, therefore you have no insight into the analogy, so you're not really in a position to say whether the comparison is useless."
You misunderstood. The comment was not about me but about the general value of such comparisons. True, I haven't played the games, but I have seen them being played countless times, have some material where they come up in (reference books, art books, magazines, documentation, etc.), and can therefore make sense of your analogy. In the end it's useless mostly for entirely different reasons, though; reasons I have already explained as well.
> "You've also used the word 'useless' a handful of times here, all without any follow-up as to why exactly. What 'use' are you referring to here?"
These discussions are often cumbersome as one has to find common, agreed-upon language in the first place. And more often than not such online discussions don't lead to deeper insights (e. g. performativity measurements who "spent more words" is not something of relevance to me). That has at least been my experience. Don't take it personal.
> "In the context of a programmer wanting to know how learning to draw compares to learnong to program (something I've only been asked once, but even once is enough to prove it's useful), to say "expect drawing proficiency to take longer, because it requires more repition" is useful."
That's, as you've stated, an anecdotal hypothesis based on your life's experience. To me, programming, writing, making music, painting pictures, etc. require creativity, rigorous exercise, repetition, and so on. What discipline was, is, or will be the easier or easiest way for you to get to whatever your goal is I cannot know for this depends on way too many factors, many of them, to top it off, outside of any parasocial (online) prism.
> Such overgeneralizations are not helpful. People gravitate stronger towards certain creative disciplines, or a selection of them; how long it exactly takes to develop-out "reasonable" skills is dependent on a litany of factors, some of which cannot be controlled (e. g. force majeure). Both programming and pixel art requires unwavering commitment and exercise´; there is no way to "wing it" if you are intellectually honest and take your craft seriously.
> And furthermore, I see both disciplines as fields which humans engage in to solve specific identified problems, rationally or intuitively; in both it takes practice to get reproducible results, in both you need to keep doing it until it becomes "second nature". This refers to the process itself, the process to hone one's craft.
These are all the words you've said so far that address whether art takes longer to learn than programming. Your points boil down to
1) People have different strengths and weaknesses
2) Both require practice
But neither of these contradicts the statement "art generally takes longer to learn than programming."
> In the end it's useless mostly for entirely different reasons, though; reasons I have already explained as well.
Here are all the words you've spent explaining why the observation is useless:
Oh... actually nothing. This whole discussion started when you said
> Such overgeneralizations are not helpful
But they've already been helpful to me before, and no fewer than one other person. Even if it's not much, "useless" is untrue. I said "this is what I've found to be true, and observed in others like me," and you said "this is not a useful observation." You never said why, you just jumped straight to "I already adressed that."
> "But neither of these contradicts the statement "art generally takes longer to learn than programming."
Man alive, I've already explained this multiple times, and you misread each and every time. You even postulate programming as something outside of art; a statement I have fundamentally disagreed with. You're fighting strawmen, and we therefore run rings around each other.
> "Oh... actually nothing. This whole discussion started when you said [...]"
The discussion started when I objected to your statement that "it takes a lot longer to become a reasonably good artist than it does to become a reasonably good programmer".
To me it's nothing but an imprecisely articulated, sweeping generalization constructed around the anectodal "evidence" that's your life (with an unknown sample size of people you've met or read about that might agree with you to some extent). In other words it's nothing but tedious fallacies, a thing oft observed in such discussions.
It's also a massive red flag; I at least would never be so presumptious and arrogant to make myself the yardstick and declare cocksure that one discipline will take longer than the other for some to me completely unknown reader. I know many a great artist who paints and/or writes but could not program their way out of a wet paper bag (they're practically computer illiterate and have absolutely no ambitions or time to change that), let alone reach the same heights there as in their chosen medium of expression. And vice versa. So what's useful to you, and what might be useful to me, is not automatically applicable to others and therefore it's useless to generalize, at least without any hard data to back it up (and even then the addressed party might be an outlier).
If one wants to find out which form(s) of expression is/are best suited for oneself, one needs to spread the wings and take to said form(s). How long that will take no one can say for sure; therefore what takes longer if one gravitates to more than one form, no one can can make reasonably accurate predictions about either. Especially not without knowing at least a modicum of relevant information about the individual any advice is supposed to enrich in the first place.
Hence, when addressing a general audience, better concentrate on giving detailed and sound advice on how to get better, or speak to useful mitigation strategies/life hacks, as opposed to shallow and often unapplicable generalizions about the future. In German there's a terminus technicus for such sillyness: Glaskugelei.
> Man alive, I've already explained this multiple times, and you misread each and every time.
Actually, what happened was you explained once, I rebutted, and your reply is now "I already explained."
>You even postulate programming as something outside of art; a statement I have fundamentally disagreed with.
Is this seriously a point of confusion for you? So I need to spell out I meant "drawing and painting" because you aren't able to extrapolate from context?
>To me it's nothing but an imprecisely articulated, sweeping generalization constructed around the anectodal "evidence" that's your life (with an unknown sample size of people you've met or read about that might agree with you to some extent). In other words it's nothing but tedious fallacies, a thing oft observed in such discussions.
Observed experience and testimony from others with similar experience isn't fallacy -- it's valid evidence. You are choosing to ignore it because... actually, I don't know why my thesis is apparently so offensive to you.
> It's also a massive red flag; I at least would never be so presumptious and arrogant to make myself the yardstick and declare cocksure that one discipline will take longer than the other for some to me completely unknown reader.
Not myself -- please show me the place where I said my own experience is my only evidence.
> I know many a great artist who paints and/or writes but could not program their way out of a wet paper bag, let alone reach the same heights there as in their chosen medium of expression.
Sounds like you know artists that didn't have a reason to take the time to learn to program. That doesn't mean that time would be longer than it took to learn to draw and paint.
Up until this sentence I assumed I was talking to another person who does both art and programming. The fact that you have something to say about people you know but nothing to say about your own experiences suggests to me you're probably not an artist. Which means you're just running your mouth about something you have no experience with.
> So what's useful to you, and what might be useful to me
Oh, I realize now you're just new to internet forums, so I should probably explain that not every individual comment needs to have direct relevance to whatever your exact current pursuits happen to be to be a worthwhile contribution to
a discussion.
> If one wants to find out which form(s) of expression is/are best suited for oneself, one needs to spread the wings and take to said form(s). How long that will take no one can say for sure
Sure.
> How long that will take no one can say for sure; therefore what takes longer if one gravitates to more than one form, no one can can make reasonably accurate predictions about either. Especially not without knowing at least a modicum of relevant information about the individual any advice is supposed to enrich in the first place.
That's where you're wrong. There are a lot of people that are qualified to estimate the general amount of time it may take to learn a skill to a certain degree. You're right that no one can tell the exact amount of time, but once again,
show me where I claimed to know the exact amount of time it takes anyone to learn anything.
There are art educators that have spent decades teaching how to draw and paint. If you've seen literally hundreds or thousands of students over the course of decades, you know how long it takes to learn your craft. And some of these edu
cators have shared their knowledge with us. For instance, Jeff Watts of the Watts Atelier has spoken about how long an artist needs to train before their skills are to a level where they can start to assist in teaching*, which is about
ten years to be a "decent teacher."
Ten years of full time study to learn, according to a master who has been teaching for over 35 years. Are you going to lie and tell me it takes that long to get a job as a programmer? I can name more programmers than I can count on my fingers that got a job straight out of a four-year or two-year program. I've never met or heard of an artist that got a full time professional job with less than ten years of study.
> Hence, when addressing a general audience, better concentrate on giving detailed and sound advice on how to get better as opposed to shallow and often unapplicable generalizions.
Are you seriously suggesting a bunch of unsolicited technique advice would've been an appropriate response in a conversation about why the author of the article suggested programmers don't have a reputation for making good artists? And just in case it causes you further confusion -- the author clearly meant "draftsmen" when they said "artists."
Ach, fuck it, one more, for it got personal. A tad bit out of order:
> "Is this seriously a point of confusion for you?"
Another strawman; it's about art as opposed to programming which I objected to, not about some confabulation of art as "drawing and painting".
> "Up until this sentence I assumed I was talking to another person who does both art and programming. [...] Which means you're just running your mouth about something you have no experience with."
I am interested in and develop my skills in both disciplines; I don't claim to be even close to a master in both. So keep such speculations about my life to yourself.
> "Ten years of full time study to learn, according to a master who has been teaching for over 35 years. Are you going to lie and tell me it takes that long to get a job as a programmer?"
You started out with the imprecise statement "reasonably good". That already begged the question what you fucking mean by that. Only now, after much back-and-forth, you roll-in with James Watts who talks in his vidya, after being prompted to describe what he considers the fucking teaching elite of his field and what it took to get there, with some extrapolations based on experience. Not exactly an optimum comparison to some "reasonably good" Coder Johnny in whatever particular (set of) coding language(s) you were sadly only dreaming about in these moments, but they are all the same anyway, amirite? ;)
And the essence worth taking home from Watts? He doesn't, and I paraphrase, "try to put his students in a box" when gouging the way ahead of 'em. In other words: "It depends". Yeah, it fucking does, lol. Any educator worth their salt knows that.
> "I can name more programmers than I can count on my fingers that got a job straight out of a four-year or two-year program. I've never met or heard of an artist that got a full time professional job with less than ten years of study."
Good for you. I on the other hand met many artists that got pro jobs after a four program at a university. Of course, like the programmers, almost each and everyone of them [1] already honed their skills (depending on talent and life circumstances even long) before they enrolled for art (or compsci) courses. That obviously still leaves one to define if these people are just "reasonably good" or are peers to "the (teaching) elite" at that point, let alone taking in account outliers such as (child) prodigies or late bloomers.
> "Oh, I realize now you're just new to internet forums, [...]"
No. I only realized too late that you clearly never made it beyond reiterating tedious logical fallacies in this discussion. You can do better.
1. Only one notable outlier: I know two cutters/editors (one now a successful TV film director) that got jobs straight out of a two- or three-year film school who never did anything even remotely close to their chosen field before.
After classical art training, I thought pixel art would be fast and easy -- the low resolution would disguise any mistakes.
Quite the opposite. The fewer pixels, the more each one has to be perfectly in place. Honestly should've been obvious in hindsight. If I have any games left in me after my current one's finished, I'll just use as high a resolution as I'm comfortable with.
Unless the sprites are truly tiny, like 16x16 with 2 or 3 frame animations, I don't know if pixel art makes a good shortcut to an aesthetically appealing game. Then again, it might be easier than six years of every day practice.
More than a dozen artists I've talked to told me pixel art is entirely it's own discipline - they're no more comfortable approaching it than a layman would.
The traditional workflow of creating a rough sketch on paper or tablet then progressively refining it just entirely doesn't apply.
> "The traditional workflow of creating a rough sketch on paper or tablet then progressively refining it just entirely doesn't apply."
For many a pixel artist that is a typical workflow, especially when working from reference, e. g. by retracing/"converting", say, an architectural period piece such as a street view to be used in a period- and location-accurate adventure game. In other words a classic line-to-pixel A/D conversion.
I've seen at least one indie game (Ta*dQuest) use Midjourney to create pixel art sprites for some NPCs that appear in the dungeon. Extra art, like portraits for those NPCs, was drawn by hand to complement the sprites after they were generated, so it all feels deliberate. I would have never guessed
If I were starting a new project, would it be unwise to just use OpenGL? It's what I'm used to, but people seem to talk about it as if it's deprecated or something.
I know it is on Apple, but let's just assume I don't care about Apple specifically.
OpenGL is fine, it has the same issues now it had before but none of it really comes from "old age" or being deprecated in any way. It's not as debuggable and much harder to get good performance out of than the lower level APIs but beyond that it's still great.
Honestly, starting out with OpenGL and moving to DX12 (which gets translated to Vulkan on Linux very reliably) is not a bad plan overall; DX12 is IMO a nicer and better API than Vulkan while still retaining the qualities that makes it an appropriate one once you actually want control.
Edit:
I would like to say that I really think one ought to use DSA (Direct State Access) and generally as modern of a OpenGL usage as one can, though. It's easy to get bamboozled into using older APIs because a lot of tutorials will do so, but you need to translate those things into modern modern OpenGL instead; trust me, it's worth it.
Actual modern OpenGL is not as overtly about global state as the older API so at the very least you're removing large clusters of bugs by using DSA.
I've found it has less idiosyncrasies, is slightly less tedious in general and provides a lot of the same control, so I don't really see much of an upside to using Vulkan. I don't love the stupid OO-ness of DX12 but I haven't found it to have much of an adverse effect on performance so I've just accepted it.
On top of that you can just use a much better shading language (HLSL) with DX12 by default without jumping through hoops. I did set up HLSL usage in Vulkan as well but I'm not in love with the idea of having to add decorators everywhere and using a 2nd class citizen (sort of) language to do things. The mapping from HLSL to Vulkan was also good enough but still just a mapping; it didn't always feel super straight forward.
(Edit: To spell it out properly, I initially used GLSL because I'm used to it from OpenGL and had previously written some Vulkan shaders, but the reason I didn't end up using GLSL is because it's just very, very bad in comparison to HLSL. I would maybe use some other language if everything else didn't seem so overwrought.)
I don't hate Vulkan, mind you, I just wouldn't recommend it over DX12 and I certainly just prefer using DX12. In the interest of having less translation going on for future applications/games I might switch to Vulkan, though, but still just write for Win32.
OpenGL is still be the best for compatibility in my opinion. I have been able to get my software using OpenGL to run on Linux, Windows, old/new phones, Intel integrated graphics and Nvidia. Unless you have very specific requirements it does everything you need and with a little care, plenty fast.
It's the oldest trick in the fascist book. You can't be a tyrant when the people are used to the idea that citizens have inalienable rights, so you slowly chip away at who counts as a "citizen."
The legal system has been chipping away at the rights themselves (and otherwise expanding governmental power) for hundreds of years, predating fascism (and communism, too). This is just the tactic of the moment.
I can install on my Fedora laptop through dnf. I've never felt like I needed a new word to describe downloading and running an AppImage. Why would phones be different?
`adb sideload` existed as a command for installing an apk from your PC on to your phone. Sideloading was not meant to refer to installing an apk on the phone from the phone.
That actually sounds like a good idea, the situation is similar with an official channel of "trusted" software for which the distributor takes some responsibility, versus whatever file you downloaded yourself. It's certainly more risky on a Debian system to install a .deb from some random website, or an AppImage, compared to a .deb from the official repositories. I guess it's the same for Fedora.
The whole selling point of Android up until now was that it allowed you to install any app you want.
The point of the above comment is that Google intentionally introduced the word "sideload" to make "installing an app on your own device which Google did not curate" sound more risky and sinister than it is, and I'm inclined to agree.
I "make" coffee on my keurig. If Keurig decides that making any single-serve coffe pods that aren't owned by the Keurig brand is now called "off-brewing," I'd dismiss it as ridiculous and continue calling it "making coffee."
We should use the language that makes sense, not the language that happens be good PR for google.
>The whole selling point of Android up until now was that it allowed you to install any app you want.
Could've fooled me. Maybe it was a thing a decade ago when android just launched, but none of the marketing pages for vaguely recent phones has that as a selling point. At best it's a meme that android proponents repeat on hn or reddit.
We're not talking about phones, we're talking about an operating system. If those companies could port IOS to their phone, they probably would. Since the OS will be mostly the same across devices, it makes sense to market a phone based on hardware differences -- like having a higher quality camera.
I've never met or talked to an android user that truly believes android is better technology or a better user experience. They all use it because of flexibility.
You've changed the subject. We were discussing whether one ought to use Google's term for it, or the term that's been used to describe this action since (I assume) the beginning of personal computing. Not whether Google is legally allowed to make the change.
My reason for bringing up the "selling point" was to bring attention to the language -- "You can install any app you want" has always been the common refrain when I see friends get into a debate about IOS vs Android. People are already using the term because it makes the most sense.
Calling something a right is an assertion about morality; it implies that a law to the contrary would be a violation of that right.
I do not believe an an OS vendor with an app store has a right to limit alternate distribution channels or that a government does something wrong by restricting such practices as unfair competition.
"I do not believe an an OS vendor with an app store has a right to limit alternate distribution channels or that a government does something wrong by restricting such practices as unfair competition."
but its not illegal and wrong tho???? if this is probihited then xbox,playstation,nintendo,ios etc would be fined already
unironically android is still more "open" than all of its competitor even after all of this
It might be illegal in the EU under the DMA. As I understand it, litigation involving Apple's equivalent is in progress, and the outcome may not be known for years.
Wrong in this context is an assertion about morality. I do think it's wrong in the context of consumer products for a vendor to attempt to override the wishes of the owner of the product outside of a few narrow exceptions. I would absolutely apply that to iOS, and I think the DMA didn't go far enough; Apple should have no ability to enforce notarization or charge fees to app developers if the device owner chooses otherwise.
I feel less strongly about game consoles because they're not as important as smartphones; they don't touch most aspects of life in modern society, and there are viable alternatives for their primary function, such as gaming on PCs. I don't like their business model and I don't own one.
all of big tech doing it for 20+ years and suddenly google isnt allowed to do "industry standard", like what we talking about here????
I know its bad for pro-sumer which is minority but consumer would get more protection which is majority so I dismiss HN audience because they are biases vs normal people
They all should be? I've never understood why gamers just accept constant blatant anti-competitive practices, going so far as to act as if "exclusives" via DRM are a good thing rather than monopolistic product tying. e.g. it's been demonstrated that a Steam Deck is technically capable of running Switch games better than a Switch, and yet you are forced to buy a Switch in order to buy the games.
It's no longer 30 years ago when hardware was unique and quirky and programs were written in assembly specifically for the hardware. It's all the same commodity parts along with what is supposed to be illegal business practices. In a reasonable world, something like Ryujinx would be just as front-and-center as Proton as part of Valve's product features, and courts would fine companies for trying to stop their software from working on other platforms.
Antitrust law exists exactly to prevent companies from making their own ecosystem/walled garden that competitors cannot sell into. Product tying (forcing you to buy product B in order to buy product A) falls under that umbrella. Game console are not magical in this regard.
Lots of us have a problem with all of those things, and would like the government to enforce the law. I've never bought an Apple product, and the last game console I owned was a PS2 when I was a child.
I don't see how that's related (e.g. Android is FOSS but can use attestation for monopolization), but I do think we ought to make the law require products that contain software come with source as a consumer protection measure.
I do not get this use of the word "reality"? The reality is Ted Bundy's currently-at-large successor has the ability to shoot me with a gun. And that fact is about as relevant as what you said.
What you're doing here is resigning from a game just because of the fact there is a game, and then being condescending to other people for trying to win the game instead, as if what you're doing is something superior. This would already be very odd behaviour if this were only Monopoly or Risk, but is downright dangerous propaganda when the game is capitalism and the future of free computing is at stake.
"Your car needs washed" instead of "you need to wash your car"
Replying "You're good" after someone apologizes.
Adding an S to the end of brand names, especially grocery stores.
I don't do this one, but my extended family in Ohio just says "please" when they mean "could you repeat that?"
reply