Hacker News new | past | comments | ask | show | jobs | submit | musicale's comments login

Not judging, but I am having trouble seeing how

> there's literally nothing interesting here at all to most people who would be attracted by the title

is not some kind of judgment.


I mean every opinion is a judgement. I'm saying I think it's perfectly worthwhile for them to do that project, and it's perfectly valid for them to post their work on their blog, and it might be interesting to someone somewhere even. However if you have in interest in sorting algorithms, or if you have an interest in GPU programming there is nothing of value for you here.

Commercial art? No, can't be art. Illustrations? Only if unpublished during the artist's lifetime. Commissions? Certainly not. Political art? No, just no. Religious art? Are you kidding me?!

Any arbiter of the arts will find that these handy guidelines readily facilitate the elimination and/or downsizing of unnecessary galleries and other non-art collections.


Boo - All I did was confirm the ick while acknowledging that we can, in fact, enjoy a cool thing for what it is - an ad.

See, that’s the thing about this POV that the term “art” is placed on some sort of pedestal and then all the other forms are then compared against it. “How dare you say [other thing requiring effort] is not art!”

Art !== artwork

An element of art is that it functions for its own relationship between artist-work and maybe somewhat the viewer or participant. It’s neither better or worse, it’s just something separate from all of those ideologies. It just is, they just make it. They place it in museums so we don’t have to bump into each others’ egos.

I will reiterate my belief: it’s foolish to wrap up art with corporate, religious or institutional patronage.


> Just because it's commonplace doesn't make it any less hostile to users.

Sure, game consoles are user-hostile. They're also great for playing games, and they tend to "just work" with less configuration and customization than a typical gaming PC.

Less configuration tends to mean fewer problems and easier tech support, but the primary business reason game consoles are locked town is to make it harder to play unlicensed commercial games on them.


It seems you're advocating for the benefits of having a door when the objection is to locking the door.

By all means have a some kind of verified/sealed mode and refuse to support anything that's not in that mode--but there are negative consequences to normalizing a lack of control over the technology that people interact with.

Take the crowd strike incident for instance. Millions of people unable to do jobs that they're relied upon to do, and we can't even hold them accountable for that because it turns out they were never in control of their tools in the first place--locked out of the section necessary to carry out the repair.

You wouldn't tolerate a screwdriver that refused to be used to pry open a paint can. I don't see how it should be any different with a phone. I want to be able to rely on users of tools--not vendors of tools--to do things, and I can't. Not because the people are authentically incompetent, but because some vendor has made a dumb decision about what they're now not allowed to do.


Crowdstrike is software that IT departments install in an attempt to mitigate the security threats that come hand in hand with having the freedom to shoot yourself in the foot.

In thus case, it was Crowdstrike that shot them in the foot.

Managing complexity has a cost that some people don't want to be bothered with.

They are allowed to choose an appliance instead of a PC, even if you would make a different choice.


> but the primary business reason game consoles are locked town is to make it harder to play unlicensed commercial games on them.

Which is user-hostile. The user bought the hardware, so they should be allowed to play whatever they please. Hiding the true cost of the hardware by inflating game prices using licensing fees is monopolistic and an attempt at misleading the consumer.

This is the exact same business model as printer companies reducing the price of printers by inflating the price of printer cartridges and locking down the ability to use third-party ones. It is unbelievable to see people on a site called "Hacker News" defending that business model.


If there were no way to know in advance if you were buying a gaming appliance or a gaming computer, you might have a point.

Some people prefer the simplicity and reliability of an appliance.

Some people embrace complexity in the name of having the freedom to do anything (including the freedom to shoot yourself in the foot).

The notion that consumers shouldn't be allowed to make decisions that are different than your own... THAT is user hostile.


> The notion that consumers shouldn't be allowed to make decisions that are different than your own... THAT is user hostile.

Having the choice is fine, but if there's no way to opt-out then it's not a choice. While far from perfect the Xbox One is a good example of the video game platform that offers an opt-out. And it works, it is one of the most secure gaming consoles on the market and yet it still offers consumers the ability to create their own game software for it.


There is, in fact, a very simple way to opt out of buying an appliance.

Don't buy it.


At some point that means not buying any modern technology because it's all been locked down. We're already most of the way there.

There are plenty of other choices.

Linux isn't going anywhere.

You just don't think people should be allowed to make choices you don't approve of.


> You just don't think people should be allowed to make choices you don't approve of.

This is what Apple believes, not the person you're replying to?


Most of the time people aren't stupid, if they care about where they are spending their money they aren't buying devices on the split of a second and complaining later.

People aren't stupid but there is no way to escape this bussiness model unless you go to PC. And the PC is only begrudgingly still an open platform. If something is ever going to successfully replace the PC it will be a walled garden as well.

I am appalled by how easily people dismiss the importance of open platforms as insecure and inconvenient. How will people ever learn technical skills if all the technology they own is locked down and glued shut?


They buy Android, Windows, Jolla, Pinephone,.....

How long before Android and Windows are also walled gardens that are completely locked down? They're certainly moving into that direction.

If you buy a device you should be allowed to fully own it. You shouldn't be forced to buy niche inferior alternatives or pay a huge premium.


One more reason to support Linux and BSD OEMs.

The Tuxedo, System 76, iXsystems, Pinephone, Jolla, SteamDeck (with native apps, not Proton),... of the world.


That's high praise since it's basically what Apple was going for with the iPad, getting out of your way to let you focus on the task at hand.

WRT drawing, specifically: no, they dragged their feet on introducing a pen/stylus for years. They insisted it be a "touch-only" device (Wacom's patents on display digitizers surely did not factor in /s), which sucked for those of us who couldn't afford a Cintiq, but who had a perfectly good iPad 2 lying around. There were several valiant, if flawed, stylus efforts befor the Pencil, but with awkward connection, pressure, and power solutions, they were DOA.

the supreme art of software marketing is convice your users that even your bugs are features.

the supreme art of product development is understand when lack of some feature is a feature in itself

I think you have answered your own question.

It is weird. Jobs was divisive and (not infrequently) abrasive, and why would you miss a tech billionaire anyway? Yet I also feel indebted to him and to the folks at Apple who helped to produce some of my favorite products like the Mac, the iPod, and the iPad.

Jobs also said a lot of things that still resonate with me. Recently Apple introduced a "classic Mac" screensaver that shows how carefully designed the original Mac GUI was. I'm sure nobody misses the days when app bugs could crash the OS, but I wish Apple were as obsessive now about detail now as they were back then.


Now that I'm becoming an old man, I've taken the time to go back and listen to him properly, to analize his thoughts and words a bit more contextually, and I've come to believe that Steve Jobs was quite misunderstood, both by us, and by himself. When I miss him I think: his thoughts were so very refined for his time, it is quite incredible and I wish he was around to hear more of them. I guess I'm a fan? Oh well...worse things to be.

(the article is good but giving you the hn for comments too: https://news.ycombinator.com/item?id=2131299)


He's definitely misunderstood. If you read his biography it's incredible how much the author of it misunderstands, but if you read between the lines you can see through them. In particular you should note how he changes before and after getting married.

The biography is really awful though. It constantly misquotes people - Bill Gates is directly quoted as saying something so technically inaccurate he can't possibly have said it.

I also remember that every time his son is quoted it's because he was telling a dick joke. At one point the book claims this is why Apple Park is a circle. Why the author did this is not clear to me.

(Btw, I have an unreported Jobs story about this myself. Actually two. I'm not going to tell them, so feel free to just imagine.)


I think "becoming Steve Jobs" is a far better book.

I feel like the official Isaacson biography was trying to tell a story, and would twist facts and reality to fit that story. This certainly makes for entertaining reading, but is not a great way to study history.

Meanwhile "Becoming Steve Jobs" gives the reader glimpses into Jobs's life, often very contradictory glimpses, ones that don't really tell you what to think. It shows you how complex of a person he really was.


I don’t remember many details from the biography at this point, but I remember not liking it either. It seemed like it was written with the assumption the reader already knew the about Steve’s more public life and career, and skipped over much of it. It didn’t feel like it would be a good source for future generations to learn about Steve, as it seemed to largely ignore the entire reason a book was being written about him. I also remembering it seeming largely negative, trumpeting the views of critics, and while downplaying the good to balance it out. Though this could also be my memory fading, feel free correct me if I’m wrong.

It was my first Isaacson biography, and didn’t leave me excited for another one.


> I also remembering it seeming largely negative

It definitely was, but at least parts of that must have been warranted given Jobs refused to read it, saying something along the lines of "I know I wouldn’t like what it says"


I think that was him trusting the author to be fair and show a balanced view of who he was; maybe that trust was misplaced.

I second “Becoming Steve Jobs.” It actually gives insight into him, rather than just regurgitating what Isaacson thinks are the facts.

Thanks, I’ll check it out.

I still think about how he tried to cure cancer with crystals and then when that didn’t work he used his wealth to get residency in a different state to jump in line for a transplant and still died before his yacht got completed. I don’t misunderstand him at all. Especially the parking in handicap spaces part. Very easy to understand what kind of person he was through his actions. Perhaps we will never see eye to eye, and I feel posts like yours do deserve legitimate opposition as applicable.

> Do I contradict myself?

> Very well then I contradict myself,

> (I am large, I contain multitudes.)

When you speak ill of Jobs you are speaking on his moral character. When others (incl. myself) speak positively on Jobs, they are speaking on his design, business, and life philosophies, which are quite profound. [0]

How you want to weigh the two is up to you, but it is not a contradiction to say someone contains both good and bad.

[0]: https://youtu.be/cHuqhQmc4ok


The worst part of internet culture is the conflation of simplicity and reductionism. Comments are short, people have different contexts, so there’s an instinct to reduce everything to binary and fight to the death over the binary value.

Worst of all is the false good person / bad person dichotomy that leads to great offense at any slight praise for someone the reader has decided is a bad person, or any slight criticism of someone the reader has decided is a good person.

I can’t think of anything less fruitful than arguing over whether a public figure’s personal plus professional life makes them a 100% good person or 100% bad person. It’s strange the conversation ever happens, and yet it’s so incredibly common.


Ok, but more or less everyone is going to have a few things about them that you’re not going to like. When your whole life is up for scrutiny and you have unlimited resources, that’s how it is. If you had a billion dollars there’d be plenty of things people would criticize about you. And anybody else who did too.

He didn’t jump the line, he just got in multiple lines.

Sure. On the one hand, everything adhered to the letter of the law. On the other, he used his money to get served before other people in an otherwise similar position would have been able to do.

I personally view that as more of a failing in the system itself (why are there multiple lines to begin with when organ transport is a solved problem?), but it's not unreasonable to look at somebody exploiting that broken system and question their character.


I know very few people who would't use their wealth to try to save their lifes, or that of their loved ones. It's kind of what wealth is for.

It's the "at the expense of others" thing that makes it more morally grey, and the chain of cause and effect is short enough that people sometimes get up in arms about it.

For some other actions on some sort of badness scale, we have:

- Murdering people for your spare organs. Parts of China do this (somebody survived and escaped recently, so it's stirred things up a bit). Most people think this is very bad.

- Paying for somebody's organs (similar to prostitution at some level, though banned much more frequently than sex work -- if society is structurally so unequal that sacrificing part of your life for a pittance is actually attractive, that reflects poorly on that society, and we try to ban.the rich and powerful from using that power to create scenarios more like my first point).

- What Jobs did. It's technically legal, but he necessarily got an organ before somebody else for no other reason than that he had money. Did that somebody else survive? Who knows. If you factor in that it was actually many people who were displaced, did all of them survive? Unlikely. Organ donations are already fraught with ethical issues and strongly held convictions, and I'm not at all surprised that a number of people would be upset at this.


You know that's still bad right?

What’s the point of making a moral judgment about a bit of human nature that literally everyone in earth shares? It doesn’t make you or me superior to condemn it; we would do the same. So… what does “bad” even mean in this context?

If it is bad to use your money to legally buy yourself advantages that other people cannot afford to buy, then capitalism is bad.

Do you think capitalism is bad?


Why do you only pay the minimum amount of tax?

> Why do you only pay the minimum amount of tax?

You didn’t pose the question to me. And yet.

Very many people don’t. We know there are constructs that would enable us to pay less, yet we choose to not pursue them. We are part of a society that enables us to be what we are, why should we strive to give as little as possible in return?

(And yes, we also don’t send extra money. This is not a contradiction.)


> We know there are constructs that would enable us to pay less, yet we choose to not pursue them.

Only because you don't want to put the effort in to pursuing it. If I told you you could reduce your tax bill by 20% by spinning round in your chair one time I doubt you (or anyone else) would decline.

Every entity generally seeks to take as much as they can and give back as little as they can. Individuals are generally a little less extreme, in my experience, with corporations being the worst.


I would not.

My taxes are not a burden on me. While on the other hand, the local politicians have sought tax cut after tax cut, causing the library to limit services, the schools to cut down on teaching staff, infrastructure maintenance delays, less funding for local social services and city events, and more.

My paying an extra 20% wouldn't fix things, as adding to the general budget would end up simply reducing taxes further, instead of everyone sharing the load.

I hate that I've starting getting involved with local politics. I would rather code.

Or, following your self-centric analysis, I would put the effort into raising my taxes by 20% since the collective benefits give me much more than what I can do individually.


because we all live paycheck to paycheck, to fund wars and Tesla carbon rebates.

While he could have funded a new hospital and not even change his tax bracket.


There are multiple lines because when an organ comes up, it can only last so long, so a person needs to be able to get to the hospital without a certain period of time. Usually this means driving distance. When you have a private plane, the distance expands. The organ still goes to the most sick person in line, not the one with the most money.

I was at a talk with Martine Rothblatt several years back, who created a startup for 3D printed organs. They ended up also building electric helicopters to transport those organs, because the transportation bottleneck was a huge issue.

I try not to judges peoples character when they’re looking death in the face. No one really knows what they’ll do in that scenario. Most people who can save their own life will. This was the premise of the movie SAW… how far are you willing to go to save your own life? How strong is your survival instinct? Most people are never tested, and it’s easy to sit back and judge, but would you just sit back and die? How do we even know there was someone else in line behind Jobs? It could be that he got an organ that would have otherwise been wasted.


Pancreatic cancer is known for being incurable, even in the best of circumstances, early diagnose or not. Having witnessed a family member go through the same thing, I understand Jobs's reaction of trying literally anything else.

Sorry for your loss.

Though SJ "He was diagnosed with insulinoma, which unlike other pancreatic cancers, is curable and can be treated with surgery."

see: https://www.bbc.com/news/technology-16157142#:~:text=He%20wa...


Well, given apparently the posts in this thread reveal me to be an "manic crazy person" (or such I inferred) - I suppose I'll add to it then by saying: I too have read and understood Yogācārabhūmi-Śāstra. I hadn't thought much about it till today, but, I suspect, will do as Steve did. :) :)

There's plenty to not like about Jobs as a person, but Apple exists because of him (twice).

Jobs was more than a tech billionaire. He was someone who had refined personal taste and stood on values and was willing to do what it took to see them through, despite the friction.

And the outcome was a computing company that was waaaay less mediocre than 99% of these other memetic, mediocre gradient-descent chasing privacy-abusing, ad-supported companies.

Apple has raised the bar so high. And the DNA of what is manifesting is Steve’s insistence and vision followed by Tim’s clarity of execution.

Look at the Apple Architecture moves. They got Intel’s hot, slow CPUs out of the device. And replaced them with excellent, quiet, fast, efficient CPUs, with UMA and great features.

It’s hard to nail every detail when you have the surface area of Apple 2025. A huge huge company with billions of users and dozens of device families and services. But the bar is high for most of what they do.


I think of Apple like I think of Disney: _consistently good_ products. Maybe not the best in all the things all the times, and some duds from time to time, but if you blindly hit "play" on a Disney movie you're going to be watching something at least pretty good.

And I need to be more obsessive about proofreading before HN's editing timeout expires.

It's not just about the products themselves, but the philosophy behind them. He had this relentless obsession with making technology feel right (it is all from my perspective)

> why would you miss a tech billionaire anyway

Because we miss new instances of the great products they created to earn all that money.


I could easily be wrong about this but I don't believe Jobs or anyone else at Jobs-era Apple became a billionaire because of it. Because of early infighting/getting fired, ownership was too dispersed for that.

He became a billionaire because Disney bought Pixar.


> massively overpaying its workforce

As I understand it, tech employees are typically paid a small fraction of the revenue that they bring into the company.

Outside of tech, employees are underpaid, and wages haven't tracked productivity growth since the 1970s. Profit growth has greatly exceeded wage growth since the early 2000s, with the exception of the 2008 recession.


The labour theory of value that you allude to here makes no sense. People are paid at the market price based on supply and demand, just like other goods and services.

Tech employees don't "bring in" the company's revenue. That makes the mistake of attributing the products and services of a business to its workers.

>wages haven't tracked productivity growth since the 1970s.

Propaganda. Productivity growth literally is just wage growth, by definition. It is impossible for them not to track each other.


>>Productivity growth is literally just wage growth, by definition.

No, it is the opposite, whether you are measuring Units Per Worker Hour or especially Units Per Worker Dollar.

You have a dozen $25/hr workers in a factory producing 50 widgets/hour, and you now introduce new tools, techniques, and/or materials and they now produce 80 widgets per hour, productivity per hour and per dollar has risen, but worker pay is exactly the same.

If you instead cut their pay to $22/hr their productivity in Units/WorkerHour is unchanged but productivity in Units/Labor$ has risen.

It is ONLY in the limited case where you are paying 100% by piecework that productivity tracks wages, e.g., if those workers are paid $6/Widget produced and they manage to make 50%more widgets/hr, then their pay rises with productivity. But that is uncommon and labor cost is rarely the only input.

Edit: typos


>No, it is the opposite, whether you are measuring Units Per Worker Hour or especially Units Per Worker Dollar.

That is precisely how it is measured: by measuring wages.

>You have a dozen $25/hr workers in a factory producing 50 widgets/hour, and you now introduce new tools, techniques, and/or materials and they now produce 80 widgets per hour, productivity per hour and per dollar has risen, but worker pay is exactly the same.

As is unlikely to surprise you, productivity as a macroeconomic indicator is not measured by lookikg at factories and the tools and techniques they use.

96% of the gap can be explained by the fact that these figures compare productivity growth of the whole economy with wage growth of some workers (the lowest 80% of them - leaving out... the most productive workers), count the productivity growth of the self employed but not their wages, dont take into account overtime, bonuses, or health insurance benefits, and intentionally use different means of measuring inflation across the two figures in an attempt to inflate the numbers.

At the end of the day they track very closely because they are both measures of wages. Productivity is just net output by hour worked and wages is just net output by hour worked. If you use different methods for calculating each you can make either look higher but it is pure methodology.


>>That is precisely how it is measured: by measuring wages.

Again, NO.

Just go to the Bureau Of Labor Statistics and their description of how productivity is measured [0]:

>>"For a single business producing only one good, output would simply be the number of units of that good produced in each time period, such as a month or a year."

Notice not a single mention of wages

It then goes on describing how they measure aggregate output in sectors of the economy. Wages is only mentioned ONCE, for charities and government organizations (since their output is not sold).

>>Government services and the output of nonprofits are not sold in the marketplace, so these types of output can be difficult to measure. For example, what is the output of a charity? Often these outputs are measured by the wages and benefits - compensation - paid to workers producing these outputs.

and then they point out:

>>

Since productivity compares output to input, if the output is measured by the input, any time the input grows, the output grows by the same amount.

>>Measuring output by labor input is similar to including the same amount in the numerator as in the denominator of the labor productivity ratio.

>>This implies no productivity growth for that group of workers, dampening productivity change for the industry and sector. For this reason, BLS productivity measures exclude government, nonprofits, and private household production.

So the ONLY mention of wages is specifically EXCLUDED from measures of productivity.

Then the summary: Output is measured primarily as an index of product revenues, adjusted for price changes. Adjustments are made to ensure that output that is sold to another business within the same measuring unit (industry or sector) is excluded to prevent counting it more than once.

Again, no mention of wages.

I have no idea where you get your misconceptions, but you really need to study some actual economics before posting pages of obviously wrong nonsense.

[0] https://www.bls.gov/k12/productivity-101/content/how-is-prod...


>Productivity growth literally is just wage growth, by definition. It is impossible for them not to track each other.

I don't understand what you mean by this. If i own a business, and employee productivity increases but i don't increase wages doesn't that disprove your statement?


Productivity as a macroeconomic measure (which is what is being discussed here) is just a measure of wages.

https://www.epi.org/productivity-pay-gap/

The graph on this page disagrees, can you explain please? I haven't heard anyone say the two concepts are the same before.

As I have heard it "productivity" is roughly gdp/hour, which can be different from wages/hour (but you generally expect the two to be related / correlated).


> Tech employees don't "bring in" the company's revenue.

Sure they do. Every employee contributes to the revenue a company brings in. If they don't, then they should be fired.

> Productivity growth literally is just wage growth, by definition.

Completely false. You accuse a commenter in a sibling thread of not having a good grasp of economics, but that feels like the pot calling the kettle black, here.


You don't "bring revenue in". You are paid for providing a service to your employer. Your employer brings revenue in by providing quite different goods and services to its customers.

By your logic, "cost centres" (like IT, HR, and office management) within a business are bad because they don't bring in any revenue. Except of course in reality they are the same: they provide a service to the business that the business makes use of in providing goods and services to its customers.

I am being pedantic but for good reason: there is no a priori reason why your pay should go up just because your employer has become more profitable, except that it is in the interests of employers to make use of resources efficiently. If they are profitable then hiring more people so they can make more money is good. But it isn't a matter of "deserving" to be paid more or something. Pay isn't based on what you deserve for many reasons, including that you can't attribute the business's profits to its workers and ignore, for example, the investment in capital resources (including IP) required to enable the workers to work effectively. Mainly though because of supply and demand.

If an improvement in productivity makes you more efficient then there should be higher demand for you and you should be paid more. And indeed that is exactly what happens: people in industries that have productivity improvements are paid more afterwards than before.


Minimum wage, fairly, should have been around $32 in 2019 or so. I haven't run numbers lately because it's depressing, but I bet it's worse now.

You can't go off mcdonalds sandwiches, and you can't use the economic indicators that ignore food and fuel.

I've heard wild numbers from "the dollar is worth 50¢ compared to 20 years ago" to "the dollar has lost 98% of its value in the last N years."

I'm not an economist, but I do know my electric bill has been thr exact same dollar amount for 12 years, and I've halved my usage twice in those years. That puts the dollar purchasing power for power at 25% of 2013.

Gasoline changes prices so much I can't really say, it's about twice as expensive for 87 here as 12 years ago, but 93 is 2.5+ times higher.

Food? Don't get me started.

I live in the rural south. I don't really care about price fixing in Los Angeles or silicon Valley.


There is no such thing as what minimum wage should be "fairly". What is the minimum amount you should be able to pay someone an hour? Surely it is the amount where if you didn't, someone else would pay them more.

There are many people who do jobs worth much less than $32/hour. That min wage would just make them illegal to employ.

>You can't go off mcdonalds sandwiches, and you can't use the economic indicators that ignore food and fuel.

You can't use economic indicators that track volatile commodities either. We use baskets of consumer goods and the inflation tracking is very accurate in short time frames but it becomes harder to compare the further out you get. A TV today is much better than a TV even 10 years ago but it is hardly even the same product as one from 1970.

>I've heard wild numbers from "the dollar is worth 50¢ compared to 20 years ago" to "the dollar has lost 98% of its value in the last N years."

What is wild about that? If you had 50c then and you had invested it in even relatively poorly performing investments it would be worth much more than $1 now.

>I'm not an economist, but I do know my electric bill has been thr exact same dollar amount for 12 years, and I've halved my usage twice in those years. That puts the dollar purchasing power for power at 25% of 2013.

That you are not an economist is obvious.

You are aware that dollars buy more things than energy from your energy provider according to your energy plan, yeah?


> There is no such thing as what minimum wage should be "fairly".

Sure there is. It's the amount you have to pay someone such that they can work a reasonable amount of hours (40/week), such that they can afford all of life's essentials while having a little extra to save for a rainy day, as well as have a little fun.

But certainly some people's poilitics ignore the human aspects of the world we live in, and think that the "free market" (something that doesn't actually exist) will sort it out.


What are life's essentials? Do you have the right to a car? To one bedroom per child or should they share? Do you have the right to central heating so you can wear a tshirt indoors or should you be expected to wear a jersey in winter?

The minimum wage doesn't make anyone be paid more. It only causes anyone paid less than it to instead be paid $0, and instead be paid an unemployment benefit. How is that reasonable?


A higher minimum wage puts upward pressure on wages at the lower end of the scale, which helps the working class so that they aren't working and on medicare, section 8, food stamps etc. - as many Wal-mart workers currently are. That saves us money in the long run.

When Seattle raised the minimum wage to $15/hour, everyone screamed it would lead to mass unemployment. That never happen. Suddenly the lower class had more money to spend, which boosted the economy as much or more than higher wages hurt bottom lines.


Higher minimum wage doesn't put upward pressure on other low wages, and even if it did, pushing up the wages of people making $30/hr by taking away jobs from people making $20/hr to put them on benefits earning $15/hr is obscene.

>which helps the working class so that they aren't working and on medicare, section 8, food stamps etc. - as many Wal-mart workers currently are. That saves us money in the long run.

A higher minimum wage leads to lower employment not higher employment.

>When Seattle raised the minimum wage to $15/hour, everyone screamed it would lead to mass unemployment. That never happen.

It has been shown many times that a higher minimum wage causes less employment. It is also obvious from first principles and basic logic. Price controls are a very bad idea, and wages are no exception.


> What is the minimum amount you should be able to pay someone an hour? Surely it is the amount where if you didn't, someone else would pay them more.

A livable wage in the geographic jurisdiction they are in. Including stuff like transportation, healthcare, food, heat, housing, and insurance.

Glad you asked.

oh, the company can't compete without exploiting workers?

oh well.


> A livable wage in the geographic jurisdiction they are in.

This is called the Iron Law of Wages. As its name implies, it's neither prescriptive nor pleasant - but it is guaranteed to be liveable.

> Including stuff like transportation, healthcare, food, heat, housing, and insurance.

The thing that trips people up is that the word "liveable" is a synonym for "subsistence," not "fullfilling." A wage that's only liveable would feel quite exploitative to most people.


The intention of "minimum wage" in the US is not merely subsistence level. FDR said, "by living wages, I mean more than a bare subsistence level-I mean the wages of decent living." [0]

The "iron law of wages" is instead an economic principle that wages tend to trend downwards until people are paid the minimum possible for subsistence. It's not meant to be a goal.

0: http://docs.fdrlibrary.marist.edu/odnirast.html


There are ten million stated reasons for the minimum wage! Pretty much the only one that economists can agree on is that it's helpful to prevent abuses of monopsony power in small towns.

Regardless of what FDR said, a living wage is guaranteed because people will not accept anything lower.

The problem with a "decent" living is that reasonable people can disagree about what that looks like. Roommates? Children? An unemployed spouse? Vacations and retirement?

It's not the government's job to guarantee all of that stuff and I would rather we focus on stopping wage and tip theft and protecting the rights of workers (banning noncompetes, decoupling health insurance, etc.) instead of increasing the minimum wage towards some poorly-defined goal.

There's also the other side of the minimum wage debate, which is that most of the specific numbers people list as "liveable" do actually result in some folks losing their jobs and becoming unemployable. There was even a recent BERKELEY study that showed this!


>>There is no such thing as what minimum wage should be "fairly"

Nonsense.

There most definitely IS a definition of a fair minimum wage -- it is the definition used when it was originally introduced into law:

The wage necessary for a full-time (40hr/week) worker head-of-household to support a family of four above the poverty line -- spouse & kids in a house/apartment, food, medical, education, etc..

We have Walmart workers collecting $6 Billion in benefits per year to stay above the poverty line while the Waltons sit on a $250Billion fortune, it is clear we are subsidizing the rich by failing to set an above-poverty minimum wage.

EDIT: typos, add referenced line


>The wage necessary for a full-time (40hr/week) worker head-of-household to support a family of four above the poverty line -- spouse & kids in a house/apartment, food, medical, education, etc..

Food of what quality? That available in the 1930s?

Medical of what quality? 1930s medical care?

Education to primary level as most had in the 1930s or more than that?

You don't have a natural right to receive the fruits of the labour of others.

>We have Walmart workers collecting $6 Billion in benefits per year to stay above the poverty line while the Waltons sit on a $250Billion fortune, it is clear we are subsidizing the rich by failing to set an above-poverty minimum wage.

Then stop giving benefits to people that have jobs.


>...of what quality?

There are standards for virtually everything. Meet the current basic standards. For food, a basket of FDA/USDA-approved for distribution food to make a basic but nutritious diet for the family of 4. For housing, you could go with minimums for HUD housing. For education, through public high school. These are not hard to figure out (but may be a bit tedious). These are also minimums required to maintain a functional workforce in a modern society.

>>You don't have a natural right to receive the fruits of the labour of others.

>>Then stop giving benefits to people that have jobs.

Right. So what you want is a Dickensian crabs-in-a-bucket labor market where the wage level is set by the most desperate person, who will work for hours to get a crust of bread for his/her next meal. A market where employers can abuse workers at will because there really are 500 others outside the gate who will take his job if he isn't willing to take the beating?

We are no longer living in a frontier society where 97%+ of the workers are producing food.

That insanely over-simplistic model has been tried, and it is a resounding failure, both for every society, every country, and every individual living in it. Those societies inevitably collapse or grow out of it with minimum standards for everyone. And while it is obviously awful for the workers, it is no day at the beach for the oligarchs either, who must live in secured closed-off areas, always frightened of everyone in the public as well as their rivals in power. Unproductive misery for everyone is what you want?

So, NO, the solution is not to just make the people at the bottom more poor, more hungry, and more desperate.

The solution is to stop giving benefits by ensuring that their employer pays the workers sufficiently that they do NOT NEED benefits to survive.

If an employer cannot pay their workers a living wage they do NOT have a business model.

They have an exploitation model.

The exploitation model specifically violates your above principle saying that the employers have a natural right to the fruits of the workers' labor to whatever degree they can exploit the worker by their desperation.

You aren't saying the no one has a right to the fruits of anyone else's labor, you are only saying that no other worker has such a right, but the employers do.

You are saying that if someone has power or deception, whatever they can take is their right.

I say, NO, that is the most dishonorable and amoral of societies.


Despite agreeing with the rest, should the kid who graduated high school 2 months ago (1) earn as a grocery cashier to support a family of four, including medical and education, (2) should they not work / not be employable, or (3) is there room for a minimum lower than this definition?

If a high school graduate comes in and offers the same value as the other cashiers making the same amount of money, then, yes, they should get paid the same. If the other cashiers are more valuable, then they should be paid more.

this isn't really as difficult as everyone makes it. "Minimum wage is a company's way of telling you that if it was legal to pay you less, they would."

If a company can't afford to pay cashiers at different rates based on their tenure and skill, then i guess the company will have to deploy self-checkout, and some people don't like that, so they'll take their business elsewhere. If that means that all grocery stores go "self checkout" then i suppose farmer's markets will become a lot bigger.

This is all about grocery cashiers, please do not try to extrapolate my words to anything else, i am speaking to this very narrow thing.


Sure, we can approach from that angle, instead of the just-graduated-high-school angle.

I did not talk about paying cashiers at different rates. I addressed the single minimum rate from the earlier comment, where a household's single income can "support a family of four above the poverty line -- spouse & kids in a house/apartment, food, medical, education, etc [without relying on assistance programs]". According to the back of this envelope, that would be $70,000/yr = $33/hr. In some areas or with other decisions, maybe only $50,000/yr = $24/hr.

A grocery store would be rare indeed that could afford to pay their lowest-skilled, lowest-tenured cashier at $24/hr. Surely society can come up with a better answer than telling so many grocery stores that self-checkout is the only practical way to stay in business.


if you search HN for genewitch, you will find many times have said that exact dollar amount should be minimum wage, so this isn't a gotcha.

for years^, i've been saying this, since 2018 or 2019. $33 an hour. So if you re-read what i actually said, i explain that i don't really care if a supermarket can't afford to pay cashiers at a livable wage. they can suffer from lack of staff, or go full self checkout and robots, or go out of business. I don't care, like, at all. "But genewitch, what about the families of the shareholders and CEO and board?" uh huh, luckily they can go get a job and make a livable wage somewhere else.

^i've only been posting on HN since 2020, but my point stands


You having said so previously has no impact on this belief:

> A grocery store would be rare indeed that could afford to pay their lowest-skilled, lowest-tenured cashier at $24/hr. Surely society can come up with a better answer than telling so many grocery stores that self-checkout is the only practical way to stay in business.

I'll take you at your word that you don't care to come up with a better answer, and that means I'll gain nothing further from discussing this with you. I'll bow out. Take care!


I would say yes, it is reasonable to have an exception for lower wages for teenage part-time entry-level workers, and maybe some partially-disabled workers.

Of course there would need to be provisions that it not be abused and just used for all positions. E.g., it cannot be used for workers 21 years old or older, etc. And the rules against abuse need to be solid, as we can guarantee that whatever rules are made, employers will work hard to abuse and game the system to their advantage and at the employee's cost.


"Doctor, I need you to declare me disabled or else I'll lose my job when I turn 21. I need this job, and I don't know if I can find another one at the 21yo minimum wage."

I really want to like what you're saying, but I see too many problems. I don't have answers.


>>Of course there would need to be provisions that it not be abused ... And the rules against abuse need to be solid, ...

Exactly that sort of scenario is why I included those phrases.

ANY large system will have imperfections, inadvertent waste, and openings for abuse. Of course these should be minimized, but that shouldn't stop us from making a system. Better a few people benefit undeservedly than many who deserve and need the benefits go hungry.


The need for a system doesn't imply that your proposal is better than the status quo.

> Better a few people benefit undeservedly than many who deserve and need the benefits go hungry.

Agreed. Now consider that regulations exclude people, not include them, overall, by far. (I say this with a job that sees that daily, and where a frequent criticism is that implementing those regulations is government waste.)

What I'm taking away from this is, contrasting with an option I generally dislike, that option actually looks much better than I have previously thought, and it looks definitely better than raising the minimum wage. That is raising the corporate tax rate, which is at historic lows from what I understand, and increasing public benefits. You mentioned Walmart's profits being subsidized by benefit programs, but that valid and important complaint seems to be taken care of this way. This also starts to sound a lot like UBI, which I may have never really understood and have never supported. Maybe I should support it.


If a layoff were actually a real layoff, you could get your job back once business conditions improve.

Tech "layoffs" are something of a euphemism for terminating (rather than pausing) employment for business reasons.


That seems unrealistic. Laid-off workers need to find a new job long before their former employer could hire them back.

If you are unable to obtain a satisfactory replacement position after that time, it might be very desirable to have the option to get your old job back.

Some businesses are seasonal, so it might make sense there as well.

"Originally, layoff referred exclusively to a temporary interruption in work, or employment"

https://en.wikipedia.org/wiki/Layoff


> For the same price as 'out of state' tuition at UC Berkeley, you could hire a $100/hour tutor for 1:1 sessions, 8 hours a week.

Out-of-state fees raise tuition from $17K to $51K, but $17K is still a lot of money (and overall costs add up to about $51K a year even for in-state students.)

Besides expensive $100/hour tutors - Berkeley could (as many schools do) hire cheap undergrads from the previous cohort.

It also seems to me that the basic idea of mastery could be implemented with self-paced learning and individualized assessment, which could potentially be batched based on milestones.


  which could potentially be batched based on milestones
This is a form of 'ability grouping'. Many people in California don't like ability grouping, for ideological reasons, even though it works.

In this case "ability" would basically be "the set (or sequence) of course objectives that you have completed so far."

Such an assessment is not an evaluation of future potential, nor is it a moral judgment. The purpose is simply to determine what the next thing to learn should be.


If you're in a group and some of your peers leave the group because they've mastered the material, you might feel like you're being 'held back'. And observers might feel that certain demographic or identity groups are being 'held back' at a higher rate than others.

Sure, perhaps you leave it up to the student to decide which group they should be in. But what if they move to the next group when they're not ready? Does the tutor cater to them or to the rest of the group? What if some demographic group is less confident on average, and doesn't move ahead even when ready?

The setup you've described might work, but people will find all sorts of reasons to be mad about it, to claim discrimination and/or to use whatever outcomes there are to claim that somebody is being treated unfairly.

The academic literature on 'mismatch' doesn't all lean in one direction.

You should read Sonja Starr's article on the coming magnet school wars. She definitely has an opinion, but she made a decent attempt to present fairly both sides of each issue she covers.


You cannot effectively teach children UNLESS you do ability grouping. The gap just becomes too large. You can't teach one child calculus while another in the same math class can't multiply.

> I do not need to email Linus to ask if I may run software on my Linux Mobile device, or Cook to run software on a Mac.

iOS follows the game console model (even though the iPhone is a "smartphone" with additional non-game features). Games are responsible for something like 70% of iOS app store revenue, which Apple taxes via its platform fees, which are comparable to game console platform fees.

Unlicensed games are, of course, the killer app for sideloading on game consoles. IIRC free developer provisioning used to last up to a year or so (?), but Apple reduced it to a week after someone created a competing app store or web site that distributed unlicensed commercial games (as well as other software) by leveraging developer provisioning. (IIRC Sony analogously killed off Linux for PS3 once someone figured out how to leverage it to run unlicensed commercial games.)

If you pay $100/year for an official Apple Dev account I think you can still get one year provisioning. This probably won't make you or anyone on HN happy, but it does exist.


It's certainly a shame that piracy is a real factor here :(

It should probably be noted, however, that from what I understand, technically speaking, the only reason that piracy is possible in the first place is due to Apple's failures to prevent jailbreaking? iOS apps are encrypted and you need a jailbreak to decrypt them and redistribute them on shady websites. If Apple finally stomps jailbreaking out for good one day (which they are trying to do), then piracy should become impossible even if sideloading becomes an option.

See also: Xbox One has a developer mode that allows unsigned code execution, but that still doesn't allow for piracy because nobody's hacked the console, so nobody can obtain the game binaries to pirate.


I like piracy as a means to punish these user-hostile practices like vendor-lockin and bait & switch.

That's why I started downloading netflix shows again after they doubled the price (I used to be on the 720p plan and now I'd have to pay more than twice to get ad-free netflix). These companies only care about their bottom line and this is where you can hurt them back the most.


> nobody's hacked the [Xbox One] console

It looks like that may have changed last year.

Not entirely surprising since there have been jailbreaks for most game consoles, including PS4/5 and Switch.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: