Hacker Newsnew | past | comments | ask | show | jobs | submit | charliebwrites's commentslogin

Apple is usually pretty good at keeping older phones performant enough but,

This is the first upgrade cycle that I upgraded in anger over an unusably slow and energy inefficient OS on an “older” model

I was traveling in south / south east Asia and my iPhone 15 Pro was dying twice a day despite minimal use besides maps and taking some photos. My battery health was 86% so not perfect but surely it shouldn’t die twice a day.

That coupled with the keyboard constantly lagging with every letter I typed made me realize Apple no longer cares about older models.

They threw that out with Liquid Glass.

I wish Android wasn’t also closing off their ecosystem.


The 15 Pro was an especially poor model as the cooling wasn't up to snuff. She be a throttlin'.

My 14 Pro Max has been a champ (though I never upgraded to Liquid Ass, iOS 18 for life on this thing). I have almost zero real reason to upgrade, esp since it'll only cost me $100 or so to have the battery replaced in a few months (I'm at 84% and treating it poorly to get it to a point I can justify the expense).


> If Gmail goes down in 20 years, it will be a major occurrence. If mailgoforward.fart goes down, you’re screwed.

The technical equivalent of “if you default on a $100,000 loan you have a problem. If you default on a billion dollar loan the _bank_ has a problem.


The steps to trouble:

- identify who owns the number

- compel that person to give unlocked phone

- government can read messages of _all_ people in group chat not just that person

Corollary:

Disappearing messages severely limits what can be read


Unless they compel people at gunpoint (which prevents the government from bringing a case), they will probably not have much luck with this. As soon as a user sets up a passcode or other lock on their phone, it is beyond the ability of even most parts of the US government to look inside.

It's much more likely that the government convinces one member of the group chat to turn on the other members and give up their phone numbers.


> which prevents the government from bringing a case

Genuinely, from outside, it seems like your government doesn't give a damn on what they are and aren't allowed to do.


Yes, but I’m not going to unlock my phone with a passcode, and unlike biometric unlock they have no way to force me to unlock my phone.

The district courts will eventually back me up on this. Our country has fallen a long way, but the district courts have remained good, and my case is unlikely to be one that goes up to appellate courts, where things get much worse.

There’s an important distinction: the government doesn’t care about what it is allowed to do, but it is still limited by what it is not capable of doing. It’s important to understand that they still do have many constraints they operate under, and that we need to find and exploit those constraints as much as possible while we fight them


They are capable of putting you in prison until you unlock your phone, or simply executing you.


Feels like the latter would be counter-productive unless there's an app for that.


They are, but again, district courts have been pretty good, and I would be out of jail in <30 days, unless my case goes up on appeal.

And if I die in jail because I won’t unlock my phone: fuck ‘em, they’ll have to actually do it.

I don’t plan on being killed by the regime, but I don’t think I would’ve survived as a German in Nazi Germany, either. I’m not putting my survival above everything else in the world.


Looks that way from the inside as well.


Yes and all of the credulous rubes still whinging about how they "can't imagine" how it's gotten this bad or how much worse it can get, or how "this is not who we are" at some point should no longer be taken as suckers in good faith, and at some point must rightly be viewed as either willfully complicit bad faith interlocuters, or useful idiots.


Learning about WWII in high school, I often wondered how the people allowed the Axis leaders gain power. Now I know. However, I feel we're worse for allowing it to happen because we were supposed to "never again".


Agreed. To see "Never Again" morphed into "Never Again for me, Now Again for thee" has been one of the most heartwrenching, sleep depriving things I've witnessed since some deaths in my family.


Watching it in real time, I still don't understand it. I could see how Trump won the first time around; Hillary Clinton was unpopular with most people outside of her party's leadership, but the second just seems insane. The kinds of things that would happen were obvious to me, and I am no expert.


Two party system. As many people didn't like Hillary, clearly there were a lot of people unhappy with Biden->Harris. When you don't like the current admin's direction and/or their party, there's only one other party to select. I think there were plenty of voters that truly did not believe this would be the result of that protest vote.


Protest votes are probably overstated, I think most of it comes down to people staying home. Everybody in America already knows what side they're on, and they either vote for that side or not at all. Virtually all political messaging is either trying to moralize your side or demoralize the other, to manipulate the relative ratios of who stays home on election day.


> I think most of it comes down to people staying home

Obama was able to get people motivated. Neither Biden nor Harris had anywhere near that motivating ability. I don't know that the Dems have anyone as motivating as Obama line up. The Dems seem to be hoping that enough people will be repulsed by the current admin to show up.


> Obama was able to get people motivated. Neither Biden nor Harris had anywhere near that

How do you explain Biden getting so many more votes than Obama even while Trump improved with black and Hispanics over past Republican candidates?


> How do you explain Biden getting so many more votes than Obama

US population in 2008: 304 million

US population in 2020: 332 million

https://www.macrotrends.net/global-metrics/countries/usa/uni...

Barring enormous turnout differences, pretty much every US election gets more raw votes than the last.


Interesting theory, explain 2024 when the total went down.

Simple enough explanation… 2020 was a massive outlier.

If you forgotten, the topic is GP saying Biden didn’t motivate voters. Well, that does not seem correct.


2024 was a massive outlier. First black woman ever, and the first time a candidate got swapped out mid-campaign. You can't extrapolate much from that one.


I think people were highly motivated in 2020 because of Trump, not Biden. The turnout would have been similar for any credible candidate running against Trump.

What's weird to me is that a lot of people lost that motivation over the next four years. If they found Trump scary in 2020, they should have found him scary in 2024.


And then in 2024 they were 100% opposite motivated for Trump to win popular vote, increase with every demo except for white women, and move almost every single county in the country to the right?

Why would Trump be so unpopular to boost Biden in 2020, then do so much better in 2024?


> Why would Trump be so unpopular to boost Biden in 2020, then do so much better in 2024?

1. He was President at the time, and people blame the President for what's happening (COVID then, recession now). Same deal now.

2. It didn't wind up being Trump/Biden in 2024 at all.


Newsom is an extremely strong candidate. Vance has several critical vulnerabilities that can demoralize right wing voters if the election is handled properly, and the Republicans really don't have anybody else. Rubio maybe, but Rubio won't be able to get ahead of Vance.


> Newsom is an extremely strong candidate.

For what office? President? Do you live in California?


Trump had more than several critical vulns as well which did not dissuade voters. The electorate isn't as predictable as many try to make it sound


Trump was able to moralize his voters, despite his weaknesses, by using a kind of charisma that Vance utterly lacks.


I think Vance isn't planning on using charisma, but violence.


Prior to 2020, I usually voted for third parties so I do understand that kind of thinking. The danger Trump represented was not obvious until well after he took office; it seemed early on like congress and institutional norms would restrain him. To swing the popular vote in the 2024 election, almost all of the third party votes would have needed to go to Harris, so I don't think that's sufficient to explain it.

By the end of his first term, the danger was hard to miss, and the attempt to remain in power after losing the election should have cemented it for everyone.

I was unhappy with Biden and Harris. I voted for them in 2020 and 2024 anyway because I understood the alternative.


> The danger Trump represented was not obvious until well after he took office

I don't get it, was there anything surprising about him after his inauguration? He sure sounded dangerous on the campaign trail.


The norm in 2016 was that candidates didn't make a serious attempt to do the more outlandish things they talked about in their campaign. When they did, advisers would usually talk them into a saner version of it, or congress wouldn't allow it.


Trump 45 had "adults in the room". Trump 47 has nothing but sycophants. The end of Trump 45 started eliminating the adults in the room, but there wasn't enough time left for him to do much drastic. Trump 45 felt like even Trump was shocked he won and there was no real game plan. The transition team was woefully unprepared. Trump 47 had 4 years of prepping with the aide of things like Project 2025. Trump 47 hit the ground running.


> The danger Trump represented was not obvious until well after he took office;

I just do not understand this sentence at all. The writing was clearly on the wall. All of the Project 2025 conversations told us exactly what was going to happen. People claiming it was not obvious at best were not paying attention at all. For anyone paying attention, it was horrifying see the election results coming in.


Project 2025 did not exist in 2016. We are in agreement about 2024.


Not the second time, the third time. Remember that Biden whooped Trump's ass once and could have whooped his ass a second time, but the donor class (career retards) got cold feet when they were forced to confront his senility, and instead of letting the election be one senile old man against another senile old man, they replaced Biden with the archetype of an HR bitch. I hope nobody thinks it a coincidence that the two times Trump won were the two times he was up against a woman. Americans don't want to vote for their mother-in-law, nor for the head of HR. And yes, that certainly is sexist, but it is what it is.

I just pray they run Newsom this time. Despite his "being from California" handicap, I think he should be able to easily beat Vance by simply being a handsome white man with a white family. Vance is critically flawed and will demoralize much of the far right IFF his opponent doesn't share those same weaknesses.


Worse, I often wondered how some people collaborated. Now I know that many people would rather have a chunk of the population rounded up and killed than lose their job.


"Whoever can make you believe absurdities, can make you commit atrocities." and "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

etc, etc. So it goes


You have to remember that "the government" is not a monolith. Evidence goes before a judge who is (supposed to be) independent, and cases are tried in front of a jury of citizens. In the future that system may fall but for now it's working properly. Except for the Supreme Court... which is a giant wrench in the idea the system still works, but that doesn't mean a lower court judge won't jettison evidence obtained by gunpoint.


The courts may (still) be independent, but it feels like they are pointless because the government just wholesale ignores them anyway. If the executive branch doesn't enforce, or selectively enforces court judgements, you may as well shutter the courts.


I would say if you ever want to prosecute these people, then you're going to need a mountain of evidence. Having all these court decisions they flagrantly violate for 4 years is going to great evidence at their trials. It may not give us much solace now, but all of their defenses are going to rest on intent, and it's going to be helpful in proving their state of mind that they willfully defied federal court orders, lied to judges, etc. Because they control all the evidence of their own criminality within the Executive branch, we should expect a lot of it will be destroyed when they are gone. Establishing independent records now through court proceedings will preserve those records past the administration end date.


Evidence goes before a judge

What evidence went before a judge prior to the two latest executions in Minneapolis?


There's a pretty big difference between getting killed in an altercation with ICE, and executing someone just because they refuse to give up their password.


Not really. ICE breaks into your home — remember they don't need a warrant for this. Demands to see your phone. It's locked. Holds a gun to your head and demands you unlock it. You refuse. Pulls the trigger.

Does it really seem that far–fetched when compared to the other ICE murders?


>Does it really seem that far–fetched when compared to the other ICE murders?

No, not really, because in the two killings you can vaguely argue they felt threatened. Pointing a gun to someone's head and demanding the password isn't anywhere close to that. Don't get me wrong, the killings are an affront to civil liberties and should be condemned/prosecuted accordingly, but to think that ICE agents are going around and reenacting the opening scene from Inglorious Bastards shows that your worldview can't handle more nuance than "fascism? true/false".


> but to think that ICE agents are going around and reenacting the opening scene from Inglorious Bastards shows that your worldview can't handle more nuance than "fascism? true/false".

Precisely.

There's no question that ICE is daily trampling civil liberties (esp 4th amendment).

But in both killings there is a reasonable interpretation that they feared for their lives.

Now should they have is another question. With better training, a 6v1 < 5ft engagement can easily disarm anyone with anything less than a suicide vest.

But still, we aren't at the "run around and headshot dissenters" phase.


The old 'shoot em in the leg' defense.


> there is a reasonable interpretation that they feared for their lives

... Did you watch the videos from multiple people filming?


> ... Did you watch the videos from multiple people filming?

Yeah, did you? Any more substantive discourse you'd like to add to the conversation?

To be clear about the word "reasonable" in my comment, it's similar to the usage of the very same word in the phrase "beyond a reasonable doubt".

The agents involved in the shootings aren't claiming that:

- the driver telepathically communicated their ill intent

- they saw Pretti transform into a Satan spawn and knew they had to put him down

They claim (unsurprisingly, to protect themselves) that they feared for their life because either a car was driving at them or they thought Pretti had another firearm. These are reasonable fears, that a reasonable person has.

That doesn't mean the agents involved are without blame. In fact, especially in Pretti's case, they constructed a pretext to began engagement with him (given that he was simply exercising his 1st amendment right just prior).

But once in the situation, a reasonable person could have feared for their lives.


> once in the situation, a reasonable person could have feared for their lives.

Sure, all things being equal, a person on the Clapham omnibus, yada, yada.

However, specifically in this situation it is very frequently not "median people" in the mix, it is LEO-phillic wannabe (or ex) soldier types that are often exchanging encrypted chat messages about "owning the libs", "goddamn <insert ethic slur>'s" and exchange grooming notes on provoking "officer-induced jeopardy" .. how to escalate a situation into what passes for "justified homicide" or least a chance to put the boot in.

Those countries that investigate and prosecute shootings by LEO's often find such things at the root of wrongful deaths.


You're not really disagreeing with the parent.

>That doesn't mean the agents involved are without blame. In fact, especially in Pretti's case, they constructed a pretext to began engagement with him (given that he was simply exercising his 1st amendment right just prior).


> You're not really disagreeing with the parent.

Was there anything else you would like to add as an observation?


They haven't for a long time, just that most of the time they were doing things we thought was for good (EPA, civil rights act, controlled substance act, etc) and we thereby entered a post-constitutional world to let that stuff slide by despite the 10th amendment limiting the federal powers to enumerated powers.

Eventually we got used to letting the feds slide on all the good things to the point everything was just operating on slick ice, and people like Trump just pushed it to the next logical step which is to also use the post-constitutional world to his own personal advantage and for gross tyranny against the populace.


If civil rights are unconstitutional, you don't have a country.


The civil rights acts had firm constitutional grounding in the 14th and 15th amendments.


14th and 15th amendments were binding on government. The civil rights act was binding on private businesses, even those engaging in intrastate trade.

The civil rights act of 1875, which also tried to bind on private businesses, was found unconstitutional in doing so, despite coming after the 15th amendment. But by the 60s and 70s we were already in a post-constitutional society as FDRs threatening to pack the courts, the 'necessities' implemented during WWII, and the progressive era more or less ended up with SCOTUS deferring to everything as interstate commerce (most notable, in Wickard v Filburn). The 14th and 15th amendment did not change between the time the same things were found unconstitutional, then magically constitutional ~80+ years later.

The truth is, the civil rights act was seen as so important (that time around) that they bent the constitution to let it work. And now much of the most relied on pieces of legislation relied on a tortured interpretation of the constitution, making things incredibly difficult to fix, and setting the stage for people like Trump.


All they have to do is pretend to be a concerned neighbor who wants to help give mutual aid and hope that someone in the group chat takes the bait and adds them in. No further convincing is needed.


social engineering for the win.


They'll just threaten to throw the book at you if you don't unlock your phone, and if you aren't rich, your lawyer will tell you to take the plea deal they offer because it beats sitting in prison until you die.


If you aren't saving people's phone numbers in your own contacts, signal isn't storing them in group chats (and even if you are, it doesn't say which number, just that you have a contact with them).

Signal doesn't share numbers by default and hasn't for a few years now. And you can toggle a setting to remove your number from contact discovery/lookup entirely if you are so inclined.


Which is just a redux of what I find myself saying constantly: privacy usually isn't even the problem. The problem is the people kicking in your door.

If you're willing to kick in doors to suppress legal rights, then having accurate information isn't necessary at all.

If your resistance plan is to chat about stuff privately, then by definition you're also not doing much resisting to you know, the door kicking.


> it is beyond the ability of even most parts of the US government to look inside.

I'm sure the Israeli spyware companies can help with that.

Although then they'd have to start burning their zero days to just go after protestors, which I doubt they're willing to do. I imagine they like to save those for bigger targets.


Cellebrite can break into every phone except GrapheneOS.


Cellebrite still requires the device to be confiscated. So if they are trying for mass surveillance, they'll have to rely on phishing or zero day exploits to get their spyware on the device to intercept messages. These tend to get patched shortly after being seen in the wild (like the recent WhatsApp one), so they need to decide if its worth it to burn that zero day or not.


There are multiple companies that can get different amounts of information off of locked phones including iPhones, and they work with LE.

I’m also curious what they could get off of cloud backups. Thinking in terms of auth, keys, etc. For SMS it’s almost as good as phone access, but I am not sure for apps.


or convince one member of a group chat to show their group chat...


I'm confident the people executing non-complaint people in the street would be capable of compelling a citizen.


Or just let the guy to enter the country after unlocking her phone.



This is accurate, but the important point is that threatening people with wrenches isn’t scalable in the way mass surveillance is.

The problem with mass surveillance is the “mass” part: warrantless fishing expeditions.


hunh. we haven't even started talking about stingray, tracking radios and so forth.


it is difficult to wrench someone when you do not know who they are


Someone knows who they are and they can bash different skulls until one of them gives them what they're looking for.


Who is someone?


I mean they have a lot of tools to figure out who you are if they catch you at a rally or something like that. Cameras and facial identification, cell phone location tracking and more. What they also want is the list of people you're coordinating with that aren't there.


It's even easier than that. They're simply asking on neighborhood Facebook (and other services too, I assume) groups to be added to mutual aid Signal groups and hoping that somebody will add them without bothering to vet them first.


I think disappearing messages only works if you activate it on your local device. And if the man compromises someone without everyone else knowing, they get all messages after that.

But yes... it does limit what can be read. My point is it's not perfect.


Is the message on storage zero'd out or just deleted?


compel that person to give unlocked phone

Celebrite or just JTAG over bluetooth or USB. It's always been a thing but legally they are not supposed to use it. Of course laws after the NSA debacle are always followed. Pinky promise.


Every time I see this, I upvote it

I’m sure it’s different than it was when I was a teenager but building Linux from scratch was the thing that got me into computers as a kid

It shows that computers can be accessible _and modifiable_ at the lowest levels


Having done it as a teenager as well when it showed up in 1999 myself, that's probably the sweet point when we are smart (and persistent) enough to figure problems out, but also have enough time to see it through! :D


It is a bit different indeed - more things to compile nowadays. Things such as LLVM take quite some time to compile too. cmake and meson is also needed these days.

Other than that it still works fairly well.


I’d love to see how this impacts their bottom line

Sure short term it’s more “focused” and “greedy”

But the damage to the community and acquisition through a free tier must drop those numbers in an impactful way


Oh, it will - but they don't care. I'm sure they'll eek out 1.5b from their 1.3b acquisition and be happy as clams.

It certainly is depressing to look at what was built and what could be made of it but most of the folks with money lack the creativity or skill to actually build a lasting business. Just burn it down and rob it on the way out - such is the modern economy.


I mean, Broadcom / VmWare is basically doing the same, just more for enterprise level software.

OTOH - if Vimeo has given up all hope of further new features, then giving current users the chance to keep going isn't completely evil, even if it's at a higher price. VmWare is basically doing the same, and lots of customers are leaving, and those who aren't may still eventually do so, etc. (Edit: what if the alternative was Vimeo shutting down?)

Think of vintage car parts - if you absolutely want to restore that '30s Ford (keep using 20+ year old software) - someone offering an OEM-equivalent part for 3x what it cost back in the day (even adjusted for inflation) may actually still be good value - because what other alternatives are there?

Now - does it suck for the employees? Sure. One thing an econ prof said back in the late 90s (who loved to guest-lecture to CS/SWEng students): your job as a software person is to put other people out of work by automating stuff they used to do manually. Are you ok with that? Because if you're not, you should go into a different industry right now. Feels much worse when it's programmers getting the axe due to finance types, but not unexpected.


Why so?

Apple explicitly acknowledged that they were using OpenAI’s GPT models before this, and now they’re quite easily switching to Google’s Gemini


The ChatGPT integration was heavily gated by Apple and required explicit opt-in. That won't be the case with the Gemini integration. Apple wants this to just work. The privacy concerns will be mitigated because Apple will be hosting this model themselves in their Private Cloud Compute. This will be a much more tightly integrated solution than ChatGPT was.


And you don't think they will include an abstraction layer?


An abstraction layer doesn’t prevent Google from seeing the data. Last year the story was that Apple would be running a Google model on their (Apple’s) own server hardware.


This story says the custom model will run on-device and in Apple's Private Cloud Compute. The implication is that Google will not see the data. The "promise" of Private Cloud Compute is that Apple wants it to be trusted like "on-device".

Presumably cutting Google out of getting the data from this is part of why this story first was mentioned last year but is only now sounds close to happening. I think it's the same story/project.


Yes, and that's still the story, as far as I can tell. So an abstraction layer would let them swap out the underlying model


I guess the question is, when are they going to use their own model?

Surely research money is not the problem. Can't be lack of competence either, I think.


I think they want it to work well with web search. That’s why Google is the obvious choice. Also their ai offering is low risk of getting eliminated where as open ai could fail at any time


It appears to be lack of competence given they lied about the initial features of Apple Intelligence.

First, they touted features that no one actually built and then fired their AI figurehead “leader” who had no coherent execution plan—also, there appears to have been territorial squabbling going on, about who would build what.

How on earth did Apple Senior Management allow this to unravel? Too much focus on Services, yet ignoring their absolute failures with Siri and the bullshit that was Apple Intelligence, when AI spending is in the trillions?


There is just too much money being burned in AI for Apple to keep researchers. Also models have no respect for original art which leads to a branding issue of being a platform for artists.

Apple is competent at timing when to step into a market and I would guess they are waiting for AI to evolve beyond being considered untrustworthy slop.


Someone tried to get into my account 2 days ago by attempting to reset it with “forgot password”

That’s never happened to me before, wonder if it’s related


The first line in the article alludes to this:

"If you received a bunch of password reset requests from Instagram recently, you're not alone."


Yes, it happened a couple of days ago on my hidden non active account. I had it for 13 years and it never happened before.


Wow, exactly the same issue for me, and for two different accounts of mine!


Same for me. Also never happened to me before


My hot take: We’re already in a recession.

We just use AI to justify doing all the same recession behaviors while making it sound like innovation

- not hiring this year > AI is making us productive!

- no wage growth > AI means we don’t need to raise salaries

- layoffs > with AI we can do more with less people

- spending less on offsites, work perks etc > we really need that budget for AI

- not spending money on that new business tool > AI can do it instead


My hotter take: we're in an economic death spiral. There isn't enough juice that can be squeezed short of mass wealth redistribution to reinvigorate the economy, and what is out there is locked up in stocks that don't translate to revenue. Slashing rates will just skyrocket prices and over inflate our monetary supply, and further tax cuts will only whittle away what's left of government services and social safety nets. Companies are going to start collapsing in a domino effect as liquidity dries up and contracts get cut.


Massive wealth redistribution is just another word for Taxes.

When we taxed the rich we were happy. It’s literally all we gotta do.


https://fred.stlouisfed.org/series/FYFRGDA188S Federal Receipts as a Percent of GDP measures total government revenue relative to the size of the economy, serving as a standardized way to track the federal tax burden over time.

Although the top marginal tax rate in the 1950s indeed exceeded 90%, federal receipts hovered around 17% of GDP; this was nearly identical to current levels, because loopholes and high income thresholds (roughly equivalent to $2MM for single/$4MM for couples) meant almost no one actually paid that top rate.

The effective tax rate for the top 1% was closer to 42% rather than 90%, demonstrating that extremely high statutory rates on paper do not necessarily generate proportionally higher government revenue.


The total tax intake is fairly unambiguous. Personal tax bill for the vast majority of people is also quite clear. But when you get to the top 0.1% or whatever it is, things like "income" and "tax" get ambiguous. I suppose a lot of the ultra rich don't have much earnings, at best cap gains, and even that can be offset, boxed and off-shored ad nauseam.

Maybe instead of looking at the ultra rich we could look at what GDP fraction the "bottom 95%" contribute to the tax burden - is that more or less than before. Not sure where to look for this data but sounds like a nice little exercise.


>But when you get to the top 0.1% or whatever it is, things like "income" and "tax" get ambiguous. I suppose a lot of the ultra rich don't have much earnings, at best cap gains, and even that can be offset, boxed and off-shored ad nauseam.

But the denominator isn't "income", it's GDP. That's far harder to "offset, boxed and off-shored ad nauseam".


Sure - but not at an individual level. If you're asking, how much tax are the rich paying now vs before, as a % basis, you need an individual numerator and denominator.

Maybe the total tax take now is 17% of GDP, same as before, but when measured correctly, the overall tax rate for the super-rich has gone down, and for the plebs gone up.

To even identify who the super-rich are in this exercise, you may need to be careful with the definition of "rich". If eg you go for highest income earners, you might find upper middle class people instead, with the super-rich having no supposed income as such.


Raising taxes should never be seen as a way to raise revenue. Even if the Laffer curve has come under attack, there is still some profit maximizing rate which I’m positive most modern countries are beyond both at a static rate and at a growth and future revenue maximizing rate. No we don’t tax at this point to increase tax revenue. We do tax to shape what society looks like.

Right now society doesn’t look very good to so many people in the US it’s almost hard to talk about. Job growth is literally people saying, “hey, tomorrow, I can see it look better. We can spend time and resources to create something we all want more than today.” When job growth is low, that vision must also be low.

Taxation can turn that around in an industry. It can turn that around in aggregate. It does thay by both signaling to players, and by changing the game tree payout structure.

I think much of the taxation conversation right now is unfortunate because it keeps getting couched in terms of tax brackets, and that is almost a strawman at this point (even if many people think it’s important). I would say we need to tax the 1% differently. For instance, stock buy backs are currently a hugely distorting effect on the world economy. You can start by greatly taxing that.

The real thing people are talking about when talking about taxing the 1% isn’t just about tax brackets, it’s more about how taxes don’t materially effect people once they reach certain thresholds. It’s the same fundamental problem with traffic tickets. They are not proportional to general wealth so that means it’s a set of laws that apply less and less as one gains wealth which not only feels unfair, it is arguably a corrupting influence undermining the rule of law.


I am choosing not to get involved in a discussion about tax policy miniutiae as I am not an expert in any related way; instead, I wanted to provide factual context to the oft-repeated 'America was better in the 1950s due to the tax rate on the rich,' claim so folks might be able to better understand what they're attempting to say.


It's about how the taxes are spent too. If the government cuts welfare and gives handouts and subsidies to special interests, that is not an effective redistribution.


It would be interesting to see the same graph broken down by wealth (preferably) or income quintile. Maybe higher tax rates don't mean more tax income, but it does mean more wealth redistribution.


no one pays the top rate today either. I've spent a large chunk of my adult life in the top 5% of income earners and I've never had an effective tax rate over 17% that I can recall.


Your experience as a 95 %ile earner does not negate that people in the 99 %ile of earners pay the top rate. (I paid the top rate last year.)


and the effective tax rate today for the ultra wealthy is 0.

Shit, a few years ago Jeff Bezos got a tax credit for his kids.

Think of how absurd that is. Jeff Bezos, the founder of a multitrillion dollar corporation that already receives billions in government contracts and subsidies, who owns a $500m yacht and a multibillion dollar real estate portfolio, asked for, and was given, a tax credit for his adult children.


It really is this simple and this chart of Net Worth Held by the Top 0.1% demonstrates it well -

https://fred.stlouisfed.org/series/WFRBLTP1246


That chart seems to demonstrate that there was a lot of inflation in the early 2020s, which I assume is not your point.


Here's one for share of net worth: https://fred.stlouisfed.org/series/WFRBSTP1300

(Note: I am not GP, and am not necessarily saying you can draw conclusions from this one chart, just that the change in net worth cannot be attributed solely to inflation.)


You can argue it was primarily due to inflation (changes in the economy can never be attributed solely to any one thing).

The upper 0.1% largely owns things that are relatively safe from inflation (like expensive real estate in areas where increases in value have exceeded the rate of inflation for decades) while the lower 50 - 80% does not.

It's effectively impossible to prove definitively, but I find it hard to believe it's a coincidence that the share of wealth held by asset-heavy individuals shot up at the exact same time the money supply increased significantly, especially given that lower class wages were actually increasing faster than inflation and upper class wages at the same time.


You thought that graph mirrored inflation? That $4 in 2000 was $24 today?

I find that difficult to believe.


My hotter take: We've been in a depression since '08 https://fred.stlouisfed.org/graph/?g=cMV


What if some of the population is experiencing a depression, some are in a recession, some are in stagnation, and some are experiencing an expansion?


I don’t know about the US but growth in the UK has been anaemic since the financial crisis.


>excluding current transfer receipts

Why was this metric chosen? What does showing that current transfer receipts grew show we're "in a depression"?


Yeah, there's absolutely a reason that we're not releasing a lot of key economic numbers and it's not because the government needs to clean house of people determined to sabotage things, and it's not because the numbers are so amazing that we have the "best economy the country has ever seen", either.


this needs to be shouted from the rooftops. something drastic needs to happen or else we will be the harbinger of a large-scale societal collapse.


> we will be the harbinger of a large-scale societal collapse.

The harbingers are already here and have been for a while now. Harbingers are signs that things are breaking down. More like "or else we will have a large scale societal collapse".


I've been saying this since last year too.

Sincerely, software developer turned fast food manager.


My even hotter take: This was always the inevitable result of shareholder capitalism.

The only outcomes left are either unchecked descent into fascism as oligarchs consolidate power and finish their government takeover before their current power base falls apart, or a successful socialist revolution.

This is what it's always been leading to.


>This was always the inevitable result of shareholder capitalism.

Just capitalism. There's no actual or necessary distinction about what shape that capitalism takes.

A system where getting more money means you have more opportunity to generate more money by itself has all the feedback loop you need to consolidate over time, generate monopolies, and end up here.

Competition just doesn't happen in a free market. Actually competing, and trying to win marketshare or mindshare that way is too expensive, as there are much simpler and cheaper ways to impact a market.

Competition requires a fair market. This was fully understood by both Roosevelt trust busters, and both Teddy and FDR made big talk about "I'm not trying to kill business, I just want them to compete because that's such a force multiplier".

It doesn't take a socialist revolution. All it takes is like a gentle sprinkling of welfare and a fair and competitive market.

Granted, we have a lot of work to make the current market competitive. We've allowed so much consolidation that we would probably have to actively break up companies, we would have to nullify lots of contracts and IP rights and reduce the power a EULA can hold over you. Interoperability is necessary for competitive markets so we would have to roll back the DMCA anti-circumvention language. Improved customer rights would also help.


> Competition requires a fair market.

Adam Smith's term in "The Wealth of Nations" is "freely competitive market". The "free market" bastardization came much later.


there is a saying that fascism is capitalism in crisis. I don't think this reactionary crisis response is restricted to capitalist economies or fascism. though. We can see this, for just one example, in the various persecutions of Jews in Europe during the middle ages any time there was a crop failure or a plague or some similar disaster.


I think that's mostly just HNers assuming AI like Claude Code is already penetrating the day to day work of the workforce.

"If I use, then everyone is probably using it".

Yet AI penetration is so low right now that it probably has zero role in the job market.

And it keeps us distracted from talking about the real reasons behind job opening decline.

That said, once AI ubiquity picks up within the next few years, we'll have all of the existing problems we're not talking about... plus AI. And we'll probably be even less capable of talking about the complexities of the market intelligently.


I think parent comment was talking about hype vs reality rather than disagreeing with you.

"We're not hiring but AI is in the news" = "We're not hiring because of AI! Don't sell our stock!" It's independent of actual current or future AI adoption.


Maybe. I am likely not a typical HNer, but my company actually has use of AI our 2026 goals. I am not guessing. I know majority of people in this company have those goals baked in. Now, can I suspect other like companies do the same? No. But even if they don't, it does not matter. Because the companies that don't allow AI, have people who use it anyway..


FOMO.

https://www.axios.com/2025/08/21/ai-wall-street-big-tech

Have we forgotten this? It'll find its niche, but it isn't yet a truly transformative one.


<< it isn't yet a << yet

That is a lot pressure to put on a conjunction. It is up there along with 'it will never be'.

In all seriousness ( and some disclosure ), I like this tech so I am mildly biased in my stance. That said, I almost fully disagree with yours.

As much as I dislike Nadella, his last blog entry is not that far off. Using LLMs for stuff like email summaries is.. kinda silly at best. The right use cases may have not emerged yet, but, in a very real sense, it already has been transformative..


"it already has been transformative.."

Yea, at being a search interface. But what else? Not that it can't be, but the failure rate for AI is absurd right now. What happens if it collapses and all its used for is answering questions on your phone and maybe better search of your emails? That seems to be a real and probably likely outcome. What then? Ironically, I think it will improve the economy because there are a lot of decisions that are on hold until we know what LLMs will be used for. Probably isn't going to be good for SEs either way.


<< but the failure rate for AI is absurd right now.

I keep a personal log of specific failures for simple CYA reasons. I do get some, but I can't honestly say it does not seem high to me. A lot likely depends on what is defined as a failure ( to me it typically is a clearly wrong result ). But those clearly wrong results do not seem to cross 10% of output.. so about the same as average human.


The writing is on the wall for AI. It is coming fast and it is transformative. That your company is still trying to ramp up AI adoption and processes for 2026 supports my point.

But we've been blaming AI for a couple years now, yet I suspect it's still too early in the adoption curve to have a meaningful impact on hiring compared to more boring explanations.


Even if AI wasn't being used for daily tasks by general employees, it's being used by HR and staff sourcing firms to sort through applications, so it already has had a large (negative) impact on hiring.

Maybe we should do an "Ask HN" for those in HR or adjacent roles to poll for experiences there.


"it's being used by HR and staff sourcing firms to sort through applications"

I think you are correct, but is anyone happy about the current situation? I suspect it will change and that change very likely will intentionally not involve AI. I suspect it will be an economic solution, not a technological one.


I hear what you are saying. In a very practical sense, I have no real way to measure either of those factors and the company I work for is international so that does not allow for an easy extrapolation. I guess what it really means is: we will find out:P


Really? I see H1B as the tiniest drop in the bucket compared to AI, at least in software. It's not that AI is filling 1 human role with 1 AI, it's that everyone who has a job knows that they need to keep it because the market is insanely cutthroat right now. Everyone has an AI-polished resume, and employers no longer see the value in having talented employees. Even if they did have talented employees they don't trust them enough to know how to do the work. If your employer says "I need you to start using AI" they may as well be saying "I don't trust you to know what's worth is worth your time." I see even a lot of people who have jobs as acting in a way that's consistent with on the verge of being fired, which I think is most of the real "value" of AI so far.


Same here, basically word for word.


What industry and roles?


Finance, but tech adjacent. I am not super comfortable going into more detail.


yes, AI isn't penetrating those fields with high job losses at all


AI isn’t penetrating but all the money needed to invest in the economy has moved over. Maybe that’s also part of the problem


Bespoke AI has not gotten everywhere but generic AI absolutely has.

The workforce is happily making themselves more efficient by using AI on their phones for what used to be multi step look it up in the literature or your supplier's catalog or consult the instructions or read the rules process when performing cookie cutter tasks they know but don't remember exact specifications for.


Do you have a source for that? Everyone I know who works outside of tech is complaining about how AI is making their jobs harder because it’s wrong so much of the time that they’re spending more time correcting it than it saves, and it’s been a boon for cheaters looking to remove obvious tells from their attacks.


I'm talking about people who shower after work not people who shower before work.

I have no doubt that people who are having AI foisted upon them by admins at the behest of someone else hate it.

They use AI as basically a leveled up version of the summaries google used to provide for certain search types. Saves them a bunch of obnoxious clicking around on the internet or in software that was never designed for mobile or to make giving up the kind of info they're seeking easily.


That’s usually also followed shortly by learning that you can’t trust the results or you’ll be making customers whole.


These people usually know enough to know when it's "not quite right". Same "don't trust the docs" story that existed in many workplaces long before AI

An example I saw recently was someone asked for a modern equivalent of a grease that's no longer made/relevant and it replied back with some weird aviation stuff. The "real" answer wound up being "just use anything, the builders intent in specifying was to prevent you from using tallow or some other crap 100yr ago"


Sources for this?


Hot take? Half of the US is formally in recession already.

https://fortune.com/2025/10/09/america-feels-recession-state...

The economy is completely fucked and we are in a race to steal and horde all the data before people catch on.


[flagged]


22/50 ~= half. No one claim it was half the population, other than you (strangely).

TIL Moody’s Analytics is "doomer FUD".


"Half the US" very rarely means "slightly less than half of the states in the US" without any qualifiers.

And yes, Moody's Analytics chief economist is generally known for making wildly inaccurate predictions.

Regardless, your take on all of this is what I called doomer FUD, which goes well beyond Zandi's take.


You’re getting very worked up arguing about a 6% difference. Please grow up. You can call people doomers all you want, but that doesn’t change the fact that the economy is getting worse.


I'm not worked up about anything, just pointing out that a contrived metric isn't all that meaningful but feeds into the preconceived notions of people like the parent poster who seem to love to submit drive-by comments about how terrible everything is.

The economy is not doing great. That doesn't mean the "economy is completely fucked and we are in a race to steal and horde all the data before people catch on", which is an absurd statement on numerous levels.

And don't tell me to grow up, especially when you've completely missed the point.


>”economy is completely fucked and we are in a race to steal and horde all the data before people catch on”

Have you been following current events for the past year? This is not an absurd statement.

You can’t just dismiss news you don’t want to accept as ”doomer FUD”, call a leading economics an “huckster” and then not back it up with real data. Go to twitter or truth social if you want to cherry-pick the news you want to believe in.


Yes, I've been following current events for the last few decades, including the last year. We weren't "completely fucked" in 1990-1991, 2000-2001, 2008, or 2020, (or any of the other cataclysmic events that occurred long before my lifetime) so why would slowing down from the hottest labor market in a generation mean we are "completely fucked" now?

And what on earth does "we are in a race to steal and horde all the data before people catch on" even mean? Who is the "we" who are stealing and hoarding (which is the correct spelling FYI) compared to the "people" who would catch on that it's happening (and presumably care)? What data is being stolen, how, and from whom? Why is hoarding data a problem? Why would any of this matter in the first place if we're "completely fucked" anyway?

Could it be that people like you and the parent poster lack perspective (and potentially some self-awareness) and are prone to overreacting?


Nice - anything to offer up other than name calling? Do you have any ideas of your own related to the displayed data and narrative?


Sure, I already offered some thoughts up-thread.

And I didn't call the parent poster any names, just pointed out that the assessment he was relying on was nonsensical (and probably designed solely to generate ad revenue and brand awareness) and his conclusions were extremely pessimistic compared to the consensus.

Should I have invented my own biased metric (maybe based on land mass since very few of the larger US states are experiencing a recession based on the source provided) as a counterpoint?

What are your ideas concerning the displayed data and narrative?


> My hot take: We’re already in a recession

Personal-savings rate says we're heading into a recession, but aren't yet in one [1]. Labour-force participation, on the other hand, suggests we may be [2].

Assuming we go into another shutdown at the end of the month, none of this may be clear until well into the autumn.

[1] https://fred.stlouisfed.org/series/PSAVERT

[2] https://fred.stlouisfed.org/series/CIVPART


The prime age (25-54 years old) labor force participation rate disagrees.[1] It's almost the highest it's ever been and steady.

[1] https://fred.stlouisfed.org/series/LNS11300060


Also RTO mandates as a way to cover up for layoffs.


Q3 GDP was +4.3% We will know Q4 in a bit, but the projection is +2.7%

Hardly a recession


Remove AI data center spending from that.


> - layoffs > with AI we can do more with less people

That's pretty silly. Just look at the unemployment rate, not at the headlines.


Eh, layoffs also declined, consumer spending is up YoY, and we all know about the investments being made in AI that very well might be propping up the economy.

AI, when effective, is deflationary because it allows for similar productivity at a lower cost. That's what you're describing above, not a shadow recession that is being papered over by claims of AI use.

To my point: You could have replaced "AI" with "computing" for most of the last 50 years and been left with the same argument.


> sudo killall coreaudiod seems to fix it for a while

For me this fixes it for about 30 minutes then I have to do it again… and again… and again…

I wonder why some folks need to do it more than others


> It is a vs code fork

Google may have won the browser wars with Chrome, but Microsoft seems to be winning the IDE wars with VSCode



VSCode is Electron based which, yes, is based on Chromium. But the page you link to isn't about that, its about using VSCode as dev environment for working on Chromium, so I don't know why you linked it in this context.


Which is based on Apple Webkit? The winner is always the last marketable brand.


Both are based on khtml. We could be living in a very different world if all that effort stayed inside the KDE ecosystem


Which came from "the KDE HTML Widget" AKA khtmlw. Wonder if that's the furthest we can go?

> if all that effort stayed inside the KDE ecosystem

Probably nowhere, people rather not do anything that contribute to something that does decisions they disagree with. Forking is beautiful, and I think improves things more than it hurts. Think of all the things we wouldn't have if it wasn't for forking projects :)


On the other hand if that had stopped google from having a browser they push into total dominance with the help of sleazy methods, maybe that would have been better overall.


I still prefer a open source chromium base vs a proprietary IE (or whatever else) Web Engine dominating.

(Fixing IE6 issues was no fun)

Also I do believe, the main reason chrome got dominance is simply because it got better from a technical POV.

I started webdev on FF with firebug. But at some point chrome just got faster with superior dev tools. And their dev tools kept improving while FF stagnated and rather started and maintained u related social campaigns and otherwise engaged with shady tracking as well.


> I still prefer a open source chromium base vs a proprietary IE (or whatever else) Web Engine dominating.

Okay but that's not the tradeoff I was suggesting for consideration. Ideally nothing would have dominated, but if something was going to win I don't think it would have been IE retaking all of firefox's ground. And while I liked Opera at the time, that takeover is even less likely.

> Also I do believe, the main reason chrome got dominance is simply because it got better from a technical POV.

Partly it was technical prowess. But google pushing it on their web pages and paying to put an "install chrome" checkbox into the installers of unrelated programs was a big factor in chrome not just spreading but taking over.


> And their dev tools kept improving while FF stagnated and rather started and maintained u related social campaigns and otherwise engaged with shady tracking as well.

Since when you don't touch Firefox or try the dev tools ?


Where did I say anything like that?

(Wrote via FF)

I use FF for browsing, but every time I think of starting dev tools, maybe even just to have a look at some sites source code .. I quickly close them again and open chrome instead.

I wouldn't know where to start, to list all the things I miss in FF dev tools.

The only interesting thing for me they had, the 3D visualizer of the dom tree, they stopped years ago.


We might not have had Mozilla/Phoenix/Firefox in the first place if so either, who I'd like to think been a net-positive for the web since inception. At least I remember being saved by Firefox when the options were pretty much Internet Explorer or Opera on a Windows machine.


> they push into total dominance with the help of sleazy methods

Ah, yes. The famously sleazy "automatic security updates" and "performance."

It is amazing how people forget what the internet was like before Chrome. You could choose between IE, Firefox, or (shudder) Opera. IE was awful, Opera was weird, and the only thing that Firefox did better than customization was crash.

Now everyone uses Chrome/WebKit, because it just works. Mozilla abandoning Servo is awful, but considering that Servo was indirectly funded by Google in the first place... well, it's really hard to look at what Google has done to browsing and say that we're worse off than we were before.


Have you read about the process of "enshittification"?


Bah! Just another "Hello World" fork if you ask me.


> Both are based on khtml. We could be living in a very different world if all that effort stayed inside the KDE ecosystem

How so?

Do you think thousands of googlers and apple engineers could be reasonably managed by some KDE opensource contributors? Or do you imagine google and apple would have taken over KDE? (Does anyone want that? Sounds horrible.)


I think they meant we wouldn’t have had Safari, Chrome, Node, Electron, VSCode, Obsidian? Maybe no TyeScript or React either (before V8, JavaScript engines sucked). The world might have adopted more of Mozilla.


Note that these are somewhat different kinds of "based on".

Chromium is an upstream dependency (by way of Electron) for VSCode.

WebKit was an upstream dependency of Chromium, but is no more since the Blink/WebKit hard fork.


that's a bit misleading. it was based on webcore which apple had forked from khtml. however google found apple's addition to be a drag and i think very little of it (if anything at all, besides the khtml foundation) survived "the great cleanup" and rewrite that became blink. so actually webkit was a just transitional phase that led to a dead end and it is more accurate to say that blink is based on khtml.


It's "based on WebKit" like English is based on Germanic languages.


English is a Germanic language. It’s part of the West Germanic branch of the Germanic family of languages.


This fact adds nothing to the discussion


That drives exactly $0 of Apple's revenue. It's only a win if you care about things that don't matter.


And Apple is not even the last node in the chain.

WebKit came from KDE's khtml

Every year is the year of Linux.


I wouldn't bet on an Electron app winning anything long-term in the dev-oriented space.


I strongly disagree.

Firstly, the barrier to entry lower for people to take web experience and create extensions, furthering the ecosystem moat for Electron-based IDEs.

Even more importantly, though, the more we move towards "I'm supervising a fleet of 50+ concurrent AI agents developing code on separate branches" the more the notion of the IDE starts to look like something you want to be able to launch in an unconfigured cloud-based environment, where I can send a link to my PM who can open exactly what I'm seeing in a web browser to unblock that PR on the unanswered spec question.

Sure, there's a world where everyone in every company uses Zed or similar, all the way up to the C-suite.

But it's far more likely that web technologies become the things that break down bottlenecks to AI-speed innovation, and if that's the case, IDEs built with an eye towards being portable to web environments (including their entire extension ecosystems) become unbeatable.


Many of VSCode extensions are written in C++, Go, Rust or C#, Java, exactly because performance sucks when written in JavaScript and most run out of process anyway.


> Firstly, the barrier to entry lower for people to take web experience and create extensions, furthering the ecosystem moat for Electron-based IDEs.

The last thing I want is to install dozens of JS extensions written by people who crossed that lower barrier. Most of them will probably be vibe coded as well. Browser extensions are not the reason I use specific browsers. In fact, I currently have 4 browser extensions installed, one of which I wrote myself. So the idea that JS extensions will be a net benefit for an IDE is the wrong way of looking at it.

Besides, IDEs don't "win" by having more users. The opposite could be argued, actually. There are plenty of editors and IDEs that don't have as many users as the more popular ones, yet still have an enthusiastic and dedicated community around them.


> Besides, IDEs don't "win" by having more users. The opposite could be argued, actually.

The most successful IDE of all time is ed, which is enthusiastically used by one ancient graybeard who is constantly complaining about the kids these days.

Nobody has told him that the rest of the world uses 250MB of RAM for their text editor because they value petty things like "usability" over purity. He would have a heart attack - the last time he heard someone describe the concept of Emacs plugins he flew into a rage and tried to organize a death panel for anyone using syntax highlighting.


I tried switching to Zed and switched back less than 24 hours later. I was expecting it to be snappier than VS Code and it wasn’t to any significant degree, and I ran into several major bugs with the source control interface that made it unusable for me.

People dunk on VS Code but it’s pretty damn good. Surely the best Electron app? I’m sure if you are heavily into EMACS it’s great but most people don’t want to invest huge amounts of time into their tools, they would rather be spending that time producing.

For a feature rich workhorse that you can use for developing almost anything straight out of the box, it within minutes after installing a few plugins, it’s very hard to beat. In my opinion lot of the hate is pure cope from people who have probably never really used it.


All these mountains of shit code are going nowhere.


It’s kind of a meme to dunk on Electron, but here’s it’s been for years.

It’s part of the furniture at this point, for better or worse. Maybe don’t bet on it, but certainly wouldn’t be smart to bet against it, either.


VS Code is technically an Electron app, but it's not the usual lazy resource hog implementation like Slack or something. A lot of work went into making it fast. I doubt you'll find many non-Electron full IDEs that are faster. Look at Visual Studio, that's using a nice native framework and it runs at the speed of fossilized molasses.


> many non-Electron full IDEs

VSCode has even less features than Emacs, OOTB. Complaining about full IDEs slowness is fully irrelevant here. Full IDEs provide an end to end experience in implementing a project. Whatever you need, it's there. I think the only plugins I've installed on Jetbrains's ones is IdeaVim and I've never needed something else for XCode.

It's like complaining about a factory's assembly line, saying it's not as portable as the set of tools in your pelican case.


"Complaining about full IDEs slowness is fully irrelevant here. Full IDEs provide an end to end experience in implementing a project."

So? No excuse for a poor interactive experience.


> VSCode has even less features than Emacs, OOTB.

No way that is true. In fact, it's the opposite, which is the exact reason I use VS Code.


Please take a look at the Emacs documentation sometimes.

VSCode is more popular, which makes it easy to find extensions. But you don’t see those in the Emacs world because the equivalent is a few lines of config.

So what you will see are more like meta-extensions. Something that either solve a whole class of problems, could be a full app, or provides a whole interaction model.


> Please take a look at the Emacs documentation sometimes.

I've used Emacs.

> But you don’t see those in the Emacs world because the equivalent is a few lines of config.

That is really quite false. It's a common sentiment that people spend their lives in their .emacs file. The exact reason I left Emacs was that getting a remote development setup was incredibly fragile and meant I was spending all this time in .emacs only to get substandard results. The worst you do in VS Code is set high-level settings in VS Code or the various extensions.

Nothing in the Emacs world comes close to the remote extensions for SSH and Docker containers that VS Code nor the Copilot and general AI integration. I can simply install VS Code on any machine, login via GitHub, and have all of my settings, extensions, etc. loaded up. I don't have to mess around with cross-platform issues and Git-syncing my .emacs file. Practically any file format has good extensions, and I can embed Mermaid, Draw.io, Figma, etc. all in my VS Code environment.

Now, I'm sure someone will come in and say "but Emacs does that too!". If so, it's likely a stretch and it won't be as easy in VS Code.


In 2025, you really picked Emacs as the hill to die on? Who is under 30 who cares about Emacs in 2025? Few. You might as well argue that most developers should be using Perl 6.

    > the only plugins I've installed on Jetbrains's ones
By default, JetBrains' IntelliJ-based IDEs have a huge number of plug-ins installed. If you upgrade from Community Edition to a paid license, the number only increases. Your comment is slightly misleading to me.


Just wait until vi steps into the room. Perhaps we can recreate the Usenet emacs vs vi flame wars. Now, if only '90's me could see the tricked out neovim installs we have these days.


They just made a big song and dance about full updating Visual Studio so it launches in milliseconds and is finally decoupled from all the underlying languages/compilers.

It's still kinda slow for me. I've moved everything but WinForms off it now, though.


I know. It's still the slowest IDE, but I suppose they deserve props for making it better than the Windows 95 speeds of the last version.


VS Code is plenty fast enough. I switched to Zed a few months back, and it's super snappy. Unless you're running on an incredibly resource constrained machine, it mostly comes down to personal preference.


Exactly.

JetBrains, Visual Studio, Eclipse, Netbeans…

VS Code does well with performance. Maybe one of the new ones usurps, but I wouldn’t put my money on it.


I have always found JetBrains stuff super snappy. I use neovim as a daily driver but for some projects the inference and debugging integration in JetBrains is more robust.


Like writing out of process extensions in compiled languages.

VS is much faster considering it is a full blown IDE not a text editor, being mostly C++/COM and a couple of .NET extensions alongside the WPF based UI.

Load VSCode with the same amount of plugins, written in JavaScript, to see where performance goes.


Electron apps will win because they're just web apps - and web apps won so decisively years ago that they will never go anywhere.


No. Electron apps won, not web apps. There's a huge difference.


Web apps won as well. Electron is just a desktop specialization of that.


electron is just a wrapper for the browser tho


It funny that despite how terrible, convoluted and maladapted web tech is for displaying complex GUIs it still gradually ate lunch of every native component library and they just couldn't innovate to keep up on any front.

Amazon just released OS that uses React Native for all GUI.


It's easy to design bad software and write bad code. Like the old saying: "I didn't have time to write you a short letter, so I wrote you a long one". Businesses don't have time to write good and nice software, so they wrote bad one.


If they have time to write nice software, they generally can only afford to do it once.

Lots of Electron apps are great to use.


Why do you consider Electron maladapted? It has really reduced the friction to write GUIs in an enterprise environment.


I didn't really mean Electron, but rather unholy amalgam of three languages, each with 20 years of "development", which mostly consisted of doing decrapifying and piling up new (potentially crappy) stuff. Although Electron with UI context and system (backend? background?) context both running js is another can of worms.


> It has really reduced the friction to write GUIs in an enterprise environment.

Thereby adapted to devs' needs, rather than users'.


It's been winning for a while


The anti-Electron meme is a vocal minority who don’t realize they’re a vocal minority. It’s over represented on Hacker News but outside of HN and other niches, people do not care what’s under the hood. They only care that it works and it’s free.

I used Visual Studio Code across a number of machines including my extremely underpowered low-spec test laptop. Honestly it’s fine everywhere.

Day to day, I use an Apple Silicon laptop. These are all more than fast enough for a smooth experience in Visual Studio Code.

At this point the only people who think Electron is a problem for Visual Studio Code either don’t actually use it (and therefore don’t know what they’re talking about) or they’re obsessing over things like checking the memory usage of apps and being upset that it could be lower in their imaginary perfect world.


why? I don't have a problem with it, building extensions for VS Code is pretty easy

Alternatives have a lot of features to implement to reach parity


Complaining about Electron is an ideological battle, not a practical argument. The people who push these arguments don’t care that it actually runs very well on even below average developer laptops, they think it should have been written in something native.


The word "developer" is doing a lot of work there spec-wise.

The extent to which electron apps run well depends on how many you're running and how much ram you had to spare.

When I complain about electron it has nothing to do with ideology, it's because I do run out of memory, and then I look at my process lists and see these apps using 10x as much as native equivalents.

And the worst part of wasting memory is that it hasn't changed much in price for quite a while. Current model memory has regularly been available for less than $4/GB since 2012, and as of a couple months ago you could get it for $2.50/GB. So even a 50% boost in use wipes out the savings since then. And sure the newer RAM is a lot faster, but that doesn't help me run multiple programs at the same time.


I regularly run 6+ electron apps on a M2 Air and notice no slowdown

2x as many chrome instances, no issues


Sure, 6 electron apps by themselves will eat some gigabytes and you won't notice the difference.

If you didn't have those gigabytes of memory sitting idle, you would notice. Either ugly swapping behaviors or programs just dying.

I use all my memory and can't add more, so electron causes me slowdowns regularly. Not constantly, but regularly, mostly when switching tasks.


> The word "developer" is doing a lot of work there spec-wise.

Visual Studio Code is a developer tool, so there’s no reason to complain about that.

I run multiple Electron apps at a time even on low spec machines and it’s fine. The amount of hypothetical complaining going on about this topic is getting silly.

You know these apps don’t literally need to have everything resident in RAM all the time, right?


> I run multiple Electron apps at a time even on low spec machines and it’s fine.

"Multiple" isn't too impressive when you compare that a blank windows install has more than a hundred processes going. Why accept bloat in some when it would break the computer if it was in all of them?

> Visual Studio Code is a developer tool, so there’s no reason to complain about that.

Even then, I don't see why developers should be forced to have better computers just to run things like editors. The point of a beefy computer is to do things like compile.

But most of what I'm stuck with Electron-wise is not developer tools.

> The amount of hypothetical complaining going on about this topic is getting silly.

I am complaining about REAL problems that happen to me often.

> You know these apps don’t literally need to have everything resident in RAM all the time, right?

Don't worry, I'm looking specifically at the working set that does need to stay resident for them to be responsive.


...so if you spend an extra $4 on your computer, you can get an extra GB of memory to run Electron in?

Here's the other unspoken issue: WHAT ELSE DO YOU NEED SO MUCH MEMORY FOR!?

When I use a computer, I am in the minority of users who run intensive stuff like a compiler or ML training run. That's still a minute portion of the total time I spend on my computer. You know what I always have open? A browser and a text editor.

Yes, they could use less memory. But I don't need them to use less memory, I need them to run quickly and smoothly because even a 64GB stick of RAM costs almost nothing compared to how much waiting for your browser sucks.


My motherboard does not support more memory. Closer to hundreds of dollars than $4. And no I will not justify my memory use to you.

And price is a pathetic excuse for bad work. RAM gets 50x cheaper and some devs think it's fine to use 50x as much of it making their app work? That's awful. That's why computers are still unresponsive half the time despite miracles of chipmaking.

Devs getting good computers compounds this problem too, when they get it to "fast enough" on their machine and stop touching it.

And memory being cheap is an especially bad justification when a program is used by many people. If you make 50 million people use $4 of RAM, that's a lot. Except half the time the OEM they bought the computer from charges $20 for that much extra RAM. Now the bloat's wasting a billion dollars.

And please remember that a lot of people have 4GB or 8GB and no way to replace it. Their apps move to electron and they can't run them all at once anymore? Awful.


> RAM gets 50x cheaper and some devs think it's fine to use 50x as much of it making their app work? That's awful.

That's ABSURD.

> That's why computers are still unresponsive half the time despite miracles of chipmaking.

Have you ever actually used VSCode? It's pretty snappy even on older hardware.

Of course, software can be written poorly and still fit in a small amount of memory, too :)

> Now the bloat's wasting a billion dollars.

Unless users had some other reason for buying a machine with a lot of RAM, like playing video games or compiling code.

Do you think most users spec their machines with the exact 4GB of RAM that it takes to run a single poorly-written Electron app?

> And please remember that a lot of people have 4GB or 8GB and no way to replace it. Their apps move to electron and they can't run them all at once anymore? Awful.

Dude, it's 2025.

I googled "cheapest smartphones India" and the first result was for the Xiaomi POCO F1. It has 8GB of RAM and costs ₹6,199 - about $62. That's a whole-ass _phone_, not just the RAM.

If you want to buy a single 8GB stick of DDR3? That's about $15 new.

> My motherboard does not support more memory. Closer to hundreds of dollars than $4.

If you are buying HUNDREDS of dollars of RAM, you are building a powerful system which almost certainly is sitting idle most of the time.

> And no I will not justify my memory use to you.

Nobody is forcing you to run an electron app, they're just not catering to this weird fetish for having lots of unused RAM all the time.


> That's ABSURD.

What is? The devs or my claim? There are apps that use stupid amounts of memory to do the same thing a windows 98 app could do.

And you can do good or bad within the framework of electron but the baseline starts off fat.

> Unless users had some other reason for buying a machine with a lot of RAM, like playing video games or compiling code.

If they want to do both at the same time, they need the extra. Things like music or chat apps are a constant load.

> Dude, it's 2025.

As recently as 2024 a baseline Mac came with 8GB. Soldered, so you can't buy a stick of anything.

> If you are buying HUNDREDS of dollars of RAM

Not hundreds of dollars of RAM, hundreds of dollars to get a different platform that accepts more RAM.

> Nobody is forcing you to run an electron app

I either don't get to use many programs and services, or I have to deal with these problems that they refuse to solve. So it's reasonable to complain even though I'm not forced.

> weird fetish for having lots of unused RAM

I have no idea why you think I'm asking for unused RAM.

When I run out, I don't mean that my free amount tipped below 10GB, I mean I ran out and things lag pretty badly while swapping, and without swap would have crashed entirely.


same people pushing rust as "it's just faster" without considering the complexities that exist outside the language that impact performance?


Ease of writing and testing extensions is actually the cause why Electron won IDE wars.

Microsoft made a great decision to jump on the trend and just pour money to lap Atom and such in optimization and polish.

Especially when you compare it to Microsoft effort for desktop. They acumulated several more or less component libraries over they years and I still prefer WinForms.


What other UI framework looks as good on Windows, Mac and Linux?


If you want electron app that doesn't lag terribly, you'll end up rewriting ui layer from scratch anyway. VSCode already renders terminal on GPU and GPU-rendered editor area is in experimental. There will soon be no web ui left at all


> If you want electron app that doesn't lag terribly

My experience with VS Code is that it has no perceptible lag, except maybe 500ms on startup. I don't doubt people experience this, but I think it comes down to which extensions you enable, and many people enable lots of heavy language extensions of questionable quality. I also use Visual Studio for Windows builds on C++ projects, and it is pretty jank by comparison, both in terms of UI design and resource usage.

I just opened up a relatively small project (my blog repo, which has 175 MB of static content) in both editors and here's the cold start memory usage without opening any files:

- Visual Studio Code: 589.4 MB

- Visual Studio 2022: 732.6 MB

update:

I see a lot of love for Jetbrains in this thread, so I also tried the same test in Android Studio: 1.69 GB!


I easily notice lag in vscode even without plugins. Especially if using it right after zed. Ngl they made it astonishingly fast for an electron app, but there are physical limits of what can be done in web stack with garbage collected js


That easily takes the worst designed benchmark in my opinion.

Have you tried Emacs, VIM, Sublime, Notepad++,... Visual Studio and Android Studio are full IDEs, meaning upon launch, they run a whole host of modules and the editor is just a small part of that. IDEs are closer to CAD Software than text editors.


- notepad++: 56.4 MB (went gray-window unresponsive for 10 seconds when opening the explorer)

- notepad.exe: 54.3 MB

- emacs: 15.2 MB

- vim: 5.5MB

I would argue that notepad++ is not really comparable to VSCode, and that VSCode is closer to an IDE, especially given the context of this thread. TUIs are not offering a similar GUI app experience, but vim serves as a nice baseline.

I think that when people dump on electron, they are picturing an alternative implementation like win32 or Qt that offers a similar UI-driven experience. I'm using this benchmark, because its the most common critique I read with respect to electron when these are suggested.

It is obviously possible to beat a browser-wrapper with a native implementation. I'm simply observing that this doesn't actually happen in a typical modern C++ GUI app, where the dependency bloat and memory management is often even worse.


Try gvim, neovim-qt or any other neovim gui client, before calling vim a "TUI only experience".

Also, emacs is a GUI app since the 90's .


I never understand why developers spend so much time complaining about "bloat" in their IDEs. RAM is so incredibly cheap compared to 5/10/15/20 years ago, that the argument has lost steam for me. Each time I install a JetBrains IDE on a new PC, one of the first settings that I change is to increase the max memory footprint to 8GB of RAM.


> RAM is so incredibly cheap compared to 5/10/15/20 years ago

Compared to 20 years ago that's true. But most of the improvement happened in the first few years of that range. With the recent price spikes RAM actually costs more today than 10 years ago. If we ignore spikes and buy when the cycle of memory prices is low, DDR3 in 2012 was not much more than the price DDR5 was sitting at for the last two years.


> I never understand why developers spend so much time complaining about "bloat" in their IDEs. RAM is so incredibly cheap compared to 5/10/15/20 years ago, that the argument has lost steam for me. Each time I install a JetBrains IDE on a new PC, one of the first settings that I change is to increase the max memory footprint to 8GB of RAM.

I had to do the opposite for some projects at work: when you open about 6-8 instances of the IDE (different projects, front end in WebStorm, back end in IntelliJ IDEA, DB in DataGrip sometimes) then it's easy to run out of RAM. Even without DataGrip, you can run into those issues when you need to run a bunch of services to debug some distributed issue.

Had that issue with 32 GB of RAM on work laptop, in part also cause the services themselves took between 512 MB and 2 GB of memory to run (thanks to Java and Spring/Boot).


I don’t really complain about bloat in IDEs. They have their uses. But VSCode feature set is a text editor and it’s really bloated for that.


I prefer my RAM to being use for fs cache or on other more useful stuff, instead of launching full lobotomized web browsers.


Anyone saying that Java-based Jetbrains is worse than Electron-based VS Code, in terms of being more lightweight, is living in an alternate universe which can’t be reached by rational means.


> VSCode already renders terminal on GPU

When did they add that? Last time I used it, it was still based on xterm.js.

Also, technically Chromium/Blink has GPU rendering built in for web pages, so everything could run on GPU.


Enabled by default since about a year

> GPU acceleration driven by the WebGL renderer is enabled in the terminal by default. This helps the terminal work faster and display at a high FPS by significantly reducing the time the CPU spends rendering each frame

https://code.visualstudio.com/docs/terminal/appearance#_gpu-...


It's actually been the default since v1.55 which released early April 2021: https://code.visualstudio.com/updates/v1_55#_webgl-renderer-...

Before that from v1.17 (~October 2017) it was using a 2d canvas context: https://code.visualstudio.com/blogs/2017/10/03/terminal-rend...


Wow, it's true--Terminal is <canvas>, while the editor is DOM elements (for now). I'm impressed that I use both every day and never noticed any difference.


I'm not sure how you went from terminal and editor GPU rendering, which can benefit from it, to "there will soon be no web ui left at all".


This is the painful truth, isn't it?

IMO The next best cross-platform GUI framework is Qt (FreeCAD, QGIS, etc.)

Qt6 can look quite nice with QSS/QStyle themes, these days, and its native affordances are fairly good.

But it's not close. VSCode is nice-looking, to me.


Godot looks ok and is surprisingly easy to work with.


Could you suggest an example such application we can try / look at screenshots of?


The Godot editor is written in Godot. The look and feel of the editor is set up to be familiar to people working with 3D, but you're using a 2D* desktop application and all of the parts work responsively.

I've been playing around with different GUI approaches for the desktop, and what impresses me the most about Godot is how lightweight and self-contained it can be while still being cross-platform on both ends.


This question is so easy to answer: Qt! Signed by: Person who frequently shills for Qt on HN. :)


Could you suggest an example such application we can try / look at screenshots of?

(I've been aware of Qt for like two decades; back in the early 2000s my employer was evaluating such options as Tk, wxWindows, and ultimately settled on Java, I think with AWT. Qt seems to have a determined survival niche in "embedded systems that aren't android"?)


I would plug my note-taking app written in Qt C++ and QML: https://get-notes.com.


What’s long term exactly? Between VSCode and previous winners Brackets and Atom Electron has been in this space in the top 5 for 20 years already.

I think the ship sailed


Care to explain why? I like Electron but I've switched to Tauri because it feels way faster and more secure.


It's like those recipes for yogurt.

In order to build a web app, you will first need a web app


I wouldn't bet on Google product for anything long-term.


Even if those devs are vibe-oriented?


its hold the market for over 10 years tho... i wished zed would've not been under gpl


Why not GPL? So we could be seeing closed source proprietary forks by now? How do you think the Zed team would feel about that?


15 years ago, every company had its own "BlahBlah Studio" IDE built on top of Eclipse. Now it's VSCode.

Meanwhile, JetBrains IDEs are still the best, but remain unpopular outside of Android Studio.


    > remain unpopular outside of Android Studio
What a strange claim. For enterprise Java, is there is a serious alternative in 2025? And, Rider is slowly eating the lunch of (classic) Visual Studio for C# development. I used it again recently to write an Excel XLL plug-in. I could not believe how far Rider has come in 10 years.


Oh, sure. I've been using IntelliJ since 2003. But compare the number of C# developers and the number of JS developers.

In my current company, only I am using IntelliJ IDEs. Other people have never even tried them, except for Android Studio.


And IntelliJ

PyCharm’s lack of popularity surprises me. Maybe it’s not good enough at venvs


IME pycharm’s weakness is not integrating with modern tooling like ruff/pyright - their built in type checker is terrible at catching stuff, and somehow there isnt an easy way to run MyPy, black or isort in it.

If there’s a workflow I’m missing please let me know because I want to love it!


Oh, it's good at venvs. Lots of flexibility too on whether to use pip, conda, or uv.


I just checked and I don’t even have the JVM installed on my machine. It seems like Java is dead for consumer applications. Not saying that’s why they aren’t popular but I’m sure it doesn’t help.


IntelliJ IDEs bundle the JVM, so you don't need to install it separately.


Every Java app these days bundles a JVM . It was made easy with jlink like 10 years ago. Only parts of the JVM are included so it’s lightweight.


In the grand scheme of things, Microsoft had always spent more money on developer tooling than most other companies, even in the 90s.

Hence even the infamous Ballmer quote.


In user numbers, maybe. JetBrains is far ahead in actual developer experience though


I wouldn't underestimate Eclipse user statistics. That may sound insane in 2025, but I've seen a lot of heavily customized eclipse editors still kicking around for vendor specific systems, setting aside that Java is still a pretty large language in its own right.


At best, that's subjective, but it's fact that JetBrains is comically far behind when it comes to AI tooling.

They have a chance to compete fresh with Fleet, but they are not making progress on even the basic IDE there, let alone getting anywhere near Cursor when it comes to LLM integration.


JetBrains' advantage is that they have full integration and better understanding of your code. WebStorm works better with TypeScript than even Microsoft's own creation. This all translates into AI performance

Have you actually given them a real test yet - either Junie or even the baseline chat?


Junie is good. Needs a few UI tweaks, but the code it generates is state of the art.


Developers, developers, developers!

https://www.youtube.com/watch?v=Vhh_GeBPOhs


I see the VSCode management has been firmly redirected to prioritize GitHubs failing and behind "AI Coding" competition entry. When that will predictably falter expect them to lose interest in the editor all together.


VSCode IS chrome though.


Kind of like how Android is linux.


More like "OBS is Qt". Which it is not, OBS uses Qt. And Chrome is just a runtime and GUI framework for VS Code. Let's not confuse forks of software with software built on something.


I believe our definitions of "winning the IDE wars" are very, very different. For one thing, using "user count" as a metric for this like using "number of lines of code added" in a performance review. And even if that was part of the metric, people who use and don't absolutely fall in love with it, so much so that they become the ones advocating for its use, are only worth a tiny fraction of a "user".

neovim won the IDE wars before it even started. Zed has potential. I don't know what IntelliJ is.


> I don't know what IntelliJ is.

It started as a modernized Eclipse competitor (the Java IDE) but they've built a bunch of other IDEs based on it. Idk if it still runs on Java or not, but it had potential last I used it about a decade ago. But running GUI apps on the JVM isn't the best for 1000 reasons, so I hope they've moved off it.


Android Studio is built on the IntelliJ stack. Jetbrains just launched a dedicated Claude button (the button just opens up claude in the IDE, but there are some pretty neat IDE integrations that it supports, like being able to see the text selection, and using the IDE's diff tool). I wonder if that's why Google decided to go VS code?


Uh, isn't that the regular Claude code extension that's been available for ages at this point? Not jetbrains but anthropics own development?

As a person paying for the jetbrains ultimate package (all ides), I think going with vscode is a very solid decision.

The jetbrains ides still have various features which I always miss whenever I need to use another IDE (like way better "import" suggestions as an easy to understand example)... But unless you're writing in specific languages like Java, vscode is way quicker and works just fine - and that applies even more to agentic development, where you're using these features less and less...


Quick comment, our AI Chat now has Claude integration. Don't need the Anthropic plugin.


Jetbrains IDEs are all based on the JVM - and they work better than VSCode or the full Visual Studio for me. It's the full blown VS (which has many parts written in C++) that is the most sluggish of them all.


I don't know what it's based on, but it works extremely well. I use Rider & WebStorm daily and I find Rider is a lot faster than Visual Studio when it comes to the Unreal Engine codebase and WebStorm seems to be a lot more reliable than VSCode nowadays (I don't know if it's at fault, but ever since copilot was integrated I find that code completion can stop working for minutes at a time. Very annoying)


You don't actually use it but somehow you know that "running GUI apps on the JVM isn't the best for 1000 [unspecified] reasons".

- This isn't a scientific approach.


You clearly don't know how Swing or Eclipse SWT works under the hood.

Java's big strength is that it's a memory safe, compiled, and sandboxed low level platform with over a quarter century of development behind it. But it historically hasn't handled computer graphics well and can feel very slow and bloated when something needs that - like a GUI. That weakness is probably a big reason why Microsoft rewrote Minecraft after they bought it.


I don't why this post is downvoted. My cynical reply to yours: "No, this isn't a scientific approach. It is the tin-foil hat HN approach!"


Since you last used IntelliJ "about a decade ago", what do you use instead?

    > But running GUI apps on the JVM isn't the best for 1000 reasons, so I hope they've moved off it.
What would you recommend instead of Swing on JVM? Since you have "1000 reasons", it should easy to list a few here. As a friendly reminder, they would need to port (probably) millions of lines of Java source code to whatever framework/language you select. The only practical alternative I can think of would be C++ & Qt, but the development speed would be so much slower than Java & Swing.

Also, with the advent of wildly modern JVMs (11+), the JIT process is so insanely good now. Why cannot a GUI be written in Swing and run on the JVM?


Notice that INTELLIJ uses its own UI framework, really, which I don’t think has much Swing left in it after all these years. And Kotlin is the main language for a decade now.


> I don’t know what IntelliJ is.

“I never read The Economist” – Management Trainee, aged 42.


The IntelliJ family are probably the best IDEs on the market currently.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: