I majored in semiconductors out of genuine interest. It was fascinating to me how miniature components can be turned into digital and analog devices. It turned out to be one of the most boring things I've ever touched. Cadence was an endless loop of crashing and losing progress. Anything you designed was called worthless because there's already a library of capacitors, amplifiers and DAC ADC components for you to drag and drop in. It was so incredibly dry.
During our final semester in school, or professor walked in and said, "All of you but X needs to know that you're going to be unemployed." He said that to a class of 4 people, the only ones left after most of the original class of 20+ had quit.
"The industry only hires the exceptional, you are all inept. Except for X, my favorite student."
X ended up being the only person to go on to work in a semicon company. The rest of us ended up in software, earning 5x the pay X was making. The barrier of entry to software engineering is so so much lower than semicon, which still makes no sense to me.
At the end of the day, the stakes are way too high to allow everyone a chance to play cowboy with the tools. You want your best possible champions to bring a design to bear. Then, the other 99% of the army is responsible for implementing the design in as repeatable & standardized a fashion as physically possible so that there is even the remotest chance of yield+profit. Software is the antithesis of this. The cost of playing around with your tools is not even worth accounting for. Due to certain cultural effects, you probably don't want to do software in semiconductor industry unless you really, really like the problem domain.
Even if you don't get to play around with the billion dollar tools, you still get to help troubleshoot some of the most intensely complicated problems on earth. Solving these riddles is very rewarding and the experience will stick with you forever. Hard to package those 2 sentences up into a PR campaign for the young generation, but I'm sure we can spin it if the DoD can still find ways to recruit.
>> 99% of the army is responsible for implementing the design in as repeatable & standardized a fashion as physically possible so that there is even the remotest chance of yield+profit.
That sounds like a mature and competitive field. I am suspicious of companies and fields where people aren't working like this. Look at the car industry, or energy, or farming, or shipping, or even aerospace. Only one of every thousand engineers at Boeing will ever decide the shape of an aircraft wing. The other 99% are there to implement and optimize its construction. Any company not spending 99% of its energies on optimization does not operate in a competitive environment and is therefore very likely on borrowed time. Eventually a competitor will appear or an IP monopoly expire and the easy times will be over.
> Any company not spending 99% of its energies on optimization does not operate in a competitive environment
That’s how a company stagnates and eventually gets replaced. Yes operational efficiency and optimization is important, but you only do that if you have a big moat and not much growth ahead of you. You need to spend at least 30% on growth and innovation so you can stay relevant. Higher if you are the one trying to replace the 99% ones.
A "mature and competitive field" is going to already have all low-hanging fruit taken and implemented. The moat is operational efficiency. The barrier to entry is the high costs of investment and general lack of access to know-how.
Not every industry is like software/tech. And the tech industry is already moving towards a "mature and competitive field" model. Otherwise why is Google and Meta cutting all their unnecessary spending and touting efficiency over growth? (besides AI tech, I suppose) Why all the layoffs?
Hah, having worked in automotive, things are dysfunctional as shit. Oh, they ship things, sure, but mostly as a consequence of sheer will of plowring man-hours in thins untill they kinda work.
Curious what was your experience operating in those fields.
And here I was wondering how manufacturers can take a thing that works and make it not work in new engine version.
Shit like "we did chain driven timing and it worked, but in new engine it prematurely wears", or "someone decided it is fine for belt-driven timing to take dip in oil, oh surprise, we now have parts of the belt landing in various parts of the engine".
Or my favourite, "metal, belt driven water pump is almost never replaced thru the lifetime of the engine and it's cheap, let's make propellers plastic and also drive it from separate electric engine"
Forget chains. Gear-driven cams are best. My motorcycle had them, but honda switched to chains in the new models because cam gears made a small high-pitched noise.
In the grand scheme of things... in a universe that's billions of years old. Humanity is 200k+ years old. Civilization is 6k years old.
Computers? 100 years old.
What part of any "modern" industry isn't still very very young? even industries like construction that have been around for thousands of years are still relatively young.
By "very young", I'm talking about industry that is still figuring out the right direction. Monoplanes made biplanes obsolete, no matter how perfectly the biplane's wing was designed. That kind of dramatic and rapid change tends to happen during the infancy of an industry.
See the early shifts from vacuum tubes to semiconductors, from expertly-crafted BJT to crude CMOS, or the rapid march of good-enough architectures in the latest process node clobbering beautiful architectures in older nodes. As the industry has matured and course has stabilized, perfecting the design has grown in importance.
Old by how close the industry's products are to physical limits.
For example, the newest gas turbine designs are already exceeding 50% of the maximum theoretically possible efficiency as allowed by thermodynamics. So it doesn't matter how many thousands or millions more years gas turbines are optimized, by us, aliens, future descendants, etc...
No future gas turbine industry in this universe can possibly double the thermodynamic efficiency of the finished product.
>Old by how close the industry's products are to physical limits.
I was just listening to an AI podcast and they were duscussing going from the 1 second, 1 minutes, 1 day [unit of response - I cant recall the name of the measurement] -- but I assume thats the "moores law" of AI right now?
And as we get closer to the physical limits of chip production scale/die/etc - I assume they will be scaled horizontally while the GPT-X capabilities will scale volumetric-ally.
Not OP, but even GPT-X and ML reaches limits due to lack of compute and/or datasets.
For example, CNNs were largely known and discovered by the 1990s-2000s, but the compute simply didn't exist yet until GPU manufacturing became commoditized.
OpenAI's massive quantum leap is thanks to the massive corpora they were able to leverage, which until the past 5 years, simply didn't exist.
This is why we've seen massive jumps in Mandarin Machine Translation and Computer Vision from the PRC due to their massive corpora/dataset of English+Mandarin language news from CCTV/CGTN and local surveillance camera data respectively.
This compute limit is a big reason why the US Federal Govt has been working on the Exascale Computing Project for example.
Even for training on a massive dataset, you are still limited by compute and processing time (good ole Computational Complexity), which is why HPC projects like the Exascale Compute Project were created in 2015 along with additional funding+research in efficient and alternative data structures.
I highly recommend going down the rabbit hole of High Performance/Accelerated Machine Learning.
The eminence of AI is more 'volumetric' than 'out'
Out scales "up and out like a hill or a lift"
- volumetric is spherical - it presses into the future AND the past (it already has been harvesting history, but created a fire-hosed spigot-interface for the future as well) -- and draws it into its center for eval...
They are different.
Always on the positive expressions of XYZ axis - but had ignored the negative of each, where AGI will go in all dimensions...
I don't think this aligns with the Chinchilla scaling law. There is indeed a point at which you can oversaturate a model with data, as well as such thing as not giving it enough. Compute is the constraining factor, and it scales more-or-less linearly in both directions.
>Solving these riddles is very rewarding and the experience will stick with you forever.
That was also my experience working in the semi field (in Europe). People working there weren't in it for obscene compensation, they were into it for the hands on puzzle solving of uniquely complex HW problems with cool and rare expensive machinery. Some were quite significantly underpaid and while they knew it they never felt the need to complain too badly about it. I guess it's kind of like arts, which is a shame, because here some publicly traded corporation is abusing you for your passion.
Also, the fact that most of Europe doesn't have many FAANGs and big SW companies paying orders of magnitude more to un-balance the jobs market, surely help to not discourage people from this industry.
I'm in digital physical design and part of that 99% of the army. The schedule is king. If we make a mistake it is another $30 million for mask costs and 4 months in the fab to get a new chip to test. Contrast with software where you can change 1 line of code and recompile and test in seconds. We do so much to minimize risk by reusing existing blocks, licensing third party IP that has already been validated in silicon, and armies of verification engineers.
Compare that to the lead time of electronics, though. I just started in PCB design which is leagues more time forgiving than silicon design and fab, and even then if a non-generic component I'm designing for goes out of stock, the lead time for a new batch is anywhere from a month to a few years, on average around 4-6 months.
Thinking about the design process, testing and revisions, all the way to fab and then market, we're probably talking years just to see a single design reach your test bench (I'm speculating). Oh and millions, because the major chip fabs only do things in large batches.
Compare this to software. It is, indeed, seconds in comparison.
I'd love for semiconductor fabrication to get fragmented out like it sounds like it will.
Lead time isn't because of some magical bottleneck. It's because of demand and batch size. If university students can somehow decimate the batch size needed per design, and enough groups do this (or find new ways to reduce the process size in order for more firms to do this), and enough of them do it at once to help demand, then the lead time goes down, and so do costs.
The months long lead time is not purely a scheduling/capacity issue though. Producing the masks for these modern processes is very complex and slow. Adding more fabs won't help because because its a latency issue, not a throughput issue.
AFAIK (but never studied the latest processes), making a mask takes weeks. That means that the minimum lead-time on making masks is weeks long, because there is no interdependence between them. But I'm sure this is one of the steps that add up to years on practice, because of production limitations.
Actually making the chips has a higher floor, because every step is interdependent. With a hundred steps, each one taking half a day, we are talking about 2 months here. There are probably more than a hundred, and a few take more than half a day (but many take less than it), so yeah, I'd easily expect a 4 months minimum. What doesn't compare to years at all.
Yeah but that speed also means we can go from idea to broad release quickly which both means quicker to market to deliver value but can also mean that a screw up scales quickly too.
I'm a hobbyist in this area, and think the field is somewhat early in its development. MCUs and cyber-physical systems was like this until the Arduino happened. Arduino may not have been the first to do exactly that but it was just good enough to cause an (re?)explosion in electronics as a hobby. During its hayday, I would say Arduino was a core element of the Maker movement.
So what needs to happen to make this a reality for semi-con? First off, we need cheap, cheap fabrication. I actually looked at public funding in Canada and how that was going to the big name Universities who had their own in-house fab labs (at older process nodes). The costs of someone not in the inside was nuts. The actual cost should be in the 100s of dollars to fabricate a design (considering the marginal costs).
There are people that do this at home but it doesn't work either due to chemicals being pretty dangerous and the need for a bunch of equipment. I bet the amount of money the EU spent on its first metaverse townhall (or whatever it was called .. the thing very few people attended) or a tiny fraction Canada wastes on silly things promoting youth culture or whatever, they could fund a lab that is actually open to the public, with the express mission of promoting hobbyists and education. This will NEVER happen because (a) it needs a professor who is on the inside with a kid-like passion in this tech and a commitment to bringing it to the masses (I see some profs like this at schools like MIT but it is so rare at large, competitive schools like the big ones in Canada), and (b) it does not have an instant payoff for the govt. They don't want dabblers and vague educational outcomes. They want workers with degrees.
I am convinced before I am dead, advances in robotics and fabrication will simply the process (or use home equipment such as future laser printers for printing stencils). I'd love to spend my retirement fabricating my own CPUs :D
Edit:
Let me add: I don't mean the cutting edge process node. I mean the kind of process node that was used to make the very first chips (but less toxic, repeatable, cheaper equipment). If it is possible for synthetic biology, it must be doable for semicon :D
I'm a 25 year professional in the industry so I've never really thought of the hobbyist side. I use software from commercial Cadence and Synopsys that has a list price of over $1 million for a single physical design tool license and we use about 200 of those licenses simultaneously to tape out a chip. Then we spend about $30 million in mask costs. If we make a mistake it is another $20-30 million for new masks and another 4 months in the fab for a new chip.
Google / SkyWater / eFabless have this program for 90/130nm chips. That is really old technology from the 2002-2006 time range but it is still useful for a lot of types of chips.
I am curious what kind of hobbyist chips you want to make that can't be done in an FPGA? You can't do custom analog in an FPGA but these days you can find FPGAs with multiple PCIE / USB / HDMI serdes links, DAC, ADC, etc.
The problem with FPGA's isn't the hardware... it's quite complete and capable. It's the software stack you're forced to access most of them via that makes them a non-starter for many/most hobbyists. (cost is a major issue, but hardly the only one) My guess is that as low cost producers who have embraced the existing open source FPGA software being developed start producing higher end parts (more recent process nodes, large LUT count etc), that's when you'll really start to see them take off in the hobbyist world.
I think the OP isn't even talking about 2006-level capability but rather something akin to 3D printing for semis that approached even early (i.e. somewhere in the 1960's-1980's range) level fab capability for hobbyists. Obviously it doesn't exist currently, but that's the dream for some.
Cost of the FPGA design tools isn't really the biggest barrier. The big barrier is that the FPGA design tools from the FPGA companies are full of bugs that you spend an inordinate amount of time trying to work around. Some open source tools like Yosys have emerged. So far they're limited to the ICE-40 FPGAs from Lattice and a few older Xilinx parts. I've used them with the ICE-40 and they do work well. Hopefully FPGA companies will come to realize the value of opening up so that 3rd party tools can be developed for them.
> The big barrier is that the FPGA design tools from the FPGA companies are full of bugs that you spend an inordinate amount of time trying to work around.
you cannot fathom how bad eg Xilinx's FPGA tool suite is. like beyond the fact it's like some kind of frankenstein monster because they bought up a bunch of smaller tools and glued them together with fucking tcl, any individual piece is worse than your worst open source compiler/ide/whatever tool you use to build software. and yet people say that it's 10x better than the competitors' offerings. hell on earth is being an FPGA dev without a direct line to Xilinx employee for help debugging (which is how things really work professionally).
just lend some credence to myself here - vitis, their supposed hls offering that turns C++ into RL by magic and genius and oracles, today embeds LLVM 7.0 (released in 19 Sep 2018). gee i wonder how many perf and etc improvements have landed in LLVM since 7 since today LLVM is approach 17.
i could tell you more but it would just spike my blood pressure after 6 months of not having to deal with that mess.
Can confirm. I used Xilinx Vivado to compile for their FPGAs in Amazon F1 servers a few years ago. There came a point when each time I edited a single small Verilog file in my small project, the GUI took 40 minutes of 100% CPU just to update the file list in the GUI. That's before you could do anything useful like tell it to compile or simulate. The GUI wasn't useful until it finished this silly update.
I knew FPGA compilation could be slow, but this wasn't compilation, this was a ridiculously basic part of the GUI. I knew it had to read my file and analyse the few module dependencies, but seriously, 40 minutes? At that time I just wanted to run simple simulations of small circuits, which shouldn't be slow.
In (open source) Verilator, the same simulations ran from the same source files in just a few millliseconds with nothing precomputed or cached.
I looked into what Xilinx Vivado was really doing and found a log indicating that it was re-running a Verilog read on the changed file several thousand times, each time taking a second or so.
That was such a ridiculous bug for software Xilinx said they had spent over $500M developing. If there were good parts of the software I didn't get to see them due to this sort of nonsense. I think it was fixed in a later version, but due to (yay) enforced licensing constraints I couldn't use a fixed version targeting these FPGAs. That's when I abandoned Vivado and Xilinx (and Amazon) for the project as I didn't have that much spare time to waste on every edit.
I am under the impression the current crop of open source FPGA and especially open source ASIC tools are much better designed.
> I looked into what Xilinx Vivado was really doing and found a log indicating that it was re-running a Verilog read on the changed file several thousand times, each time taking a second or so.
Just reading that made me nauseous, to think this forced so many trillions of CPU cycles to be spent on nothing, and nobody at Xilinx corrected it until some time later.
I started working with a Stratix 10 GX dev kit last year, and the Intel Quartus Prime Pro software. It's been a nightmare.
1. Unless you have a direct line to an experienced support engineer inside Intel, technical support amounts to pleading with new-hire support engineers who can't do more than try to look something up and regurgitate it back to you.
2. Sample designs don't work with the tools you have. A design might be for version 18 of Quartus and you have version 23. You can try to auto-upgrade the IP, and maybe 3/4 of the time it works. The other 1/4 of the time it's some obscure error you can't track down, so you're back on the "Community Site" begging for help.
3. Doing something like programming your design into flash on the S10 board involves lots and lots of research, including watching Intel produced YouTube videos.
I could go on, but the whole process is like running in wet cement.
> I could go on, but the whole process is like running in wet cement.
just quit. seriously it's not worth it. no job you have (not even low-latency work in HFT) will compensate you enough to make investing time, blood, sweat, hair, and sleep into this work worth it. there is a better, more rewarding, with an actually transferable set of skills, software job waiting for you somewhere out there.
> i could tell you more but it would just spike my blood pressure after 6 months of not having to deal with that mess.
There was a time when I could've transitioned to FPGA development and I was pretty keen to do so. But after using (fighting with) Xilinx's tools and watching other people who were deeper into the FPGA side wage similar wars I decided it just wasn't worth it. That it would be better just to stay on the software side of things where the tools are mostly open source and so much better. In FPGA development you just don't have any control over the backend tools - fortunately there are a lot of front-end simulation tools like Verilator that are open source and some limited backend tools like Yosys are open source. Hopefully open source will continue to make inroads, but it's slow going.
I get paid to do this professionally so I don't follow the hobbyist side but I've heard of people using this for FPGAs. I have no idea how it compares to commercial tools.
OSS CAD Suite is a binary software distribution for a number of open source software used in digital logic design. You will find tools for RTL synthesis, formal hardware verification, place & route, FPGA programming, and testing with support for HDLs like Verilog, Migen and Amaranth.
> I use software from commercial Cadence and Synopsys that has a list price of over $1 million for a single physical design tool license and we use about 200 of those licenses simultaneously to tape out a chip. Then we spend about $30 million in mask costs. If we make a mistake it is another $20-30 million for new masks and another 4 months in the fab for a new chip.
I've thought for a long time now that this is an area ripe for disruption. But it's very difficult to disrupt - it hasn't been yet. EDA software is probably the easier part to disrupt. Some open source EDA tools are out there, but not so much on the physical design side.
I've read there's actually some excellent ASIC open source design flows now, thanks to some open PDKs (notably Skywater) as well as recent funded research. I haven't tried the toolchains myself but they are readily available and you have courses like Zero to ASIC using them.
They don't target the advanced nodes where masks cost as much as the GPs prices (though $20-30M sounds higher than I expected even at the leading edge), but they are workibg their way forward and the space in general is being disrupted at last.
I've taped out 4 chips in 5nm and our managers and VP's have said about $30 million for a full set of masks (base layers plus metal layers)
The $20 million is for metal layers only for a respin.
We are moving to 3nm right now which will be even more expensive.
The open source tools can't handle the latest process nodes. Maybe in the future but this is a highly specialized area with tons of NDAs from the fabs for PDKs and DRC decks.
> MCUs and cyber-physical systems was like this until the Arduino happened
I'd say it technically happened when cheap flash based microcontrollers become available.
Before you had to have *at the very least* EPROM programmer and eraser (or CRT TV I guess) to even put your code onto microcontroller. Flash based microcontrollers meant all you needed is parallel port and some wires to start programming microcontrollers, dropping the barrier to entry by orders of magnitude if you already had PC to program.
Then we had first wave with Basic STAMP and similar making it even easier.
Then it was Arduino that exploded it and there were other factors on that too.
>So what needs to happen to make this a reality for semi-con? First off, we need cheap, cheap fabrication. I actually looked at public funding in Canada and how that was going to the big name Universities who had their own in-house fab labs (at older process nodes). The costs of someone not in the inside was nuts. The actual cost should be in the 100s of dollars to fabricate a design (considering the marginal costs).
Like that. Before making PCB was either doing it at home that took a bunch of time and not everyone wanted to play with chemicals, or expensive to prototype PCB.
Now we have extremely cheap low volume PCBs and even well priced low volume manufacturing options. Same thing with 3D printing, went from massively expensive to affordable.
The problem really is that it's still a bit profitable to do so for companies doing it, while doing same for chip-making would be oh so much harder unless someone put some SERIOUS R&D into making "industrial chip printer" where each wafer could have different set of chips (packaging I guess could be handled by requiring each submitted chip to have connectors in same place). So no mask but some kind of DLP or similar projector to do lithography (dunno if that even possible for anything in hundreds of nm range, just guessing)
I don't believe Google is sponsoring any more free shuttles. Your best bet nowadays is ponying up $10k to get a tiny chip on sky130 (so small you really can't hold even mildly complex in-order RISC-V cores). Sky130 is also generally so old that you can probably get better performance (let alone area) on a modern FPGA for a fraction of the price. Efabless's sky130 MPW is nice for semiconductor research insofar as it makes it more realistic to actually fab designs but it's not particularly useful for hobbyists beyond just the novelty of holding a chip you designed in your hands.
I think it's simpler than that. Semiconductor manufacturing is such an advanced field, and there are so few openings for engineers, that they can be very selective in who they hire. We don't need to conclude that semiconductor engineering is particularly difficult, it's sufficient to conclude that they are simply picking people who will work lots of unpaid overtime, simply for the thrill of working in such a cutting-edge field.
You missed the part where they do it for a fraction of the salary.
I suggest there's a kind of 'cultural imperative' in Korea that is so different from the US that it doesn't even begin to factor into our equations.
My Korean grad school colleagues went to intern at Samsung for serf pay, the only way that could work is if there was some kind of culturally implied merit.
The US also needs a 'different kind of ethos' to compete in semiconductor, Cali Surfer Vibes won't cut it, neither will NY banker vibes, neither will SF Social Media AI Allbirds vibes, nor Cambridge High Intellectual Healthcare Conference vibes, nor Texas Energy Industry or DC Government Consultant etc..
And that will be impossible without a fair number of migrants who will be mostly Asian - so where is this going to happen?
Maybe they need a 'New South Valley' about 50Km south of San Jose, a cross between Orange County, Silicon Valley, Cambridge, and Seoul aka Cali, but slightly more formally regimented, a bit like the actual old-school Silicon Valley which was at the time, away from the buzz, a bit of a sleepy burb where people focused on hard stuff and economics mattered.
> The US also needs a 'different kind of ethos' to compete in semiconductor, Cali Surfer Vibes won't cut it, neither will NY banker vibes, neither will SF Social Media AI Allbirds vibes, nor Cambridge High Intellectual Healthcare Conference vibes, nor Texas Energy Industry or DC Government Consultant etc..
If someone like Netflix/HBO did a proper documentary with Samsung about their LSI fab in Texas, HR would have more applicants than they would know what to do with. This 'ethos' can be created. I saw what it could be like when I worked there. I really believe seeing the whole thing in action is the primary selling point. The draconian security makes sense once you walk that catwalk and witness with your own 2 eyeballs what is at stake. Capturing this sensation for the masses is the challenge.
They don't pay enough is the point. Really smart and competent people don't work for 1/2 wages unless they are caught up in some kind of system like in Korea where it's a culturally backed norm and frankly they can't really escape.
Americans will not work for fractional pay unless it's a kind of a 'wartime' / 'emergency' situation.
> the stakes are way too high to allow everyone a chance to play cowboy with the tools
This is really what pushed me into just doing digital design as a hobby and making my money elsewhere. It's understandable that chips companies are so risk averse (a bad tapeout or, worse, a major escape can cost the company billions) but it is really a miserable experience to have so much passion and ambition but never being able to do anything other than extremely conservative incremental changes over a decades long career.
I worked on a campus with alot of semiconductor investment and education for a somewhat affiliated entity.
The thing I noticed was that the crunch work was brutal, parking lots were full till 11pm half the year. The parking lot told the story - the execs had Teslas and BMWs in their gated lot. The troops were in 7-10 year old Hyundais and couldn’t afford to buy lunch in the cafe.
If you want to be in the semiconductor industry, the real winners are the tradesmen. They were all driving $70k pickups. Pipe fitters, electricians, operating engineers, etc make bank. If you know how to build the supporting infrastructure around some ASML tool, your income is limited by your desire to work. I can think of a dozen serious contracting companies in those lines of work that spun out of building that place.
Unfortunately, this is how Soviet engineering was operating until it all tanked. Engineers with university degrees getting paid peanuts as an official policy.
It's sad seeing this practice repeated. And people falling for that.
Official policy sounds like the Soviet Union was the only employer and they were deciding to pay peanuts unilaterally.
Is this the case with modern hardware engineers? No one is forcing them to accept low pay, or learn how to write software and earn more.
Sounds like they are getting paid market price, and it has been decades since it is known that pay for hardware engineers is low, indicating that people should stay away unless they want relatively low pay.
There is no such thing as a 'free market' really. Once people have been pulled into a specific academic program without 'true knowledge of market forces', the switching costs are high. They take an entry level job because, well, it's a job. 4 years later they find out there are only 3 employers who want their ultra niche skill, and so the employers use their power to leverage to shit wages.
This shortsighted action causes the opportunities in high wage areas to fail as devs do 'other things', while the jobs move to places where there are even more rigid systems and the 'work hierarchy' is integrated right into culture. Like Korea.
It's also why even very talented software developers outside the US are paid so little - it's hard to justify high pay in most companies, as it's not clear how to discriminate between the 'true talent' and 'regular devs' - so talent gets anchored very low. Devs languish or move or find solace in working on something interesting and intellectual, which is common.
It's why there is no real 'hardware' industry in the USA, all of the jobs are niche even if they are very hard, industry can hold them to the wall for low salary, it's more effective for them to be carpenters, while the jobs go to Asia.
All of that plus the fact that software can move much quicker and can be more profitable, the 'windfall' economic part of software makes the VC math work out better to justify more investment.
If we were to reorganize that around net value creation instead of just local barter between trades and enterprises, it'd be a different situation. Very hard to do. Government is stepping in with a big starting point, will industry follow suit? Maybe, I doubt it.
There’s no compulsion, but people tend to nerd out in the specialty and lose their way. It’s hard to be the respected engineer in <pick a topic> and start over as a newb. Many of the folks in these fields get caught in a visa trap as well that makes the friction that much stronger.
One of my classmates in high school was a well regarded engineer that specialized in some sort of critical manufacturing process. Published papers, had patents, etc. Unfortunately, a new tech replaced that process, and he found himself laid off and kinda screwed.
He had a good network, crammed for a PMP and got into the project management racket and went back to school. Now he does cyber analysis stuff.
> Official policy sounds like the Soviet Union was the only employer and they were deciding to pay peanuts unilaterally.
The official doctrine was that it was a Laborer's Republic, and so physical labor very often paid more than mental one. It was all about making the common man feel as if the state is really organized for the benefit of people like him, and not some narrow elites. Of course, it was BS.
Somewhat related but anecdotal—there was so much analog gatekeeping during my EE degree it really turned me off of analog electronics design. Some professors acted like nobody was worthy and delighted in ridiculous trick problems on tests to weed everyone out. Many also had the favorite student syndrome that you mentioned where everyone but X was an imposter. The industry also seemed to not want anyone in analog and semi manufacturing anyway or had very high entry requirements, although kept complaining about a lack of candidates publicly. I bet it’s somewhat different today with the resurgence of modular synths and hobby electronics, but I can only imagine that it might make some professors double down on the hazing mentality. All that is to say, in some areas STEM educators have serious cultural issues to work out before just blasting away with funding expecting results.
> The industry also seemed to not want anyone in analog and semi manufacturing anyway or had very high entry requirements, although kept complaining about a lack of candidates publicly.
Didn't take EE in the US, but your statement here seems to be universal. Especially true about Analog professors being gatekeeping jerks.
The funny part is EE professors (analog or digital) don't even teach you any of the areas where analog is practical to digital electronics. They'll have you doing tons of abstract differential equations and studying transistor characteristics, and then kick you out into the world unprepared for the things you should know. Which is why Johnson & Graham should be the first book on the shelf of any new digital EE grad.
Completely the opposite here (not in the US either), my analog professors were some of the most passionate and caring professors I had.
Didn't end going in to analog design because although I think it's interesting I don't think it's actually fun to do the design. I feel like digital design has much more variety.
Well, to be fair, if there is not a place at all for the people, those professors do in fact help their students.
They got that mentality from some kind of trauma themselves. I'm not sure it's even the wrong adaptation, since it's not in their power to fix the systematic issue.
One of the systemic issues that needs fixing is the gatekeeping itself. Weeding out serves no purpose because there's work besides just analog ic design roles. Industry complains because undergraduate programs have become so detached from reality focusing on stupid stuff that everyone graduates way more underprepared than they should be.
I had a similar experience in Germany. Studied Engineering Physics with a major in semiconductor technology. At university we learned everything from high level programming, VHDL, electronics down to semiconductor chemistry. I didn't find it boring at all.
Judging from our industry partnerships the semiconductor industry seemed to do well at the time. At least that was my impression. When I graduated in 2001 the signs of impeding death were already there however, so I worked in aerospace for a couple of years, went back to university for a computer science degree and ended up in software anyways. Not the worst thing to happen for me personally, but honestly I still don't really understand how a promising industry could die like that.
For me specifically Siemens divesting or spinning-off its semiconductor assets. Before that my study program was almost a free ticket into Siemens. Working for Infineon, EPCOS or (a little later) Qimonda did not seem attractive to me, specifically because it was well known at the time that these divisions were spun-off because Siemens wanted to reduce the risk from the quote "highly volatile semiconductor business" for the parent company.
Reminds me of talking to brewers and winemakers. The brewers said they could experiment as much as they want because the worst case was you get some different beer.
The winemakers had to follow things exceptionally carefully because you basically couldn’t improve anything, but you had innumerable ways to destroy the wine and end up with vinegar.
In other countries, with less technical jobs, this effect applies to all kind to technical fields.
You go to an electronics engineering class. There are a few superbrilliant people. The rest are quite clever though.
As there are a few jobs available for the technical skills that you learn at class, only the superbrilliant get those jobs.
It turns out that the companies say to these superbrilliant guys: "I am giving you a unique opportunity to work in this top field in this country. So your salary is not about money, but motivation".
End of the story: the rest of the guys, not the super brilliant, end up working in more generic stuff like management, consultancy, generic simple software.... And they make way more money that the superbrilliant guys.
These examples are taken from Spain. Althoug I have seen similar in other European countries with more technical jobs.
The way that toxic prof treated the class makes no sense, but the barrier to entry in industry makes a lot of sense, even if we don't like it. Semiconductor is just insanely capital-intensive. Now that it's become a major issue of strategic concern for the USG there is finally some capital available for it.
People have finally begun to realize that if all you do is build software, you're completely dependent on your hardware suppliers. It's like being a commuter society that doesn't make their own cars.
They do, for the 1 in a thousand engineer. For example the vast majority of TSMC engineers aren't making that much, but a select few easily earn 10x FAANG compensation.
In Taiwan working in semicon will earn you more than software engineering. It’s insane that majority of software jobs in Taiwan earn less than monthly rent costs.
There’s definitely an increase in interest and demand for semis. Your experience might be very different from graduates looking today and in the future as demand increases.
Having said that, the vitriol and toxicity in colleges and towards one another needs to cease. I see that being more naturally achieved in an economy that is growing and seeing things getting cheaper generally. That would look like vast investments in small, big, and mega projects. Imagine how much different the US today would be had it never built a highway system and stayed to only where the trains/horses went. I think it’s 80-90%+ of US GDP transports via truck over a highway at some point.
Are they working at a fab? Working in the manufacturing portion of the fab? Doing process design work on a computer? There are so many different types of jobs within the semiconductor industry and some are going to pay far higher than others.
As I said in my other comment in the thread I have worked at many fabless semiconductor companies. The different between software and hardware salaries at companies like Nvidia and Apple is around 10% or so, not 5X.
Last I heard it was Micron. Not too sure what X got into. He was definitely a genius though.
[Edit] Just checked. He has left the industry a few years back, and is now a software engineering manager. Even the brightest student has given up. Someone on this post said it best, they need to have salaries matching the amount of skill required.
If X works in something low level like device physics then there are a limited number of companies doing cutting edge fab stuff like that. Intel, TSMC, Samsung. The small number of companies limits your salary potential.
It's the same with being a maintenance technician for a piece of ASML 3nm semiconductor manufacturing equipment. You may be highly skilled but if there are only 3 companies in the world that will hire you then your salary potential is limited.
I know about 30 people developing hardware at Apple (writing Verilog, custom analog serdes design) and you can see that their software and hardware salaries are similar. In some cases hardware is higher although I suspect that is because of the limited sample size of salary data at the ICT6 level.
There is more to life than money but people who care about money need to look into the companies and salaries before they commit themselves to 4 to 10 years of college.
>
If X works in something low level like device physics then there are a limited number of companies doing cutting edge fab stuff like that. Intel, TSMC, Samsung. The small number of companies limits your salary potential.
You should be very careful about comparing raw salary numbers: a much better metric to compare is "'salary' minus 'cost of living at the job location'".
If you're pulling in 400k USD per year, it actually doesn't matter what the cost of living is. You tie up 40% of your income in a 1-2 million dollar house and still save $50-$100k a year while buying everything you want on a daily basis, then retire after 20 years and live like a king
Other branches of engineering: mechanical engineering, material science, chemical engineering, certainly civil engineering tend to pay at a lower level overall as well. And the purer sciences often require advanced degrees to get "real" jobs at all.
When I worked for a computer systems company after school, my impression was that the various types of engineers at the company were on fairly similar pay scales. What I think you've seen the past couple of decades is that out-sized salaries at some mostly software-related West Coast companies have really driven up compensation for specific types of jobs. It seems more likely that will settle down to levels with comparable jobs over time than the other way around as a lot of people who are late to the game try to jump in.
That's an interesting angle I'd never considered. If the government simply creates a lot of potential employers via funding, the labor market becomes competitive, which improves working conditions, compensation, and ultimately attracts talent that would not have otherwise considered the industry. The most interesting part is that outcome is almost a certainty if the government focuses on creating that competitive labor market through their investments; it doesn't require moonshots, breakthroughs, or wild successes to work, yet it can produce them, and can become self-sustaining.
I think maybe in Taiwan and China semiconductor engineers are earning a good salary, especially China when government is dumping insane amount of $$ into it. But there is a lot of hype though.
According to the office for national statistics in China, for jan/feb of 2023 http://www.ce.cn/xwzx/gnsz/gdxw/202303/27/t20230327_38464161..., car manufacturing profits is down 41%. non-metal products manufacturing profits is down 39%. chemicals manufacturing profits is down 56%. electronics manufacturing profits is down 71%!! In fact, all businesses have dropped 22% in profit. Not to mention the 1M+ SMB that failed in the last 2 years.
Didn't profits boom during Covid? They have to come down as things normalize. Also, profit down 50% still means you're profitable so I wouldn't call that "collapse".
The Chinese GDP is expected to grow by >5% this year.
The global economy is strained. But especially in Europe because of the shock of the Ukraine war.
How does it not make sense? As a software person you are completely relying on the hardware to do its job. You are putting your blind faith into the work of semiconductor engineers in order for you to get your paycheck
More often in startup land, you are relying on the investors for your paycheck.
The place where the value is is not at all related to where the difficulties are or the bottlenecks are. The universe does not have that kind of sense of distributive justice.
It makes no sense because if the semiconductor industry only hires the absolute best student of the cohort, that's a way higher barrier of entry than software where 90% of the students in my coding class is now working in MANGA. That should guarantee an insanely high pay for the semiconductor engineer. Instead he's currently making less than what fresh grad MANGA hires are paid.
But you said they hired 1 person out of a class of 20? Obviously the few who get in can't ask for a high salary with that many candidates per position.
>But you said they hired 1 person out of a class of 20? Obviously the few who get in can't ask for a high salary with that many candidates per position.
Obviosly they can, assuming that these companies hire "only the best". That means that in the demand/supply calculations, the demand is not for "graduates of semiconducting programs" but "only top top graduates of semiconducating programs, ignore the rest".
That's in much shorter supply, and means that 95% of the "class of 20" aren't even considered worthy as candidates, and aren't competition for those top semicon jobs. They'll need to find work in another field (like switching to software), or low level semicon jobs.
You could write the same thing about the concrete slab the office or data centre is on. The barrier to being a concreter is very close to "applied for a job as a concreter".
It does not make sense because software could be written using the same quality standards that CPUs use, which would involve a lot of formal reasoning and proofs.
This work style, however, is not valued at all and is actively suppressed by "flat", non-expert hierarchies that reward people for being popular and sloppy.
It works for web companies because errors have no consequences for them and the web has turned into a gigantic tabloid anyway.
Abstraction layers become simpler, more rigid and more robust as you peel them.
A painter uses rules of thumb all of the time, yet his work is entirely dependent on chemical engineers getting their science and formulas right. Artistic painting is just one of many uses of paint (and probably its most approximate), but both the painter and the industrial product engineer need the same quality assurances regarding their colors.
Hardware needs to be solid to allow all kinds of software the freedom to be built.
I think you also overestimate the formality in CPU design. Yes they do a lot of testing, but it's not lik an airtight mathematical document for the whole chip.
In my experience what trade-off is the correct one depends on your ability to iterate and where your risk comes from. Can you ship infrequently, but you know with high confidence what product/market-fit looks like (extreme example Mars mission)? In that case you want all the planning and verification you can get. Can you ship often, but have little confidence in what your users/customers want and what will grow your business (extreme example social media app)? In this case you probably want to prioritize speed of iteration and rework technical aspects once you know it's what's needed.
I've had a very angry product manager because a feature took almost a second to load. I told him that we could make it load much faster but it would be almost a week of work and he should validate the feature with customers before we invest in that. "Of course we need this feature! It's core to the product!". So we spent a week. It worked realty nicely and scaled pretty well. In the next 12 months the feature changed completely twice because customers didn't find it useful. Then it was removed entirely and then we had to bring it back because the sales team liked it for demos. All that iteration was required to find what the market needed, not fit technical reasons, except for the performance work which was entirely wasted because it all got redone after the first user feedback.
That’s an hilarious story because the asshole professor ends up being right, a naive optimist (the supposed reader) gets a dark surprise, and all students involved get reasonably happy endings.
Could be a Hollywood movie, if you fill in the gap between the penultimate and ultimate paragraphs with some interesting action
The professor told them after they already invested years, they will end up unemployed. They did not, because their skillset was valuable somewhere else.
The way op told the story, the professor meant what he said to humiliate the students (despite he likely knew, they would be wanted in other professions). And a fry cook is not quite comparable to a software engineer.
I worked in process monitoring and supply chain departments at a couple semiconductor companies.
There's an impenetrable culture of "best practices" that keeps the tech from catching up with the times. I eventually left because my career stagnated being on teams who couldn't update any of the internal tools because of this or that exec who was too scared to explore beyond the tiny niche they carved for themselves the past 10-30 years.
The last team I was on was using JDK 8 and Postgres 8. Vendor lock-in on downstream dependents, I guess. We finally managed to convince leadership after 3 years that if they wanted any ML in their systems at all they'd need to update their stack or make room in their fabs for a Python server on the intranet at the very least. So they gave us a Windows box with 2 cores and 1GB of memory because "that's all that will fit" and "we aren't a security company".
A lot of the simulation software for PCB design is stuck in the 90s. The core of the simulation hasn't changed so it's not too big of a deal from a results perspective. However, it integrates terribly into the workflow so designers only simulate a small portion of a design and there's little automatic checks and things.
This is just true for a whole lot of the industry tooling. Xilinx Vivado is a bloated piece of crap that'll crash all the time unless you have half a terabyte of RAM.
Same goes for lots of other EE-tooling in general. The L and B in MATLAB stand for Legacy and Bloat. People still write programs for PLCs in Ladder, where programs still cannot be portable between vendors, or even different products from the same vendor.
All the companies that produce anything invent their own language for the thing and write their own compiler for it.
These compilers are clearly not written by compiler experts.
I don't blame EEs for building bad software. They weren't trained to do it and aren't paid to do it. I blame the "if it works, it works" culture that the industry seems to have.
Never go back to refine anything, just keep pushing more plugins, more software; create a patchwork of programs until you get the job done.
Having lived on both sides of the fence, I feel comfortable blaming the EEs. Many are allergic to basic scripting, don't bother to learn how their tools work, and have an elitist attitude towards other design responsibilities (ex. layout, verification).
P.S Set up Vivado in scripted mode with either an in-memory project or non-project mode. It works like a champ.
> Many are allergic to basic scripting, don't bother to learn how their tools work, and have an elitist attitude towards other design responsibilities (ex. layout, verification).
In my very limited experience, most people don't learn new stuff until they're forced to; either by their employer, by their university, or by needing to learn it for something they want to accomplish.
This is why you'll have self-taught developers go for years using strings as enums, linear-searching in huge sorted arrays, because it works and why would you seek out something else?
I think the solution is to introduce more software development in EE education; forcibly expose them to it.
My EE bachelor's degree contained a whopping ONE class that was focused entirely on Python programming.
The rest just used cobbled-together C code for microcontrollers, or arcane languages with dumb IDEs for PLCs.
The tried and true :) It's funny seeing people complain about Vivado bugs when I haven't run into any in years. Sure, the IDE may be absolute garbage, but I've thankfully never run into any bugs in the actual synthesis and routing parts of the package, which is all that really matters.
There are basically only two firms which produce chip CAD software, and they get to be a conservative duopoly of "enterprise grade" (i.e. user hostile) software.
And the industry is so brutally cyclical. They'll hire thousands, work them into the ground to get some machine or line done, then let them all go, then hire them all in a flurry next time they need to build something. It's also very toxic in my experience, lots of mean people.
> X ended up being the only person to go on to work in a semicon company. The rest of us ended up in software, earning 5x the pay X was making
Sounds like a great way of nurturing talent... not. Apparently they will only look at your CV if you have a PhD or a very specialized MsC
And all this for what again? "The privilege of working with hard problems?!" I've done enough C already and that makes my head hurt now. Thanks but I think I'll take it easier with "higher level software" while making more money and having less stress
Top engineering school hazing, creating underpaid and not quite mentally balanced people
For me it's more of sad realization that what's very hard and requires either skill, talent or both just isn't that well correlated with what earns most money.
Gotta have enough demand so other competent people interested in same thing won't drive the value down.
And I do feel lucky that my set of skills correlates with stuff that pays decently.
Just demand/supply I guess. Semi-conductor companies need just a few (even if extremely talented) engineers compared to software. The industry is also very concentrated and not growing that so not that much competition for employees.
I wonder if it's because of the nature of the companies involved. The semiconductor one is actually producing something tangible with huge costs associated with that. Most of the "tech" companies with the crazy salaries are just in the business of seeking rents from an entire market by collecting royalties or subscriptions for automated things, so they have way more money per person to give away.
How many people do you think exist on earth that could build a usable but most minimal form of computer/computation from scratch? Roughly same utility just lowered expectations on horsepower with also the assumption the software would be better optimized.
> Anything you designed was called worthless because there's already a library
This attitude (also in software) indicates to me someone who isn't very interested in a subject academically. There is so much to learn from prior solved problems and reinventing the wheel yourself.
If we weren't interested, we would have quit during the first semester.
Our very first assignment was to design a modified Miller. After nights of struggling with the software and connecting transistors, we were then told psyche! You could've imported from the library that we've never told you about! The rest of the assignments were essentially just importing circuits and moving stuff around and hitting the simulator run. Over and over and over again. It takes a certain grit to stare at the Cadence UI for 10 hours running the simulator button to yield a random result. There's no Stack Overflow, the TA was of no help. Just learning the software alone was hell. The instructions were on printed paper that didn't work half the time. Then you have the crashes that wipe out hours of work. We didn't dare to save our work sometimes because it'd crash the software. Loading the software took forever. I genuinely hated Cadence with a passion.
I'll admit I don't have that grit. My best memory from the course was still being able to design and fabricate my own chip and PCB to go with it, and successfully converting an analog signal to digital and back. The ENOB took longer than needed to solve and it was rewarding. But I'm not doing that as a career.
Only 3 out of the original 20+ people graduated. Only one bothered to apply to semicon. The barrier of entry is high, but the educators were of abysmal quality.
I'm not endorsing bad professor games (which are real). But from my time in academia a lot of students have similar experiences about courses that just aren't for them.
For example, in a state school analysis course, almost half the class decides they just need to squeeze by to graduate and maybe 5 students will enjoy it and do graduate school.
Then again it's difficult to figure out what you are interested in in university at 18/19yo. Maybe just a vague curiosity, maybe the paycheck. But you have to actually take the course to know. I wish there's a better way.
I've been designing semiconductor chips for 25 years in the US. For the last 20 years all the companies I have worked at have been fabless. We send the mask data to TSMC or Samsung to manufacture.
I sit in front of a computer at a desk all day and get to work from home 3 days a week. We get plenty of new grad design engineers from college.
But do young people want to work in manufacturing plants being on call when the production line goes down? There are thousands of technician jobs that are more 2 year associate degree types needed as well.
I've also seen stories that because of the small number of fabs it is difficult to switch jobs because there are not many options. This pushes down salaries as well. In contrast to design where there are hundreds of companies which you can leave and go to and get significant salary increases.
As with most jobs, more pay and better working conditions will attract more people.
> do young people want to work in manufacturing plants being on call when the production line goes down
They do if the jobs pay enough to live. There are a lot more classes of young people than recent grads from big universities. See Amazon warehouse and airports.
If I attended Oregon State, why would I want to earn $70k base with almost no stock and bonus to work at Intel or Micron as a New College Grad Process Engineer in Beaverton or Boise where I have income tax when I can make $110-140k base with a 10% bonus and stock at Amazon, Google, or Microsoft in Seattle (let alone smaller startups or non-name brand companies like UIPath) while paying no income tax.
The classes you study in EE/CE are extremely similar (and often the same) to those in CS so I may as well earn more with a better quality of life.
Salaries in the Semiconductor industry in the United States are stuck at 2000s levels, which lead to the massive outsourcing of electronics talent to Taiwan, South Korea, PRC, Israel, India, and Malaysia because you could pay $50-70k salaries and still get top of the barrel talent in the EE space (though Israel is seeing the same shift to software - sad days for Haifa).
> I thought this was about technician level with high school and associates degrees
The article is about students attending Bachelors and Masters programs. Even process and manufacturing roles require a 4 year degree now. I have friends who are specifically working these jobs at the Intel fab in Beaverton (hence my call out of it)
Associate’s degree or Bachelor’s degree in a Mechanical or Electrical STEM field related to semiconductor manufacturing technology
• Examples: Microelectronics, Mechatronics, Electronic Technology, Equipment Technology, Industrial Technology, Manufacturing Technology, Avionics Technology, Electrical, Chemical, Materials OR Semiconductor related Scientific STEM Degree
> people make career choices for reasons other than money
"Technican level" in steel factory is not same as semiconductor factory. You need to understand process to control and fix it and semiconductor manufacturing is probably the most complex process out of anything humanity makes in factory.
I mean my experience is just listening to podcast with former chip fab technican, but whatever, I won't stop you from thinking making requires same competence as flipping burgers
So, it's been a while since I worked in semiconductor manufacturing, but I know a little bit about this from personal experience. The community colleges, yes maybe; the universities, no not particularly.
Once upon a time every fab needed lots of engineers, because they were designing the manufacturing process (because each company had only one fab, or at least only one modern one). As they grew, they kept hiring engineers for each fab, mostly out of habit and custom. Then they put huge bureaucratic obstacles in the way of changing anything, because the manufacturing process was already designed (at another fab), and you didn't want every engineer who wanted to make his mark on things to screw it up.
The thing is that if you make a change in step 87 out of 450, and that change turns out to screw things up, you can have (450-87)/450 = 80% of your factory is full of dud semiconductors. It takes a few months for a wafer to get from the beginning to the end of the line, so that's a couple months of production from your multi-billion dollar fab, that you can throw in the trash. So mostly, they don't want the engineers to do any design, because there's way more downside than upside. Only at the R&D fab is anything like real design done.
Now, to run the (rather complex) equipment, you probably want a lot of community college graduates, although frankly a smart high-school graduate could do it well with a few days training. But the only reason they "require" so many engineers at production fabs, is that they don't realize it's not necessary. It's a bunch of people who worked for years to get an engineering degree, put in a place where they are very nearly forbidden to change anything.
If the multi-billion dollar fab is made, they will run it. If there are not enough college graduates to staff it, they will miraculously discover that smart high school graduates will do just as well, as long as there are engineers at the R&D fab.
It's also been a while since I stepped foot in a fab, but my impression is that for modern processes there are no people in the fab proper. It is fully automated, people sit outside the fab or they service machines in drop-down bays underneath the fab floor, accessible only through airlocks.
What the technicians are there for isn't to operate the robots...they're supposed to analyze the performance of the robots constantly and summarize anomalies and flag them to the next shift. So a work shift might involve metrology, statistics, and communication skills; I wouldn't expect that skill set from any but the most exceptional US high school graduates, and those folks are bound for the university, not a job with 12-hour days wearing clean room suits and limited bathroom breaks.
Some of the more fussy processes like EUV require regular cleaning of excess tin in the machine as well to keep yield up. afaik it's a trade secret as to how this process works, but I know one of the big delays in rolling out EUV is that the cleaning process was hard, and that lead to a lot of downtime on the line. People on both side of the Pacific were able to make chips with EUV for a while, but they just couldn't make money while making them. TSMC cracked that nut, but I haven't read any articles describing exactly what their magic was to get the uptime so high. However, I suspect it might involve a platoon of PhD-level experts that also work like an F1 pit crew.
Anyways, the days of carrying boxes of wafers from machine to machine and pressing buttons are long gone. Everything that can be automated has been, what's left is debugging and servicing robots, and searching for early signs of degradation in an otherwise fully automated machine.
So, the real value of a highly trained technician in this case is they can look at bumps in the data and go "huh..." and flag it to management so that you don't have 80% of the fab full of trash wafers. In this situation, you want someone who actually understand the physics of the process going on inside the machine, and not just someone who is following an SOP. Any analysis that could be easily automated probably already has been.
Of course, you need even more highly skilled people to set up the robots in the first place, and that process also consumes hundreds if not thousands of engineers, depending on your schedule, scale and the complexity of the machines. A lot of that can be pushed back onto vendors, but you still need an on-site team who actually understands the physics to get your process yields up to several nines and hold it there with almost no downtime.
There are still plenty of people in the fab. They're not usually running product around in a modern automated fab, but there are always equipment issues to work through. When you have fleets of multiple tens of tools, it's always an exercise to keep them matched and in-line with each other. The majority of modern cleanrooms are "ballroom" style, where the tools all sit out in the open. Each tool has a class 1 mini-environment inside which keeps the product clean. Outside the tools is a cleanroom, but it's not as clean as inside.
Remote access has helped out a lot, but the process engineering folks will still spend a bunch of time every week on the line. The equipment engineering folks will spend time every day on the line working on things.
True that, but if they didn't have enough engineering graduates, they wouldn't just not run the fab. They would suddenly discover that a few engineers at the R&D fab, plus some motivated and smart high school graduates at the production fab, can totally make things run. The fabs will not sit idle due to a lack of college graduates.
A lot of what needs to be done definitely is not rocket science, and much of it can be trained -- at least when things are going well.
One thing I do appreciate is that there is a pretty good meritocracy at the second tier fabs working on legacy nodes. I know a fair number of folks who have worked their way up into upper management with only an associates or bachelors degree.
Yes, it's mostly the continual equipment maintenance and the occasional process engineer that goes actually looking at the equipment (mostly in drop bays) as wafers go by in boxes. Most of those are associates degree positions, because you need to understand enough (cleanliness, electronics, a little chemistry/optics, and some chemistry) to debug and replace components without breaking more stuff. Most of the equipment is manufactured, updated, and maintained/fixed by european, american, and japanese FAEs anyway.
That said, I do still hear that when there's an actual yield problem (or a delay in achieving yield), then there's a lot of overtime for the process engineers getting it up and running. Usually some engineer has to be on call all night in each area (photo, diff/dep, etch, implant), but in an office nearby.
Oh, and also there is all the process model extraction/control that has to be done for each fab/process. That's a whole group of serious engineers to create the PDKs.
Everything I hear about semiconductor technology sounds like you need two PhDs minimum and then you have to work in horrible environments with horrible management (either personally or just due to stressful deadlines).
I think prestige is a big factor (and kind of a trap): people want to work in a prestigious field regardless of lower salary, worse working conditions etc. That's why there is so much competition. Kind of like how academic positions are so competitive despite low pay.
EDIT: Also an oversupply of graduates who feel they need to work in their fields of study. Actually I think "graduate oversupply" is the main issue. Feels like such a waste to work elsewhere if you've dedicated years to studying something, it's a major sunk cost.
Maybe it's less common today, but I would say the overwhelming majority of people I went to undergrad with ended up doing things that were only tangentially related (or not related at all) to what their degree was in.
Yep, 50% of graduates from my course went into finance, after spending 4 years learning about electrons. Joke's on me, they probably started at more than what I make now.
No, but it was a decent enough British one. Still seems wasteful of everyone's time to hammer in MOSFET equations to someone who has subzero interest in electronics ("doesn't recognise a screwdriver in practicals and no intention to start now; it's not in the exam" level of apathy). But then again they all got Firsts by absolutely smashing exams and I didn't. So, I have to say they could really focus on something they didn't care a jot about, for years, as a means to an end (landing a job at Deloitte, in the case of the screwdriver person) in a way that I never could.
If you just wanted to learn differential equations or something there must be way more effective ways to do that, and it would free the engineering teachers to teach engineering to aspiring engineers, who are being turned away in droves to make space for City-fuel grads.
However, as they say, the market has spoken, especially in a country where "industrial policy" is either "lol" or "sell".
The UK job market is a bit different from the US tbh. It feels like there isn't enough respect for Engineering+STEM in the UK. Almost everyone from good programs seems to end up in S&T or Consulting or IB.
Not OP but not necessarily. I have a similar percentage of Physics+Applied Math major friends who attended Cal and UCLA who ended up in Consulting and/or Financial Engineering.
A lot of the sciences aren't really terminal degrees for people who want to actually pursue the subject of their major as a profession explicitly. So if you don't want to go on and get a physics PhD--which may not be a great career trajectory anyway--going into something that interests you and pays generally smart people who aren't scared of numbers well isn't the worst place to go. (And you're probably actually better off with that physics degree than you would have been with an undergrad business degree.)
I am one of those disaffected STEM grads btw - T10 STEM program grad -> Engineering internships -> Policy Stint -> BSchool -> MBB -> Product Management
The business portion is easy to teach, and hiring for Finance+IB+VC+Consulting+PM roles is increasingly oriented only at STEM grads and BSchool grads from top programs (eg. BSBAs @ Haas, Wharton, Anderson, Ross, Stern)
I'm pretty similar--engineering undergrad/masters -> worked as an engineer in oil business for a few years -> MBA -> product management -> various analyst/marketing roles that followed from that fairly naturally.
To add to that, given when I went to school and the fact that I wasn't a CS major, I didn't touch a computer in undergrad. (Not literally true--I took a FORTRAN course which I didn't use until much later.) Never touched a PC until I was working and that was initially an Apple II we ran a couple of analysis programs on. But... like many of my classmates we ended up in the computer industry anyway.
>increasingly oriented only at STEM grads and BSchool grads from top programs
I assume the idea is that they have not only already been filtered by the schools, but they're presumably not dummies if they got the program with flying colors, we can do a sanity check that we can put them in front of clients, and teach them what they need to know.
Oh whoa! That industry definetly played (and still does) a big part in technology today. I heard a lot of stories about the HPC/Simulations and Networking research done by the ONG sector.
> we can do a sanity check that we can put them in front of clients, and teach them what they need to know
Exactly! That's how I do hiring for interns and new grad PMs as well! Also the added benefit of having technically adept candidates makes imparting domain specific knowledge easier (if I'm a Cybersecurity PM, I'd like to have someone with networking+os chops along with business sense)
I was actually an engineer for an offshore drilling contractor. More planning and running shipyard jobs than doing HPC stuff though I was also involved in a couple accident investigations.
But got in at a hot time and had a planned departure to grad school just as things were having a major downturn.
Semiconductor salaries are still very low. Intel and co pay around 70-100k with marginal stock and bonus for most engineering staff. Even seed stage startups pay more. May as well become an accountant at that point.
> When that happens, people will just go back into finance and other high pay fields that require less education and effort.
Straight from the horse's mouth I know that quite some very intelligent, insanely well-educated (in a relevant direction, say, doctoral degree in mathematics, physics, stochastics, ...) and hard-working people don't have the personality traits that finance is looking for.
I know quite some insanely talented people who could not, for example, get a permanent position in research (for reasons that have nothing to do with the quality of their research) who would, as mentioned, rather stand no chance of becoming hired in finance (if I had a startup, I would seriously consider hiring them).
So the whole advice of "go into finance" is in my opinion divorced from the reality.
You are being too literal here. Firstly, there's a large number of software devs that would do just fine in finance. A common refrain is that many of those who would have gone into finance over the past decade or two want into tech instead. Secondly, there's many other fields people could go into. People follow the money. They adapt their skills to those fields. This idea that "well devs will just work for peanuts" is absurd. Intelligent and talented people will just go elsewhere if the field is no longer a source of high incomes.
Of course I expect it will continue to provide high incomes for quite some time. There are few things I can think of that wouldn't be improved by software, and as long as that holds true it means there's more demand for software.
This is probably a reason STEM graduate level classes and PhDs are prominently international students, willing to go on to the industry and toil under immigration limbo.
Yes, and no. International grads in the US also want stability, and FAANG has proven to not offer that anymore. Many Stanford grads I know are targeting pharma-tech and health tech companies which still pay well, but have a way more stability at the cost of being boring as fuck.
FAANGs are especially brutal right now. Because many of them had layoffs, they are not even allowed to process perm certification for the time being, so green card hopefuls are stuck.
In my view FAANG jobs are beside the point. I've puzzled about this a lot, because I'm technically a "hardware person" with a degree in physics yet I'm also a good programmer. Many of my physics classmates became programmers. I could hypothetically bring myself up to speed on the latest software development practices and make myself employable at it. Maybe not at FAANG, but probably somewhere decent. But I've never made that jump.
I work in an office that has a sizable software development department, so I'm quite familiar with what they do, and what their work environment is like. It's not some distant mysterious place. They're my lunch buddies.
There are also lots of people who are quite bright, and do difficult mental work at lower pay, yet don't learn to program for whatever reason. Those of us who program know that it's actually quite easy, yet is an impenetrable barrier for most people, and we just don't know why.
There will always be somebody who makes more than you do, but it doesn't explain why people aren't all leaving their jobs for higher paying ones. Part of it may be psychological inertia -- I might be guilty of that myself. But I think another part is that we don't actually know the recipe for success.
Last I checked, they generally do require you to live in SF, Seattle or similarly "awful places". If anything, rampant homelessness and crime seem to be highly correlated with cities that have "FAANG" HQs.
Given that four of the FAANG companies are headquartered, not in San Francisco, but in suburbs south of it (Menlo Park, Mountain View, Cupertino, and Los Gatos), I'm not sure what you're on about. Los Gatos isn't perfect, but it's not some haven of homelessness and crime.
I lived in the bay area for years, in the city, in Oakland and in Moutain View for a year. I agree the south bay is less chaotic than SF. It's still a lot more chaotic than either Colorado, where I was born, or Taiwan, where I've spent most my life. To me, the US west coast was always a sacrifice of quality of life for career growth.
If you need a code to enter the bathroom at your local coffee shop, you might want to consider moving somewhere with a bit higher baseline for public spaces.
San Francisco is still an incredible city to live and work in if you're young and don't have a family (which is a huge part of the workforce). The homelessness problem is blown out of proportion in terms of everyday living (it's still ridiculous from a societal point of view, but still)
To be fair, my experience there was all in 2012-2015.
It’s not just homelessness, though. Stores here in Taipei aren’t shutting down due to huge losses from shoplifting, but that’s happened to multiple places I used to go to regularly in SF. Nobody I know here has been robbed in public, either. In contrast, my friend’s mom had her purse stolen from her in broad daylight on the street in SF and several other friends had their cars broken into.
Wait, you're comparing an American city to Taipei, and saying it has a crime problem?
Why not compare it to another American city? You'll quickly discover that these 'worst' cities are middle of the road when it comes to most forms of crime, the real horrible places in the US are not in the liberal coastal states. The vast majority of them are in the tough-on-crime god-fearing ones.
(I also always found it a little weird how many of the stores citing crime for closing are ones that just happened to have unionized, but maybe that's just a coincidence...)
If you think Seattle, outside of ~18 square blocks is an "awful place" by the definition of even the most dyed-in-the-wool WASP, you've either never been to the city, never been outside Pioneer Square, or have been eating way too much political propaganda.
When I was being recruited by ASML this is what I thought. Investing all the time, money and energy, to then be at the mercy of one employer, and have to live in a very specific locations, and the salary is not even that great.
Sidenote: most companies in most countries do this. They tend to align on a standard set of salary grills for the region, that are more or less the same between each other, to keep competition, wage increases and employee churn to the minimum.
This is of course illegal in most places, but to counter this in court you need a whistleblower willing to go public and expose them with hard evidence, but since the workers actively involved in these wage fixing schemes are very well compensated along with NDAs, they have no reason to throw all that away in exchange for no other company hiring them ever again and your ex employer suing your for breaching your NDA. Sure, you might eventually win in court, but do you really want to drag your family through expensive lawsuits against mega-corporations who spend more on toilet paper than you can on a lawyer?
Is there any company anywhere that doesn't research the market rate for jobs to at least some degree? What you decide to do with that information is another matter of course but I'm well within my rights to decide not to pay anyone more than what I've determined to be the market rate no matter what other offers they claim to have in hand.
In the country I work it goes like this: the employers are members of an IT association, once a year each employer sends over the anonymous salaries for each job title. The associations aggregates them and sends back the distribution, which HR then uses as a reference. Similar result, but no direct collaboration between employers needed. You can even buy the report for each position as an employee for a fair price.
This sounds similar to what RealPage does in the U.S. with apartment rents: https://www.propublica.org/article/yieldstar-rent-increase-r.... After the Propublica story came out, the U.S. Department of Justice started investigating it as essentially collusion-via-3rd-party.
If you're the one setting the market rate, and there are no market forces to stop you, it becomes a big problem
As we can see from the current situation where the USA is completely reliant on foreign fabs and unable to make its own partly due to a shortage of skilled and willing labor
What is NOT illegal is to all hire the same "outside consultants" who give you the same slide deck they give all the other employers, that "agrees" to the same wages giving them something to point at.
A lot of that is Californians in skilled trades and back office roles who got priced out of NorCal and SoCal. A lot of my HS teachers for example moved to AZ to become teachers because they could at least afford a home in the Phoenix Metro, along with friends of mine who ended up working in Sales, Accounting, and Back Office Administrative Staff work.
the catch is that you can't have FAANG jobs if you don't have semiconductors. Greedy short-termism, grab what you can now and forget about the future is always what kills the golden goose.
unfortunately you can't expect people as individuals to make those calls. You need mechanisms that properly value long-term sustainability
The fact FAANG jobs pay better means the supply/demand ratio for talent that can do semiconductor jobs is higher than for talent that can do FAANG job.
GP is right, so rationally everyone who could - would rather live in Mountain View and work a FAANG job than live in some factory town in the Arizona desert.
Apparently there just aren't enough such people, so FAANG will keep paying more and those who can't get these jobs will have to take the worse paying position that requires harder work, more dependency on a single employer, under far worse working and living conditions.
Phoenix isn't some "factory town". But your general point is true - wages in the electronics sector are stuck at the same amount that they were in the 2000s.
That's why companies like Andruil and Tesla can poach electronics and MechE talent with still low but not as low wages (110-130k with a bunch of stock versus 70-100k with almost no stock).
There is no shortage of chip design in the US (I'm in Austin and I constantly bump into these folks), only chip _fabrication_.
What little I know about fab is that extremely bright physicists do rote tasks with horrible schedules. I don't think the solution to that is more education
Waves from a perch overlooking AUS and Giga Texas.
I've run into enough NVIDIA and AMD engineers at The Domain to surmise this is true.
We need a funnel towards training highly-skilled blue-/light-blue-collar workers to feed the strategic needs of said fab industry if the US were serious about building domestic capabilities and competitive independence.
Right now, I don't think the current state of the US education system from national to local levels is promoting the fundamentals needed to attain this goal.
Agree completely. High specialization towards physics and science weeds out other brain types that may have a lot to offer. Trades generally don't do that
Because manufacturing at this scale and precision has extremely thin margins and stringent quality requirements. A coop would necessarily be more communally minded and would be outcompeted by more ruthless efficiency seeking operations
Going to be tough for them: jobs are in EU (ASML) and Taiwan (TSMC). Heard some other critical tool are from Japanese companies (masks and others).
The silicium industry is critical for any modern society and it seems it has no economic meaning if not at worldwide scale: then it requires significant out-of-free-market financing. The state administration has to organise and maintain this "out-of-free-market" financing.
Only bleeding-edge performant silicium must be tolerated there.
I graduated 5 years ago with a degree in computer engineering, and have been working since then in embedded systems for satellites/aerospace. Currently I’m finishing a masters in CS with a thesis in cache prefetching. I started the masters because I decided aerospace is probably not the place for me (salary is okay but not stellar, very high inertia, hardest problems are not control systems and signal processing rather than software). I loved computer architecture in undergrad, and I think getting a job at a newer fabless chip company like SiFive or Tenstorrent would be amazing, but I am wary of all the warnings I read from people in the field. Is there anybody here with industry experience who has something good to say about working on the fabless side of things?
There’s an absolute shedload of opportunity, for one thing. I run a job board for people who design chips and RTL for FPGAs. The pay is good and all companies are hiring RTL designers.
For another, plenty of people who do it love it. This probably isn’t the place to connect with them. I’d recommend /r/fpga or /r/chipdesign on Reddit.
(That job board is www.fpgajobs.com if you're curious.)
Thanks for the suggestion! As someone with little job experience doing FPGA work (just a single internship and university research), is it that hard to get an offer that’s not extremely junior level? I’m hoping to leverage my embedded experience for a role that involves firmware or OS work.
Lots of folks are starting to leverage hybrid FPGA/ARM chips, like the Xilinx Zynq series. Folks who work with these typically get pretty involved in both embedded development and RTL, as part of the challenge is figuring out what segment of compute lives where in those types of systems.
There are plenty of places who'd take a decent firmware engineer and then allow them to learn the gateware portion through absorption.
A master's in this area also is a good way to break in. Defense companies tend to value it pretty highly.
Intel is planning to build an enormous billions of dollars fab just miles away. It makes total sense to build up the semiconductor education there. OSU is a fine state school with a well regarded CS and EE department.
Let's not count our chickens before they hatch. I have serious doubts that US universities are capable of building anything except nepotism networks for upper-class Americans. I hope they prove me wrong, but you can't build a semiconductor industry on nepotism.
America is home to Qualcomm, Intel, AMD, NVIDIA, Micron, Broadcom, Texas Instruments, Analog Devices, Apple, Applied Materials, Lam Research, KLA, Cymer, and most of the other most important fabless houses and IDMs outside of Korea. TSMC and Samsung are both building large foundry fabs in the US. Why don’t you stick with opining in what you know.
US tech moved up the chain just like most US businesses during globalization. Why have a extremely intricate and expensive to run fab when you can just have a warehouse full of coders making better profits
During our final semester in school, or professor walked in and said, "All of you but X needs to know that you're going to be unemployed." He said that to a class of 4 people, the only ones left after most of the original class of 20+ had quit.
"The industry only hires the exceptional, you are all inept. Except for X, my favorite student."
X ended up being the only person to go on to work in a semicon company. The rest of us ended up in software, earning 5x the pay X was making. The barrier of entry to software engineering is so so much lower than semicon, which still makes no sense to me.