We’re talking about skills that span kernel-level programming, hardware quirks, low-level debugging, distributed systems, security, orchestration logic, even the capability to work with the UI/UX team... and the ability to explain all that without scaring interns. You can’t just hire for that. You have to grow it. Nurture it. Beg for it. Or in some cases, resurrect it.
If you are that person, what is the best way to market yourself? I am the person described. I've got experience from poking registers in firmware, to wireline transport protocol implementation, to infosec, to writing microservice framework middleware, to pipeline orchestration at the OS level, and on and on. In the last week I've debugged Linux UDS issues and TLS cipher suite problems, and wrote code to provision WiFi-connected devices over BLE.
But it's incredibly hard to demonstrate that in an interview, if I can even find a role that warrants it. You're not going to find me on a university campus or in a research lab because I'm at a FAANG trying to pay my mortgage.
I think there's a bit of an "eternal September" where people are promoted to management, asked to hire a bunch of people for some niche role, and blog about their inability to hire good people.
Hate to break it to the author, but there are plenty of people who can write a driver, bootloader, distributed systems, PAXOS, etc. but there's NO JOBS DOING THAT so they all work for <generic SaaS company> making <generic NodeJS app>.
The author mentions their grand strategy is fishing for talent out of universities, which is probably smart. Pulling people out of generic web dev world to go write "container orchestration logic" (or some other niche) is going to be a hard sell - most of those people were burned before by straying too far from the lucrative web dev jobs. Nothing like going into job interviews and telling the hiring manager that your last job was some ultra-obscure niche.
My first job out of university was working on an embedded Java runtime, with a group of peers from said university. Which sounds similar to what Virtualise.sh is doing.
Now, 20 years later? Tech lead on a Typescript/Flutter/AWS internal system.
In some ways I'd be excited to write container orchestration logic, or hack on a hypervisor. There's precious little of that work available, especially so in a small country like mine (NZ). My CV is crazy enough: being burnt isn't the issue, just finding the work is.
The cynical part of me wonders if the other reason to hire people straight out of university is that they're cheap...
cheap, moldable, and not distracted by families will always be a strong combination for niches you can't really learn otherwise. Not a lot of people will take a "downgrade" in their career to work on some passionate or cutting edge tech.
Also the usual botched hiring process, where even if one has the knwoledge, just because it wasn't what one was doing the last five years, it doesn't count, even if one has portfolio from side projects to show.
Or the dumb leetcode stuff because "we are going to be cool like Google over here".
So many gravitate to companies with more sensible hiring practices, and that is how a company loses candidates, if it is one of those unicorns that the whole planet wants to work on.
> Also the usual botched hiring process, where even if one has the knwoledge, just because it wasn't what one was doing the last five years, it doesn't count, even if one has portfolio from side projects to show.
So many times this. It doesn't even have to be five years to disqualify so many applicants these days. If you aren't working on it currently, there's already so much bias against you in the hiring process at large (e.g. the "I don't consider anyone that isn't currently employed" mantra that pops back up now and then).
Pretty much matches my experience. I'm an EE, I started out writing an LLVM backend for a custom DSP. Pretty good understanding of hardware, compilers and general low level things.
Now I have a generic web dev job working on a generic nodejs app. There are many more jobs available and in general it's much easier to have a remote web dev job.
I think I'm pretty similar; I don't have great advice, more or less my first real job I got the attention of my skip level boss (he was one of my interviews and also I did a couple projects with him), and he's hired me to two more places since then, so I'd say be sure to network. Bonus points if your network gets you a job where you don't have to work again.
I've also done a couple sessions of peek at a problem in production and fix it / tell people how to fix it based on reputation, which is networking.
Oh, and one more thing: never ever mention any experience with mail handling, or you'll get roped into doing it again. People remember, even if you only said it once. :P
I think this is the real answer. A personal testimony is the only way to truly distinguish between a bullshitter and one of those radically curious, passionate people who learn everything they can and quickly become experts in whatever they touch.
A sibling comment said to repeat what you just said in an interview. Don't do that.
Demonstrating that you are a stable hire who will fit well with the team is almost all that matters. You wouldn't be having an interview to begin with if you didn't technically qualify-ish. Even when speaking to technical interviewers, you will intimidate them. Actually selling yourself comes after your first 90 days of acclimation.
>because I'm at a FAANG trying to pay my mortgage.
Let's rephrase that:
I'm at a company with top of the line talent in many different domains and I prefer the compensation of this company over networking to find someone who knows the hiring manager of a team like this.
You have the easy way into this. For people without this networking, the best advice is to make them come to you. You gotta share that knowledge out there and get regards in various communities online if you can't stand out in a batch of resumes (and no one these days can, for reasons entirely the fault of HR and ATS). Do that and your newfound network will connect you.
Go for robotics or specific robotics subtype (UAV, ROV, vehicles, etc..). If there is a "platform team", that would be your best fit, but some companies don't have separate platform team, or they don't advertise team name in the job postings.
People like that are in demand. Our company just filled few position like this, and hiring was hard. Some people were rejected because of not enough low-level knowledge, some people were rejected for not having enough high-level knowledge.
Agree with the TFA's author though, people with all the skills are super rare.. We were lucky to end up with few hires like that, but most of the people don't have all skills at once, and they have to learn on the job.
I think it's just not possible, because you're stuck with the reality that it's some combination of a machine and HR sorting through resumes.
Even past that step, it's a bit random. I fully admit I ignore the resume when doing interviews, as it's a bias for the interview role I get, which is typically working through some coding problems.
> You're not going to find me on a university campus or in a research lab because I'm at a FAANG trying to pay my mortgage.
Make sure you are demonstrating it to the people on your team so that when they leave and go somewhere else, they can hit you up. This takes some time.
And, sorry, you have to get out and hit some gatherings in person (hackerspaces, meetups, professional meetings, etc.)--online-only isn't going to cut it anymore. With the AI garbage clog of inline interactions, your professional network is back to who you know in-person.
I found it is best to keep quiet, not even have a (directly attached to your name) blog with such varried content, and instead just send appropriate version of your CV when needed. When it comes to interviews I found it helps that I'm in my office/lab space, able to pick up the cam and show the person on call contents of my shelves and present few details about them on the office whiteboard. But even then they of course doubt you.
No he's saying if you are a generalist you shouldn't try to market yourself as that. Usually employers are looking for specialists that are perfectly moulded to the one exact task they need at that very moment.
It's misguided of course, but that's what they think they want and if you say "I've done all sorts of things and I'm good at all of them" they'll hear "I don't have much experience with anything" and discount you.
Exactly what I wanted to get across. That's why I have 3 personal pages/blogs and a separate forum. Only people very close to me know for all of them. Otherwise people get weird about you doing so many, so varied things. Here I m not talking even about work related relationships, on a personal basis people feel insulted and as if it is a competition if you list what you do when they ask you to. I have super focused, 16h days at the lab/office, of course I ll get a lot done... https://supermemo.guru/wiki/Planning_a_perfect_productive_da...
On the flip side few reasons why I don’t favor generalist profiles (while being one myself).
- resumes which claim generalist tend to be SEO’ed than real, or AI generated and they will say experience dozens of technologies most of which would be untrue, throw everything and see what sticks .
In most cases it just tells me what they have heard of rather than what they know especially if they are not very senior .
- It takes really hard effort to be beyond a superficial generalist, even when they know some of those skills and remember them during an interview it is not with a lot of depth they pursued only in passing interest not in a professional capacity (that is fine, but it is a lot work to make the distinction in evaluation on all relevant skills in a timed interview)
- It is harder or simply not viable for a specialist to interview a generalist , so you need to have few to hire more . If you don’t have any or they are not doing interviews you are not going get more .
- Being a generalist with depth of understanding means you are the type of person who needs to understand things properly before doing them . .You are lot of time learning things which are not required for get the task completed.
That means either you need are prepared to spend a lot of personal time and be perpetually stressed or be slow in completing something .
It is hard thing to master to let it go . I don’t think I have learned it yet
- Complexity and depth of technologies change a lot in short duration when they are corporate backed .
I learned Linux architecture or vim or git 20 years ago they haven’t changed much. You can be productive in any of these to stacks very quickly even if you didn’t use them for years .
Last I worked on android or swift is more than few years ago . I doubt I could even build a serious app without spending major time . To be productive to learning curve is steep and the prior knowledge is limited in usefulness.
>Being a generalist with depth of understanding means you are the type of person who needs to understand things properly before doing them . .You are lot of time learning things which are not required for get the task completed.
It's just how my career swung me. I would have loved to have developed as a subject matter expert, but everytime I get into the swing of things, layoffs came around or the studio shut down. Now I'm freelancing and that by nature requires a generalist approach.
no one's really investing in specialists, so I have no idea how the millentials/Gen Z of the world will ever get to properly specialize. Specialization requires time to master something, and that time implies stability to do that thing.
>To be productive to learning curve is steep and the prior knowledge is limited in usefulness.
business wise, sure. It's a shame all business sees as "productive" is based on how many widgets you churn in that time. It's no surprise such companies want to force AI into it without quality considerations.
Generalist over here as well, that is kind of what pushed me to boring fields, UNIX, Windows, Java, .NET, C++, vanilajs,...
Changes also happen, but they are kind of glacial with areas where we are hunting the new shinny every here.
Usually I tend to be a laggard on the adoption graph, the large majority of stuff hardly makes the curve, and I have better things to do with my time.
Plus that vintage stuff that isnt' cool to write blog posts about, usually pays good enough.
Stuff like Android or iDevices suffer from yearly fashion, platform owners feel compeled to reboot the development experience every year, forcing app developers to keep up, and also as means to sell new devices.
> Being a generalist with depth of understanding means you are the type of person who needs to understand things properly before doing them . .You are lot of time learning things which are not required for get the task completed.
Ha, I feel seen. Though also I wouldn't frame this quite so negatively. I've seen a lot of tasks "completed" by people who just got the job done without fully understanding it and they frequently get it done badly.
I'm one of those generalists who's done everything from bare metal work, to cloud stuff, to most things in between. Very few companies hire for that. They usual have some particular pain point they need fixed and I have experience with that. But it's easy to explain that if they need help with all this other stuff the product needs to get out the door, sure I can pitch in with that, and more importantly, I understand intimately how my piece fits in with all the other pieces in the stack and that makes it far easier for me to design and build components that fit into their wider environment.
That's something hiring managers do find compelling.
Most HR hiring managers, and headhunters alike, have a real hard problem trying to see the point of generalists.
They want the easy way out, what to focus on.
As I usual say I tend to do whatever, so is the life of agencies, however when applying for positions sell the skills relevant for the position.
Not e.g. "I can do .NET and Java", rather "I can do .NET, proud of X, Y and Z projects in .NET, and by the way, I may do Java as well if needed, there was this project...", something like this.
I think the counterintuitive thing is that the marketing communicates a lack of confidence. It's... sortof odd to think about it that way, but to communicate actual confidence in your field, you mostly have to be willing and able to have a conversation about whatever topic your interviewer fancies. Being able to do that comfortably speaks volumes that your resume and project portfolio struggle to replicate.
This is the advice I plus one. On the other hand to include kernel to UI raises security questions. In my experience, I was rejected as "we can't hire you, you're too dangerous."
On the right team at a FAANG to a large SaaS, because the places that need all of that kind of work are few and far between. Eg somewhere among the teams that make the Oculus firmware would be close enough to the hardware to need all that sort of work in order to ship to the tight latency specs they're looking for. Or early employee at a startup making a product that's going to need those sort of problems solved on a consistent basis, that's well funded enough that you can have enough time to solve problems like that in ways that aren't a total hack (because we all know those are going to be removed Real Soon Now).
What you say makes sense, but I guess finding the right team is extremely hard. I would guess that 95% of FAANG is actually kinda bad and only lives from the reputation of the 5% (or less). So how do you specifically interview for the 5% without a prior connection / network?
You're entitled to your own belifo about FAANG, but that particular one might make it harder to find any team you like. Definitely would make it nigh impossible to find the right one.
There's also a question of making the market. I've got these really weirdly-broad skills. LinkedIn is not great for finding the niches I want to fit. I suppose there's HN's "Who Wants to be Hired" threads, but that's risky if you're already employed. I'm obviously networking already, but what else is there?
I think the issue started way before AI when it comes to not having enough fresh blood for this industry. Companies have decided they don't want to train juniors. They want seniors who already have the skills.
> Companies have decided they don't want to train juniors. They want seniors who already have the skills.
It got worse in last two years. Now Senior Engineers must have exact combo of the weird tech stack and tools with N years of experience exactly as the existing employees, else gtfo, you don't even get screening calls. You don't have something from nice to have list? Lol, why are you wasting our recruiter's time even? She needs to use GenAI to write her next rejection email.
Also, your 15yoe does not matter, unless you are coming from direct competitor, in which case your 1.5yoe with internship is also excellent.
I think it is important to acknowledge that it takes much longer to train juniors in systems software today than a couple decades ago because the nature of the problems have changed. It now takes several years to get someone to an acceptable level for a lot of systems software work. In many cases that’s longer than the entire development cycle — a junior would never really be productive on the project. And yes, this is creating a vicious feedback loop where we are no longer producing many new people with these skills despite a lot of demand.
The minimum level of sophistication required to be effective in systems software has increased dramatically since the 1990s, when I first started doing systems software. The kinds of systems software we were putting in production in back then would be considered a trivial toy today. This shift has placed an enormous amount of upward pressure on the minimum level of experience and skill that would allow someone to become productive within a useful amount of time.
It is no longer feasible in many cases to have companies effectively subsidize the development of highly skilled systems software people. The training time has become so long that companies will never recoup the investment. It is easy to see how the incentives have created the current situation and it is not clear how to address the shortage. Even before the current situation, it was widely noted that most systems software developers were self-taught over many years rather than trained up in a structured environment.
>And yes, this is creating a vicious feedback loop where we are no longer producing many new people with these skills despite a lot of demand.
if the company would rather burn out and age out all their talent than invest in the future, I don't know what to say. For niche domains, you can't treat talent a a pure business expense. That's how you lose out all that talent to China and suddenly you can't compete at all. If anything, training should be treated like insurance.
>It is no longer feasible in many cases to have companies effectively subsidize the development of highly skilled systems software people.
"subsidize" implies that somehow engineers are saving money by choosing to take these more niche paths. It doesn't make sense. You have a product and you need talent to develop and maintain it to compete. There's no "subsidies" in a business relationship like this.
your choices are few and simple: train in-house, lobby schools to train for you, or publicize your tech and hope people train themselves for you again. It seems like these days companies aren't looking out long term though and just want to jump the ship that has plenty of time to steer past the iceberg everyone sees ahead.
I agree but it was always like that. Also there isn't any incentive anymore in growing juniors if they're going to immediately take the next better offer that comes around when they know enough, and you have such thing being offered on a daily basis (at least until a couple years ago).
I would love to work on low-level, systems stuff (anything as close to the hardware as possible), that's even my education and area of expertise. BUT SaaS companies in reality:
- pay better
- have lower costs
- are way easier
- don't have that many (or any at all) geographical restrictions (e.g. importing hardware prototypes).
>there isn't any incentive anymore in growing juniors if they're going to immediately take the next better offer that comes around when they know enough
okay. Promote them with proper pay then. This HR debacle of putting more budget into hiring over retaining is entierly self-inflicted.
>People follows the money.
Well that love sounds more like a low priority whimsy in that case. I don't think tech workers of all people are ones who ever complain about low compensation. Unless you work in games, I suppose.
This doesn’t follow. Below some level of skill and experience they can contribute negative value to the project. Companies need a minimum level of experience just to make the role pay for itself.
Does the high skill standard for surgeons mean the market for surgeons is saturated?
If you are not needing to consider having someone who does not meet a certain skill standard to perform surgery on you, then, yes, it would seem that the surgeon market is saturated as the parent describes it.
It depends on the time scale you are looking at it with. It looks saturated short-term, but it surely is not long term. The problem will only show up once your surgeons start retiring.
Instantaneous time is the only time scale that makes sense with respect to the topic at hand.
Sure, it is possible for the market to desaturate at some point as key people leave. But at that point, assuming suitable, per the earlier comment, replacements don't fill the void, the market will either come to accept lesser skilled people or it will come to accept fewer surgeries, returning to equilibrium. Not exactly magic.
...this doesn't make sense. Surely you need to factor in price point. Often times junior engineers deliver disproportionate value. Some ratio of juniors:seniors just seems rational, and those juniors grow into seniors.
Maybe there's a good argument against training, but it could also just be irrational and stubborn in this case.
In my experience it only works if you pull in juniors into strong teams and keep the proportion of juniors reasonably small. You also need to have a process in place for training them - it’s not enough to rely on ad-hoc mentoring from peers.
If you get the ratio wrong, with too many juniors and weak technical leadership then you will end up in a very bad place in your code base.
In terms of value, even if juniors are half the cost, it is much wiser to hire one senior instead of 2 juniors for the same money.
> it is much wiser to hire one senior instead of 2 juniors for the same money.
Maybe in terms of pure productivity, but if you can match hiring to roadmaps you can give them more approachable/further from revenue work for which seniors would be an overkill. Etc. I'm just saying anyone with a simple explanation is only telling you part of the story.
Besides, hiring is expensive. If you say live near a university, you have an edge in finding talent.
> give them more approachable/further from revenue work for which seniors would be an overkill
My experience suggests there is no such thing. There is work that seniors might not want to do, but if you hired well then they will be professional enough to do it, if it really needs to be done. And it takes some experience to determine which “non-revenue generating” work (e.g. tech debt) actually needs to be done, to advocate doing it to the stakeholders and to actually do it well.
Juniors need a lot of supervision and that is not free. Which is not a reason not hire them in the first place, just that that it should be done mindfully.
>which “non-revenue generating” work (e.g. tech debt) actually needs to be done
This mentality is part of the problem. You can't fundamentally treat juniors as a profit center. Especially in more niche fields. They need to be trained and schools don't cover your specific pipeline.
maybe it doesn't "need" to be done, but some refactoring work will pay off to turn your juniors today into seniors tomorrow. If you can only think in productivity charts, then you don't hire juniors.
I’m a lead consultant on a gig where the CTO outright told us that he’s not really interested in growing their (frankly, under qualified) engineers, just wants them to get on with the job - it’s very short-sighted and sad.
That said, in my previous job in a startup we hired very junior engineers and gave them plenty of opportunity and support for growth and several people did stunningly well. Pity the company didn’t do very well (IMHO due to focusing more or this sort of thing rather than making money).
>That said, in my previous job in a startup we hired very junior engineers and gave them plenty of opportunity and support for growth and several people did stunningly well. Pity the company didn’t do very well (IMHO due to focusing more or this sort of thing rather than making money).
Mentoring is a key part of technical leadership, way easier to help the talent grow into your requirements (even if they are under-qualified, for now)
I agree but there's also the extra difficulty to do mentoring remotely (we are a remote first company). I *really* like being remote first and provide the choice if you want to work on site or wherever you want. But it does come with some challenges.
This may be a hot take, but I don't think most juniors should be remote. You need some time face to face to understand the processes and you miss lot of passive knowledge when you're not navigating a workplace of people discussing matters.
once you get a few years under the belt, sure. You're probably fine doing everything online. I can't think of many remote only studios hiring juniors anyways.
Maybe I'm being too cynical, but I can't tell whether this is a promotional piece for something..
That said, I do agree with the premise of the article that it's hard to learn "the stack", especially with the advent of generative AI.
"Back in the day", when google spat out a link to something resembling your problem, you still had to deconstruct the answer given to apply it to your particular case. That required some sort of understanding, or a mental model of the domain. With ChatGPT I find myself copying/pasting error messages between two browser windows, or a terminal, not really understanding what is going on.
No learning is happening. Does not bode well for new folks coming in.
It's not a promotional piece of something, it's my personal experience as a CEO and co-founder of a company using Xen as the core of our stack. I like to share my views in a transparent fashion on how it's hard to do very technical stuff, not just for technical reasons, but due to the lack of people trained for.
It is a promotional piece - for Xen-based stack (the most popular hypervisor decades ago). That author laments that few people are interested in bare-metal hypervisors, like Xen.
But hypervisors did not disappear, they just got replaced. When we run virtual machines, they are usually backed by KVM (low-level) and qemu (higher layer). Sometimes there is libvirt on top of it too, but running qemu directly is not that hard.
And there is plenty of exciting research about this stack, for example KVM can be driven by things like firecracker and crosvm, and there are some rust projects targeting it too. There is also BSD's bhyve which
My impression is that it's not that people find hypervisors in general are boring, but just Xen specifically (or maybe all classic Type-1 ones? hard to tell with Xen being the only open-source representative).
I respectfully disagree with much of your comment.
First, this wasn't intended as a promotional piece. It's a personal blog post where I share some of the challenges involved in building a full virtualization stack — a stack that happens to be fully open source. It's unfortunate that sharing real-world experience is sometimes immediately perceived as promotional.
Second, I think there's some confusion between using a hypervisor and mastering one — or building and maintaining an entire stack around it. KVM/QEMU is widely used, but it has significant issues, especially regarding security, performance, and latency consistency. Very few groups in the world are actively trying to tackle these challenges holistically (even major players like VMware have made some questionable shortcuts).
When it comes to low-latency, real-time use cases with a strong security model, Xen remains unique among open-source hypervisors. It's definitely not boring — in fact, it's one of the few that enable certain classes of critical applications at all.
We also work closely with academic research labs, and I can tell you: there’s still a lot of exciting work happening around Xen — even if it's less visible than buzz around newer projects like Firecracker or crosvm.
It made me smirk a little, I've been doing almost exactly this line of work at $dayjob for the past few weeks/months, trying to prove out a concept as a solution to solve a problem. Actually really enjoyed the work, it has been a neat problem to solve.
Sadly, the program I was supporting just had all of its funding yanked, I expect to get laid off tomorrow.
> No learning is happening. Does not bode well for new folks coming in.
Recently got into a completely new language/framework and used Copilot to understand what was going on. I still made all the big decisions and wrote most of the code myself.
You can definitely use AI to avoid learning though (up to a point).
Sure, let me explain it a bit better. It's more like in the sense of the "stack" is very deep now. Clearly, we have/hire Xen/hypervisors specialist, and we do not ask them to be CSS experts. However, deeper in the stack (at lower levels) harder it is to find them, because of the lack of expertise in universities and/or appeal of doing such job.
And if you find or train those low-level/system-oriented people, they also need to understand how a feature they build will be exposed functionally to a user (and why they need it in the first place). Because things are not make into thin-air but required to work in a bigger picture (ie: the product).
> If it's difficult to find kernel developers then wouldn't it help to not require them to also know web UX?
That means hiring two people, and in $current_year, companies expect one person to know everything. Sysadmin, backend programmer, frontend programmer, designer and a DBA used to be different people not that long ago, now they expect one person to do all that... + it seems they want kernel development experience now.
Sure, some C code, some html, a table here, a colspan there, and you can have a website made by a single person... if we want a website to look like it was made on an 1980s computer by a single person.
If you're not that person, it's fine. Some people still just use notepad and write html like it's 1999. Other have both kernel experience, and have picked up react at some point in the past ten years. Plus LLMs write css these days, so no colspan needed.
At the rate LLMs are improving, that certainly seems like a possibility, but until they do, why would you need one for that? Kernel C makes sense. CSS is the problem here.
Or, you can have a decent website made by a single person. It's not that hard to learn basic HTML5, enough ARIA semantics to know not to use ARIA, a programming language with decent synchronisation primitives that supports CGI, an SQL dialect and the principles of relational database design, enough JavaScript to use MDN, enough CSS to use MDN, the basics of server maintenance, TLS certificate provision, and DNS.
If you want to do your own networking, or run email, that's a whole 'nother specialism; but just running a website is easy enough for one person to do.
Good, that's the way it was until the splitting of roles for commodification. A programmer is more like the Renaissance man who makes it a goal to do everything from different disciplines than a drone who has been trained to do one thing and can only be trusted to do one thing.
It's not commodification, it's acknoledging that tech got exponentially more complex over the decades.
just think of your favorite video game character in 2000 and then one in the 2020's and consider how much tech is needed to render, animate, light, and conceptualize it. in 2000 this was all done by maybe one artist and one gamedev, probably making a character with some hundreds of polys at best. now that artist has a pipeline of riggers, material artists, animators, and concept artists, while that single dev became a graphics programmer, gameplay programmer, tech artist, and build engineer.
My point was unnecessarily they split the roles. An artist can cover concept and materials. A programmer can do gameplay, graphics, rendering and builds. In fact having people who understand the entire project makes for a better project.
It's like moving from custom built cars to the assembly line where someone's job is putting in one screw. I understand it's cheaper/faster because you can hire anyone unskilled for cheap but cars were all suppose to be identical. Software should be unique (if not just copy the last thing built) but I guess when it comes to major games things are more of factory throwing millions of pixels of characters at existing game engines while copying gameplay of successful games. That's why games are shovelware these days like a netflix original.
But we now expect a single person to design the engine, the bodywork, both aesthetically and technically, make an engine, actually make all those parts, assemble them together, pain the car and test it.
Jack of all trades, master of none. This is why we need clusters, "stacks" and "clouds" on the server side and gigabytes of ram on the client side + many megabytes transfered, just to show one simple weather forecast website that gives the user the same amount of information as a WAP site did on a five line mobile phone back in GPRS times.
This kind of talent is ... not rare at all? And pretty easy to find too, just wander around the hallways at a major academic systems conference or hang around the kernel mailing lists, you'll meet most of the people working at the cutting edge of these things and get connected to those working on critical systems components. And yes, most of them work for FAANG or are funded by them.
Seriously, book a plane ticket to ATC/OSDI, EuroSys, etc and talk to all the people there. The good ones are already hired by one of the big established players (FAANG, Red Hat, Intel, etc), which is why you need to offer competitive compensation to lure them away.
Because we cannot afford that, we aren't Google, and still a relatively small company vs the task of building a full-stack virtualization solution. Luckily, we have other strong points helping a lot (remote first, no micro-management, a great culture promoting human values and so on etc.)
Great idea to go to Grenoble (France). There is there a nice ecosystem of companies + grad schools + researchers, one of the birth place of IT in France.
If you are looking for the place to hire new engineers, or researchers to work on an advanced project, I would recommend it too.
Curious if there is a way to dip your toe into this kind of thing without dedicating years of learning. Maybe a home lab? I've always been interested in this because it seems hard/interesting, and enjoyed learning about it in school. Wondering how much time you'd have to invest to see if you like it. I assume there is a healthy amount of "this is hard/not fun" along the way before you find out.
The title holds for much more specific and thinner stacks, along with support, QA, and sales - especially even simple web apps.
Onboarding is important
Training is important
Retaining is important
Maybe your system is 100% documented, conventional (looking at you Rails, Angular), debt free, tested, instrumented - but more than likely it's not.
But if you get down to staff who can't teach a a system, including product teams that don't respect feature overload, value, or internal feature training. If you prioritize new features over team development and training (aka a team that doesn't know the system), you're likely to get muddy with existing features both technically and use facing.
I agree. If it's already hard for simpler stacks, you can imagine how hard it is for more critical or complex ones. And it's even worse if you inherited some of it (ie collecting technical debt that's not yours, which is the case here as some part of the stack are the result of a fork).
Sometimes it's even a catch-22 situation, where the technical/generic knowledge is already hard to find, but you absolutely need it to train more junior people. Luckily we found such experimented people, but then you also need to use their expertise to actually fix stuff and not just mentor juniors. A very very delicate balance to find, especially in a timed market.
My boss is about to learn this lesson the hard and painful way. Retention has fallen to the point where I am the only person who groks the full stack. I'm making an attempt to document what I can, but they're going to be fucked when I leave. I would feel bad about it, but CEO has tried to fuck me over many times-- for no apparent reason.
>Partnering with Universities and Research Labs
Like where? In the US? Western Europe? You're cleary not searching hard enough because I've never heard about your company once throughout my time at university.
The median software developer in France (similar for e.g. Germany) makes around 50k. Software development as a path to personal wealth does not exist outside the US.
The proposition by these companies for new hires is: "Put in a lot of work, longer hours and solve difficult problems, so that (after taxes) you can earn 2k more a year than someone with a far more relaxed job".
The only people you are going to find are the very few ones intrinsically motivated to forego their social life in favor of your job offer and new graduates who don't know any better and will burn out and resent you and possibly the entire industry for the rest of their lives.
I will not put in exceptional effort, unless you are offering me something exceptional in return. And no, the "opportunity" to work hard on some problem does not count.
Basically all of Europe has this problem, companies are looking for people who put in exceptional work for average salaries. The problem isn't that these people don't exist or that you couldn't find them. The problem is that everyone who fits your role knows what kind of effort it involves and knows that their time will never be adequately rewarded.
It's true but until some extent. When you are talking about a hypervisor (like Xen), and many many subtle things depending on your CPU brand/model, it's really really *hard*, even with an LLM (and even more with an LLM hallucinating some CPU features or forgetting basic things like Meltdown and Spectre).
However I agree: to learn a topic, LLMs are providing a great speedup. As a CEO/co-founder, I have no issue to hire people without a degree if they are good at what they do. However, our biggest chances are to scout directly in universities to find motivated students (motivation >>> everything else)
The value of knowledge is plummeting. It can, will, and should be subsumed by LLMs. Consider London’s black cabs. Drivers pass a gruelling exam to prove they know “everything” about one part of London. But, Google Maps puts that knowledge and more on every rideshare driver’s dashboard, for free, no advance study needed.
Map knowledge doesn’t make you able to drive. Driving happens in real time - steering, obeying road rules, avoiding accidents.
In engineering, the real time part is when you use mental models you’ve painstakingly developed - how the hardware works so you can debug it, what syntax is valid in your programming languages, what APIs exist in your stack. But, those depend on knowledge - so mental models can be self-taught with AI as tutor in your own time. The LLMs supply the knowledge, you integrate it via study - or a side project.
An organization cannot clone itself a team, it’s true. They need engineers now, not in 6-12 months. But - a motivated individual can use AI to make themselves employable as a full stack engineer faster now than at any point in history.
One of the big issues with LLMs for e.g. systems software is that there are broad domains where training data effectively does not exist. Consequently, you can’t learn much from them of value. It is the blind leading the blind.
The lack of training material for LLMs is of course a lack of training material for people too. Some areas of software have a long history of relying almost entirely on an oral tradition to pass down knowledge. This has some advantages but it doesn’t scale and it makes it basically “dark knowledge” for LLMs or people without access to those that know it. If you want to get into an area like this, you often need to find a way to spend a lot of time with people that already know it.
>This has some advantages but it doesn’t scale and it makes it basically “dark knowledge” for LLMs or people without access to those that know it.
We call this "tribal knowledge" in games. I despise it. It's done for a few reasons:
- NDAs make public knowledge a landmine. And every game studio makes you sign NDAs. Even at the interview stage.
- Churn. No one gets time to really develop expertise as they work on a project for 2-3 years and then layoffs come. Only a relative few become experts, and probably not because of the studio itself
- lack of incentives. Games aren't very connected with acedemia to begin with, despite relying so much on cutting edge tech. So the best resources for sharing such techniques is shafted. This is slowly getting better as more tech conferences talk about games tech, but it's a pretty slow tricke unless you come from one of the largest studios and specifically come in for R&D.
>If you want to get into an area like this, you often need to find a way to spend a lot of time with people that already know it.
All too true. Open source development is one bastion for this, but that's overall why I keep trying to stay in this domain. You literally can't get the knowledge elsewhere. And it's knowledge that directly leads to better looking, more optimal, and less buggy games overall.
okay. So what to do in the meantie while an LLM can no in fact help you much with kernel programming? People love talking about the future but people also have current business needs.
> a motivated individual can use AI to make themselves employable as a full stack engineer faster now than at any point in history.
I'd love to hear one case of that happening. I'll even take someone ramping up from zero to freelance work as long as they have a good portfolio.
Sounds like a good plan. University recruitment is how you invest in society's future. I don't know how large this company is, so it's hard to determine if they can afford a training program.
We’re talking about skills that span kernel-level programming, hardware quirks, low-level debugging, distributed systems, security, orchestration logic, even the capability to work with the UI/UX team... and the ability to explain all that without scaring interns. You can’t just hire for that. You have to grow it. Nurture it. Beg for it. Or in some cases, resurrect it.
If you are that person, what is the best way to market yourself? I am the person described. I've got experience from poking registers in firmware, to wireline transport protocol implementation, to infosec, to writing microservice framework middleware, to pipeline orchestration at the OS level, and on and on. In the last week I've debugged Linux UDS issues and TLS cipher suite problems, and wrote code to provision WiFi-connected devices over BLE.
But it's incredibly hard to demonstrate that in an interview, if I can even find a role that warrants it. You're not going to find me on a university campus or in a research lab because I'm at a FAANG trying to pay my mortgage.
Hate to break it to the author, but there are plenty of people who can write a driver, bootloader, distributed systems, PAXOS, etc. but there's NO JOBS DOING THAT so they all work for <generic SaaS company> making <generic NodeJS app>.
The author mentions their grand strategy is fishing for talent out of universities, which is probably smart. Pulling people out of generic web dev world to go write "container orchestration logic" (or some other niche) is going to be a hard sell - most of those people were burned before by straying too far from the lucrative web dev jobs. Nothing like going into job interviews and telling the hiring manager that your last job was some ultra-obscure niche.
My first job out of university was working on an embedded Java runtime, with a group of peers from said university. Which sounds similar to what Virtualise.sh is doing.
Now, 20 years later? Tech lead on a Typescript/Flutter/AWS internal system.
In some ways I'd be excited to write container orchestration logic, or hack on a hypervisor. There's precious little of that work available, especially so in a small country like mine (NZ). My CV is crazy enough: being burnt isn't the issue, just finding the work is.
The cynical part of me wonders if the other reason to hire people straight out of university is that they're cheap...
(throwaway account, my CV is bad enough already)
Or the dumb leetcode stuff because "we are going to be cool like Google over here".
So many gravitate to companies with more sensible hiring practices, and that is how a company loses candidates, if it is one of those unicorns that the whole planet wants to work on.
So many times this. It doesn't even have to be five years to disqualify so many applicants these days. If you aren't working on it currently, there's already so much bias against you in the hiring process at large (e.g. the "I don't consider anyone that isn't currently employed" mantra that pops back up now and then).
Now I have a generic web dev job working on a generic nodejs app. There are many more jobs available and in general it's much easier to have a remote web dev job.
I've also done a couple sessions of peek at a problem in production and fix it / tell people how to fix it based on reputation, which is networking.
Oh, and one more thing: never ever mention any experience with mail handling, or you'll get roped into doing it again. People remember, even if you only said it once. :P
Demonstrating that you are a stable hire who will fit well with the team is almost all that matters. You wouldn't be having an interview to begin with if you didn't technically qualify-ish. Even when speaking to technical interviewers, you will intimidate them. Actually selling yourself comes after your first 90 days of acclimation.
Let's rephrase that:
I'm at a company with top of the line talent in many different domains and I prefer the compensation of this company over networking to find someone who knows the hiring manager of a team like this.
You have the easy way into this. For people without this networking, the best advice is to make them come to you. You gotta share that knowledge out there and get regards in various communities online if you can't stand out in a batch of resumes (and no one these days can, for reasons entirely the fault of HR and ATS). Do that and your newfound network will connect you.
People like that are in demand. Our company just filled few position like this, and hiring was hard. Some people were rejected because of not enough low-level knowledge, some people were rejected for not having enough high-level knowledge.
Agree with the TFA's author though, people with all the skills are super rare.. We were lucky to end up with few hires like that, but most of the people don't have all skills at once, and they have to learn on the job.
Even past that step, it's a bit random. I fully admit I ignore the resume when doing interviews, as it's a bias for the interview role I get, which is typically working through some coding problems.
Make sure you are demonstrating it to the people on your team so that when they leave and go somewhere else, they can hit you up. This takes some time.
And, sorry, you have to get out and hit some gatherings in person (hackerspaces, meetups, professional meetings, etc.)--online-only isn't going to cut it anymore. With the AI garbage clog of inline interactions, your professional network is back to who you know in-person.
Tldr raise your price and they'll belive easier.
If that has worked for you, that's amazing! But it seems really counter-intuitive to me.
It's misguided of course, but that's what they think they want and if you say "I've done all sorts of things and I'm good at all of them" they'll hear "I don't have much experience with anything" and discount you.
So it's better to pretend to be a specialist.
- resumes which claim generalist tend to be SEO’ed than real, or AI generated and they will say experience dozens of technologies most of which would be untrue, throw everything and see what sticks .
In most cases it just tells me what they have heard of rather than what they know especially if they are not very senior .
- It takes really hard effort to be beyond a superficial generalist, even when they know some of those skills and remember them during an interview it is not with a lot of depth they pursued only in passing interest not in a professional capacity (that is fine, but it is a lot work to make the distinction in evaluation on all relevant skills in a timed interview)
- It is harder or simply not viable for a specialist to interview a generalist , so you need to have few to hire more . If you don’t have any or they are not doing interviews you are not going get more .
- Being a generalist with depth of understanding means you are the type of person who needs to understand things properly before doing them . .You are lot of time learning things which are not required for get the task completed.
That means either you need are prepared to spend a lot of personal time and be perpetually stressed or be slow in completing something .
It is hard thing to master to let it go . I don’t think I have learned it yet
- Complexity and depth of technologies change a lot in short duration when they are corporate backed .
I learned Linux architecture or vim or git 20 years ago they haven’t changed much. You can be productive in any of these to stacks very quickly even if you didn’t use them for years .
Last I worked on android or swift is more than few years ago . I doubt I could even build a serious app without spending major time . To be productive to learning curve is steep and the prior knowledge is limited in usefulness.
It's just how my career swung me. I would have loved to have developed as a subject matter expert, but everytime I get into the swing of things, layoffs came around or the studio shut down. Now I'm freelancing and that by nature requires a generalist approach.
no one's really investing in specialists, so I have no idea how the millentials/Gen Z of the world will ever get to properly specialize. Specialization requires time to master something, and that time implies stability to do that thing.
>To be productive to learning curve is steep and the prior knowledge is limited in usefulness.
business wise, sure. It's a shame all business sees as "productive" is based on how many widgets you churn in that time. It's no surprise such companies want to force AI into it without quality considerations.
Changes also happen, but they are kind of glacial with areas where we are hunting the new shinny every here.
Usually I tend to be a laggard on the adoption graph, the large majority of stuff hardly makes the curve, and I have better things to do with my time.
Plus that vintage stuff that isnt' cool to write blog posts about, usually pays good enough.
Stuff like Android or iDevices suffer from yearly fashion, platform owners feel compeled to reboot the development experience every year, forcing app developers to keep up, and also as means to sell new devices.
Ha, I feel seen. Though also I wouldn't frame this quite so negatively. I've seen a lot of tasks "completed" by people who just got the job done without fully understanding it and they frequently get it done badly.
That's something hiring managers do find compelling.
Most HR hiring managers, and headhunters alike, have a real hard problem trying to see the point of generalists.
They want the easy way out, what to focus on.
As I usual say I tend to do whatever, so is the life of agencies, however when applying for positions sell the skills relevant for the position.
Not e.g. "I can do .NET and Java", rather "I can do .NET, proud of X, Y and Z projects in .NET, and by the way, I may do Java as well if needed, there was this project...", something like this.
It got worse in last two years. Now Senior Engineers must have exact combo of the weird tech stack and tools with N years of experience exactly as the existing employees, else gtfo, you don't even get screening calls. You don't have something from nice to have list? Lol, why are you wasting our recruiter's time even? She needs to use GenAI to write her next rejection email.
Also, your 15yoe does not matter, unless you are coming from direct competitor, in which case your 1.5yoe with internship is also excellent.
The minimum level of sophistication required to be effective in systems software has increased dramatically since the 1990s, when I first started doing systems software. The kinds of systems software we were putting in production in back then would be considered a trivial toy today. This shift has placed an enormous amount of upward pressure on the minimum level of experience and skill that would allow someone to become productive within a useful amount of time.
It is no longer feasible in many cases to have companies effectively subsidize the development of highly skilled systems software people. The training time has become so long that companies will never recoup the investment. It is easy to see how the incentives have created the current situation and it is not clear how to address the shortage. Even before the current situation, it was widely noted that most systems software developers were self-taught over many years rather than trained up in a structured environment.
if the company would rather burn out and age out all their talent than invest in the future, I don't know what to say. For niche domains, you can't treat talent a a pure business expense. That's how you lose out all that talent to China and suddenly you can't compete at all. If anything, training should be treated like insurance.
>It is no longer feasible in many cases to have companies effectively subsidize the development of highly skilled systems software people.
"subsidize" implies that somehow engineers are saving money by choosing to take these more niche paths. It doesn't make sense. You have a product and you need talent to develop and maintain it to compete. There's no "subsidies" in a business relationship like this.
your choices are few and simple: train in-house, lobby schools to train for you, or publicize your tech and hope people train themselves for you again. It seems like these days companies aren't looking out long term though and just want to jump the ship that has plenty of time to steer past the iceberg everyone sees ahead.
I would love to work on low-level, systems stuff (anything as close to the hardware as possible), that's even my education and area of expertise. BUT SaaS companies in reality:
- pay better
- have lower costs
- are way easier
- don't have that many (or any at all) geographical restrictions (e.g. importing hardware prototypes).
People follows the money.
okay. Promote them with proper pay then. This HR debacle of putting more budget into hiring over retaining is entierly self-inflicted.
>People follows the money.
Well that love sounds more like a low priority whimsy in that case. I don't think tech workers of all people are ones who ever complain about low compensation. Unless you work in games, I suppose.
Does the high skill standard for surgeons mean the market for surgeons is saturated?
Sure, it is possible for the market to desaturate at some point as key people leave. But at that point, assuming suitable, per the earlier comment, replacements don't fill the void, the market will either come to accept lesser skilled people or it will come to accept fewer surgeries, returning to equilibrium. Not exactly magic.
Maybe there's a good argument against training, but it could also just be irrational and stubborn in this case.
If you get the ratio wrong, with too many juniors and weak technical leadership then you will end up in a very bad place in your code base.
In terms of value, even if juniors are half the cost, it is much wiser to hire one senior instead of 2 juniors for the same money.
Maybe in terms of pure productivity, but if you can match hiring to roadmaps you can give them more approachable/further from revenue work for which seniors would be an overkill. Etc. I'm just saying anyone with a simple explanation is only telling you part of the story.
Besides, hiring is expensive. If you say live near a university, you have an edge in finding talent.
My experience suggests there is no such thing. There is work that seniors might not want to do, but if you hired well then they will be professional enough to do it, if it really needs to be done. And it takes some experience to determine which “non-revenue generating” work (e.g. tech debt) actually needs to be done, to advocate doing it to the stakeholders and to actually do it well.
Juniors need a lot of supervision and that is not free. Which is not a reason not hire them in the first place, just that that it should be done mindfully.
This mentality is part of the problem. You can't fundamentally treat juniors as a profit center. Especially in more niche fields. They need to be trained and schools don't cover your specific pipeline.
maybe it doesn't "need" to be done, but some refactoring work will pay off to turn your juniors today into seniors tomorrow. If you can only think in productivity charts, then you don't hire juniors.
That said, in my previous job in a startup we hired very junior engineers and gave them plenty of opportunity and support for growth and several people did stunningly well. Pity the company didn’t do very well (IMHO due to focusing more or this sort of thing rather than making money).
The truth is bound to be somewhere in the middle.
Mentoring is a key part of technical leadership, way easier to help the talent grow into your requirements (even if they are under-qualified, for now)
once you get a few years under the belt, sure. You're probably fine doing everything online. I can't think of many remote only studios hiring juniors anyways.
That said, I do agree with the premise of the article that it's hard to learn "the stack", especially with the advent of generative AI.
"Back in the day", when google spat out a link to something resembling your problem, you still had to deconstruct the answer given to apply it to your particular case. That required some sort of understanding, or a mental model of the domain. With ChatGPT I find myself copying/pasting error messages between two browser windows, or a terminal, not really understanding what is going on.
No learning is happening. Does not bode well for new folks coming in.
It's not a promotional piece of something, it's my personal experience as a CEO and co-founder of a company using Xen as the core of our stack. I like to share my views in a transparent fashion on how it's hard to do very technical stuff, not just for technical reasons, but due to the lack of people trained for.
But hypervisors did not disappear, they just got replaced. When we run virtual machines, they are usually backed by KVM (low-level) and qemu (higher layer). Sometimes there is libvirt on top of it too, but running qemu directly is not that hard.
And there is plenty of exciting research about this stack, for example KVM can be driven by things like firecracker and crosvm, and there are some rust projects targeting it too. There is also BSD's bhyve which
My impression is that it's not that people find hypervisors in general are boring, but just Xen specifically (or maybe all classic Type-1 ones? hard to tell with Xen being the only open-source representative).
I respectfully disagree with much of your comment.
First, this wasn't intended as a promotional piece. It's a personal blog post where I share some of the challenges involved in building a full virtualization stack — a stack that happens to be fully open source. It's unfortunate that sharing real-world experience is sometimes immediately perceived as promotional.
Second, I think there's some confusion between using a hypervisor and mastering one — or building and maintaining an entire stack around it. KVM/QEMU is widely used, but it has significant issues, especially regarding security, performance, and latency consistency. Very few groups in the world are actively trying to tackle these challenges holistically (even major players like VMware have made some questionable shortcuts).
When it comes to low-latency, real-time use cases with a strong security model, Xen remains unique among open-source hypervisors. It's definitely not boring — in fact, it's one of the few that enable certain classes of critical applications at all.
We also work closely with academic research labs, and I can tell you: there’s still a lot of exciting work happening around Xen — even if it's less visible than buzz around newer projects like Firecracker or crosvm.
Sadly, the program I was supporting just had all of its funding yanked, I expect to get laid off tomorrow.
Sometimes, probably most of the time, it is better to work through it and understand an issue than to blindly copy pasta.
Recently got into a completely new language/framework and used Copilot to understand what was going on. I still made all the big decisions and wrote most of the code myself.
You can definitely use AI to avoid learning though (up to a point).
It goes on to say that it's hard to find and develop expertise for low level software like hypervisors.
What's the connection between the topics? It feels like two different rants.
If it's difficult to find kernel developers then wouldn't it help to not require them to also know web UX?
And if you find or train those low-level/system-oriented people, they also need to understand how a feature they build will be exposed functionally to a user (and why they need it in the first place). Because things are not make into thin-air but required to work in a bigger picture (ie: the product).
That means hiring two people, and in $current_year, companies expect one person to know everything. Sysadmin, backend programmer, frontend programmer, designer and a DBA used to be different people not that long ago, now they expect one person to do all that... + it seems they want kernel development experience now.
A single person can in fact write a program for a computer.
People who came after you would write it vb6 people who came latter bootstrap.js or use material icons.
If you want to do your own networking, or run email, that's a whole 'nother specialism; but just running a website is easy enough for one person to do.
just think of your favorite video game character in 2000 and then one in the 2020's and consider how much tech is needed to render, animate, light, and conceptualize it. in 2000 this was all done by maybe one artist and one gamedev, probably making a character with some hundreds of polys at best. now that artist has a pipeline of riggers, material artists, animators, and concept artists, while that single dev became a graphics programmer, gameplay programmer, tech artist, and build engineer.
It's like moving from custom built cars to the assembly line where someone's job is putting in one screw. I understand it's cheaper/faster because you can hire anyone unskilled for cheap but cars were all suppose to be identical. Software should be unique (if not just copy the last thing built) but I guess when it comes to major games things are more of factory throwing millions of pixels of characters at existing game engines while copying gameplay of successful games. That's why games are shovelware these days like a netflix original.
Jack of all trades, master of none. This is why we need clusters, "stacks" and "clouds" on the server side and gigabytes of ram on the client side + many megabytes transfered, just to show one simple weather forecast website that gives the user the same amount of information as a WAP site did on a five line mobile phone back in GPRS times.
Seriously, book a plane ticket to ATC/OSDI, EuroSys, etc and talk to all the people there. The good ones are already hired by one of the big established players (FAANG, Red Hat, Intel, etc), which is why you need to offer competitive compensation to lure them away.
I don't see "pay like Google" listed.
If you are looking for the place to hire new engineers, or researchers to work on an advanced project, I would recommend it too.
Onboarding is important Training is important Retaining is important
Maybe your system is 100% documented, conventional (looking at you Rails, Angular), debt free, tested, instrumented - but more than likely it's not.
But if you get down to staff who can't teach a a system, including product teams that don't respect feature overload, value, or internal feature training. If you prioritize new features over team development and training (aka a team that doesn't know the system), you're likely to get muddy with existing features both technically and use facing.
Sometimes it's even a catch-22 situation, where the technical/generic knowledge is already hard to find, but you absolutely need it to train more junior people. Luckily we found such experimented people, but then you also need to use their expertise to actually fix stuff and not just mentor juniors. A very very delicate balance to find, especially in a timed market.
I had to chuckle.
The proposition by these companies for new hires is: "Put in a lot of work, longer hours and solve difficult problems, so that (after taxes) you can earn 2k more a year than someone with a far more relaxed job".
The only people you are going to find are the very few ones intrinsically motivated to forego their social life in favor of your job offer and new graduates who don't know any better and will burn out and resent you and possibly the entire industry for the rest of their lives.
I will not put in exceptional effort, unless you are offering me something exceptional in return. And no, the "opportunity" to work hard on some problem does not count.
Basically all of Europe has this problem, companies are looking for people who put in exceptional work for average salaries. The problem isn't that these people don't exist or that you couldn't find them. The problem is that everyone who fits your role knows what kind of effort it involves and knows that their time will never be adequately rewarded.
LLM are quite good at explaining systems and frameworks. I would never got into kernel programming without Deepseek guidance.
As for universities: too expensive, too much paperwork, too slow, too elitist.
However I agree: to learn a topic, LLMs are providing a great speedup. As a CEO/co-founder, I have no issue to hire people without a degree if they are good at what they do. However, our biggest chances are to scout directly in universities to find motivated students (motivation >>> everything else)
Map knowledge doesn’t make you able to drive. Driving happens in real time - steering, obeying road rules, avoiding accidents.
In engineering, the real time part is when you use mental models you’ve painstakingly developed - how the hardware works so you can debug it, what syntax is valid in your programming languages, what APIs exist in your stack. But, those depend on knowledge - so mental models can be self-taught with AI as tutor in your own time. The LLMs supply the knowledge, you integrate it via study - or a side project.
An organization cannot clone itself a team, it’s true. They need engineers now, not in 6-12 months. But - a motivated individual can use AI to make themselves employable as a full stack engineer faster now than at any point in history.
How cool is that!
The lack of training material for LLMs is of course a lack of training material for people too. Some areas of software have a long history of relying almost entirely on an oral tradition to pass down knowledge. This has some advantages but it doesn’t scale and it makes it basically “dark knowledge” for LLMs or people without access to those that know it. If you want to get into an area like this, you often need to find a way to spend a lot of time with people that already know it.
We call this "tribal knowledge" in games. I despise it. It's done for a few reasons:
- NDAs make public knowledge a landmine. And every game studio makes you sign NDAs. Even at the interview stage.
- Churn. No one gets time to really develop expertise as they work on a project for 2-3 years and then layoffs come. Only a relative few become experts, and probably not because of the studio itself
- lack of incentives. Games aren't very connected with acedemia to begin with, despite relying so much on cutting edge tech. So the best resources for sharing such techniques is shafted. This is slowly getting better as more tech conferences talk about games tech, but it's a pretty slow tricke unless you come from one of the largest studios and specifically come in for R&D.
>If you want to get into an area like this, you often need to find a way to spend a lot of time with people that already know it.
All too true. Open source development is one bastion for this, but that's overall why I keep trying to stay in this domain. You literally can't get the knowledge elsewhere. And it's knowledge that directly leads to better looking, more optimal, and less buggy games overall.
okay. So what to do in the meantie while an LLM can no in fact help you much with kernel programming? People love talking about the future but people also have current business needs.
> a motivated individual can use AI to make themselves employable as a full stack engineer faster now than at any point in history.
I'd love to hear one case of that happening. I'll even take someone ramping up from zero to freelance work as long as they have a good portfolio.
Seems they are plucking self-starters directly from universities. You could either use that funnel or build your own with the same idea.