Can anyone convince me this is truly an effect of AI, and not just pullback from the mass hiring during and following ZIRP in 2021-23ish? I understand either cause makes it a bad time to be a junior, but would like to hear the argument _against_ "AI is taking our jobs".
(I suspect some very junior jobs have genuinely been taken by AI, but it seems to me that the driving factor is still a return-to-mean after ZIRP).
If it were 2018, I personally would have made 3 SWE hires in the last 12 months. The reason I didn't need to is because of LLMs. Not budget, not anything else. I don't think AI is so much to blame for layoffs, but I do think it is a huge component of the slow hiring. There's just less demand for coders.
I was the first dev at my current company to experiment with Claude Code back when it first came out. Some of my coworkers tried it, and some didn’t like it at all.
But now literally all of us are using it. The company gives us a $100 monthly stipend for it. We’re a small dev team, but our CEO is thrilled. He keeps bragging about how customers are gobsmacked by how quickly we’re delivering new features and he’s been asked if we’ve made a ton of hires in the last year. We’re actually down two developers from when I started.
I don’t love the code it writes, but the speed is unbeatable. We still need devs, and I don’t think that’s ever going to change. But we don’t need as many devs. We’re shipping like crazy with a small team. I don’t think more people would speed us up much at all.
This is what I'm seeing in the design market. With Figma Make, you can write a prompt, tell it to use your design library, generate a flow, and then hand it off to developers and say "Hey, look at this, can you implement this?". Alternatively, you can use Cursor/Claude Code/Codex to pull in Figma design system elements via MCP, and generate flows that way. You can push features so much faster with the same or fewer people, and lets be honest, pushing more features in less time is the #1 metric at a lot of companies even if they claim otherwise.
I own a small agency and it’s the same for me, but that doesn’t explain the urgency nor the layoffs in the big companies. Yes we are not hiring juniors, but we still hire seniors because they are now more productive. That’s not what we are seeing in big companies.
To me it’s the cumulated effects of many things happening coincidentally:
- Massive offshoring
- Money becoming scarce after the ZIRP era and in the recession except for AI investments
- Near monopoly positions that allow layoffs as a strategy to push stock price, without penalty for the decline in quality (enshittification)
- Cherry on the top, LLMs being able to do the work of juniors when used by seniors
If it was only about AI productivity we wouldn’t see this urgency.
It's almost worse than this, in that IME LLMs also made junior devs worse. They want to use LLMs, and why not, it's the tech du jour, but then they lack the judgment to produce good LLM code.
So as LLMs are getting better, junior devs aregetting worse.
It feels to me like everyone is holding their breath to see how the wholesale "AI can replace people" notion pans out. Whether it proves true or not, betting on the wrong result will hit hard so few want to go all in (outside of the companies that produce the tech itself). If there's anything "AI" has been able to ship at scale, it's uncertainty.
My take is that AI adoption is a gear shift to a higher level abstraction, not a replacement. So we will have a lull, then a return to hiring for situations just like this. Maybe there is a lot more runway for AI to take jerbs, but I think it will hit an equilibrium of "creating different jobs" at some point.
Biggest argument against AI meaningfully replacing software engineers is software not getting better/faster/cheaper. You'd think if it managed to cut down the cost of producing it we'd see more software, or better/faster software, or it would be cheaper. Same in open-source.
Yet good software is still as scarce as ever, and if anything it's getting worse. So it seems like the effects of AI on software development are either too minimal, or are just short-term effects that translate to technical debt down the line which ends up erasing all the gains.
I don't think any of us has enough data to judge this. The vast majority of code exists strictly behind closed doors in companies' various internal systems.
And for the commercial and open source code you do see, how do you know if it's being produced more quickly or not?
And finally, even if LLMs speed up coding by 10% or 50% or whatever... writing code is only a fraction of the job.
Why would you think this? Most software doesn’t have competition. Even if Microsoft had all of openAIs secret sauce, why would the improve their products? Who you gonna switch to?
It's not exactly a science, it's the aggregate trend of many distinct players allocating capital where they think it will be most productive. AI isnt't "taking jobs", but it might be taking the capital that would otherwise go towards sustaining and growing headcount.
Personally I think it’s “None of the Above”. Frankly I own several projects that I really wish AI would do better at maintaining or enhancing (please don’t reply with “you’re holding it wrong messages”).
At least with my org and a lot of my friends companies, they’ve just kind of stopped building (non-AI) features? Not completely, but like a 70% reduction. And that’s apparently fine with the market, since AI promises infinite productivity in the future. As an example Amazon had a massive outage recently, like a world-scale, front page news outage. Then they followed it up with a massive layoff. In 2018 logic the markets probably would have been like “this company is fucked people are going to start moving out of AWS if they can’t promise reliability”. In 2025 logic, it barely even registers. I guess the assumption is that even with less employees they’ll be AWS can be even more stable in the future because of better AI. Even companies who _should_ be more concerned with reliability aren’t because _they’re too concerned about their next big AI move_.
I guess in the end I still think it’s about AI but more about how companies are reacting to AI rather than replacing too many jobs.
AI is increasing the capacity of existing employees and I think we're all still discovering, everyday, better ways to leverage it even more. So when the question comes of hiring a new teammate, the value they'd bring to the table needs to be convincingly more than what I can expect to achieve alone with AI.
I think the glut is ZIRP, but the lack of recovery (or very slow pickup) is AI.
As a counter anecdote, I've yet to try a model where I break even. The output is so untrustworthy that I need to spend as much time coaching and fixing as if I'd just written it myself. When it does produce a working result faster, I still end up with less confidence in it.
What I'm working on doesn't have much boilerplate, though, and I've only been programming for 18 years. Maybe I need to work for another 7 before it starts working for me.
Its effectiveness is going to depend on the domain and tech stack used.
You also need to know what chunks of AI output to keep and what to throw away.
For example, Gemini 'Fast' quickly identified a problem for me the other day, based on the stacktrace. But its proposed solution was crappy. So, I was happy with the quick diagnosis, but I fixed the code manually.
My rule on this is that you can only judge your coworker’s productivity never your own. People are horrible at judging their own productivity, and AI makes it really easy to just dump a bunch of work on someone else.
The report especially highlights data as being most hit.
Indeed, during pandemic data field sort of matured. Modern data stack emerged. No one knew what they were doing and it was extremely over hired for.
In addition, I highly doubt LLMs have a significant effect on data teams as their main jobs aren’t writing code.
LLMs are still very unreliable at converting language to SQL. I believe the best benchmark was merely at around 70%. Who in their right mind can trust this level of accuracy?
Slightly tongue in cheek, but maybe there's just a divide in the market? I've been seeing more jobs advertised as "for people excited about using AI at work" or something to that effect.
Maybe the market is bifurcating along the lines of people who hate AI chat bots and people who love them?
A more interesting question is this due to oversaturation of stem graduates? For a decade we've been pushing people relentlessly to study stem, have we just flooded the market with so many people?
It’s mostly ZIRP hangover. The over hiring in those years was ludicrous.
AI is having some effect at the margins but it’s mostly an excuse.
Companies always prefer to avoid saying they are just laying people off. It can be a negative market signal to investors, which is paradoxical, since it might indicate lower growth expectations. It also creates possible exposure to lawsuits depending on the circumstances and state.
The nice thing about AI as an excuse is you can say to your investors and board “we are shedding cost but still growing and in fact our productivity is up because we are leveraging AI!”
We are generally rather negatively judgy towards some mistakes (the ones that indicate you're an idiot) versus positive towards others (some mistakes show taste or talent).
AI plays !some! role, but the real culprit conists of two things:
- Zirp policy ended as countries started using currencies other than the dollar for international trade. Now the Fed and private banks cant print money and distribute zero-interest loans. This stopped the 'free cash' that used to go into Silicon Valley. A lot of money has either moved to interest vehicles or foreign investments/currencies. So Silicon Valley doesnt have cash to burn as it did before. As a result, they are having to back the valuations/stock prices with their profitability instead of the infinite cash that kept coming into the market doing it for them. Hence, the layoffs and focus on profitability.
- Trump did something to reclassify something legally and that stopped allowing tech companies to show their software engineering work as research to reduce their taxes or something. And Biden did not reverse it. (surprise). Or Biden may have done it, not certain at this point. Whatever it was, it went into effect 2-2.5 years ago and made headcount more costly.
The main difference seems to be that the number of jobs posted to HN has dropped significantly lower, well below the low point of 2020. It's really pretty dismal, which seems to have started around the middle of 2022. Maybe the types of jobs that get posted to HN are doing even worse than "technology" in general.
It'd be interesting to correlate this against total posting activity on HN as well, to account for it being progressively a more popular website to post on
Maybe they are just calling the jobs by different names? It seems like names of roles are constantly shifting. "Data scientist" is a term that is going out of fashion.
Jobs don't have to be exclusive to a single site. Pretty much every job gets posted everywhere (usually done automatically by your HRIS/ATS software). Job boards will even scrape each other for postings. LinkedIn is notorious for this, which is why it has so many outdated listings.
Managers need to keep finding efficiencies or else the company will eventually find the efficiency of making them not part of the company anymore.
The hiring bar is going up partly because it’s become possible to spend the effort that would’ve gone into hiring to instead vibecode tools to do the easy bits of the required job - and delegate the rest, while finding new efficiencies in other team members’ roles to make room for what’s newly delegated. The net result is the same work getting done with fewer headcount.
The broad effect is the economy becomes more efficient and new jobs get created just as old jobs get divvied up according to the “replaceability” of each of the many roles that make up a particular job.
Those are not "tech jobs" for the most part, they are business support jobs.
Data scientist is an exception based on title, but in my experience there are a large number of people with that title who have never heard of the scientific method, let alone could apply it with any rigor.
I'm sure LLMs are taking some of these jobs, but a lot of the decrease is probably due to general cutbacks on overhead and a realization that they produce limited value.
We were told that "everyone needs to learn how to code", to the thunderous applause of all corporate stooges.
It is hard to tell what is really going on. No company will admit that they are firing, e.g., DEI hires from 2023. I have seen some open source CoC loudmouths being fired, but that is not enough to establish a large trend.
I’d recommend talking to people in the trades first. Not saying it can’t be a good move, but it is definitely hard and has its own huge downsides like poor working environments, long hours, and years to actually get into decent paying roles.
Faddish career advice is usually bullshit, or it’s too late and the bus it’s telling you to jump on left the stop years ago.
As companies tighten their belts, they’re quicker to cut low performers that had been hanging around for too long anyway because:
1) cost reduction
2) companies had been lazy at getting rid of low performers when the market was hot and they didn’t need to cut (and couldn’t find better devs to replace them anyway)
3) with the market this skewed towards employers, you can replace low performers with better talent anyway because everyone’s looking
A brilliant colleague of mine helped invent the Internet. They fired him anyway. Don't give me your comforting just-so story about "low performers". It's insulting.
Can anyone convince me this is truly an effect of AI, and not just pullback from the mass hiring during and following ZIRP in 2021-23ish? I understand either cause makes it a bad time to be a junior, but would like to hear the argument _against_ "AI is taking our jobs".
(I suspect some very junior jobs have genuinely been taken by AI, but it seems to me that the driving factor is still a return-to-mean after ZIRP).
If it were 2018, I personally would have made 3 SWE hires in the last 12 months. The reason I didn't need to is because of LLMs. Not budget, not anything else. I don't think AI is so much to blame for layoffs, but I do think it is a huge component of the slow hiring. There's just less demand for coders.
I was the first dev at my current company to experiment with Claude Code back when it first came out. Some of my coworkers tried it, and some didn’t like it at all.
But now literally all of us are using it. The company gives us a $100 monthly stipend for it. We’re a small dev team, but our CEO is thrilled. He keeps bragging about how customers are gobsmacked by how quickly we’re delivering new features and he’s been asked if we’ve made a ton of hires in the last year. We’re actually down two developers from when I started.
I don’t love the code it writes, but the speed is unbeatable. We still need devs, and I don’t think that’s ever going to change. But we don’t need as many devs. We’re shipping like crazy with a small team. I don’t think more people would speed us up much at all.
This is what I'm seeing in the design market. With Figma Make, you can write a prompt, tell it to use your design library, generate a flow, and then hand it off to developers and say "Hey, look at this, can you implement this?". Alternatively, you can use Cursor/Claude Code/Codex to pull in Figma design system elements via MCP, and generate flows that way. You can push features so much faster with the same or fewer people, and lets be honest, pushing more features in less time is the #1 metric at a lot of companies even if they claim otherwise.
> but the speed is unbeatable
... for "now". Wait until the debt kicks in.
> But we don’t need as many devs. We’re shipping like crazy with a small team.
... for "now". Wait until the debt kicks in.
I own a small agency and it’s the same for me, but that doesn’t explain the urgency nor the layoffs in the big companies. Yes we are not hiring juniors, but we still hire seniors because they are now more productive. That’s not what we are seeing in big companies.
To me it’s the cumulated effects of many things happening coincidentally: - Massive offshoring
- Money becoming scarce after the ZIRP era and in the recession except for AI investments
- Near monopoly positions that allow layoffs as a strategy to push stock price, without penalty for the decline in quality (enshittification)
- Cherry on the top, LLMs being able to do the work of juniors when used by seniors
If it was only about AI productivity we wouldn’t see this urgency.
It's almost worse than this, in that IME LLMs also made junior devs worse. They want to use LLMs, and why not, it's the tech du jour, but then they lack the judgment to produce good LLM code.
So as LLMs are getting better, junior devs aregetting worse.
It feels to me like everyone is holding their breath to see how the wholesale "AI can replace people" notion pans out. Whether it proves true or not, betting on the wrong result will hit hard so few want to go all in (outside of the companies that produce the tech itself). If there's anything "AI" has been able to ship at scale, it's uncertainty.
AI is absolutely already replacing people. No idea if it will replace everyone, but that doesn't really matter, does it?
My take is that AI adoption is a gear shift to a higher level abstraction, not a replacement. So we will have a lull, then a return to hiring for situations just like this. Maybe there is a lot more runway for AI to take jerbs, but I think it will hit an equilibrium of "creating different jobs" at some point.
Is it that LLMs are making your devs that much more efficient?
From my experience devs are much faster with LLM if you measure speed in merge requests.
However, in 2025 they are producing x4 more bugs comparing to 2024. The same developers, the same codebase but now with Cursor.
[dead]
Biggest argument against AI meaningfully replacing software engineers is software not getting better/faster/cheaper. You'd think if it managed to cut down the cost of producing it we'd see more software, or better/faster software, or it would be cheaper. Same in open-source.
Yet good software is still as scarce as ever, and if anything it's getting worse. So it seems like the effects of AI on software development are either too minimal, or are just short-term effects that translate to technical debt down the line which ends up erasing all the gains.
I don't think any of us has enough data to judge this. The vast majority of code exists strictly behind closed doors in companies' various internal systems.
And for the commercial and open source code you do see, how do you know if it's being produced more quickly or not?
And finally, even if LLMs speed up coding by 10% or 50% or whatever... writing code is only a fraction of the job.
Why would you think this? Most software doesn’t have competition. Even if Microsoft had all of openAIs secret sauce, why would the improve their products? Who you gonna switch to?
It's not exactly a science, it's the aggregate trend of many distinct players allocating capital where they think it will be most productive. AI isnt't "taking jobs", but it might be taking the capital that would otherwise go towards sustaining and growing headcount.
[dead]
No. ZIRP hangover all the way.
AI is just a plausible scapegoat that sounds hip.
Personally I think it’s “None of the Above”. Frankly I own several projects that I really wish AI would do better at maintaining or enhancing (please don’t reply with “you’re holding it wrong messages”).
At least with my org and a lot of my friends companies, they’ve just kind of stopped building (non-AI) features? Not completely, but like a 70% reduction. And that’s apparently fine with the market, since AI promises infinite productivity in the future. As an example Amazon had a massive outage recently, like a world-scale, front page news outage. Then they followed it up with a massive layoff. In 2018 logic the markets probably would have been like “this company is fucked people are going to start moving out of AWS if they can’t promise reliability”. In 2025 logic, it barely even registers. I guess the assumption is that even with less employees they’ll be AWS can be even more stable in the future because of better AI. Even companies who _should_ be more concerned with reliability aren’t because _they’re too concerned about their next big AI move_.
I guess in the end I still think it’s about AI but more about how companies are reacting to AI rather than replacing too many jobs.
AI is increasing the capacity of existing employees and I think we're all still discovering, everyday, better ways to leverage it even more. So when the question comes of hiring a new teammate, the value they'd bring to the table needs to be convincingly more than what I can expect to achieve alone with AI.
I think the glut is ZIRP, but the lack of recovery (or very slow pickup) is AI.
I think smart use of free LLM chat tools has increased my productivity by maybe 50%.
Nothing fancy. No Claude code, Codex, Cursor, Etc. Just focused Q&A with mostly free Gemini models.
I've been writing software for 25 years, though.
As a counter anecdote, I've yet to try a model where I break even. The output is so untrustworthy that I need to spend as much time coaching and fixing as if I'd just written it myself. When it does produce a working result faster, I still end up with less confidence in it.
What I'm working on doesn't have much boilerplate, though, and I've only been programming for 18 years. Maybe I need to work for another 7 before it starts working for me.
Its effectiveness is going to depend on the domain and tech stack used.
You also need to know what chunks of AI output to keep and what to throw away.
For example, Gemini 'Fast' quickly identified a problem for me the other day, based on the stacktrace. But its proposed solution was crappy. So, I was happy with the quick diagnosis, but I fixed the code manually.
My rule on this is that you can only judge your coworker’s productivity never your own. People are horrible at judging their own productivity, and AI makes it really easy to just dump a bunch of work on someone else.
The report especially highlights data as being most hit.
Indeed, during pandemic data field sort of matured. Modern data stack emerged. No one knew what they were doing and it was extremely over hired for.
In addition, I highly doubt LLMs have a significant effect on data teams as their main jobs aren’t writing code.
LLMs are still very unreliable at converting language to SQL. I believe the best benchmark was merely at around 70%. Who in their right mind can trust this level of accuracy?
What is the modern data stack that you speak of?
AI has effect, though. Due to AI spending there is smaller budget for salaries.
As a result we have budget cuts, layoffs and movement to low cost countries like India.
Slightly tongue in cheek, but maybe there's just a divide in the market? I've been seeing more jobs advertised as "for people excited about using AI at work" or something to that effect.
Maybe the market is bifurcating along the lines of people who hate AI chat bots and people who love them?
A more interesting question is this due to oversaturation of stem graduates? For a decade we've been pushing people relentlessly to study stem, have we just flooded the market with so many people?
It’s mostly ZIRP hangover. The over hiring in those years was ludicrous.
AI is having some effect at the margins but it’s mostly an excuse.
Companies always prefer to avoid saying they are just laying people off. It can be a negative market signal to investors, which is paradoxical, since it might indicate lower growth expectations. It also creates possible exposure to lawsuits depending on the circumstances and state.
The nice thing about AI as an excuse is you can say to your investors and board “we are shedding cost but still growing and in fact our productivity is up because we are leveraging AI!”
No tech is dead, 2015 i got endless interviews with a barely functional chrome extension. Today your expected to know everything.
Particularly spelling and grammar.
I wish I could hang out in the server closet, going long periods of time without coworker interaction, but you can't get far like that.
Communication skills matter.
Wait until everyone has to make grammar mistakes just to prove they aren't an AI bot.
Will we start using AI bots that adds grammar mistakes so that we don't sound like AI bots?
Make sure to use a good prompt.
We are generally rather negatively judgy towards some mistakes (the ones that indicate you're an idiot) versus positive towards others (some mistakes show taste or talent).
[dead]
I've actually seen a pretty notable uptick in recruiter outreach over the past month or two, but it's still way way below the Good Years.
AI plays !some! role, but the real culprit conists of two things:
- Zirp policy ended as countries started using currencies other than the dollar for international trade. Now the Fed and private banks cant print money and distribute zero-interest loans. This stopped the 'free cash' that used to go into Silicon Valley. A lot of money has either moved to interest vehicles or foreign investments/currencies. So Silicon Valley doesnt have cash to burn as it did before. As a result, they are having to back the valuations/stock prices with their profitability instead of the infinite cash that kept coming into the market doing it for them. Hence, the layoffs and focus on profitability.
- Trump did something to reclassify something legally and that stopped allowing tech companies to show their software engineering work as research to reduce their taxes or something. And Biden did not reverse it. (surprise). Or Biden may have done it, not certain at this point. Whatever it was, it went into effect 2-2.5 years ago and made headcount more costly.
The second thing you mention (capex/opex) was reverted this year
Original source instead of Business Insider blogspam - https://www.hiringlab.org/2025/11/20/indeed-2026-us-jobs-hir...
I made a chart of the number of jobs posted to HN "Who is hiring?" posts (using data from hnhiring.com): https://bsky.app/profile/lemonodor.bsky.social/post/3m4t2b66...
It correlates pretty well with the line showing technology jobs over time in the article at hiringlab.org: https://www.hiringlab.org/wp-content/uploads/2025/11/sector_...
The main difference seems to be that the number of jobs posted to HN has dropped significantly lower, well below the low point of 2020. It's really pretty dismal, which seems to have started around the middle of 2022. Maybe the types of jobs that get posted to HN are doing even worse than "technology" in general.
I would assume startups are much more likely to post here, and they were hit hard by the end of ZIRP around the time of the downturn you mentioned.
It'd be interesting to correlate this against total posting activity on HN as well, to account for it being progressively a more popular website to post on
Maybe they are just calling the jobs by different names? It seems like names of roles are constantly shifting. "Data scientist" is a term that is going out of fashion.
Pretty large claim to insinuate Indeed can't even tell when their own users are simply shifting terms around...
This is the company so large, their jobs data was used in lieu of the Fed's jobs data when the gov was shut down.
The real comparison needs to be between pre-covid and today.
Is Indeed seeing much of the tech job market? I never considered looking for a job there over LinkedIn.
Jobs don't have to be exclusive to a single site. Pretty much every job gets posted everywhere (usually done automatically by your HRIS/ATS software). Job boards will even scrape each other for postings. LinkedIn is notorious for this, which is why it has so many outdated listings.
I’ve actually seen an increase in recruiter emails in Toronto. Not sure if it’s just me though.
I have 15 years software development experience and just took a job paying barely over minimum wage doing data entry. In the Bay Area.
cripes, it's time to move
Managers need to keep finding efficiencies or else the company will eventually find the efficiency of making them not part of the company anymore.
The hiring bar is going up partly because it’s become possible to spend the effort that would’ve gone into hiring to instead vibecode tools to do the easy bits of the required job - and delegate the rest, while finding new efficiencies in other team members’ roles to make room for what’s newly delegated. The net result is the same work getting done with fewer headcount.
The broad effect is the economy becomes more efficient and new jobs get created just as old jobs get divvied up according to the “replaceability” of each of the many roles that make up a particular job.
Those are not "tech jobs" for the most part, they are business support jobs.
Data scientist is an exception based on title, but in my experience there are a large number of people with that title who have never heard of the scientific method, let alone could apply it with any rigor.
I'm sure LLMs are taking some of these jobs, but a lot of the decrease is probably due to general cutbacks on overhead and a realization that they produce limited value.
We were told that "everyone needs to learn how to code", to the thunderous applause of all corporate stooges.
It is hard to tell what is really going on. No company will admit that they are firing, e.g., DEI hires from 2023. I have seen some open source CoC loudmouths being fired, but that is not enough to establish a large trend.
> CoC loudmouths
Probably means code-of-conduct, and not clash-of-clans.
> No company will admit that they are firing, e.g., DEI hires from 2023.
That's because it's a fantasy you've contrived to make yourself feel good.
Do you feel better now?
Now it’s “everyone get into the trades!”
I’d recommend talking to people in the trades first. Not saying it can’t be a good move, but it is definitely hard and has its own huge downsides like poor working environments, long hours, and years to actually get into decent paying roles.
Faddish career advice is usually bullshit, or it’s too late and the bus it’s telling you to jump on left the stop years ago.
The jobs lost are "business analyst, data analyst, data scientist, and business-intelligence developer."
That's not coding. Those are bullshit jobs.
BA are actually cool, please dont
Another thing I’m seeing from the employer side:
As companies tighten their belts, they’re quicker to cut low performers that had been hanging around for too long anyway because:
1) cost reduction
2) companies had been lazy at getting rid of low performers when the market was hot and they didn’t need to cut (and couldn’t find better devs to replace them anyway)
3) with the market this skewed towards employers, you can replace low performers with better talent anyway because everyone’s looking
A brilliant colleague of mine helped invent the Internet. They fired him anyway. Don't give me your comforting just-so story about "low performers". It's insulting.
Don’t take this out on me, Jesus…
Judging by your reaction, guessing you’re the low performer?
Edit: judging by your comment history, you sure have a grudge with the world. So this clearly isn’t about me. Have a good night, dude.
Not the world. The world is a lovely, beautiful place. Some of the people running it? Well.