Summary: we don't understand why programmers are paid so well. If you're a programmer, there's enough of a chance that this is temporary that it's worth explicitly planning for a future in which you're laid off and unable to find similarly high-paying work.

Programmers are paid surprisingly well given how much work it is to become one. Here's Dan Luu comparing it to other high-paid careers:

If you look at law, you have to win the prestige lottery and get into a top school, which will cost hundreds of thousands of dollars. Then you have to win the grades lottery and get good enough grades to get into a top firm. And then you have to continue winning tournaments to avoid getting kicked out, which requires sacrificing any semblance of a personal life. Consulting, investment banking, etc., are similar. Compensation appears to be proportional to the level of sacrifice (e.g., investment bankers are paid better, but work even longer hours than lawyers).

Medicine seems to be a bit better from the sacrifice standpoint because there's a cartel which limits entry into the field, but the combination of medical school and residency is still incredibly brutal compared to most jobs at places like Facebook and Google.

My sister is currently a second-year medical resident, and "incredibly brutal compared..." feels like a understatement to me. She works 80hr weeks, often nights, helping people with deeply personal and painful issues that are hard to leave behind when you go home. This is after four years in medical school, with still at least a year to go before starting to earn doctor-level money. When I compare it to how I started programming right out of college, making more money for 40hr weeks and no on-call, I feel embarrassed.

What makes me nervous, though, is that we don't really understand why programmers are paid this well, and especially why this has persisted. People have a bunch of guesses:

  • Demand: as software eats the world there are far more profitable things for programmers to do than people to do them.

  • Supply: it's hard to train people to be programmers, fewer people are suited for it than expected, and bootcamps haven't worked out as well as we'd hoped.

  • Startups: big companies need to compete with programmers choosing to go off and start companies, which is harder to do in many fields.

  • Novelty: the field is relatively new, and something about new fields leads to higher profits and higher pay, maybe via competition not being mature yet?

  • Something else: I'd be curious if people have other thoughts—leave comments!

Things are pretty good now, and seem to have gotten even better since Dan's 2015 post, but something could change. Given how poorly we understand this, and the wide range of ways the future might be different, I think we should treat collapse as a real possibility: not something that will definitely happen, or that's going to happen on any certain timescale, but something likely enough prepare against.

Specifically, I'd recommend living on a small portion of your income and saving a multiple of your living expenses. It's far more painful to cut expenses back than it is to keep them from growing, and the more years of expenses you have saved the better a position you'll be in. If you take this approach and there's no bust, you're still in a good place: you can retire early or support things you believe in.

If being laid off and unable to find similarly high-paying work would be a disaster, figure out what you need to change so that it wouldn't be.

(This isn't really specific to programming, but I think the chances of a bust are higher in programming than in more mature fields.)

Comment via: facebook

New Comment
73 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I had a similar realization many years ago but I have a very different (and lonely) perspective. Nobody seems to get it, maybe someone here will.

I realized this (unfair income) in 2011 as a junior in university, right after I got an internship at Facebook. They paid me $6000 / month and I had only been coding for one year (literally). Previously I dabbled in multiple other majors and my internship offer was higher than the full time salary of my peers in other majors (whom I respected deeply).

I saw this as an opportunity. During my internship and my senior year, I taught my highschool friend how to code while he completed his major in econ. I figured if it only took me one year to get into facebook, he could do it in two. A year after I got a job at a startup, he got a job (105k base).

My girlfriend at the time graduated with a stats degree and was doing customer support. I thought maybe I could get her into coding too, and I did. A year later she got a job (115k base).

Then I had an idea.... could I teach anybody coding? I reached out to a kid I knew back in high school who had a 2.0 GPA. I figured his life sucked and it did (he was a uber driver). Things didn't turn out so we... (read more)

4Viliam
The idea of paying your students for studying sounds fantastic! Maybe you could make a contract with them that they will return you the money if they get a software development job (similar to how Lambda School does it, except they don't pay their students, only teach them for free).
2randomsong
I always get this comment: Maybe you could make a contract with them that they will return you the money if they get a software development job No offense, but I don't like that idea and the answer will always be no. Why should somebody whom society left behind be expected to pay in their pursuit to have a normal life like everybody else? These people are just getting their lives started, I don't want them to have a looming payment hanging over their heads. If you have been in debt before you know how stressful it feels to be indebted. These guys should pay: Microsoft, Google, Facebook, Amazon, PayPal, Apple, etc. etc. My goal that I'm working towards is to lead sustainable open source projects and negotiate a direct employment contract from companies because the engineers we produce are of such good quality. Its an ambitious journey (I know), but it makes the most sense to me.
Why should somebody whom society left behind be expected to pay in their pursuit to have a normal life like everybody else? These people are just getting their lives started, I don't want them to have a looming payment hanging over their heads.

Do as you wish, of course; it's your (potential) money and your time. My perspective was that maybe having some of the money back would allow you to teach more people. Like, that you can afford to donate money to ten people, but you could loan money to hundred people; and although getting a gift is better than getting a loan, hundred is also more than ten. On the other hand, if money is not the bottleneck but your time is, then this doesn't make sense. No "should's" were involved in the calculation.

Also, payments in style of Lambda School are not that bad. They are limited in time (unlike school loans), and you only pay if you get a well-paying job. That means that having the new job and the debt is already an improvement over having the old job (and then the debt expires so it becomes even better), and if you fail to get the promised new job, then there is no payment.

4randomsong
I understand where you are coming from. From my perspective, I don't see the point of helping "more" people. Doing so lowers the quality for the existing students and creates more burden on myself. If you were in my shoes, what would be the inspiration for helping more? For me, I'm just looking for a balance. One person at a time, when a student leaves I'll get one or two more to fill the spot depending on budget. I really hope you are right. Personally, the students who are the slowest have severe self confidence issues and they don't communicate their emotions very well. It breaks my heart to imagine the emotional turmoil they might feel if they fail. Part of what I spend most of my time doing is to make sure nobody fails. I'm extra committed to make sure nobody gets left behind. Maybe it makes a difference, maybe not. Thanks for sharing your thoughts.

Things are pretty good now, and seem to have gotten even better since Dan's 2015 post, but something could change. Given how poorly we understand this, and the wide range of ways the future might be different, I think we should treat collapse as a real possibility:

Poor understanding is in the map, not the territory. I started to write a comment arguing that this is incorrect, that the factors which cause programmers to be well paid are straightforward and aren't going to go away. But instead of that, how about a bet.

Here's the US Bureau of Labor Statistics series for 2nd quartile nominal weekly earnings of software developers (applications and systems software): https://fred.stlouisfed.org/series/LEU0254530600A. They didn't seem to have mean, median, or other quartiles. There are other series for different subsets of programmers, like web developers; I chose this one arbitrarily. The series is not inflation adjusted.

I will bet up to $1k at 4:1 odds that the 2030 value in this series will be greater than or equal to the 2018 value, which was 1864. So $1k of my dollars against $250 of other peoples' dollars.

(I'll accept bets in order of their comment timestamps, up to a maximum of $1

... (read more)
9jefftk
I'm not sure we disagree. I was thinking more like a 15% chance. This is high enough that it's definitely worth being prepared for, but I wouldn't take 4:1.
2jefftk
Came across this again randomly. Why were you offering the bet on nominal income instead of adjusting for inflation? (That particular series hasn't updated since 2019)
4jimrandomh
I don't specifically remember, but I think I was mostly  going for "only have to check the number in one place" (not expecting that the series would stop in 2019), and secondarily I expected we both thought inflation would be predictable, and so this provided some margin in the case where wages stagnated but didn't fall by much.

We don't understand why programmers are paid so well

It might be pretty straightforward.

GDP is 2x higher than in 2000Real GDP per capita is up 25% since 2000, so some category of workers must have gotten at least 2x more productive in that timeframe, so their real income should be 2x higher. That seems like a plausible story for the software engineering career.

This simplified model is consistent with my inside view. I can help sufficiently big companies save $millions/yr, or earn $millions/yr more, by writing software that streamlines some detail of their internal processes or external interactions. And my ability to do this is obviously much higher thanks to all the tools that exist today that didn’t exist in 2000: cloud computing, tons of third-party APIs, ubiquitous mobile phones with fast chips and internet connections, huge high-resolution monitors, Stack Overflow, etc.

The tech entrepreneur and investor Naval Ravikant talks a lot about how “leveraging yourself up” is the key to getting rich. E.g. these days Joe Rogan makes $1M for recording a 90-min podcast because he built a following. He has distribution as leverage.

Programming in the internet age is an unprecedentedly hug

... (read more)
6Liam Donovan
Why not use per capita real GDP (+25% since 2000)?
4Liron
Ah right yes, ok so not a huge increase. Meanwhile, I've seen real value of entry-level programmer compensation packages at least double since 2000, probably more like triple. I think my point about GDP growth helping the outside view is this: *some* significant chunks of sectors in the economy are getting significantly more productive. What kind of workers are producing more value? What are the characteristics of a job that enables more value creation? One where there's more leverage, i.e. an hour of work produces more economic value, without a corresponding increase in supply. And any sector where the leverage on time is increasing say 5x+ per decade is a good bet for an area where supply is trailing demand.
1TAG
If you measure programmer productivity by the number of jobs replaced by their code, not lines of code, then programmer productivity is almost unlimited.

I used to work as a software engineer. As the company I work for has grown a lot, I now no longer write code, but do software design, and hire new team members in different positions, inluding PMs, visual design, usability design, backend programming, and frontend programming.

It is extremely difficult to find good programmers, especially frontend programmers.

I'm pretty sure that the reason here is not that it is difficult to become a good programmer, but that a lot of people choose not do, for a number of reasons.

Two reasons that I have personally encountered:

  • I studied comp sci between 2000 and 2005. During my first year, we had about 20% women. At graduation, we had about 5% women. Reasons are probably varied, but a major reason, at least back then, was that professors were hostile towards women studying comp sci (one of them explicitly told a friend of mine who was studying with me that he thought women were not suited for comp sci). In effect, we're basically excluding half of the population from this job option.
  • A lot of people just don't consider programming as a job option at all. Trying to encourage people to enter the field, people typically push back because
... (read more)
[-]Shmi180

An interesting analysis would be to find the relevant reference class. What other group is/was similarly overpaid, for how long and for what reason?

Being a good software developer is very very difficult. Only a few percent of the population have the wiring, the wattage and the inclination to do it for long enough to be very productive.

Compare coding with portrait painting or composition for orchestra or pro golf - anyone can learn the basics of them, but very few can become good enough to be paid for them.

The thing is, you don't have to actually be particularly good at software development in order to get a high-paying programming job. Even mediocre or very junior programmers can easily break six figures, something that's much harder even in other intellectual labor positions in the Bay Area (e.g. technical writing, which is what I do). So, while I don't disagree that being a good software developer is very difficult, I definitely don't think that explains away the issue discussed in the OP, and I definitely disagree that "very few can become good enough to be paid for" software development.

(Source: I work for a software recruiting company where I have access to information on both the skill level and the salary of thousands of software developers.)

 

7randomsong
I don't know how to say this except, you are wrong. I've been trying to prove you correct since 2011 by teaching every low-ranking society person and I succeeded every time. I saw a college dropout (with multiple Fs on her transcript) become a good engineer, I saw a 40 year old become a good engineer. Last year my dad (60 years old with 0 coding experience) picked up coding and I think he's gonna do great. I had hoped that you are right so I could have the same sense of job security, but the belief that "being a good software developer is very very difficult" is wrong. It may be helpful for you to start seeing things from a different perspective, better sooner than later.

I don't think your experiment gives much evidence that "anybody" can learn coding, just that it isn't very strongly correlated with social status.

randomsong describes at least 15 successes and zero failures, which is certainly not what I would have predicted in advance. If we take this at face value, either they have a pretty strong filter for who they teach[1] or it's pretty decent evidence that "anybody" can learn programming, at least for colloquial definitions of "anyone".

[1] Which is the opposite of what they're trying to have, though of course that doesn't rule out that they have one anyway.

7clone of saturn
Yes, my immediate assumption was that they have a strong filter on who they teach. I don't find it terribly implausible that someone would know 15 people who are smart enough to code. But I think they're going to be unpleasantly surprised if they start teaching strangers from the public library.
4randomsong
Perhaps so. If I fail I will write about it. One thing I can confidently say is that teaching is very difficult, so failure is a real possibility. I sure hope this works out though. 10 / 15 original students were random people who raised their hand on a facebook group when I posted a potential pilot program. I think this prepared me well for the coding bootcamp at our local public library that was launched last week. I hope to keep this going throughout 2020 and see what happens. Here's the meetup group, if you are around the area come say hi! https://www.meetup.com/San-Jose-C0D3/
4Daniel Kokotajlo
I'm fascinated to hear how this went. Well done, Randomsong, and please let us know what happened!

Last year, my dad (60 years old with 0 coding experience) picked up coding and I think he's gonna do great

That's not the question being posed. The question being posed is whether your dad is now in a similar enough reference class to you to be considered a substitute for you, and thereby lower your salary.

I'm inclined to agree with Mark Roberts here. Not everyone has the mental horsepower and right ticket in the lottery of fascinations to be a programmer. It's like with any other trade skill. Can I do woodworking? Absolutely. I can knock together small projects fairly easily. But do I have the aptitude and interest in woodworking to become a professional carpenter? Absolutely not. Can I do plumbing? Sure. I've replaced my own sinks and faucets. But do I have the aptitude and interest to become a professional plumber? No way. Why is programming any different?

You touched on something important here. The most important hurdle I have to overcome with students is making them feel empowered and needed so they care about coding. Afterwards, the problem solving skills become easier to teach.

If you are the only carpenter in town and your family needs a home, you can absolutely care enough to become a professional carpenter.

You can also develop the aptitude and interest to become a professional plumber if you feel valued and people around you needed a great plumber.

6quanticle
I disagree and what I've seen and read of people doing their own construction work seems to back me up. If you're the only skilled person in town and you need a home, then you'll probably be able to knock something together. But will that structure be safe? Will it keep out the rain in a storm? Will it keep out the wind in winter? Will it work reasonably well immediately after you've built it or will it require constant patching for months or year before it finally becomes usable? All of these questions have fairly direct analogs to programming. I do think there are differences between programmers that speak to aptitude differences, rather than differences of experience. When comparing two programmers with roughly equivalent amounts of experience, I have noticed that some programmers just "get it", whereas others don't. Their first solutions are faster (often algorithmically faster). They've thought through more edge conditions. Their code is simpler and easier to read. I agree that even a less talented programmer, perhaps with coaching and assistance, will eventually be able to arrive at the solution that the more talented programmer arrives at immediately. But it doesn't matter. By the time the less talented programmer has found the best solution for problem 1, the more talented programmer has moved on to problems 2, 3, 4 and 5. This is definitely noticeable over a 6-12 month period, and it's likely that the less talented programmer will be eased out of the organization. I don't know if these differences are due to IQ or the lottery of fascinations. I suspect it's both. However, it is plain to me that there are differences in ability between programmers who have equivalent experience, and these differences do go some way towards determining who is successful as a programmer and who isn't.
0randomsong
Nature vs nurture. I agree there are less competent people. I believe their incompetence is due to nurture. Anything nurtured can be unlearned. One year is a long time. I believe that less competent people, over time, could be nurtured into great people with the right mentorship. 10 years of good strong mentorship could make incompetent person a great person. We may have a disagreement based on 1st principles, which is okay. I'm glad we got down to that.

Alternative hypothesis: for most of human history returns to analytic abilities were anomalously low due to the bottleneck of geography limiting returns to scale.

By "we don't understand", you mean "I don't understand". There is no great mystery; programmers are paid as well as they are because of the amazing efficiency improvements their employers get by automating work. If you think about how much money you make your employer (or even better, talk to your company CEO or someone close enough), you'll see that in fact, programmers could be paid a lot more if they were aware of their impact.

Whether it's "fair" or not is irrelevant - you can accomplish a lot with littl... (read more)

What we don't understand is why this has persisted: the barriers to entry are low, the pay is high, why don't people shift into the field and bring up labor supply?

The barrier to entry is higher than you think, it just takes the form of a talent requirement rather than a training requirement.

Also, smart people often live in a bubble of other smart people. Get out of the bubble and then try again teaching programming.

Recently I got a temporary side job teaching "computer skills" to random people. Most of them had serious problems understanding the "IF" statement in Excel.

5FactorialCode
They are? I know several people who've pivoted to becoming software developers. I think it's just that growth in demand is keeping up or outpacing growth in supply.
1Gytis Daujotas
I'm not sure I'm following. Janitors are also great; nobody would really want to step foot in a business or storefront if it had trash everywhere. Without a janitor you would lose most if not all of your business quite fast. Yet janitorial work is low paid due to the high supply. Most such roles can be said to have a high impact on a company. It is easy to see how isolating any role in a company you could hypothesize that they should be paid 10x what they are since without their role the company would be in ruins. Unfortunately this is not accurate to reality. To my understanding, that is the point of the argument being made: why are programmers paid so highly when there are so few barriers to becoming a programmer, meaning that the supply of programmers should be higher than it is? If programmers are so amazing and high achieving then there should be many people lining up to become one (as the argument theorizes this is easy).
6jimrandomh
If a janitor quits, a new janitor can be hired the next day with minimal disruption. If a programmer quits, it will be half a year before a newly hired replacement can have acquired the context, they may bring expertise about your business to a competitor, and there's a significant risk that the replacement hire will be bad. Projects and businesses do sometimes fail because their programmers quit. This means that even if there were an oversupply of programmers, it would still be worth paying them well in order to increase retention.
2jefftk
I agree, though these are factors that are exacerbated by the field being so new.
6jimrandomh
How? Is the model that, as the field matures, programmers will get more fungible? Because it actually seems like programmers have gotten less fungible over time (as both projects and tech stacks have increased in size) rather than more.
2Viliam
Seems to me that there is pressure on developers to become "full-stack developers" and "dev-ops", which would make them more fungible. But there are also other forces working in the opposite direction, which seem to be stronger at the moment.
2jefftk
My model is that over time systems get more similar between companies, as we start learning the best way to do things and get good open source infrastructure for the common things. But you may be right: there's a really strong tendency to build layers on top of layers, which means, for example, "familiarity with the Google Ads stack" is very important to the company and not a very transferrable skill.

Timescales matter. The modern internet's only a bit over 25 years old, and common developer compensation has been truly crazy only for maybe 15 years of it. Easy to predict a bubble, difficult to predict the size or when it ends. People were starting to notice in the 70s, and take it seriously in the 80s, that the idea of a career was changing. Nobody could expect to work for one company for many years anymore. In the 90s, that expanded to industries - most people don't (or shouldn't) expect to work in one TYPE of job for an entire car... (read more)

I think coding just generates a ridiculous and growing amount of value. Look at this list of companies with large earnings per employee. Note that they all specialize in some form of tech or finance. With a regular job, you're bottlenecked by how much work you can accomplish as an individual. With programming, the value you generate is proportional to how much value your code generates. A Lawyer might generate 100,000$ in value per year. The company that makes lawyers 5% more efficient generates 5,000$ / lawyer*year. The lawyer has to dedicate his life to

... (read more)

I think that automation can save a lot of money, for a company. As an individual, if you automate something for yourself, you probably spent more time analyzing the problem and writing the code, than the task took originally. But in a company, you can automate a repetitive task of hundreds of people. And those people made errors when they did it manually, so you also improved the quality. If you save 40 people 1 hour a week, you have already paid your salary. Actually, the company now got 1 extra hour from those 40 people forever, but they only paid you fo... (read more)

A question I have is, what do you mean by "less"? Dan Luu is citing programmers making on the order of $250,000 to $300,000 total compensation, but as a programmers who has made his entire career outside of the Bay Area, I have never seen anywhere near compensation that high. What if the phenomenon is that salaries in the Bay Area are skewed upwards, perhaps due to the cost of housing? In that case, perhaps only programmers in the Bay Area need to worry, as tech firm expansion outside the Bay reduces salary growth, but programmers outside of the Bay will be relatively unaffected (and might even benefit, as demand in other markets increases).

2jefftk
I'm talking about the top end of the market, which is almost entirely in the US though not confined to the Bay Area: NYC, Boston, Seattle, etc have similarly high levels of programmer pay at high-paying companies. Ex: I'm in Boston and making ~370k.
4quanticle
So how much should a hypothetical programmer making, say, $120,000 a year worry? Another way of phrasing this question would be, do you expect the median programmer salary to decline or do you expect the variance to decline so that most programmers make about the same amount of money?

One of my internship mentors at Google told me their average software engineer generates $1 million dollars of value for the company every year. So I don't think it's any mystery why they're paid so well.

From my other comment:

What kind of workers are producing more value? What are the characteristics of a job that enables more value creation? One where there's more leverage, i.e. an hour of work produces more economic value, without a corresponding increase in supply.

Another example of a sector that's seeing much higher economic leverage is white-collar work serving high-cost-of-living countries/cities from within lower-cost-of-living countries/cities. E.g. English-speaking workers from our neighbor Mexico where real income per capita is only 28%... (read more)

3quanticle
I'm not sure remote work is as advantageous as you think it is. If remote work were so obviously superior, we'd be seeing programming rapidly evolving to support remote work, both here in the US and offshore. Yet, that's the opposite of what I'm seeing. I'm seeing more and more companies bring their programming in-house. I'm also seeing more and more companies insist on programmers being in the office, rather than working remotely.
2Liron
I'm confident that the larger trend is the opposite of the examples you've witnessed. Sure, big companies who wake up to the importance of having engineering resources might first get the idea to hire them in-house, but remote work is clearly a huge trend throughout the economy. One way to understand this is that even "in-house" often means that a company's own employees are working across national/international offices or working from home.
4quanticle
I'm not so sure about that. The trend I've seen at larger companies, such as Google, has been towards more people coming into the office. Moreover, even at startups, once the company reaches a certain size (around 20 developers or so), the trend I've noticed is to encourage workers to co-locate in an office. It's the rare company, in my experience that manages to stay remote while having a workforce of hundreds or thousands. In fact, I can only name one: Zapier. If remote work were so much more advantageous for programming productivity than co-locating people in an office, then I'd expect to see many more examples of medium and large corporations embracing remote work than I do.
2jefftk
Automattic, which makes WordPress, is another one

Zapier is "250+" employees. Automattic is 1153 employees. Gitlab, another fully remote company, is 1117 employees. All of these companies are rather small. I would be interested to see whether they can continue to be fully remote as they scale past 10,000 employees. My suspicion is that large organizations cannot be fully remote, as remote working tools do not (currently) provide the necessary communications bandwidth and latency to allow large organizations to function.

2Liron
But every sufficiently large organization is already distributed across lots of offices and timezones. Why should we expect the distinction between “on-site” and “off-site” work to be relevant to productivity if on-site work is already remotely distributed? The inside view is: even if you’re in the same office with all the people who matter to your job, most of your job is done by you interfacing with your computer. Even when I did the whole “live with your startup cofounder in a 2BR apartment” thing, we worked in separate rooms and interacted via text. So what specific interactions happen in meatspace that are durably necessary for increased productivity in our increasingly virtual world, and can’t be compensated for by any creative remote-work best practices? It seems obvious to me that the answer is nothing.
2quanticle
There is a huge difference between being scattered across a dozen offices and three timezones and being scattered across literally tens of thousands of offices (since each worker is plausibly in their own office, remote from all the others) and four or five timezones. And I repeat: if it's so obvious, then why isn't it winning? Why do we not hear about major remote-work initiatives from e.g. Google or Facebook?
4Liron
I don't have any hard data but I'd bet that the ratio of "work locations per market cap dollar" has been steadily increasing in the economy over the last few years. (A measure of how distributed each company's workforce is, and weighting higher-market-cap companies higher.) I also bet more than 50% chance that within 3 years at least one of {Google, Microsoft, Facebook, Amazon} will give more than 50% of their software engineers the ability to work from home for at least 80% of their workdays. The fact that Stripe ($30B+ valuation) is now actively hiring many remote employees is a significant recent anecdote I can offer. Do you have any significant anecdotes indicating a remote work decrease?

I also bet more than 50% chance that within 3 years at least one of {Google, Microsoft, Facebook, Amazon} will give more than 50% of their software engineers the ability to work from home for at least 80% of their workdays.

If that's not just a figure of speech, I'll take this bet. $100 each?

(This is not intended as commentary on the question at hand.)

You think I would use the language of belief probabilities as a figure of speech???

I’m up for $100 vs $100. Just send me a message at https://m.me/sendmessage to confirm with your real identity.

That was not how I was expecting this bet to resolve!

4Ben Pace
Always remember to model tail events causing all your bets come out the wrong way... :)
2quanticle
According to this article by the Society of Human Resource Management, Best Buy, Yahoo, IBM and Honeywell have all abandoned remote work initiatives after noting that the costs of such outweighed the benefit. I would also say Stripe is roughly 2000 employees, which puts it in line with the other remote-first companies I've posted above. They're a long way from companies that have tens, or even hundreds of thousands of workers.
2Liron
Alright I shall update to a belief that maybe there’s some different dynamic that makes remote work less advantageous when an organization size is in the 10,000+ range. What are all these “costs” that outweigh the benefits? I suspect they are easy enough to circumvent. My observation is that all the same tools being invented that make non-remote companies more productive (e.g. Slack) are usable remotely. I’d still bet 1:1 odds that 2+ of the top 10 US companies by market cap in 5 years will allow more than 50% of their employed software engineers to work from anywhere.
4Elizabeth
I would expect the opposite. When you have 10k employees, it is physically impossible to have all of them receive the benefits of co-location to everyone else. It's also more expensive to co-locate 10k employees with each other than two, even though most of the "co-located" employees are still functionally remote from each other. I dug into this last year, and my conclusion was that either fully remote or fully co-located could work, and the real villain was hybridization.
2quanticle
As I indicated in my other comment every worker does not need the benefits of co-location with every other worker. The corporation desires that every worker be co-located with every other worker with whom they communicate on a regular basis. Teams aren't spread across locations randomly, they're arranged geographically by function.
2Elizabeth
I didn't mean to suggest that there were no benefits to co-location, merely that I don't understand how those benefits would scale up with total number of employees at a company.
0quanticle
It's not that the benefits of co-location scale up with size, it's that, to a first approximation, communication overhead scales linearly with the number of employees in a remote-work environment and scales with something like the logarithm of employees in a co-located environment. New technology, such as e-mail or slack, in my model, doesn't go far enough to address that disparity. I think there's still a point at which the benefits of having everyone in a centralized office outweighs the savings from not having to rent office space.
2[comment deleted]
2Elizabeth
Assuming you hold the number of timezones constant, what is the difference between dozens of offices and tens of thousands of offices? Or rather, under what circumstances are those very different? To give one scenario where I don't think it is different: if all of your contact is with workers outside your office, it doesn't seem to matter if you drive into that office or call them from home.
4quanticle
But that's just the thing: in situations where there are multiple offices, management does not assign workers randomly to various offices. Instead workers are assigned to offices according to some criterion that is a proxy for how much communication there is going to be among the workers. While there is inter-office communication, the volume of inter-office communication is usually much less (by several orders of magnitude) than the volume of intra-office communication. Whereas, with remote work, you lose out on the benefits of colocation, since every communication is, in effect, an inter-office communication. It's like a computer architecture problem. It's much easier to split work among a few larger powerful nodes than it is to split work among many smaller, weaker nodes.

Summary: we don't understand why programmers are paid so well.

Something I didn’t see addressed here or on Dan Luu’s blog post are productivity (as the main economic reason for wages) and scale ( as an amplifier for SWE productivity )

A SWE is made more productive by the tools available: fast cpus, internet, cloud, OSS, mobile

Cloud and mobile then amplify the value that the modern SWE brings to a business - but not to all of them

Businesses are not charity, they will pay less if they can

Can you talk more about retirement and earning to give? I see you max out your 401k, but am curious how much you have saved for retirement and how much you think you'll need. Retirement fears have been the only cause of trepidation when I think about earning to give.

3jefftk
I've been putting the max allowed into a pre-tax IRA since I started at Google in 2012. I also put in a little in 2008-2009 when I was at BBN, but much less. We took $10k out, which the IRS let's you do without penalty, when we bought our house. It's invested in index funds, and is currently at $387k. Our house is also a kind of savings/investment. You can think of the mortgage as a combination of rent and forced saving. Figure out how much you'd like to have saved for retirement, and see what that leaves?

I handle this uncertainty via diversification.

I've dumped portions of my income into purchasing and building rental properties.