Bahl ’24: salary versus science when choosing a tech career
Like many other computer concentrators at Brown, I started looking for a summer internship early last fall. In the weeks leading up to winter break, I saw many of my peers informing their LinkedIn contacts that they would be spending their summers working for high-tech companies. To my surprise, the majority of them demonstrated a dramatic bias towards Silicon Valley. The appeal of these offers is pretty clear – the salary can be considerably higher than the alternatives in computer science research and academia. But I was still shocked to notice that even Brown’s students — as socially concerned as they usually are — weren’t pushed into more impactful and experimental career paths. Frankly, industry salaries tell us little about the extent to which we can actually advance technology itself, a consideration that should weigh far more heavily on students who have received the world-class education than Brown offers.
Starting salaries at big tech companies can reach upwards of $150,000 a year, and that’s without the added bonuses and lucrative stock options they also offer. However, junior software engineers at these companies are essentially cogs in gigantic corporate machines: one of thousands of skilled software engineers contributing to a code base that is probably millions of lines long. These systems have already been proven to work extremely well – it wouldn’t make sense to let a new graduate try out significant modifications to a software that generates millions of dollars. This dynamic is not inherently negative, but it does remove some potential for creative engagement and innovation. A-level software engineers may be more likely to win major promotions and multiply their salary many times over (up to a figure well over $500,000 per year) before they have the power to experiment with development methods software that have not yet been tried. and tested.
While the average computer science scholar earns significantly less — an average of about $103,000 for tenured and tenure-track faculty at public institutions, according to a study by the National Education Association — they also have a degree of autonomy almost impossible to match in an established company. Although many researchers, especially in universities, are forced to publish frequently and obtain grants, their superiors have much less control over what they investigate or how they choose to conduct their research. In fact, many computational researchers take full individual responsibility for the entire research process, from ideation to impact, allowing them to architect every decision they make. Moreover, the success of their work is not closely tied to the ease with which their findings can be monetized. Thus, unlike their industry counterparts, researchers have the freedom to pursue less established and more experimental methods.
Over the past five years, it is clear that academia and industry have made significantly different contributions to machine learning. Although machine learning is an important part of the work done by many large tech companies, these companies have largely settled into techniques that were first developed almost 10 years ago. It’s common to use libraries such as scikit-learn or TensorFlow that make it easy to use popular and existing machine learning models in desired applications. These libraries are useful, but they are also standardized, which makes it difficult for their users to innovate with new machine learning techniques. Instead, it is academic researchers who have made the most crucial contributions to the field in recent times, whether discovering new patterns that lead to performance improvements or conceptualizing how technology works. deep learning.
While it is true that most fields see differences in the tasks performed by those working in academia versus industry, this disparity is even more concerning in the technology sector given the dominance of great technologies today. University research is underpinned by the idea that the advancement of our knowledge must take priority over its monetization. While there is significant funding for academic research in technology, the private sector, especially FAANG, has access to just as much, if not more, research money. The massive assets of these companies allow them to pay such extraordinarily high salaries to new hires, not to mention the plethora of amenities they offer. Since most academic research is simply not designed to generate a comparable net profit, compensation for academic researchers is also insufficient, regardless of its true value in shaping the field. This financial disparity can often serve to push people away from academia and into big tech.
Although some students actually need the extra money a career in industry could provide – to pay off debts, support family members or deal with medical emergencies – not everyone has this. kind of motivation. Many are pushed into the industry solely on the basis of pay, whether or not they really feel satisfied with the work they are doing.
Although neither the contributions of university researchers nor their expertise in their field are sufficiently recognized in their remuneration, the influence that their work has in defining the future of computing is of paramount importance. While we cannot overlook monetary compensation when choosing our career path, we must remember that the greatest value of a technology career lies not in its financial rewards, but in the scientific advances it brings. generates.