← Back to context

Comment by snickmy

5 months ago

This was inevitable and we'll see it playing out all over Europe.

You have a desire to be relevant in an important technological shift.

On one side, you have big tech companies laser-focused on attracting the best talent and putting them in a high-pressure cooker to deliver real business outcomes, under a leadership group that has consistently proven effective for the last XX years.

On the other side, you have universities, led by the remnants of that talent pool—those who were left behind in the acquisition race—full of principles and philosophical opinions but with little to no grounded experience in actual execution. Instead, you find a bunch of PhD students who either didn’t make the cut to be hired by the aforementioned tech companies or lack the DNA to thrive in them, actively avoiding that environment. Sprinkle on top several layers of governmental bureaucracy and diluted leadership, just to ensure everyone gets a fair slice of the extra funding.

I'm surprised anyone is surprised.

I don't think universities should become industry. I mean, that is exactly what we have industry for. If you want to be put in a pressure cooker under leadership focused on business outcomes, great, do industry.

The problem really is that universities are treated as if they have the same mandate as industry. Government people shouldn't tell a professor what kind of research is interesting. They should let the best people do what they want to do.

I remember an acquaintance becoming a professor, promoted from senior reader, and he was going to be associated with the Alan Turing Institute. I congratulated him, and asked him what he was going to do now with his freedom. He answered that there were certain expectations of what he would be doing attached to his promotion, so that would be his focus.

This way you don't get professors, you turn good people into bureaucrats.

  • Yes. The demand for increasing control, driven by the "taxpayer's money!" lot evident in this thread, strangles almost all state-funded research because it demands to know up front what the outcome will be. Which instantly forces everyone to pick only sure-bet research projects, while trying to sneak off to do actual blue-sky research in the background on "stolen" fractions of the funding. Like TBL inventing the WWW at CERN: that wasn't in his research brief, I'm sure it wasn't something that was funded in advance specifically for him to do.

    Mind you, it was evident to me even twenty years ago when briefly considering a PhD that CS research not focused on applying itself to users would .. not be applied and languish uselessly in a paper that nobody reads.

    I don't have a good answer to this.

    (also, there is no way universities are going to come up with something which requires LLM like levels of capital investment: you need $100M of GPUs? You're going to spend a decade getting that funding. $10bn? Forget it. OpenAI cost only about half of what the UK is spending on its nuclear weapons programme!)

That doesn't sound like a fair appraisal of university research at all. How much do we rely on day to day that came out of MIT alone? A lot of innovation does come from industry, but certain other innovation is impossible with a corporation breathing down your neck to increase next quarter's profits.

  • The person you replied to is talking about the UK and Europe. I suspect that funding for research works differently at MIT and in the US generally.

    • Europe also seems to hand out PhDs like candy compared to the US (you can earn one faster, and you're less prepared for research), and there's a lot more priority put on master's degrees, which are largely a joke in the US outside a few fields like social work and fine arts.

      2 replies →

  • US universities (the usual suspects) have a substantial different approach to industry integration then European one.

    Yet, European leaders have not got the memo, and expect the same level of output.

  • Your rhetorical begs the question -- I can't think of anything more recent than the MIT license.

    What DO we rely on that has come out of MIT this century? I'm having a real hard time thinking of examples.

    • I think when talking about university research output it's pretty clear that the objective of university research is to produce output that is much earlier in the stack of productisation than something that comes out of a corporate entity. The vast majority of university research probably won't impact people on a day-to-day basis the way that running a product-led company will, but that's not to say it isn't valuable.

      Take mRNA vaccines for instance - the initial research began in university environments in the 80s, and it continued to be researched in universities (including in Europe) through the 00s until Moderna and BioNTech were started in the late 00s. All the exploratory work that led to the covid vaccine being possible was driven through universities up to the point where it became corporate. If that research hadn't been done, there would have been nothing to start a company about.

      It's the same in computing - The modern wave of LLMs was set off by Attention is All you Need, sure, but the building blocks all came from academia. NNs have been an academic topic since the 80s and 90s.

      I suspect that in 2050, there will be plenty of stuff being built on the foundations of work conducted in academia in the 00s and 10s.

      I wouldn't expect to see that many groundbreaking innovations being useful in day-to-day life coming out of contemporary university research. You have to wait several decades to see the fruits of the labour.

The problem is the "desire to be relevant in an important technological shift".

There's loads of worthwhile research to do that has nothing to do with LLMs. A lot of it will not or cannot be done in an industrial environment because the time horizon is too long and uncertain. Stands to reason that people who thrive in a "high-pressure cooker" environment are not going to thrive when given a long-term, open-ended goal to pursue in relative solitude that requies "principles and philosophical opinions" that aren't grounded in "actual execution". That's what makes real (i.e. basic) research hard and different as opposed to applied research. Lots of people in industry claiming to be researchers or scientists who are anything but.

  • "actual execution" in the business world seems to be more and more synonymous with recklessly and incompetently fucking things up. See also: doge.

Yes, this is so telling:

> For example, neither the key advance of transformers nor its application in LLMs were picked up by advisory mechanisms until ChatGPT was headline news. Even the most recent AI strategies of the Alan Turing Institute, University of Cambridge and UK government make little to no mention of AGI, LLMs or similar issues.

Almost any organisation struggles to stay on task unless there's a financial incentive or another driver, such as exceptional staff/management in place. Give them free money - the opposite of financial incentive - and the odds drop further.

I’m sorry to read this — it just doesn’t feel grounded in my own lived experience.

Many of the best Engineering and Computer Science departments, around the world, operate a revolving door for people to go in and out of industry and academia and foster the strongest of relationships bridging both worlds.

Look at Roger Needham’s Wikipedia page and follow his academic family tree up and down and you’ll see what I mean.

https://en.m.wikipedia.org/wiki/Roger_Needham

> remnants of that talent pool—those who were left behind in the acquisition race—full of principles and philosophical opinions but with little to no grounded experience in actual execution.

I do believe that these people at universities do have experience in the actual execution - of doing research. What they obviously have less experience in is building companies.

> Instead, you find a bunch of PhD students who either didn’t make the cut to be hired by the aforementioned tech companies

Or because they live in a country where big tech is not a thing. Or because these people simply love doing research (I am rather not willing to call what these AI companies are doing "research").

Jesus… are you this judgmental about everyone in society? Some people just value the university environment. It doesn’t mean they’re incompetent and had no other options. Not everyone values money above all else, nor does choosing to opt out of the private sector mean people are “remnants”.

  • From my perspective it's almost exactly opposite. Almost all of the people I consider exceptionally talented are vying for positions in academia (I'm in mathematics), and the people who don't make it begrudgingly accept jobs at the software houses / research labs.

    I'm frequently and sadly reminded when I visit this website that lot of (smart) people can't seem imagine any form of success that doesn't include common social praise and monetary gain.

  • Point 10:

    https://www.ettf.land/p/30-reflections

    • "Academia is an absolute fucking cesspool of political corruption, soul crushing metrics gaming and outright fraud functioning mostly as a jobs program for nerds that only produces valuable science completely in spite of itself, not thanks to it, because it manages to trap some genuinely smart and hard working people there like a venus fly trap its prey and keeps them alive to suck more “citations” and “grants” out of them."

      This is not my experience with academia. Rather, my experience was that a lot of very idealistic people tried to make their best out of the complicated situation set up by incompetent politicians.

      1 reply →

Another point re: grounded experience, good professors/researchers make a point to take sabbaticals to work in industry for that purpose.

  • Have met lots of professors who are glorified managers who do not actual research taking sabbaticals for a fat paycheck. I doubt very much they do any real work during these sabbaticals either. If I had to guess, I would bet that these sabbatical positions are frequently sinecures.

    • Could be the case for some, but in the cases I know of, it was nothing like that - they took it seriously and used it as I said.