Oh! There ARE lots of very smart people working on crazily hard problems, like in advanced mathematics and theoretical physics but the “problems” are… how could I say? … CRETINOUS!
Like Fermat’s last theorem and almost everything initiated by Paul Erdos, cleverness for the sake of cleverness, not gonna “change the world” that much…
Like nibbling around the infinite like it were the sex of angels: http://jdh.hamkins.org/set-theoretic-geology-and-the-downward-directed-grounds-hypothesis-cuny-set-theory-seminar-september-2016/
To me, the problem is finding someone who will be willing to pay me* to do that, if possible in an open context (opendata, opensource, etc.)
In fact, after finding someone willing to pay, there are not many questions about what to do, whoever pays will have clear ideas about how to use you.
I will be searching for opportunities in USA soon. Maybe things will look more bright there.
* as much as possible, I should say.
Peter Boothesays:
Universities used to be places with lots of odd little cracks where people could do such work. Consider Richard Stallman, who was at MIT from 1970 until 1984. Or Julian Jaynes who was a lecturer at various places while he worked on his writing. Or the network operations centers at universities throughout the US which both provided network service to the campus, but were also at the forefront of designing modern networking (e.g. Internet2).
As universities have been required to trim the fat and become more efficient, they have lost these little organizational crinkles where people were allowed to just sort of be while they figured out interesting things (or didn’t!).
To find the places where stuff is going to happen, look for places with ill-defined job descriptions that don’t tend to fire people easily and have a tolerance for people just trying new stuff without asking permission. Right now, that looks like Google and maybe Facebook to me (and what Bell Labs and Xerox PARC were), some of the US national labs, and the research institutions (MIT, Harvard, Stanford, Princeton, Oxford, Simons Institute, etc.) that have large enough endowments to not worry too hard at an institutional level about raising money to fund their operating budget through tuition or grants on a year-over-year basis.
Everyone other organization is involved in a shorter-term hustle for money, and it is almost impossible for weird cracks to survive when the surrounding organization is constantly shifting to try and optimize itself.
As universities have been required to trim the fat and become more efficient, (…)
I see no evidence that universities are more efficient than they were. There is more paperwork, certainly… more administrative staff… but I do not equate this with efficiency. Professors teach no more if not less than they did. Contrary to what is sometimes hinted, they do not publish more than they did… though we put many more names on each paper so it does look like people publish more. We don’t see more textbooks written. Costs have not gone down.
What I think might be happening is that there is a different outlook with respect to risk. In academia, “sure bets” are looked favorably upon… whereas “high risk, high reward” is frowned upon. Don’t even try this: “give us a year to solve this problem we do not know how to solve… we will report back…” This never gets funded even if the problem is tremendously important and the necessary funds are slim. Instead, you need to have something of the form “we have this very well understood problem with these well established techniques… and we’ll just keep doing what we used to do…”
In effect, it is a cultural issue. Governments want professors to train students who will graduate and move on to safe jobs.
Peter Boothesays:
You are right about universities overall, as their budgets have grown immensely. For the research and teaching portions, however, I think it is inarguable that most faculty are asked to do more with less. Funding is harder and harder to acquire (there is a graph of the average age of first-time PIs going around and the age has talked steadily upwards) but finding is becoming more and more important for tenure cases. Class sizes are steadily growing and an increasing part of the teaching load is being handled by adjuncts.
Perhaps we should look into university administrations to discover the new cracks and crinkles for big thinkers to hide while their ideas gel?
there is a graph of the average age of first-time PIs going around and the age has talked steadily upwards
I’d like to see a thorough analysis of this effect. You see, the population is getting older. People retire later. This is all an effect of a healthier population and of the modest gains we make against aging every year. Also, we spend more and more time studying. So I am not particularly impressed to see that people who have grants today are older than before.
It is not that I disagree with your sentiment.
I do see a lot of colleagues running around, supporting a dozen students… managing several large grants… while trying to teach without getting too many complaints… I don’t think this leaves much time for hard problems. Once you have solved your administrative problems, you are probably ready for bed. Every night.
The net result is that the average computer science professor in his 40s couldn’t put together a non-trivial piece of software in a year. But he sure can write 3 grant applications over the summer…
Peter Boothesays:
http://nexus.od.nih.gov/all/2012/02/03/our-commitment-to-supporting-the-next-generation/ (hopefully your comment software will turn the URL into a link?) has the graph that caused the NIH to start worrying. It’s not a concern that the average successful grant recipient age is rising (it’s good, even!), it’s a concern that the average age of the *first* successful grant applicant is rising. A healthy system would not respond to people not retiring by forcing young people (although I mean “people in their 30s” here, so “young people” is not the right words) to spend an extra 6-8 years in career limbo.
For the specific case of computer science I definitely agree that being good at programming is not a skill that is particularly rewarded in the CS academy – it’s part of why I left! I felt like I had to do research, teach, do service, keep up with industry practices, and keep up my programming skills (I was not at an institution that required me to get funding, but they did ask me to teach 3 classes a term). It was too much! I could do any three of those, and the safest ones to drop from a getting-tenure perspective were the programming skills and keeping up with industry practices. But being good at programming and knowing industry practices was what kept my CS classes and knowledge relevant! Faced with the choice of losing my sanity or my relevance, I opted out entirely. Now I am at Google, and am enjoying hanging out in a weird little crinkle of the organization while I try and figure out the best ways of analyzing Internet performance data.
I’ve met a lot of really smart people at Google whose jobs seem valuable, but I can’t quite pin down what their job description actually is. Which is what makes me think Google is (currently) good at supporting smart people in their own weird niches.
it’s a concern that the average age of the *first* successful grant applicant is rising. A healthy system would not respond to people not retiring by forcing young people (although I mean “people in their 30s†here, so “young people†is not the right words) to spend an extra 6-8 years in career limbo.
In academia, through financial engineering and careful marketing, we have ever more PhDs compared to the available jobs (professorships). Meanwhile, the number of grants is often not growing in proportion to the number of jobs. All these imbalances can easily explain why people get research grants later.
Regarding the NIH, it seems to me that there is a lot of academic hype followed with very little actual progress in practice. The solution is not, I think, to get young people earlier on the academic bandwagon.
The model whereas you give grants to professors who hire PhD students, who do breakthrough research… is not how science must necessarily work. I hope you can agree with this. People like Einstein, Newton, Turing… did not do the work they did by cycles of research grants.
For the specific case of computer science I definitely agree that being good at programming is not a skill that is particularly rewarded in the CS academy – it’s part of why I left!
I suspect, if we go back to the NIH, that trying to cure actual people is not particularly rewarded in the medical academy.
I felt like I had to do research, teach, do service, keep up with industry practices, and keep up my programming skills (…). It was too much! I could do any three of those, and the safest ones to drop from a getting-tenure perspective were the programming skills and keeping up with industry practices.
I suspect that this is true of all CS professors, no matter where they work. I think also that the “getting tenure” angle is overplayed. The fact is, in academia, you are part of a culture where hardly any professor ever programs (except maybe toy examples). Is it any wonder that it is not a rewarded activity?
There is a deeper issue. Practical skills are often low-status skills in our society. So even at Google (where I never worked), there will be a tendency, an irresistible one, to put people who can’t solve practical problems higher up and push down people who have useful skills. I think you can compensate for this, but it is not easy: it requires a strong culture.
Anonymoussays:
Wouldn’t basic income mostly solve this problem?
With basic income, if people want to do research they can just spend their time doing this.
If they need hardware it’s another matter though, it should be either bought or donated.
Oh! There ARE lots of very smart people working on crazily hard problems, like in advanced mathematics and theoretical physics but the “problems” are… how could I say? … CRETINOUS!
Like Fermat’s last theorem and almost everything initiated by Paul Erdos, cleverness for the sake of cleverness, not gonna “change the world” that much…
Like nibbling around the infinite like it were the sex of angels:
http://jdh.hamkins.org/set-theoretic-geology-and-the-downward-directed-grounds-hypothesis-cuny-set-theory-seminar-september-2016/
I agree that working on hard problems with smart people is not sufficient to move our civilization forward.
To me, the problem is finding someone who will be willing to pay me* to do that, if possible in an open context (opendata, opensource, etc.)
In fact, after finding someone willing to pay, there are not many questions about what to do, whoever pays will have clear ideas about how to use you.
I will be searching for opportunities in USA soon. Maybe things will look more bright there.
* as much as possible, I should say.
Universities used to be places with lots of odd little cracks where people could do such work. Consider Richard Stallman, who was at MIT from 1970 until 1984. Or Julian Jaynes who was a lecturer at various places while he worked on his writing. Or the network operations centers at universities throughout the US which both provided network service to the campus, but were also at the forefront of designing modern networking (e.g. Internet2).
As universities have been required to trim the fat and become more efficient, they have lost these little organizational crinkles where people were allowed to just sort of be while they figured out interesting things (or didn’t!).
To find the places where stuff is going to happen, look for places with ill-defined job descriptions that don’t tend to fire people easily and have a tolerance for people just trying new stuff without asking permission. Right now, that looks like Google and maybe Facebook to me (and what Bell Labs and Xerox PARC were), some of the US national labs, and the research institutions (MIT, Harvard, Stanford, Princeton, Oxford, Simons Institute, etc.) that have large enough endowments to not worry too hard at an institutional level about raising money to fund their operating budget through tuition or grants on a year-over-year basis.
Everyone other organization is involved in a shorter-term hustle for money, and it is almost impossible for weird cracks to survive when the surrounding organization is constantly shifting to try and optimize itself.
As universities have been required to trim the fat and become more efficient, (…)
I see no evidence that universities are more efficient than they were. There is more paperwork, certainly… more administrative staff… but I do not equate this with efficiency. Professors teach no more if not less than they did. Contrary to what is sometimes hinted, they do not publish more than they did… though we put many more names on each paper so it does look like people publish more. We don’t see more textbooks written. Costs have not gone down.
What I think might be happening is that there is a different outlook with respect to risk. In academia, “sure bets” are looked favorably upon… whereas “high risk, high reward” is frowned upon. Don’t even try this: “give us a year to solve this problem we do not know how to solve… we will report back…” This never gets funded even if the problem is tremendously important and the necessary funds are slim. Instead, you need to have something of the form “we have this very well understood problem with these well established techniques… and we’ll just keep doing what we used to do…”
In effect, it is a cultural issue. Governments want professors to train students who will graduate and move on to safe jobs.
You are right about universities overall, as their budgets have grown immensely. For the research and teaching portions, however, I think it is inarguable that most faculty are asked to do more with less. Funding is harder and harder to acquire (there is a graph of the average age of first-time PIs going around and the age has talked steadily upwards) but finding is becoming more and more important for tenure cases. Class sizes are steadily growing and an increasing part of the teaching load is being handled by adjuncts.
Perhaps we should look into university administrations to discover the new cracks and crinkles for big thinkers to hide while their ideas gel?
there is a graph of the average age of first-time PIs going around and the age has talked steadily upwards
I’d like to see a thorough analysis of this effect. You see, the population is getting older. People retire later. This is all an effect of a healthier population and of the modest gains we make against aging every year. Also, we spend more and more time studying. So I am not particularly impressed to see that people who have grants today are older than before.
It is not that I disagree with your sentiment.
I do see a lot of colleagues running around, supporting a dozen students… managing several large grants… while trying to teach without getting too many complaints… I don’t think this leaves much time for hard problems. Once you have solved your administrative problems, you are probably ready for bed. Every night.
The net result is that the average computer science professor in his 40s couldn’t put together a non-trivial piece of software in a year. But he sure can write 3 grant applications over the summer…
http://nexus.od.nih.gov/all/2012/02/03/our-commitment-to-supporting-the-next-generation/ (hopefully your comment software will turn the URL into a link?) has the graph that caused the NIH to start worrying. It’s not a concern that the average successful grant recipient age is rising (it’s good, even!), it’s a concern that the average age of the *first* successful grant applicant is rising. A healthy system would not respond to people not retiring by forcing young people (although I mean “people in their 30s” here, so “young people” is not the right words) to spend an extra 6-8 years in career limbo.
For the specific case of computer science I definitely agree that being good at programming is not a skill that is particularly rewarded in the CS academy – it’s part of why I left! I felt like I had to do research, teach, do service, keep up with industry practices, and keep up my programming skills (I was not at an institution that required me to get funding, but they did ask me to teach 3 classes a term). It was too much! I could do any three of those, and the safest ones to drop from a getting-tenure perspective were the programming skills and keeping up with industry practices. But being good at programming and knowing industry practices was what kept my CS classes and knowledge relevant! Faced with the choice of losing my sanity or my relevance, I opted out entirely. Now I am at Google, and am enjoying hanging out in a weird little crinkle of the organization while I try and figure out the best ways of analyzing Internet performance data.
I’ve met a lot of really smart people at Google whose jobs seem valuable, but I can’t quite pin down what their job description actually is. Which is what makes me think Google is (currently) good at supporting smart people in their own weird niches.
it’s a concern that the average age of the *first* successful grant applicant is rising. A healthy system would not respond to people not retiring by forcing young people (although I mean “people in their 30s†here, so “young people†is not the right words) to spend an extra 6-8 years in career limbo.
In academia, through financial engineering and careful marketing, we have ever more PhDs compared to the available jobs (professorships). Meanwhile, the number of grants is often not growing in proportion to the number of jobs. All these imbalances can easily explain why people get research grants later.
Regarding the NIH, it seems to me that there is a lot of academic hype followed with very little actual progress in practice. The solution is not, I think, to get young people earlier on the academic bandwagon.
The model whereas you give grants to professors who hire PhD students, who do breakthrough research… is not how science must necessarily work. I hope you can agree with this. People like Einstein, Newton, Turing… did not do the work they did by cycles of research grants.
For the specific case of computer science I definitely agree that being good at programming is not a skill that is particularly rewarded in the CS academy – it’s part of why I left!
I suspect, if we go back to the NIH, that trying to cure actual people is not particularly rewarded in the medical academy.
I felt like I had to do research, teach, do service, keep up with industry practices, and keep up my programming skills (…). It was too much! I could do any three of those, and the safest ones to drop from a getting-tenure perspective were the programming skills and keeping up with industry practices.
I suspect that this is true of all CS professors, no matter where they work. I think also that the “getting tenure” angle is overplayed. The fact is, in academia, you are part of a culture where hardly any professor ever programs (except maybe toy examples). Is it any wonder that it is not a rewarded activity?
There is a deeper issue. Practical skills are often low-status skills in our society. So even at Google (where I never worked), there will be a tendency, an irresistible one, to put people who can’t solve practical problems higher up and push down people who have useful skills. I think you can compensate for this, but it is not easy: it requires a strong culture.
Wouldn’t basic income mostly solve this problem?
With basic income, if people want to do research they can just spend their time doing this.
If they need hardware it’s another matter though, it should be either bought or donated.