Comment by archi42
2 days ago
I treat all social science degrees as "likely bullshit" these days. Could as well be astrology.
A few computer science friends of mine worked at a social science department during university. Their tasks included maintaining the computers, but also support the researchers with experiment design (if computers were involved) and statistical analysis. They got into trouble because they didn't want to use unsound or incorrect methods.
The general train of thought was not "does the data confirm my hypothesis?" but "how can I make my data confirm my hypothesis?" instead. Often experiments were biased to achieve the desired results.
As a result, these scientific misconduct was business as usual and the guys eventually quit.
Glad to know they quit. That's exactly what I observed, except it was probably worse if I think back at it. I'm a mathematician "by trade" so was sort of pulled into this by proxy because they were out of their depth in a tangle of SPSS. Not that I wasn't but at least I have conceptual framework in which to do the analysis. I had no interest or knowledge of the field but when you're with someone in it you have to toe the line a little bit.
Observations: Firstly inventing a conclusion is a big problem. I'm not even talking about a hypothesis that needs to be tested but a conclusion. A vague ambiguous hypothesis which was likely true was invented to support the conclusion and the relationship inverted. Then data was selected and fitted until there was a level of confidence where it was worth publishing it. Secondly they were using very subjective data collection methods by extremely biased people then mangling and interpolating it to make it look like there was more observation data than there was. Thirdly when you do some honest research and not publish because it looks bad saying that the entire field is compromised for the conference coming up which everyone is really looking forward to and has booked flights and hotels already.
If you want to read some of the hellish bullshit, look up critique of the Q methodology.
Sounds like economics.
Research fraud is common pretty much everywhere in academia, especially where there's money, i.e. adjacent to industry.
It does rather depend on the industry. Research in fields relevant to electrical engineering are much less likely to be fraudulent because the industry actually uses the results to make the products and the customers depend on those products working as specified.. If you discover a better and cheaper ceramic insulator you can be confident that transformer manufacturers will take it up but the big companies are well stocked with experts in the field so a fraudulent paper will quickly be spotted.
Graphene in electrical engineering is a staple of every (dis)reputable papermill.
1 reply →
Let me introduce you to theoretical condensed matter physics, where no one cares if the data confirms the hypothesis, because they are writing papers about topics that very likely can never be tested.
At least in the social sciences there is an expectation of having some data!
That's actually the part about people constantly negging on social sciences [1] that I often find confusing.
There's huge amounts of data available (geography, lots and lots of maps; history, huge amount of historical documentation; economics, vast amounts of public datasets produced every month by most governments; political science, censuses, voting records, driver registrations, political contest results all over the Earth - often for decades if not centuries).
Most is relatively well verified, and often tells you how it was verified [2]. Often it's obtainable in publicly available datasets that numerous other researchers can verify was obtained from a legitimate source. [3][4][5][6][7][8][9][10][11][12]
There's lots of data available. Much is also verifiable in a very personal way simply by walking somewhere and looking. In many ways, social sciences should be one of the most rigorous disciplines in most of academia.
[1] Using Wikipedia's grouping on "social sciences" (anthropology, archaeology, economics, geography, history, linguistics, management, communication studies, psychology, culturology and political science): https://en.wikipedia.org/wiki/Social_science
[2] Census 2020, Data Quality: https://www.census.gov/programs-surveys/decennial-census/dec...
[3] Economic Indicators by Country: https://tradingeconomics.com/indicators
[4] Our World in Data (with Demographics, Health, Poverty, Education, Innovation, Community Wellbeing, Democracy): https://ourworldindata.org/
[5] Observatory of Economic Complexity: https://oec.world/en
[6] iNaturalist (at least from a biological history perspective): https://www.inaturalist.org/taxa/43577-Pan-troglodytes
[7] Coalition for Archaeological Synthesis, Data Sources: https://www.archsynth.org/resources/data-sources/
[8] Language Goldmine (linguistics datasets): http://languagegoldmine.com/
[9] Pew Research (regular surveys on economics, political science, religion, communication, psychology - usually 10,000 respondents United States, 1000 respondents international): https://www.pewresearch.org/
[10] Marinetraffic (worldwide cargo shipping): https://www.marinetraffic.com/en/ais/home/centerx:-12.0/cent...
[11] Flightradar Aviation Data (people movement): https://www.flightradar24.com/data
[12] Windy Worldwide Web Cameras: https://www.windy.com/?42.892,-104.326,5,p:cams
People who hate "social science" are surely targeting too wide, but there's plenty of terrible research hiding under that umbrella that relies exclusively on social media/internet surveys/self-reported data and absolutely deserves criticism.
1 reply →
A lot of psychology research involves data not from these datasets, though.
The complaint is that their data often doesn't strongly support the hypothesis, and dubious statistical techniques are performed to make it appear otherwise. And just poor statistics abilities (not malicious intent).
Physicists get away with it because they often just don't do any statistics. Often the data aligns so well with the hypothesis that you don't need any sophisticated techniques, or their work doesn't involve any data (like my example in my prior comment).
Most US trained physicists have never taken a course in statistics. It's not in the curriculum in most universities. When I was in school and would point it out, the response was always "Why do we need a whole course in statistics? We learn it in quantum mechanics."
No. That's probability you learn. Not statistics.
In social sciences (and medicine) people take a lot more statistics courses because the systems are much more complex than typical physics systems. A lot more confounding variables, etc. They simply need more statistics.
(Yes, yes. I know. There's probably some experimental branch in physics where people actually do use statistics. But most don't).
I’m not ragging on the whole field. If I narrow it down too much they’ll know who I am and you will know who they are.
I’ll reduce it to a part of psychology.