As I’ve said, unless you go syllabus by syllabus, it’s very difficult to find out what the current mainstream teaching is in any academic field. How many archeologists teach their students that the first Americans came from Beringia by land? How many teach that it was more likely by sea?
I have some very specific questions:
How did American pragmatic philosophy disappear?
What departments teach Marx as a foundational scholar? When did that start?
How did land grant universities give up on “the liberal and practical education of the industrial classes”? All at once? Or slowly, over time?
Who teaches their students “rape isn’t about sex, it’s about power”? What academic department?
Is someone teaching students that “white supremacy is a hegemonic ideology”? What department? Is it agreed or controversial?
When did economists begin to claim the ability to predict the future? Did other departments notice?
How did critical legal studies start? Did someone say it was ok to apply anti-science theories on how to read books to real life, or did no one notice that’s what they were doing?
How does sociology exist? Who thought it was ok to reject Darwin? How did that happen? Why can’t I blame Marx?
Theory about sociology I am working on, inspired by what’s right and what’s wrong about Tressie McMillian Cottom: Established academic departments like economics and the hard sciences haven’t let black people and women join, as a result all the new smart black women are in fields like sociology that were much more wide open because they were much less respected. These new scholars then adopt the established wisdom of the field. Indeed, they prove themselves worthy by becoming Durkheim and Marx experts. The problem is that this means they inherit continental philosophy, which is a bunch of self-referential nonsense.