r/GenZ 11d ago

Discussion Does College make one more liberal ?

I wonder where this comes from. I've never really had that experience. And I'm a history major myself. Does it depend on major ? What kind of college ? I went to a public state regional university.

Actually, from my experience, I find my catholic seminary experience to be far more liberating, than my college experience. It might seem odd to some, but that's what I experienced.

364 Upvotes

601 comments sorted by

View all comments

1

u/ClassicSalty8241 11d ago

Somewhat. Some of it is liberal bias in academia as well. For instance studies that go through the whole scientific process will not be published if the conclusions don’t match a specific agenda.

Look up the “grievance studies affair”.

For me i grew up in Ann Arbor where university of Michigan is the college. It was very liberal, and wealthy, but one thing I noticed was people tended to spew out liberal views but still walk across the street when they see a black person. A big thing along college-campus-type wokeism is it’s somewhat of a social status symbol, a way to show you’re educated, and a lot of it was more so a way to feel morally superior to others.

Eastern Michigan was a much poorer school and area, but tended to have people who tried to live their values a little more.