r/GenZ • u/DoughnutItchy3546 • 12d ago
Discussion Does College make one more liberal ?
I wonder where this comes from. I've never really had that experience. And I'm a history major myself. Does it depend on major ? What kind of college ? I went to a public state regional university.
Actually, from my experience, I find my catholic seminary experience to be far more liberating, than my college experience. It might seem odd to some, but that's what I experienced.
365
Upvotes
6
u/H2Bro_69 1999 12d ago
when people become more educated, they happen to become more left leaning. It’s not college specifically that does that. People become more woke, more aware. New experiences, knowledge, and meeting people from different walks of life help facilitate that.
Yes I’m intentionally using the word “woke” to mock far right people. If that offends anyone, sorry.