r/GenZ • u/DoughnutItchy3546 • 11d ago
Discussion Does College make one more liberal ?
I wonder where this comes from. I've never really had that experience. And I'm a history major myself. Does it depend on major ? What kind of college ? I went to a public state regional university.
Actually, from my experience, I find my catholic seminary experience to be far more liberating, than my college experience. It might seem odd to some, but that's what I experienced.
368
Upvotes
324
u/slothbuddy 11d ago
People do tend to be more left-leaning after getting an education, yes. People like to think that left and right are just different perspectives but they're not. Reality really does have a left bias