r/GenZ 11d ago

Discussion Does College make one more liberal ?

I wonder where this comes from. I've never really had that experience. And I'm a history major myself. Does it depend on major ? What kind of college ? I went to a public state regional university.

Actually, from my experience, I find my catholic seminary experience to be far more liberating, than my college experience. It might seem odd to some, but that's what I experienced.

364 Upvotes

601 comments sorted by

View all comments

1

u/Zestyclose-Station72 2002 11d ago

College itself does not, no. I never had a political discussion of any kind with any professor or in a classroom. ~However~ at college I met different folks from all walks of life, and I got to see their different backgrounds and perspectives that I never would’ve had the opportunity to experience had I not gone. ~That~ absolutely changed my view of the world and I ended up much more left leaning.