In politics, the Left refers to governments that lean towards pro-poor policies whether by action or just slogans. This form of politics believes that free capital (or money) creates an unjust society where the rich sit on top of the hierarchy and influence the state to their advantage.
THE LEFT, THE RIGHT AND THE CENTRE

Sign Up For Daily Newsletter
Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
- Advertisement -



