How America Became Capitalist
Imperial Expansion and the Conquest of the West

Has America always been capitalist? Today, the US sees itself as the heartland of the international capitalist system, its society and politics intertwined deeply with its economic system. James Parisot's new book, How America Became Capitalist: Imperial Expansion and the Conquest of the West, out from Pluto Press, looks at the history of North America from the founding of the colonies to debunk the myth that America is 'naturally' capitalist.

From the first white-settler colonies, capitalist economic elements were apparent, but far from dominant, and did not drive the early colonial advance into the West. Society, too, was far from homogeneous - as the role of the state fluctuated. Racial identities took time to imprint, and slavery, whilst at the heart of American imperialism, took both capitalist and less-capitalist forms. Additionally, gender categories and relations were highly complex, as standards of ‘manhood’ and ‘womanhood’ shifted over time to accommodate capitalism, and as there were always some people challenging this binary.

| More