"Finance capitalism" refers to an economic system where financial institutions, like banks and investment firms, play a prominent role in generating profits and shaping the overall economy. It involves the idea that financial activities, such as lending money, investing in businesses, and trading securities, have a significant impact on economic growth and wealth accumulation. In essence, it highlights the influence of finance sector in the workings of capitalism.
Full definition