"To dominate the market" means to have the highest control, influence, or power over a particular industry or sector. In other words, it refers to being the leading or most successful company or brand in that market, surpassing all competitors.
Full definition