"To take the reins" means to assume control or leadership over a situation, often metaphorically referring to taking control of a horse by holding its reins. It implies taking charge and being responsible for making decisions or guiding a situation.
Full definition