dominionism


do·min·ion·ism

D5342950 (də-mĭn′yə-nĭz′əm)n.1. The theory or doctrine that Christians have a divine mandate to assume positions of power and influence over all aspects of society and government.2. The belief that God gave humans the right to exercise control over the natural world.
do·min′ion·ist n. & adj.