WHAT DOES THE TERM "LEGALISM" MEAN?

 

Legalism is any doctrine which teaches justification by the law, any doctrine which teaches sanctification by the law, or any attempt to bring the people of God under the bondage of the law. The children of God have no desire to break the law in any point. We love the law, and we love him who fulfilled the law for us. But we will not be brought again under the yoke of bondage. We are free-born children in Christ.