Since when did the word evangelical become a dirty word? I never try to hide my Christianity. Yet I’ve noticed that some people kind of look at me sideways when they ask me if I’m evangelical, like that’s just too weird. I know what they’re thinking because I used to think the same thing. They equate evangelical Christians with Republicans and Tea Party members and meat eaters and people who wear lots of polyester. Politically I am somewhere between leftist and I-don’t-care to be either a Republican or a Tea Party member. I’m a fallen vegetarian. I prefer natural fibers but I have no moral repugnance to man-made fibers. Even man-made fibers are useful on occasion. But I’m still determined to be a Christian—does that make me (that dirty word again) evangelical?
(Never, ever depend on me for a definitive explanation of anything theological. What I write has no theological basis, it’s just an observation.)
I’m not even sure what the term evangelical Christian means any longer. I think it’s supposed to mean that a person who believes in Jesus is called to proselytize, to “spread the faith to all nations.” No one else ever affected me by preaching to me and I’m not comfortable preaching to others. I’m just going to live my life the best way I can, try to “live the Gospel,” and not be an embarrassment to Jesus. Sometimes I fail.
For me it’s okay to defy description. It’s okay to be a liberal, a Bible-toting Christian, a feminist, and a banjo player all at the same time. None of those labels is inconsistent with being evangelical, is it? Can I simply work on being a good Christian and not worry about the labels?