Growing up, I thought Christians should demonstrate our faith by what we didn’t do: Don’t watch certain movies; don’t engage in certain behaviors; don’t hang out in certain places.
But now, I think that’s wrong. Above all else, Christians should be known for the things we do. We should spend our energy fighting racism (even if we’re white), fighting sexism (even if we’re male), fighting poverty (even if we’re rich), and so forth. Basically, we should be constantly watching for who is hurting, who is marginalized, who is being treated as “less than”—and we should join them in their struggle, and make their cause our own.
That’s what Jesus did. And yes, there are still unhealthy/immoral things you shouldn’t do, but at the end of the day, I’m more interested in hearing about what you did do…and how you treated others with the love you say you believe in.