Yes, I've always felt that was so effing weird, you know. It's this emphasis on the world being this scary, dark place where your children are snatched form your homes and people are dying left and right. And that's not to say that doesn't hapen, but other things happen, stuff thats not so bad, stuff that isn't either good or bad, things that are really, good and so on. That's life! If anything, and I think we can all attest to- by making the world a scary, dismal, and dark place they will make it hard for you to even consider going into or back to it because it's scary and/or there is nothing out there. Because I know when i was at first facing the idea of leaving it- to me it didn't matter (at first) whether it was true or not, it was the fact that "there is nothing out there!" in 'the world.' I mean imagine that- it doesn't matter of it is the truth, the world itself is a horrible place to be in.
And I think that's why a lot of people stay. I mean, at times I ask my grandmother about scriptures and policies and ways and she is always pulling stuff out of her ass almost. She doesn't read all the time and couldn't give a single name of those on the Governing body, past or present.. but "this world is surely wicked and Jehovah has to come through and clean it up!" in her eyes and I bet thats the viewpoint of so many. Why even chalenge the WTBS when the world itself is so disgusting and evil.. why ruin the "only" good thing in the world for yourself?
To me, I think its a silly perspective to have. "the world?" You are always going to be part of the world- you absorbed the culture, the language, the customs, the ethics... you can claim to be different all you want to, but the fact remains that everybody is a part of 'the world' whether they are watching R rated movies or not, throwing away evil smurf dolls or not, and so on.