When people tell you it's important to believe in "something", do they ever specify what that something is or why it's good?
I keep hitting walls with those kinds of questions, which I feel compelled to ask when I hear people sing the praises of belief - not gods, just belief - and how important it is to imprint that onto children.
It makes me angry because this is one of those subtle ways of showing bigotry and encouraging a divide, believers may not even be aware of doing, and when asked for merits; they just shrug.
While they leave the "something" conveniently vague, the one idea the child almost certainly will walk away from this education with, is their parents belief that it is very important to believe in "something". Leading to the notion that people who do not are lacking; that there is a part of us missing or broken.
I usually say that I believe I am standing in front of them, that morning follows night. But that's not their point. What they are asking if you have faith. I like to ask what their idea of belief or faith is. When I have the time, I like to engage such people, as long as they are capable of cordial discourse.
I try to believe in gravity, but it keeps letting me down.
i haven't heard of parents pushing "something", but I don't doubt it.
My parents raised my sisters and I, in such a way, to where they introduced us to religion (both of Catholic and Souther Baptist faiths) and always told us that they didn't care WHAT we believed in, just to believe in something. We took all of that into consideration and now, my sisters and I believe in OURSELVES; taking responsibility for our actions and being grateful for those around us, that have had our backs, during troubling times. We owe nothing to a deity and we're better people for it....when credit is given, where credit is due...other human beings seem to appreciate it more than the sky daddy does.