Does it matter what you have faith in? Will anything you believe in firmly enough tend to become real? For example “not all boys have penises”. I guess it depends on what you mean by real. I would say obviously in some meanings of the word real this is never becomes real. But also obviously for some people and some meanings of the word real this does become real.
So what would this mean in our every day world? Well for some people this would mean their delusions became real. They do see unicorns. And they see racism every where.
What I am thinking; does faith require a belief in God? I do not get poison ivy, I do not catch the flu, or Covid (it’s a flu). If this is just something I believe and therefore know to be true and it works, does that require a believe in God? No.
If I believe and act on I will become a professional baseball player will that tend to become true? Yes, especially if I start early. Being in my 60’s makes it a little bit harder to believe, but a 42 year old just won the Super Bowl!
“If I can see it, I can be it.” Hmmmm.
So why believe in God from a practical point of view? What does it gain you? If you can have it all, by believing you can have it all, how is believing in God different?
I am going to think on this. I do not want to just throw out a religious answer, which is often just an appeal to authority. There should be a good practical answer too.