Out of interest, I would just like to ask - why do you believe that the bible is the word of god? After all, the bible was written by men, and men are liars. There are many who claim that god speaks through/to them.
I am not a religious person. I have briefly read the bible and don't place my faith in it, but I believe that religion is valid if it helps people to justify their existance or find solace.
Just wondering, when and/or why did you decide that the bible held the answers? Was it how you were raised, or did you pick up a bible looking for answers after going through a difficult time in your life? Or something else?