The religions are made to teach good discipline and values to the human and every religion taught us to love and kindness. But in the name of religion we humans fighting with each other. So my question is do religions do more harm than good to society? Guys, I would like to know your thoughts regarding this, let me know your opinions through your replies.
Bookmarks