The decline has been in progress for years and it's not all the schools' fault. I recall running into a teacher of mine quite a number of years ago.
I asked him how the school was. He replied that he had quit and was selling insurance because anytime he tried to discipline some brat who deserved it, the kid's parents would come in and yell at him!
While that is true, it begs the point. This is 2018, not 1950, and the world has changed. Would you like to try answering my questions so we can have a rational discussion? I will copy them below:
I am curious about this. Do you think it is the public schools' responsibility to teach religion? Specifically, do you think it should be one particular religion, given the constitution and the bill of rights? What role should your church and parents play in teaching religion? Should children in school be forced to pray and read the bible?
for your information, I did spend thirty years teaching in public school system, and I do not remember God being ousted from my classroom. When did this happen? One more piece of info: I am a fiscal conservative, but am not a bible thumper. Religion has little to nothing to do with being a political conservative.