That is an excellent question.
I went to school in a small Southern city. Explaining the Nazis and the Holocaust was a touchy feely issue. I suppose the State Department of Education figured the kids would learn from parents, TV, other sources besides the classroom. WWII then was closer to the present than it is now, so the truth was always there to seek.
Now, it's all deeper in the past. State Departments of Public Instruction have other matters, other agendas to pursue.
Personally speaking, the more the Nazi era, and later World War II can be made more alive, interesting and real, the better US society would be for the effort.
It was indeed the Greatest Generation.
Old Cadillacs never die. The finance company take 'em and faaaaade 'em away.