5 replies
No, actually most historians believe that WWI put an end to the Progressive Era. Other "progressive" periods in the United States include the New Deal era (1933-1936), and the New Frontier & Great Society (1960-1968) to an extent have intellectual/policy-related roots in the Progressive Era.
Add a comment
to see other 3 answers
Add a comment