When the United States entered World War I, the country braced for impact. But Black Americans who still worked on the fields in the South saw an open door. With the country’s white men off to fight ...