From Hollywood's beginnings, Black people were mostly given roles of subservient maids and sharecroppers in movies with regressive, racist messages. But over the last century, there have also been movements to present Black people as real, nuanced human beings with stories worth telling on film.