Before Hollywood became known for being the center of the film industry, directors filmed on the East Coast. It wasn’t until the 1910s when films started moving west to California, due to lack of ...