The #MeToo movement revealed the very ugly side of Hollywood's misogyny with hundreds of claims of sexual harassment and assault made not just against Harvey Weinstein but many other men working in the industry. What is surprising is that the film industry started out with many women filmmakers. It was only once the industry really started to make money that were women pushed aside. Can they now muscle their way back into more positions of influence?