Everyone knew this culture permeated Hollywood for as long as it has existed and every other related industry. Of course they were all raped. People put up with it for their careers and now just stopped deciding to cover for these people. Especially since Cosby and whatever else.
It's not just Hollywood. It's everywhere. Glad to see these a-holes being called out and their lives ruined.