Hollywood (or rather, Hollyweird) is just showing who and what they are via the Oscars, these days. It used to represent American values and pride. They actually made flicks with value and meaning back in my parents' era. Movies that resonated with most of the nation. Movies that most Americans couldn't wait to pay to see. It has been evolving and "fundamentally transforming" since that era into something my parents wouldn't recognize, if they were alive now.
Bottom line.... the movie that won "Best Picture" is one that most folks here probably will never see. Just shows how out of touch and tone deaf the liberals in Hollywood have become now.