Prompted by a post from @krishnanrohit: “I’m once again registering my annoyance at the fact that EVERY SINGLE NATURE DOCUMENTARY talks about how humans suck. Literally every single one. I am so tired of explaining to my 7yo son that no humans are not destroying everything. That he can be optimistic. It’s obscene.”
I mean. Have you stopped to consider why you can’t find a nature documentary to fit a “humans aren’t destroying the planet” narrative?
because most of them are directed by white people who bought into the colonial myth that it’s impossible to form symbiotic relationships with nature?
Yup. And it’s not just white people! In a very international course I took with Bija Vidyapeeth about 20 years ago, at least one of the non-white participants shared the view that any human engagement with the rest of the natural world was going to be a negative. I knew less then, but did recall and share about research in the Amazon which documented an increase in local biodiversity where humans were, over ecologically similar areas which were left alone.
The elites of many countries have absorbed the same Western-dominated views that those of us living in the West are bombarded with.