I changed jobs during the pandemic. I asked if I could work remotely permanently, they said yes. It’s in my contract I work from home, not the office. I’ve been watching the “sea change” as working remotely has been removed from various companies and wondering why? If all the research points to it being better, then - again - why? The speculation about it being related to real estate is depressing!
I did once, for about nine months when I was 18.
Then an exorcism happened, I became doubtful and finally stopped believing.