A curious thing has been on my mind as I prepare to leave the country for the summer. I hadn't noticed it until I went to Japan last year, and going away again has caused this idea to resurface.
It's funny how when Americans talk about going abroad, we often refer to it as going to a "foreign country". "When you go to a foreign country...", "When you're in a foreign country...", etc.
Depending how you look at it, yeah, those places are foreign to us. But it's like it never occurs to us as Americans that if we're in someone else's country, we are the ones who are foreign. We would be the foreigners. And as foreigners, it would be on us to learn how to be gracious and respectful guests, no matter how uncomfortable we are.
Admittedly this is a sweeping generalization, but perhaps there's a certain arrogance that comes with being American. It seems we have a hard time seeing ourselves in situations where we're neither in control nor the most important people in the room. We're so attached to what we've claimed to be "our land" that we can't even embrace the idea of being foreign.
And that's a shame. Because depending on the situation, being a so-called "foreigner" can be a very humbling and rewarding experience.