Hah, when I was in school in the US, you were lucky if textbooks portrayed Asians at all.
Unlike the popular misconception, we spent a long time talking about the abuses of Native Americans in our US history curricula. But most history classes I took in public school were either:
-
World History, from the perspective of Europe.
-
American History, up until we get to any bits that someone's parents might have been alive for.
China only entered the stage after the Europeans colonized it. Japan only entered the stage when they bombed Pearl Harbor. India only entered the stage when it became independent from the UK. All of Asian history, what little of it there was, was cast in a Eurocentric light and depicted a continent of people that entirely lacked agency.