I noticed that in a lot of areas I am more British than German.
Moreover, I am currently planning my semester abroad which will take place in less than a year and I desperately want to go to Great Britain. Unfortunately, my university has only three partner universities in the UK and there are about 500 people applying each semester and only two are chosen for each university (at least if you want to go as a Erasmus student, in that case you don’t have to pay the tuition). The reason why there are only two students to be chosen is because this is based on an agreement between the German and British universities and apparently no one wants to go to Germany. This does not only concern Great Britain but also France, for example. I can understand this somehow, if I could choose, I wouldn’t choose Germany either. Actually I would love to move to England someday in the future.
Anyway, I am really curious, how much do people (non-Germans) learn about German history? And how much do you know about Germany as it is today? What do you think about Germans (please be honest)?
This woman also talked a lot about stereotypes and told us about some of her experiences and asked us about ours. A lot of the other students told stories about how people asked them about Hitler when they were abroad and how a lot of people think of Germans as Nazis. In fact, one student had been asked (not by a Briton though) what Hitler does today. I am always quite shocked to hear that a lot of people don’t really know a lot about this and that they really associate Germany with nothing else but Hitler. Admittedly, I am not very patriotic, however Germany today is not Germany 70 years ago.
Also, as a German, it is quite unthinkable to not know about German past and you can’t imagine that other people don’t learn about it. Of course, history is inerasable, therefore we learn really a lot about it at school, mainly to prevent that such a thing can happen again. It is almost awkward to admit to be German.
Again, people shouldn’t forget about this time in history, in contrary, they should learn about it (and Germans do but I don’t know how this is in other countries) but I also think that it is important to know that it is past.
So, when you hear Germany, do you really immediately think of WWII and/or Hitler? Do you learn a lot about it at school? Do you associate Germany with anything else that doesn’t have to do with the above mentioned issues?