Germany declares war on the United States.
Adolf Hitler declares war on the United States, bringing America, which had been neutral, into the European conflict.
Germany declares war on the United States.
Adolf Hitler declares war on the United States, bringing America, which had been neutral, into the European conflict.
Comments
Sorry, comments are closed for this article.