At the dawn of the 21st century, something new may be happening in the heartland of America: the spread of a negative image of France.1 Traditionally, a mostly positive image of France linked to its reputation for good food, high fashion, and sophisticated tourism, coexisted with a somewhat negative image in some elite circles. But the most important factor was definitely a lack of knowledge and the fact that above all, indifference reigned supreme. (See Body-Gendrot in this issue.)