An overwhelming majority of Americans agree it is important that the United States is generally respected by other countries around the world, according to a Pew Research Center survey conducted in February. And a new parallel survey of international relations scholars finds that academics are in lockstep with the U.S. public on the importance of America’s image abroad. CONTINUED
Jacob Poushter & Mara Mordecai, Pew Research Center