United States

Bachelor's Degrees in Earth Sciences in United States

Earth Sciences degrees

Earth Sciences degrees study physical aspects of our planet. Earth Sciences are interdisciplinary studies dedicated to understanding the Earth's atmosphere, hydrosphere, biosphere, geology, and the effects of humans on the environment. Earth Sciences aim to preserve the systems maintaining life on our planet. Earth Sciences degrees can lead to careers in Geology, Environmental Sciences, and more.

Read more about studying a Earth Sciences degree

Not sure if Earth Sciences is for you?

Take personality test

Study in United States

Study in the U.S.A, home to some of the most prestigious universities and colleges. The United States is internationally renowned for top business schools, medical schools and engineering schools. International students in the U.S. can select from a huge variety of Bachelor’s and Master’s degrees offered by some of the best universities in the world.

Read more about studying abroad in United States

Can you handle the weather in United States?

Take country test

Wishlist