United States

Bachelor's Degrees in Social Work in United States

Social Work degrees

Social Work degrees teach students how to improve the well-being and life-quality of vulnerable groups like children, youths, homeless people, or minorities. Social Work studies employ theories from Sociology, Education, Psychology, as well as knowledge of public laws. Students learn how to support people in need of help and how to empower them to take action and solve the problems they face.

Read more about studying a Social Work degree

Not sure if Social Work is for you?

Take personality test

Study in United States

The United States is home to some of the most prestigious universities and colleges in the world. With over 150 universities featured in international rankings, the U.S. has some of the best business schools, medical schools, and engineering schools. Universities and colleges in the U.S. are well known for academic flexibility and ways to customize your study experience with optional studies and extracurricular activities. Depending on where you will be studying, you will be able to visit iconic places like the Statue of Liberty, Empire State Building, Goldengate Bridge, The Grand Canyon, Mount Rushmore, Disney's Magic Kingdom Park, and much more.

Read more about studying abroad in United States

Can you handle the weather in United States?

Take country test

Wishlist

Go to your profile page to get personalised recommendations!