There is no universal healthcare in the U.S., so most people in this country buy health insurance.
Health insurance protects you from owing a lot of money to doctors or hospitals if you get sick or hurt and need medical care. Health insurance can also help you pay for regular check-ups, laboratory tests, vaccines, and prescription medication.