Healthcare in the United States is costlier than care in other countries. A new analysis suggests that the cost may be worth it at least when it comes to cancer care. Americans may pay more for cancer treatment, but they also live longer after diagnosis — getting a benefit that offsets their higher health expenses the study says.
More...