Do all full-time jobs have to offer health insurance?
No. Absolutely not! Health insurance was traditionally never offered as terms of employment. During WW-II or thereabouts there were wage and price controls and so to entice people to come and work for you, employers offered health insurance. In those days it only cost a few bucks and never had to cover these end-of-life expenses … Read more