1. 63333 POINTS
    Peggy Mace
    Most of the U.S.
    Many employers do offer Disability Insurance. A few states even mandate that employers offer Disability Insurance. Should they offer it? If they can afford to do so, it is certainly is a great benefit for employers to offer their employees, and is something good employees look for when considering which job to take.
    Answered on October 22, 2014
  2. 10968 POINTS
    Tim Wilhoit
    Owner, Your Friend 4 Life, Brentwood TN
    I believe the more useful benefits, such as disability insurance, an employer can offer their employees the better. Benefits do a couple of important things for their workforce. It helps with morale, it states to the employee you or the company cares. It cuts back on costly turnover through employee retention. It becomes a great recruiting tool to hire the best talent to your business. Many employers consider their benefit package their "golden handcuffs" to keep employees long term. Disability insurance both short and long term should be a part of that package.
    Answered on October 23, 2014
  3. 47 POINTS
    Kevin Haney
    A.S.K. Benefit Solutions, New Jersey
    Yes, employers should offer disability insurance to their employees. Many carriers provide programs designed to be paid by the employees through payroll deduction. Aside from supporting the payroll deduction, the employer does not incur any direct costs.

    Many employees cannot purchase these policies directly as an individual. In particular, females employees of child bearing age desire policies to cover their maternity leave, which they can only get if offered at work.
    Answered on April 24, 2015
  4. Did you find these answers helpful?

Add Your Answer To This Question

You must be logged in to add your answer.

<< Previous Question
Questions Home
Next Question >>