1. 2275 POINTS
    Bill Loughead
    President, SummitMedigap.com, CO, FL, GA, MI, NC, SC & TX
    Yes, health insurance is mandatory in the state of Florida or risk paying a fine.  The affordable care act states that by January 1, 2014 people are required to have health insurance or pay a fine.  This rule is not just for Florida but for all states.  An independent insurance agent should be able to show you plans from most of the major carriers and help you pick that plan that fits your needs.  Several independent agencies (like ours) have a website that people can instantly compare health insurance plans online and then ask questions to experienced licensed agents over the phone.
    Answered on November 23, 2013
  2. Did you find these answers helpful?
    Yes
    No
    Go!

Add Your Answer To This Question

You must be logged in to add your answer.


<< Previous Question
Questions Home
Next Question >>