Health Care

Health care or healthcare is the treatment and prevention of illnesses. Health care is delivered by professionals in medicine, dentistry, nursing, pharmacy and other alternative health practitioners.

The social and political issues surrounding the health care industry in the US make improved health education and responsible dissemination of information pertinent for improving the overall health of Americans.

This sections aims to bring some attention to commonly misunderstood or unknown facts regarding our health care system in order to help individuals increase their knowledge base and make smarter decisions regarding their health.


Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s