Overview of Quality Assurance inside Healthcare Organizations . [602977]
Overview of Quality Assurance inside Healthcare Organizations .
Analyzing survey implementation methods and data interpretation
Student: [anonimizat]. Vlad Alexandru Sandu
Scientific Coordinator: Conf.dr.ec. Corina Dumitrescu
Universitatea POLITEHNICA din București
Facultatea de Antreprenoriat, Ingineria și
Managementul Afacerilor
2
I. Main concepts of Quality of Care.
What is Quality of Care?
When we say quality of care, we mean healthcare activities that we in the medical, nursing,
laboratory fields etc. perform daily to benefit our patients without causing harm to them. Quality
of Care demands that we pay attention to the needs of patients and clients. We also have to use
methods that have been tested to be safe, affordable and can reduce deaths, illness and disability.
Furthermore, we are expected to practice according to set standards as laid down by clinical
guidelines and protocols.
What is the Importance of Monitoring Quality of Ca re?
• Monitoring helps us to identify gaps in quality of our health care delivery.
• It provides lessons to learn from as we progress with our implementation.
• It tells us if we are making progress in improving quality of care.
Monitoring therefore helps us to identify problems with the implementation of our plans
so as to take the necessary steps in order to achieve our targets.
Methods for Monitoring Quality of Care
There are many methods of monitoring quality. The common ones include:
• Review of routine healt h information. For example, Health Management Information
System data on OPD attendance, In -patient admissions and deaths, Immunization
coverage.
• Client satisfaction surveys.
• Patients complaints system.
• Critical incidents -Adverse events.
• Mystery clients
• Supervision
We shall now discuss each of these quality -monitoring methods.
3
Client Satisfaction Survey
This is a good way of getting the clients' views on our services.
• It tells us what the clients expect from our health services.
• By telling us their expectations and making suggestions, clients are indirectly participating
in the decision -making process of the facility.
• It promotes services that are sensitive to the needs of the client .
Preparation for the survey:
It is important to prepare very well before starting any client satisfaction survey. The
quality assurance team should:
• Identify the objective of the survey. We need to be clear about what we want to achieve at
the end of the survey. It is only when we get our objectives right that we can know the
relevant data to collect.
• Develop your questionnaire. There is currently an existing questionnaire on satisfaction,
which is widely used by health facilities (refer appendix 1A).
• You may have to t ranslate the questionnaire into the local language. This should be done
and agreed upon before the interviews are conducted.
• Determine the number of people to be interviewed (sample size). It is recommended that a
minimum of 50 clients are interviewed in a clinic or health center survey.
• Select and train the interviewers on how to conduct the interviews.
The interviewers should not be known to the clients.
When do we collect the data?
Information should be collected from clients when they are about to leav e the facility. This
is called the EXIT interview.
How do we collect the data?
These are the measures that should be taken when conducting exit interviews.
4
• Spread data collection over two weeks or over a period of 10 days. (5 per day from
Monday to Friday)
• Select patients randomly. You will have to decide whether you will select every 3rd
person or 4th person or 5th person in that order.
• Number your questionnaires in consecutive order. (1,2,3,4,5)
• Before interviewing the client, introduce yourself an d seek his consent.
• Explain briefly why you are carrying out the survey (to help improve on services
for clients)
• Let the same person interview the clients to ensure that questions are asked the
same way.
• The interviewer should not be in uniform.
• Do not in fluence the client's responses.
Complaints /Suggestions
A complaint box as the name suggests, involves placing a clearly labelled box at an open
place e.g. the reception. Attached to the box is a pen and paper, which clients will use to write
down their c omplaints and suggestions. There should be a person responsible for emptying the
box, analyzing the complaints and reporting on findings regularly to management for action.
When using the complaint box, the following should be noted:
1. It should be possib le to not identify those who make the complaints; else it would
scare off clients or patients who would like to complain about the quality of services.
2. Prompt investigations should be carried out and feedback given to clients who
provide their address.
3. Staff should not sit by the box.
There are some problems that relate to the use of the complaints box. Among them are the
following:
5
1. The box may not be opened for very long periods.
2. People may write about things that are not related to the quality of service.
3. They may also use it to make accusations against health workers.
It is not useful in an area where a large number of the clients are illiterates.
Tools for Monitoring
Fig. I -1 Yardstick for measuring Staff attitude
You need to use Indicators to make monitoring meaningful. An indicator can be defined
simply as the yardstick by which you measure progress.
Indicators are derived from standards. Depending on what you set out to do, you may select
indicators that will help you measur e them.
We can categorize Indicators for monitoring quality into Client and Professional
perspectives. Client -defined indicators are those derived from the client’s expectations and
professional indicators are those derived from professional standards
You n eed to use Indicators to make monitoring meaningful. An indicator can be defined
simply as the yardstick by which you measure progress.
6
Indicators are derived from standards. Depending on what you set out to do, you may select
indicators that will help you measure them.
We can categorize indicators for monitoring quality into Client and Professional
perspectives. Client -defined indicators are those derived from the client’s expectations and
professional indicators are those derived from professional standa rds.
Fig. I. 2 Climbing the steps to achieve quality standard
7
Table I-1 Indicators for Monitoring Quality on Patient Satisfaction (OPD)
No. INDICATORS
1. Proportion of patients seen promptly
2. Proportion of patients seen without an unnecessary delay
3. Proportion of patients examined by the Doctor
4. Proportion of patients told about the diagnosis
5. Proportion of patients given instructions about how to take
their treatment
6. Proportion of patients told whether or not to return
7. Proportion of patients having privacy during consultation
8. Proportion of pat ients receiving all drugs prescribed
9. Proportion of patients perceiving staff attitude to be very good
10. Proportion of patients perceiving clinic to be clean
11. Proportion of patients seeking emergency treatment in the past
6 months who were seen promptly
12. Proportion of patients feeling very satisfied with their visit
13. Proportion of thirty (30) essential drugs in stock.
The table above provides a set of indicators widely used in health facilities in the Ghana
Health Service to monitor quality from the clients’ perspective.
8
Tools for Collecting Data and Use of Information
Before you set out to collect data for monitoring the progress of your QA, you need to
agree on how you are going to collect the data.
The common tools used for data collectio n during monitoring are:
• Checklist
• Observational guide
• Questionnaires
• A combination of all the 3.
Checklist
Checklist contains the important information you will need to collect to assist you monitor
quality in your facility. It lists out the important points that should guide you to ask the necessary
questions and make the required observations. A sample checklist can be found at appendix 1B.
Observational Guide
It is a list of key points that will guide you to observe the important activities that you need
to take note of.
We can use this method to assess staff attitude at the OPD by observing how patients are
handled by health staff at the various points during OPD consultation.
We can also use observational guide to assess how sick children are managed at the OPD
by sitting in the consulting room and quietly observing the process of consultation using for
instance, a sample observation guide at appendix 1C. The rating scale provided with the guide
gives the result of observation a numerical v alue.
9
Questionnaire
A questionnaire is a useful tool containing questions on key issues that you want to know
about. There are several types of questionnaires. A few of them are stated below:
• Structured questionnaire: This provides possible answers for the one being
interviewed to choose from.
• Open -ended questionnaire: The one being interviewed is encouraged to come out
with his or her own answers.
• Semi -structured questionnaire: This combines both structured and open -ended.
Dissemination of Information on Quality Assurance
The importance of gathering information about quality is to improve our services. People
are more likely to use the information when they understand it, hence the need for creative ways
to disseminate it. It is important to discuss your findings first with management before presenting
them to the general staff body and the community.
Find below some guidelines for dissemination:
• Findings from monitoring should be presented in a very clear manner so that staff
can easily un derstand.
• Findings should be presented as absolute figures; proportions or percentages;
pictorial form e.g. line graph, bar chart, pie chart and histograms.
• Always remember that after initial discussion of your findings with management,
you would have to follow it up with a written report so that they can act where
necessary.
• Findings should be displayed on staff notice boards.
• Always remember to hold staff durbars to inform them about your findings.
The findings and the proposed solutions should be shared with clients and the community.
The use of audiovisual equipment like the video will make your message clearer. [1]
10
Below is an example of bar charts displaying quality indi cators
Fig.I -3 Example from QA Monitoring
Fig. I -4 Discussions about the performance of the facility
11
Table I -2 Evolution of the Clinic applying the monitoring tools and receiving feedback
KEY ISSUE % of respondents per quarter
1st quarter 2nd quarter 3rd quarter 4th quarter
The OPD is not clean 75 50 40 35
I was not told how to 50 20 15 0
use my drugs
Nurses at the OPD are 65 40 35 20
very rude to patients
I was not told when 30 30 15 5
to come back
I wasted too much time 75 60 45 25
before seeing a Doctor
I was not given receipts for 45 40 25 0
the drugs I was supplied with
The Doctors come to 85 85 70 50
work too late
12
Fig I -5 Questionnaire sample used in this study
13
II. Study case upon HCAHPS used in American Healthcare Units
HCAHPS (the Hospital Consumer Assessment of Healthcare Providers and Systems) is a
patient satisfaction survey required by CMS (the Centers for Medicare and Medicaid Services) for
all hospitals in the United States. The Survey is for adult inpatients, excluding psychiatric patients.
MGH administers the survey to our patients by phone shortly after discharge.
Why is HCAHPS important?
The survey and its results are important for several reasons:
The survey is the voice of the patient – it gives MGH a view into the patient’s perception
of the care we provide.
The survey results are publicly reported on the internet for all to see –so results impact our
reputation.
The government will reimburse us on results – so, excellent survey performance keeps the
hospital financially strong.
What does the HCAHPS survey ask about?
• Doctor Communication – respect, listening skills and communication ability of
doctors.
• Nurse Communication – respect, listening skills and communication ability of
nurses.
• Staff Responsiveness – answering call bells and responding to toileting needs
• Hospital Environment – cleanliness and quietness of the hospital.
• Pain Management
• Medication Communication – explaining medications to patients
• Discharge Information – preparing patients to leave the hospital.
• Food Services – quality of food and the co urtesy of those who serve it.
• Overall Rating of the Hospital – rating the hospital on a scale of 1 -10.
14
What is the purpose of the HCAHPS Survey?
The HCAHPS (Hospital Consumer Assessment of Healthcare Providers and Systems)
Survey is the first national, s tandardized, publicly reported survey of patients' perspectives of
hospital care. HCAHPS (pronounced “H -caps”), also known as the CAHPS® Hospital Survey, is
a 29-item survey instrument and data collection methodology for measuring patients’ perceptions
of their hospital experience. While hospitals collected information on patient satisfaction for their
own internal use prior to HCAHPS, until HCAHPS there were no common metrics and no national
standards for collecting and publicly reporting information about patient experience of care. Since
2008, HCAHPS has allowed valid comparisons to be made across hospitals locally, regionally and
nationally.
Three broad goals shape the HCAHPS Survey. First, the survey is designed to produce
comparable data on patients' perspectives of care that allows objective and meaningful
comparisons among hospitals on topics that are important to consumers. Second, public reporting
of the survey results is designed to create incentives for hospitals to improve quality of care. Third ,
public reporting serves to enhance public accountability in health care by increasing transparency.
With these goals in mind, the HCAHPS project has taken substantial steps to assure that the survey
is credible, useful, and practical. This methodology an d the information it generates are available
to the public.
How is the HCAHPS Survey administered?
HCAHPS is administered to a random sample of adult inpatients between 48 hours and six
weeks after discharge. Patients admitted in the medical, surgical and maternity care service lines
are eligible for the survey; HCAHPS is not restricted to Medicare patients. Hospitals may use an
approved survey vendor or collect their own HCAHPS data, if approved by CMS to do so.
HCAHPS can be implemented in four survey mo des: Mail Only, Telephone Only, Mixed (mail
with telephone follow -up), or Active Interactive Voice Response (IVR). Each mode requires
multiple attempts to contact patients. Hospitals must survey patients throughout each month of the
year. Inpatient Prospe ctive Payment System (IPPS) hospitals must achieve at least 300 completed
surveys over four calendar quarters. The survey and its protocols for sampling, data collection,
15
coding and submission can be found in the HCAHPS Quality Assurance Guidelines (QAG) m anual
located in the Quality Assurance section of the official HCAHPS On -Line Web site .
What must hospitals do in order to participate in HCAHPS?
CMS has developed detailed Rules of Participation and Minimum Survey Requirements
for hospitals that either s elf-administer the survey or administer the HCAHPS Survey for multiple
hospital sites, and for survey vendors that conduct HCAHPS for client hospitals. The HCAHPS
Rules of Participation include the following activities:
• Attend Introduction to HCAHPS Traini ng
• Hospitals and survey vendors that are approved to administer the HCAHPS Survey
must participate in HCAHPS Update Training each year
• Submit an HCAHPS Participation Form
• Follow the Quality Assurance Guidelines and Policy Updates
• Attest to the accuracy of the organization’s data collection process
• Develop an HCAHPS Quality Assurance Plan
• Become a QualityNet Exchange Registered User for data submission
• Participate in oversight activities conducted by the HCAHPS Project Team.
Hospitals and survey vendors administering the survey must also meet HCAHPS Minimum
Survey Requirements with respect to patient -specific survey experience, survey capacity, and
quality control procedures. Details about these activities and requirements can be found in the
Qual ity Assurance Guidelines .
16
How is the HCAHPS Survey data analyzed?
Data submitted to the HCAHPS data warehouse is cleaned and analyzed by CMS, which
then calculates hospitals’ HCAHPS scores and pu blicly reports them on the Hospital Compare
website.
How are HCAHPS results adjusted prior to public reporting?
To ensure that differences in HCAHPS results reflect differences in hospital quality only,
HCAHPS Survey results are adjusted for patient -mix a nd mode of data collection. Only the
adjusted results are publicly reported and considered the official results. Several questions on the
survey, as well as items drawn from hospital administrative data, are used for the patient -mix
adjustment. Neither pat ient race nor ethnicity is used to adjust HCAHPS results; these items are
included on the survey to support congressionally mandated reports. The adjustment model also
addresses the effects of non -response bias.
Which results from the HCAHPS Survey are pu blicly reported?
Hospital -level HCAHPS results are publicly reported on the Hospital Compare website at
https://www.medicare.gov/hospital. Results are reported for four quarters on a rolling basis, which
means that the oldest quarter of survey data is roll ed off as the newest quarter is rolled on. Ten
HCAHPS measures are publicly reported on Hospital Compare .[2]
17
HCAHPS THREE -STATE PILOT STUDY ANALYSIS REPORT
Executive summary of the Pilot Study
This study was conducted between December 2002 and January 2003 and it was
designed as a three -state pilot test of adult medical, surgical and obstetric patients who were
overnight hospitalized in units from Arizona, Maryland and New York.
Participating hospitals a core group of 24 and a noncore one of 85 for the analyses
reported. Later an additional of 23 noncore hospital joined the study at a later stage of it. The
eligibility of the patients questioned in this study is constraints people under age of 18, psychiatric,
and OB/GYN patients who delivered s tillborn babies or had miscarriages.
Core sample members were mailed an advance notification letter, followed 1 week later by
a cover letter and a mail questionnaire. Ten days later, a reminder/thank you postcard was mailed
Telephone follow -up for nonrespo ndents began about 4 weeks after the mailing of the postcard.
Noncore sample members received a second mailed questionnaire in place of the telephone follow –
up.
Fig. II -1 Service Line Distribu tion of Responden ts for the C ombined , Core and Noncore
Surveys
18
Fig. II -2 Distribut ion of Education among Mail and Phone Respondents in Core Hospitals
It has been performed an empirical analysis of the HCAHPS pilot data of hospital patients
‘perspectives of care to evaluate the degree to which these experiences corresponded with the
Institute of Medicine’s (IOM’s) nine domains of care: respect for patient’s values; preferences and
expressed needs; coordination and integration of care; informatio n, communication and education;
physical comfort; emotional support; involvement of family and friends; transition and continuity;
access to care. While some of the survey items correlated strongly with this hypothesized domain
or composite, it was clear t hat the general hypothesized structure was inconsistent with the
observed data. Exploratory factor analyses at the individual and hospital level were used to help
guide refinements to the initially hypothesized structure. The revised structure was evaluate d using
a series of analyses that included item -scale correlations, internal consistency reliability, hospital –
level reliability, and correlations with global ratings. Based on analyses of the data and stakeholder
suggestions, a revised HCAHPS survey was produced that consists of 32 questions assessing seven
internally developed domains of care: (1) nurse communication (items 1 -3); (2) nursing services
(items 4, 13); (3) doctor communication (items 6 -8); (4) physical environment (items 10 -11); (5)
pain control (items 15 -16); (6) communication about medicines (items 17, 19); and (7) discharge
information (items 21 -22). The revised survey also includes global rating items for nursing care
(item 5), doctor care (item 9), and hospital care (item 23). A single item is also included that
assesses whether or not the patient would recommend the hospital to family and friends (item 24).
This report also provides the results of case -mix analyses that were performed in order to
identify variables associ ated with reports and ratings of care, as well as variance components
analysis performed to estimate how much of the variation in reports and ratings of care are
19
attributable to regions, hospitals, service category, and patients. These analyses suggest tha t
hospital service (medical, surgical, obstetrics), self -reported global health status, age, and
education and an interaction term representing different effects of age in different services should
be controlled for when comparing hospital scores. Language of respondent and race should be
evaluated further when data from a more complete sample of regions in the country are available.
Analyses of variability across hospitals and states suggests that there is substantial variability
among hospitals after stat e effects have been considered . In addition, we evaluate predictors of
unit and item nonresponse, evaluate responses to open -ended questions, and compare English and
Spanish language survey responses.
This report first provides a brief review of the litera ture on patient evaluations of hospital
care that preceded the drafting of the HCAHPS survey instrument. The design and results of the
field test follow.
Analysis plan
The study evaluated item -missing data rates, skip pattern errors, item -scale correlations
(convergence and discrimination) and internal consistency reliability for hypothesized multi -item
composites, and correlations of items and composites with the global ratings (hospital, doctor,
nurses) and whether the patient would recommend t he hospital to family and friends. It began with
an a priori specification of how survey items cluster in accord with the Institute of Medicine (IOM)
dimensions of care and make revisions based in part on exploratory factor analyses at the
individual level and the hospital level of analysis. It has been estimated the reliability of global
rating items and multi -item composites at the hospital level. In addition, it conducted case -mix
analyses to identify variables that are significantly associated with repo rts and ratings of care and
variance components analysis to estimate how much of the variation in reports and ratings of care
are attributable to regions, hospitals, service category, and patients. Furthermore, the analyses that
examined predictors of uni t and item nonresponse as well as characteristics of early versus later
respondents. This report concludes with recommendations for revising the original HCAHPS
survey instrument.
20
The results summarized above focus on internal consistency reliability and c orrelations
with global ratings and willingness to recommend to family and friends. These analyses were
conducted to evaluate the items and composites in case composite score algorithms are created at
the individual patient level. The main purpose of these analyses was to aid the HCAHPS Analysis
Team in their charge to identify items that could be deleted from the pilot study questionnaire for
the purpose of shortening the HCAHPS survey before it is implemented nationally. Below are
results of a hospital -level factor analysis to identify those composites indicated by hospital -level
data. Data accumulated across patients within a hospital provides information about the reliability
of the items and composites with regard to measuring care at the hospital level (how well items
and composites differentiate between hospitals) and the extent to which items vary by service type.
Analyses of Open -Ended Responses
The HCAHPS pilot survey contained two open -ended questions designed to elicit content
regarding a patient’ s experience that was not covered by the close -ended questionnaire items.
Namely, Q54 and Q55 asked, “What did you like most about the care you received during this
hospital stay?” (Like Most) and “If you could change one thing about the care you received during
this hospital stay, what would it be?” (Would Change), respectively. We sought to evaluate
whether these open -ended responses would suggest changes to the survey, including content that
should be deleted or added.
A sample of responses to these two questions was coded and analyzed to identify patterns
in the responses, as well as emerging themes that did not fit in the established HCAHPS domains.
Two hundred cases were randomly sampled and coded from the 16,048 surveys that were
conducted in English. One hundred of the 571 Spanish surveys were also randomly sampled,
translated, and then coded. Thus, Spanish -speaking respondents’ comments were over -sampled
(17.5% of total) relative to English -speaking respondents’ comments (1.3% of total). To the exten t
possible, responses to both open -ended questions were coded to specific HCAHPS questionnaire
items. In many cases, more than one code was applied to the individual response. The largest
number of codes applied to any one response was five.
In general, we found very little information in the open -ended responses. A large
percentage of respondents either did not answer the open -ended questions, gave answers that were
redundant with the questionnaire content , or indicated that there was nothing they would change
21
about their care. Aspects of care that were mentioned by sampled patients but were not covered by
the HCAHPS items fell into the following categories:
• Staff —general comments about staff friendliness, helpfulness, or treatment that
could not be attributed specifically to nursing or physician staff.
• Care coordination —comments regarding coordinating care with doctors, nurses,
and other staff within the hospital, or with the patient’s primary care physician or
other providers outside the hospital.
• Food —comments regarding the taste and quality of the food served in the hospital.
• Timeliness —comments regarding delays in care outside of the admissions process
and delays in dischar ge.
• Language —comments made regarding the ability of hospital staff to speak the
patient’s language.
It should be noted that content not included in the questionnaire was mentioned fairly
infrequently. Moreover, it is not entirely accurate to say that ques tions referring to care delivered
by hospital staff, other than nurses or doctors, were absent from the questionnaire. While hospital
staff do not form a separate composite measure, items that mention hospital staff are part of several
other composites including “nursing services,” “pain control,” “communication about medicine,”
and “discharge information.”
In summary, the results of the analyses indicated that the current HCAHPS questionnaire
seems to tap into most aspects of care that patients care about . Most responses to the open -ended
questions mapped to existing questionnaire items or were missing. Missing responses are perhaps
an indicator that respondents felt it was unnecessary to add anything more.
CONCLUSIONS – HCAHPS survey
Analyses of the three -state pilot stu dy made it possible to reduce the size of the 66 -item
pilot survey by more than 50 percent to a revised 32 -item survey. The pilot study analyses provides
initial support for the reliability and construct validity of the HCAHPS survey and provided
important information about potential case -mix variables, variables associated with unit and item
nonresponse, and logistic information for evaluating patient perceptions of hospital care. The
22
revised survey will be used in a series of pilot studies to be conducted early in 2004 that will
provide the basis of finalizing the HCAHPS survey instrument for future applications. [3]
Fig II-3, II-4 Samples from fir st version of HCAHPS Surve y
23
III General conclusions and rec ommendations
The hospitals may use the information about specific doctors or other staff for performance
evaluations, compensation and benefit decisions, and training. Patient satisfaction has become a
priority in hospitals and healthcare fac ilities. It is estimated that a high percentage of senior
managers and board members are participating in such programs in the United States. However,
patient surveys are viewed differently by leaders and frontline clinicians.
Clinicians note that there are drawbacks to concentrating on patient satisfaction. Many
hospital administrators are being pressured by boards to improve patient satisfaction scores taken .
The biggest argument against such surveys is the fact that the survey of patients might not
tell the whole story, making the information gained by the survey unreliable.
Knowing the b enefits and drawbacks of surveys, training leadership to properly evaluate
them, and disseminating information to frontline clinicians may allay the fears of healthcare
providers about the efficacy and appropriate use of surveys.
24
Bibliography
[1] Dr Aaron K. Offei , Dr. Cynthia ,Mr. Kumi Kyeremeh , HEALTHCARE QUALITY
ASSURANCE MANUAL FOR SUB -DISTRICTS JULY 2004
• Bannerman, C; Tweneboah, N.T; Offei, A; Nicholas,T.A; Acquah, S; (February
2002): Health Care Quality Assurance Manual.
• Brown, Lori Di Prete; Franco, Lynne M; Rafeh, Nnadwa; Haatzell, Theresa:
Assurance of Health Care in Developing Countries. Bethsaaida, USA (USAID).
• Donabedian, Avedis, (1996). Evaluating the Quality of Medical Care. Milbank
Memori al Fund Quarterly 44: 166 -203
• Ghana -Denmark Health Sector Support Programme Ministry of Health -Upper
West Region (September 1996): An introduction to Quality care.
• Quality Assurance Programme, MOH Uganda. Quality Assurance for Health
Workers in Uganda. Man ual of Quality improvement methods.
• Offei, A; Sagoe, K; Owusu Acheaw, E; Doyle,V; Haran, D; Health Care Quality
Assurance Manual for a Regional -Led, Institutional -based Quality Assurance
Programme. Eastern Regional Health Administration & Liverpool School of
Tropical Medicine.
• Regional Centre for Quality of Health Care, (February 2001:) Improving Quality
of Health Care. Foundations in Quality Assurance (Participant Manual). Institute
of Public Health Makerere University Medical School , Kampala, Uganda/Qua lity
Assurance Project, Centre for Human Services, Bethesda, USA (USAID)
• Regional Health Administration Upper West Region ( November 1994). Health
Care Quality Assurance Manual. Ghana -Denmark Health Sector Support
Programme.
• The Quality Assurance Project ( 2000). Developing standards for Quality Health
Care.
• Weaklim, D (1994). Development of Quality Indicators Based on Patients'
Perceptions of Quality for Health Service Monitoring at Health Centres in Ghana.
Collaborative work between the Liverpool School of Tropical Medicine and the
Eastern Regional Health Administration.
25
• Tweneboah, N. A; Opoku, S.A (August 1998). Implementing Quality of Care at the
sub-district. Ghana -Denmark Health Sector Support programme Ministry of Health
[2] The HCAHPS Survey – Frequently Asked Questions , https://www.hcahpsonline.org
[3] HCAHPS Three -State Pilot , Study Analysis Results , Decemb er 2003 , submitted by The
CAHPS® II Investigators and The Agency for Healthcare Research and Quality (AHRQ)
• Charles, C., Gauld, M., Chambers, L., O’Brien, B., Hay nes, R.B., and Labelle, R.
(1994). How was your hospital stay? Patients’ reports about their care in Canadian
hospitals. Canadian Medical Association Journal, 150, 1813 -1822.
• Chou, S., Boldy, D. (1999). Patient perceived quality of care in hospital in the
context of clinical pathways: development of approach. Journal of Quality in
Clinical Practice, 19, 89 -93.
• Cleary, P.D., Keroy, L., Karapanos, G., and McMullen, W. (1989). Patient
assessments of hospital care. Quality Review Bulletin, 15, 172 -179.
• Cohen, G ., Forbes, J., and Garraway, M. (1996). Can different patient satisfaction
survey methods yield consistent results? Comparison of three surveys. British
Medical Journal, 313, 841 -844.
• Conover, C.J., Mah, M.L., Rankin, P.J., and Sloan, F.A. (1999). The impa ct of
TennCare on patient satisfaction with care. American Journal of Managed Care, 5,
765-775.
• Coulter, A., and Cleary, P.D. (2001). Patients’ experiences with hospital care in
five countries. Health Affairs, 20, 244 -252.
• Covinsky, K.E., Rosenthal, G.E., Chren, M., Justice, A.C., Fortinsky, R.H., Palmer,
R.M., and Landefeld, C.S. (1998). The relation between health status changes and
patient satisfaction in older hospitalized medical patients. Journal of General
Internal Medicine, 13, 223 -229.
• Coyle, J., a nd Williams, B. (2001). Valuing people as individuals: Development of
an instrument through a survey of person -centeredness in secondary care. Journal
of Advanced Nursing, 36, 450 -459.
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: Overview of Quality Assurance inside Healthcare Organizations . [602977] (ID: 602977)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
