For most of last year, 20 data collection teams visited 3 880 public health facilities, throughout the country, ranging from tiny clinics to national central hospitals. The vast majority ‘ 3 074 ‘ were clinics.
The aim was to assess all public health facilities according to six criteria so that problems could be identified and addressed.
Areas of concern emerging from the audit are poor infection control in most facilities (the average countrywide score for ‘vital’ life-or-death indicators was a mere 54%) and dismal maintenance of equipment and infrastructure.
Almost half the clinics reported never being visited by a doctor, and over two-thirds could not offer dental care.
There are also 56 clinics without running water and 36 without electricity and one-fifth of clinics do not have managers.
This is very worrying for ordinary South Africans, whose first stop for health services is their local clinic.
The audit, which was presented to Parliament recently, has for the first time provided the health department with baseline information against which future progress can be measured. On-going assessments are going to be conducted by inspectors employed by the new Office of Health Standards Compliance.
Sadly, the City of Cape Town opted not to take part in the audit despite high-level negotiations, so its 21 primary healthcare facilities were excluded.
Apparently, the city officials felt that their facilities’ audit by the Council of Health Service Accreditation of SA (Cohsasa) was sufficient ‘ although this audit uses entirely different measures.
Many health facilities faired badly in the audit, but this was partly because the criteria were strictly applied, according to Ronel Visser, project leader of the National Health Facilities Audit team.
‘If, for example, we were looking at cleanliness and were inspecting toilets, all the toilets had to be spotless. If one wasn’t, then the facility would fail this indicator,’ said Visser.
Visser’s non-profit organisation, the Health Systems Trust (HST), won the health department tender to conduct the audit. The six performance areas selected were:
* Availability of medicine and supplies, particularly availability of medicines on the Essential Drugs List and stock control;
* Cleanliness, particularly clean bathrooms, kitchens, laundries and grounds, and whether and daily inspections took place;
* Improved patient safety, including emergency services’ response times, physical safety measures to protect patients, and how a facility manages adverse events;
* Infection prevention and control, including how a facility manages infectious diseases, disposes of waste and monitors infection control.
* Positive and caring attitudes of staff, including how staff interacts with patients, whether there is a satisfactory mechanism for complaints, whether patients have privacy and staff satisfaction.
* Waiting times.
These six criteria have been defined as ‘priority areas for patient-centred care’ by the Department of Health in its quest to clean up the country’s health services.
Standards relating to these have been set by the health department. The newly formed Office of Health Standards Compliance in the health department, is going to be the custodian of these standards and will send inspectors to monitor health facilities.
Facilities fared worst on ‘positive and caring attitudes’, with a countrywide average score of only 30% for compliance with vital measures in this area.
Clinics scored considerably lower than hospitals on all measures, but were particularly poor when it came to staff attitudes to patients, with an average score of only 25%.
The worst staff attitudes towards patients were found in the Northern Cape, where facilities scored a dismal 17%, North West (21%) and Eastern Cape (22%).
‘One of the things we did (to measure was to observe positive and caring attitudes’) was to check how healthworkers gave patients information. For example, how to take their medication. We also interviewed the patients to see if they understood the information. There was a high failure rate on this,’ said Visser.
Health facilities were also supposed to display posters about patients’ rights and conduct ‘staff satisfaction surveys’.
The average score for patient safety was 34%, while the average cleanliness and infection control scores were both 50%.
On average, facilities scored best on waiting areas with an average score of 68% compliance.
The standards were also categorised as vital, essential and developmental, and a facility was expected to achieve 100% in the ‘vital’ indicators or it would fail.
Visser said that the standards set were new ‘ so the audit was also test-driving them to see whether they worked.
They found that, in many cases, it was not appropriate to expect clinics to have the same standards as hospitals which is in part why so many of the clinics failed badly.
‘Some of the standards need to be significantly adapted. For example, in the case of maternity services, a clinic that is not doing a lot of deliveries should not be expected to have a central sterilising unit,’ said Visser.
In addition, there were huge differences between hospitals and clinics in the way they kept patient records, mainly because there is no standard for how records are kept at clinics and some don’t even write down the clinical diagnosis of a patient.
Clinics generally didn’t have a system for fixing broken equipment or for servicing items such as boilers.
Despite the poor performance, however, Visser says many of the problems are relatively easy to fix.
This was a sentiment shared by Director-General of Health, Precious Matsoso, who said shortly after briefing Members of Parliament about the audit, that ‘some of our people seem to have lost their problem-solving skills’.
‘We found things such as broken water pipes that have not been fixed for a number of years, and no one has followed up, and rooms packed to the roof with broken furniture,’ said Matsoso.
In a bid to sort out some of these relatively minor problems, Matsoso set up Facility Improvement Teams made up of people from head office plus provincial officials, and dispatched them to some of the facilities that were first audited to sort out their problems.
As for things such as broken infrastructure ‘ such as pipes, windows and roofs ‘ and furniture, Matsoso has enlisted the help of local FET colleges that train artisans. In the past, the Department of Public Works was charged with fixing such things.
From the audit, it is clear that the Northern Cape needs particular attention. The average score for facilities on all priority areas combined was a dismal 40%, the lowest in the country. Monitors reported filthy facilities and rude staff in this province.
JT Galeshewe district (31%) around Kuruman was the worst performing district in the country while Tshwane (74%) was the best.
Gauteng scored highest with 69%, but many of this province’s health facilities are hospitals rather than PHC clinics. KwaZulu-Natal (58%), Free State (57%) and the Western Cape (57%) were neck-and-neck for second place ‘ the Western Cape undoubtedly pulled down by the refusal of Cape Town to take part in the audit as some of its more rural facilities in the Central Karoo and Overberg scored relatively poorly.
The Eastern Cape (51%), North West (48%), Mpumalanga (47%) and Limpopo (46%) made up the bottom half, ahead of the dismal Northern Cape.
While the findings are generally poor, they were not unexpected. Horror stories about public health facilities have been told over a number of years.
The new National Health Insurance (NHI) scheme is based on healthcare being delivered mainly by primary healthcare facilities, yet clinics in particular scored badly.
The facilities audit has delineated very clearly the many failings of the public health services and unless these are addressed and regularly assessed, the NHI will flounder and fail. ‘ Health-e News Service.