- Patient registration – will the auto registration fail if one all the many key identifiers fail such as email? The auto-registration matches on first name, last name, and date of birth. All 3 are required to match. If they don’t match, then it is referred to a member of staff to review. We recommend patients create PATCHS accounts, in which case the patient only goes through matching once. They can also use the NHS login and App, then they will automatically match.
- What checks are in place for proxy users? The details of the patient for whom the request is being submitted are checked versus the PDS API, but the staff member will need to check the clinical system to check if the carer is approved for the patient.
- When a patient update’s their PATCHS details will this update those held in the clinical system also? No as it will require human verification
- How are the practice staff notified that there is an outstanding form to be reviewed? PATCHS is the primary workflow tool for staff reviewing and responding to patient requests. So, they will generally be using it all the time. When new requests are received, they will be shown in the PATCHS inbox and if they are urgent or emergencies they are flagged.
- What are the safeguards to ensure that a completed form is reviewed on a timely basis and not overlooked? As part of the EULA staff agree to review requests as soon as possible. Furthermore, patient requests that have been outstanding for the longest period of time are shown at the top of the staff inbox (below requests that have been classified as emergencies or urgent). Patients are also informed to contact the practice should they not hear back within a pre-specified time (e.g. 48 hours) set by the practice.
- Do we have to keep checking PATCHS?
- PATCHS is designed for you to manage and respond to requests fully within the portal. So you should have PATCHS open all the time and keep checking it
- We will be adding notifications to the toolbar by June 2023.
- How does auto completion of requests work? Practices can set PATCHS to auto-complete patient requests if they have not had a response from the patient in a certain time period. To encourage patients to respond, PATCHS uses automatic patient reminders – time period set by the practice – to remind patients (watch here).
- How easy is it to initiate a video call? The practice can send the patient a link via SMS in 1-click. The patient can then click on the link to load the VC. The patient waits in a virtual waiting room until the clinician is ready. Read more here for ‘Conducting a Video Consultation’
- Can consultations be audited that have been conducted via PATCHS? All PATCHS requests are viewable from the Completed Inbox. From there you can view all actions taken on a request.
- When submitting a non-digital request for a registered PATCHS patient, will this request show in their PATCHS account? Yes
- When a completed request is saved direct to the clinical system and a clinician has made notes /or comments is there anyway of editing what is seen by a patient when they view their summary care record? The user can edit the consultation directly in the clinical system
- How are online consultations recorded in the appointment system so they can be included in the GP appointment data dashboard (GPAD)? If a practice wants to record online consultations conducted in PATCHS so the data is picked up for GPAD then they need to manually add an appointment to their appointments book. We may be able to do this process automatically in future when access is available to the API to enable staff to book appointments.
- How good is PATCHS at helping clinicians?
The University of Manchester recently conducted the largest systematic review of research evidence on online consultations ever undertaken. It covers 62 research studies published up to 2022. Thirteen studies (21%) showed that when patients had to use multiple choice questionnaires (MCQs) to input their OC query like in eConsult (rather than free text like PATCHS) it increased both patient and staff workload. Filling out long lists of questions shifted work from the clinician to the patient, and staff found them burdensome to read. MCQs limited the amount of detail patients could enter so staff could not always fully understand their request. This increased workload because they often had to contact the patient to get further information. MCQs also asked questions about seemingly ‘irrelevant’ symptoms, which staff were responsible to assess and follow up, diverting attention away from the patient’s primary concern. Due to their restrictive nature, patients regularly adapted their responses to MCQs to get the outcome they wanted, even when it was not the most appropriate use of resources. For example, reporting their symptoms differently to get an in-person consultation when self-care may have been more suitable (‘gaming’). Nine studies (14%) suggested that MCQs could also decrease patient satisfaction. Reasons included the amount of work required to complete them, their inflexibility in getting the answers patients wanted from their primary care provider, and that they could be confusing to navigate.
However, we appreciate that additional targeted questions can understandably help the triage process, which is why we have Topic AI and Topic AI questions, which ask patients to fill out certain validated questionnaires based on their initial query when appropriate. This is much better than a blanket approach of asking pages of questions to every patient. Practices can also configure PATCHS to ask their questions and questionnaires too and share them at a PCN or organisation-wide level.