Do you have any thoughts on RPA increasing the impact of errors for a business? A robot could very rapidly repeat a mistake thousands of times before it gets detected.
Thank you Richard Griffiths for the question in comments last week; it’s a good one!
Initially, redesign the business process to streamline it and minimise risk of errors. This isn’t always possible but should be the starting point.
As for the technical, there are ways to defend;
BEFORE – the process trigger action should be secure from accidental and/or malicious intent. eg, if the workflow is started on ‘files added to a folder’, make sure security prevents any non-process related files being filed there. Take care when using database triggers, webhooks.
DURING – defend at runtime based on common sense eg, stop processing if dozens of updates appear when we expect a few
AFTER – use/build logging and make sure they are checked. If the process is mission critical, have bots pre-built to monitor for errors and manage rollback.
Those in the #rpa community, any other tips?