How insuring technology can overcome data and process risks

0

By Russ Banham

Today, most companies have made great strides towards automating their business processes with the goal of making their operations more efficient, saving money, and generating analytical information that can inform decision-making. But what if the underlying data is incorrect? Or if the processes to which it is subjected, the way it is manipulated or the algorithms applied to it are faulty?

These are questions that get a lot of attention in C-suites, as much of business today relies on the integrity of financial and non-financial data.

There are many ways to produce inaccurate data, from sloppy data entry and inconsistencies in data formats to sloppy migration or processing of data.

More than half of respondents (55%) in the 2021 Global Data Management Research Survey of Data Quality CEOs say they lack confidence in their data assets. They estimate that a third of their customer and prospect data alone is inaccurate.

The company’s internal control systems are meant to identify and mitigate data integrity and security risks, but what if an error falls through the cracks? Regulators expect full numbers, consumers expect sensitive personal information to be protected, investors expect accurate ESG (environmental, social, governance) metrics, and boards of directors expect ‘expect all of the above.

Obviously, the consequences of a misstep can be serious.

When data errors get worse

“The stakes are high,” said Heather Paquette, National Head of Technology Assurance, Audit, at KPMG US. “Almost every interaction in business begins with data, whether it’s a sales or service transaction or data provided by functions and partner organizations. “

Data, of course, does not exist in a vacuum. Information connects and integrates with other data across a growing array of devices, applications, databases, and systems in on-premises and cloud systems. It is shared with auditors, regulators, rating agencies, suppliers and other third party organizations.

If even a set of numbers or an algorithm’s calculations are flawed, the error can escalate to the point of affecting business performance, decision-making, social and cultural biases, and the reputation of the company. brand, not to mention that it can attract the attention of regulators. .

Similar problems can arise if the processes that govern the collection and manipulation of data are broken and the controls in place for inspecting and validating the accuracy of data and processes are not up to standards, imperfections data can go unnoticed. Finally, if the data is not sufficiently secure, a cybersecurity incident can lead to reputation issues, market losses, lack of investor confidence, and regulatory disclosure issues.

While the potential for such problems is sobering, no one disputes the extraordinary business value of end-to-end process automation between functions, the digitization of structured and unstructured data, and the use of technologies such as than AI and robots to generate information from massive volumes of data.

This is why, as Paquette says, “companies must do their utmost to ensure completeness, accuracy, [accessibility] and data security.

Avoid legal and regulatory issues

And that’s where technical assurance, a smart, risk-based approach to managing data across the enterprise, can come in handy. Just as internal audit identifies ways to improve business operations, technical assurance identifies how to improve automated business processes.

“Technical assurance systematically examines the risks and controls associated with all aspects of the technology, increasing confidence in the accuracy of financial data among investors and other stakeholders,” said Paquette.

“Without that assurance,” she said, “a significant technology problem can cause significant weakness in internal controls over financial reporting, calling into question the reliability or otherwise of the organization’s information. “

The likely outcome? “Legal and regulatory repercussions”.

With change comes risk

There is an old maxim in business that companies that fail to anticipate the business risks associated with organizational change suffer the consequences. Digital transformation, arguably the most disruptive organizational change in company history, is one example.

The integration of digital technologies into all functions of a business has fundamentally changed the way businesses operate and generate value. However, implementing these technologies without a clearly defined strategy or continued attention to data accuracy, resiliency and security can backfire and either cripple the business or destroy it.

“Our studies on the future of finance indicate that 60 to 70% of manual controls will be automated within five years,” Paquette said. “This large-scale automation presents a huge and growing need to test controls to ensure that the information is complete and accurate.”

Without this data assurance, business leaders risk making capital decisions and business forecasts based on misleading information. A survey carried out at the end of 2020 indicates that 6 in 10 executives fear that their company’s forecasts provide an accurate picture of future performance. Only 1 in 3 in 1,300 respondents are confident in the accuracy of their financial data.

ESG reporting adds a new wrinkle

Errors in a financial statement can force shareholders and other investors to take legal action against a company for misleading them in making investment decisions. Such errors also affect the integrity of a company’s ESG metrics, a major concern for investors.

In February, the United States Securities and Exchange Commission ordered public companies to “improve [their] focus on climate-related disclosure ”in filing their financial statements. More recently, SEC Chairman Gary Gensler said he plans to come up with new rules on climate risk and other ESG disclosures in the second half of 2021.

“Many companies already publish additional ESG notes in their financial statements and annual reports to explain how the measures were prepared,” Paquette said. “If the underlying data is inaccurate or misleading, it can damage the company’s brand and reputation and can soon lead to a regulatory violation. “

Manage the dangers

Certainly, as organizations operationally and organizationally transform around data, it is incumbent on CFOs and CIOs to ensure its integrity. This is not a task for the faint hearted, given the large number of technology systems, applications and automation solutions in use across all functions of the business, front-office sales, marketing and operations. interactions with clients in finance and back office accounting.

Data is transmitted by customers, suppliers, regulators and various partners through these functions; it is automatically processed and stored in on-premises and cloud-based systems and applications that connect to a core enterprise resource planning system. The data is accessible to users using AI tools and robots for analysis. But, underlines Paquette, given that some of this data “remains unstructured and not yet digitized, an incomplete picture of the financial health of the organization can be presented”.

Added to these challenges is the need to secure data against the growing risk of a cyber incident. Since third-party cloud service companies have access to corporate systems, their cybersecurity is critical. Almost a third of third-party vendors are considered a significant risk in the event of a breach, according to a 2020 survey.

Organizations typically rely on SOC 1 or SOC 2 reports from an independent auditor on the effectiveness of internal controls used by third-party cloud service providers. “These reports cover business process controls and general IT controls such as [those related to the] the security, availability, confidentiality and processing integrity of the systems used by suppliers to process customer information, ”explained Paquette.

Large professional services firms such as KPMG LLP provide SOC attestation reports and may also provide technical assurance assessment services that involve identifying potential data integrity issues and evaluating the data integrity. ‘effectiveness of associated controls. To do this, KPMG has deployed bots that assess clients’ financial data and controls, looking for particular financial numbers that are imbalanced or out of alignment.

“Our goal is to audit technology with technology, to make financial audits more efficient and to improve our customer experience,” said Paquette.

With so many issues at stake when it comes to the accuracy, completeness and reliability of financial data and ESG metrics, reliance on the sources of truth is absolutely necessary. Businesses today must not only strategize their digital journeys, but also assess the technology controls that facilitate them.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.


Source link

Leave A Reply

Your email address will not be published.