So says Jenn McMackin at Gresham in a Q&A about normalizing, validating, and enriching the data that runs multiple operations.

Jenn McMackin
(Reconciliation and exception management, net asset value oversight, compliance checks, fee billing, and settlement all rely on data that has been properly captured, validated, and distributed — sometimes from thousands of sources. In an FTF News Q&A, Jenn McMackin, global director, data and managed services at Gresham, takes on these issues and discusses the provider’s Pulse Data offering, which won Best Data Aggregator in the FTF Awards program for 2025. The challenge for Gresham is ensuring that regardless of how the data arrives, it must be transformed into high-quality outputs that can scale and drive operations.)
Q: Gresham’s Pulse Data won Best Data Aggregator in the FTF Awards program for 2025. Why do you think your company’s offering placed first?
A: I think we came out on top because Pulse Data solves an ongoing industry challenge, delivering clean, complete, and timely data from thousands of feeds across financial services. Our offering stands out for its ability to normalize, validate, and enrich high-volume data at scale, while also being flexible enough to support bespoke client workflows and needs. It’s the trusted service provided by our team that makes Pulse Data unique and clearly valued by our clients.
Q: Gresham’s Pulse Data encompasses an industry data set of approximately 6,000 feeds. In general, how do financial services firms create links between the data flows to the workflows for reconciliation, compliance, corporate actions, NAV, fund accounting, performance, client fee billing and more? Is it all handled by application programming interfaces (APIs)?
A: While APIs can play a role in interoperability, what’s more important is how the data is curated, structured, and linked across functions. Pulse Data is designed to serve as an operationally ready data service that seamlessly integrates into a firm’s ecosystem. One of the key differentiators of Pulse Data is its ability to handle data at scale, from thousands of sources, regardless of how the data is sourced.
Whether it is reconciliation, NAV [net asset value] oversight, compliance checks, fee billing, or settlements, firms rely on Gresham not just for the volume of data it supports, but for the flexibility in how that data is captured, validated, and distributed. It’s about ensuring that no matter what shape the data comes in, it’s transformed to high-quality outputs ready to drive operations.
Q: Gresham’s Pulse data service collects and validates a wide variety of data. What is involved in the data validation process?
A: Our validation process is multi-layered and highly configurable. We start by applying structural checks, format checks, completeness, data validation, and logical consistency. Then, contextual validations compare incoming data against historical trends, expected patterns, and known reference points.
Finally, clients can layer in and apply custom rules or tolerances specific to their operations. This proactive approach helps identify issues at the source before they trickle downstream, saving time and reducing operational risk.
Q: How does data aggregation tie into the embrace between A.I. and post-trade operations? Would Pulse ever use A.I.?
A: A.I. and data aggregation are deeply entwined. For A.I. to be effective in post-trade operations, whether for exception management, analytics, or intelligent workflow routing, it needs access to accurate, comprehensive data. Pulse lays down that foundation.
We’re actively working on exploring A.I.-driven enhancements within Pulse, such as anomaly detection, auto creation of tickets, auto classification of data issues, among other areas. The future is about enabling operations teams to work smarter by providing clear data insights.
Q: For those whose main focus is to oversee data operations, how important is the role of data aggregation?
A: It’s absolutely essential. Aggregation isn’t just about collecting data; it’s about making the data usable, traceable, and actionable. Without strong aggregation, the entire operational stack becomes fragile and reactive. Pulse empowers operations teams to be proactive, confident, and efficient.
Q: Has Gresham supported any initiatives for securities industry data standards? If so, what were they, and how have they helped Gresham better serve its clients?
A: Yes, Gresham has been a supporter of initiatives like ISO 20022 adoption and industry-wide reference data standardization. We are actively engaged in industry discussions with our clients and peers through events, panel discussions, and user groups.
Q: Are there any technologies and/or standards on the horizon that could help with securities data aggregation?
A: We see several developments on the horizon. Broader adoption of ISO 20022 will continue to improve standardization and reduce friction in cross-border messaging. Advances in distributed ledger technology (DLT) and tokenized securities could also drive new forms of reference data and transparency.
In parallel, open-data and interoperability frameworks — supported by regulators and industry groups — will make it easier to connect data across firms and ecosystems.
Finally, the combination of A.I., agentic [model context protocol] MCP-powered A.I., and machine-readable standards holds great promise for automating data validation, mapping, and enrichment at scale. At Gresham, we’re preparing Pulse to take advantage of these innovations so our clients can stay ahead of the curve.
Need a Reprint?