Business development executive Bill Sullivan’s career began over two decades ago with IBM, and he has since held leadership positions with Oracle, Cloudera, and Amazon Web Services, among others. With his experience in strategy and development, he has worked to push organizations further into their target markets.
In a recent interview with ExecutiveBiz, Sullivan, who now serves as vice president and general manager of Denodo’s federal sector, spoke about challenges for government agencies in moving to the cloud and adopting artificial intelligence as well as Denodo’s expansion into these areas of the federal marketplace.
What are some of the key barriers that remain in widespread federal AI adoption, and how do you think we can overcome them?
There are two key barriers, with the first being contractual. There are high and increasingly growing barriers to the government adopting new AI applications. Some of them are essential, like security requirements to operate in the federal government, but some of them are contractual. This latter group can and should be easily overcome. The government needs to provide an incentive to the systems integration community to add new technology to existing or prospective five-year firm-fixed-price contracts. Currently, awards are often made on a firm fixed price contract with little incentive for prime contract holders to try and add new technology. By adding consideration of new technologies to these contracts, the government could reward primes for introducing new technology as it becomes available.
The second barrier is architectural. The government has data scattered on-premise, in hybrid clouds and in multi-clouds.
Because AI applications provide more granular, predictive results when they have access to a steady source of relevant and reliable data drawn from as many different sources as possible, providing more data, locating data, accessing it, and using it in a timely fashion are particular challenges for the government given their distributed data architecture. A typical corporation does not have three or more disparate CSPs working in the environment but this is common for the government.
A lot of our customers are using Denodo because we allow them to access data from multiple clouds in real-time, securely, and at scale. The ability to find, catalog and make available all data empowers their AI applications for better testing in the government customers’ environment, and brings selected AI applications into production on a faster and more cost-effective basis. The Denodo platform also features some AI/ML functionalities for dataset recommendation, collaboration, performance optimization and DataOps that are highly valuable to our enterprise customers.
What is the top challenge you’re seeing as federal agencies migrate to the cloud? What solution would you propose to this problem?
It is easy to migrate to the cloud. Each of the CSPs have tools for data ingest, and a consulting ecosystem designed to help customers get data into any of the prospective Cloud environments. What is hard is accessing and understanding what data you have in a multi-cloud environment to ensure the data you are accessing is complete, timely and accurate for a specific purpose.
The reason Denodo has seen 203 percent revenue growth in federal business over the last year is that our government customers understand that we allow customers to catalog their data, and instead of trying to replicate it for applications, production, or data lakes, they can use the data where it rests. By leveraging data virtualization to enable access to the data, we bring the computing to the data instead of bringing the data to the computing.
The second reason we have seen this growth is because we are able to do it securely, at scale, in both classified and unclassified environments, accessing both structured and unstructured data. Finally, we are growing because we help the government uniquely by managing clearances for accessing data to ensure only those that should have access do have access, and we can provide an audit for that access.
Our customers include the Missile Defense Agency, the U.S. Army and ten of the National Laboratories. We handle some of the largest, most complex data in the U.S. government at scale and securely. We also have three of the largest systems integrators as customers, who use us for the same reason: understanding where their data is and accessing it in real-time without having to replicate it.
One of the things the government is waking up to on a monthly or quarterly basis is the cost of managing our cloud implementations. Customers using Denodo are able to access their data – and we can cache the data if they have a regularly scheduled query to ensure that they are accessing the most up-to-date data – without incurring IO charges for regular queries.
Where are you seeing opportunities for expansion in your company’s portfolio? What new capabilities or markets are you eyeing?
The government at all levels is moving to the multi-cloud world. We are seeing an explosion of adoption for Denodo across all federal market sectors as well as in state and local governments. The government is being pushed or being drawn into the multi-cloud environment, and once they are, they consider how they can access their data.
One solution is to do a data lake in the cloud. However, it is possible to leave the data where it is and access it in real-time, which turns the current thinking on its head. For that reason, we are seeing that customers are thinking they do not first need to move data into all the clouds and then set up a data lake. We can leave the data in the various cloud environments taking advantage of whatever value each cloud uniquely offers the customer.
We can also access that data quickly without having to move it into a data lake for a second time or incur IO charges.
We have a sales team dedicated to each of the federal government markets and the state and local government market. We have dedicated marketing, channels and contractual relationships via Carahsoft. This allows us to serve the government at all levels.
What can you tell us about how federal agencies are handling the massive influx of data we are seeing today?
Data architecture historically has gone through phases of centralized and decentralized architecture. We’re currently at a juncture where enterprise data remains extremely distributed, and it’s unlikely to come back together, and that necessitates new approaches to deal with such distributed data. The key challenge the government has is not just the massive amount of data distributed across their various clouds, but also the rate at which it is increasing, understanding what data they have and using it in a way that makes the data valuable to them. Doing all this in a timely fashion is critical. Fortunately this is what we do.
We can enable things such as JADC2. There will always be some latency, but hopefully not more than split-second latency in accessing data to make targeting and defense decisions in a given tactical situation.
That is where Denodo really shines. You need unrivaled data access to generate actionable insights from massive amounts of data, both structured and unstructured, in a classified environment to be used for life-and-death decisions like targeting or defense. We can be an essential partner.