January 14

Systecon CTO Justin Woulfe Sheds Light on Building Predictive Maintenance Tech for Military Clientele & Beyond


Justin Woulfe is committed to the mission of helping aviation, defense and oil and gas industry clients make data actionable. With his company Systecon, where he is the chief technology officer of the North American arm, Woulfe focuses his efforts on creating predictive maintenance tools and systems to help develop and upgrade support structure, reinvigorate strategy, manpower and readiness for weapon systems and beyond.

Woulfe has an electrical engineering background and spent over five years at Lockheed Martin after his undergraduate work at Virginia Military Institute. At Lockheed, he began working in the mission critical logistics group, under the supervision of the corporate vice president for logistics. It was in this department that he first encountered the concept of utilizing modeling and simulation to predict the future. He became fascinated by the pursuit of the capability to gain insight into what level of weapon system performance we might expect in the coming years.

The executive eventually decided to join a group of young professionals who constitute the founders of Systecon’s U.S.-based business (the company collaborates closely with a Swedish software development unit that was the organization’s genesis). Woulfe spoke with ExecutiveBiz recently to discuss the complexity of bringing a large group of systems of record together to build predictive models, overcoming the sometimes confrontational dynamic between government and industry and more.

Data is often coming from multiple sources that organizations need to collect, analyze and understand in order to use it. What are some of the key challenges and opportunities you’re seeing emerge as organizations harness data and use it to drive decisions?

If you start with the end goal and ask, ‘why are we collecting data? What do we intend to do with it?’ I would posit that, in general, the reason why we have data, especially when we think about weapon system performance or some sort of measure of capability of our weapon systems—it’s really about how we understand the mission performance or mission effectiveness that we can expect from the system.

In that same realm, all of the data coming off that platform or the data that we’re using or keeping or storing, should in some way be useful for that purpose. As the systems become more complex and more interconnected, there is more and more data.

If you look at, let’s say, weapon systems from the 1980s, there’s a limited set of computational data and sensor data—data that could be used to predict future performance—that was available. Now you’re seeing terabytes of data coming off these systems and not all of it is useful. So, as we think about what to do with it, first is to consider, how do we bring these data systems and data sources together to create something that’s actionable? Our goal is to perhaps influence mission performance or mission capability. And to do that, oftentimes, disparate systems need to be connected to one another. We need to find ways to very quickly transform that information and make it available and useful.

We’re talking about sensor data off the platform itself, the maintenance records from what happens in the depots when we actually go fix the components or fix the weapon systems; and of course, our supply transaction data: how much are we paying for components? How much are we paying for the repairs of these things?

Frequently there are a dozen different systems of record that need to be brought together to actually create a model that can be useful to predict something in the future. As we think about bringing together all that data, finding ways to make these systems talk together in a more complete or, to use the buzzword, in a more holistic manner, will really enable us to do something with this multitude and terabytes of data that we’re collecting rather than put it in a place and look in awe at it. We want to be able to do something with it in a very timely and matter-of-fact way so that we can influence operations in near real time.

What kind of tools and technologies can organizations use to make their data more accessible and understandable?

Before we even get to tools and technologies, in many instances there is a process or a cultural barrier that we need to first overcome, and some of that is making sure that we’re putting the right data in our contractual requirements. This entails ensuring that we have the right relationship with our vendors and with the original equipment manufacturers of our systems that are just exposing the right kinds of data and that are making the right types of data available.

That foundational piece of buying the right information from our suppliers or ensuring that we’re able to access the right data from these weapon systems is the base for everything that follows. From there it’s a really interesting perspective as we think about the tools and techniques. Because every system is going to have its own unique ways of providing information and big data solutions that can perhaps be used to architect a point estimate of what we can expect out of our weapon systems.

There is a whole host of proven AI and machine learning techniques to be able to classify data and compare that against previous maintenance events to be able to do predictive maintenance. There is a whole host of optimization schema, like using some of the heuristic algorithms or evolutionary algorithms that enable us to ‘cut the corner,’ so to speak, with regard to simulation. Some of our war gaming simulations, as an example, could be done much faster than they traditionally have been. And then of course there’s just good, old fashioned database tools and techniques that enable us to bring data together.

There are a number of vendors out there today that are doing a really good job of enabling us to do data classification in a much more rapid way. But again, at the end of the day, the key is making sure that we understand the data and that we actually have access to it. Because ultimately we can’t apply these great tools and techniques if the data doesn’t exist. And so making sure we can get it, that we understand it and we have persistent access to it is really critical.

What’s the most impactful trend you’re currently seeing in the GovCon market? How are you seeing GovCon organizations respond to that trend?

First, there is the cultural side of things. Traditionally, over the last eight or 10 years, you’ve seen this somewhat confrontational relationship between the contractor community and the government community. You have the government on one side saying, ‘Hey, we’re buying an asset.’ And the contractor on the other side saying, ‘okay, you can buy it.’ But there was always this clash of maybe the government people feeling like they weren’t getting a good deal and the contractor community feeling like the government was asking for things that were unreasonable. I never really see that antagonism in our customer base in the UK, for example, where there has always been this very positive relationship. With those clients, it truly is a teaming arrangement between government and industry.

I think over the past few years in the U.S. we’ve seen this evolution toward that partnership. And I think for us to be successful in the technology space, we really have to foster that partnership between government and industry. The government can’t do it without industry’s involvement and vice versa. The people that are designing and building these very complex weapons systems really need to be an integral part of that team moving forward to leverage some of these amazing technological pieces, right? That relationship is really critical.

We’ve just seen an explosion in capability here. At Systecon, we have been doing optimization, predictive analytics and modeling and simulation now for over a decade. I’ve been with the company since 2011 and in that time we’ve seen an increase in people asking the right questions, saying, ‘hey, we really need to understand our weapon system capability or availability over the next 3, 4, 5 years.’ And certainly more recently with some of the potential conflicts that may happen around the world. Now in eastern Europe, there is a real desire to make sure that we can apply the right kinds of capability at the right time. To do that, it really requires predictive analytics.

We now have customers asking the right questions and have experienced a boom in computational capability, so we’re able to deploy our, our modeling and simulation capabilities, our optimization algorithms and things in quite capable cluster computing environments. In doing so, we can solve previously unsolvable problems in hours or minutes rather than something that may take months and months to figure out.

That speed is a really critical thing that we’ve seen in enabling us to apply our algorithms and make sure that we’re able to answer questions in a relevant timeframe. That’s also really important. And then just the algorithms themselves—that continuous development has been something that we’ve seen some real benefits with, the evolution of predictive maintenance algorithms and their ability to predict failures and understand weapon system effectiveness, but then also some things in the genetic algorithms space where we’re able to shortcut things that would traditionally take months. That sort of continuous improvement toward more real-time analytics is really cool. It’s something that has been an interesting thing to watch come to fruition over the last decade.

More and more of our nation’s critical systems and capabilities are reliant on advanced software. Can you tell us your vision of the role software will play in the future of the U.S. government?

I don’t think there’s too many things out there in the fleet today or in inventory today that aren’t software-driven. There aren’t very many analog components out there, certainly besides small arms. Otherwise, everything else has a computer connected to it. Whether we’re talking about embedded software controls or cloud computing, there is a multitude of everything software-driven and I don’t think that that trend is going to change.

It’s really critical that we maintain that ability to produce good software, oftentimes we get really enthralled by the bright, shiny object—the weapon system, the bomber, the new truck or tank, and forget that all of that is software-driven and that there is is millions of lines of code behind that. Being cognizant that that’s all software-driven and isn’t going to change is going to become even more important. And therefore requires a continuous evolution in software and frankly, good software developers. I think within the Department of Defense, it’s harder and harder every year to find software engineers that can get a TS clearance. So that brain trust inside of our nation is really important to preserve and is something that we’re really interested in here at Systecon.

For example, we have a number of scholarship programs for U.S. citizens that are in software and related fields because it’s really critical. It’s really important that we as a nation are able to have that kind of capability and bring that kind of capability to bear.

It’s only going to become more critical as these systems become more complex, more interconnected and more evolutionary. If we can push a software update to an operational weapon system much like Tesla does with their car fleet, where you get a new software push every month that enables some new platform capability—that’s incredible. And there is no reason why we can’t. We just need the right skillset, the right people and the right security protocols and things like that to support them.

Learn more here https://blog.executivebiz.com/2023/01/systecon-cto-justin-woulfe-sheds-light-on-building-predictive-maintenance-tech/ by



You may also like

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Get in touch

0 of 350