Sep 18, 2019
| 4 min read

Podcast #70: Unlocking Industrial Insight with Cognite - A Conversation with Dr. Francois Laborie

Dr. Francois Laborie is President of Cognite North America. Our conversation covered his background working in aerospace for EADS (now Airbus) and how his first hand experience working with supply chains, the assembly line and early iterations of augmented reality helped shape his views of how information analysis is critical to optimizing business processes. We discuss some of the unique aspects of working with industrial data, and he highlights the work that Cognite is doing with key clients to help bridge disparate information systems and deliver relevant value add insights to users.  

Recommendations:

How Google Works: The Rules for Success in the Internet Century - By Eric Schmidt & Jonathan Rosenberg 

orange-line.png

 We'll notify you weekly about new podcast episodes, upcoming guests, and news. You can subscribe to the podcast and if you'd like to be considered to appear on the podcast contact us.

 

View Transcript

Good day everyone, and welcome to another episode of the Momenta Podcasts. With us today is Francois Laborie of Cognite, and he is the President of Cognite in North America. We’ll get into this a little bit, but Cognite is a company which is really focused on data, in particular they’re doing a lot of interesting work around industrial data, so very close to our passions around industrial IoT, they’re doing some fascinating work so we’ll dive in a bit deeper. Francois, first of all I want to thank you for joining us.

Well, thank you for having me Ed.

Great. First, I’d like to start with a little context and a bit of background, and understand a bit of your history, and what brought you to your focus, and ultimately to your current role at Cognite.

That’s interesting looking back, I don’t very often do that but when I’m thinking about the path I took, I actually started working with industry and industrial players when I started my career, back in 2001 I joined Airbus Group which was called EADS at that time. Back then I was a computer scientist myself, I joined the research team and we worked very closely with the operations, and I knew nothing of aeronautics when I joined them, I had a pure computer science/computer vision background.

As I started working with EADS we were asked to work very closely with not just the engineering teams, which had super-interesting challenges, but also with the operations on the show floor where they were assembling the planes. That’s where I realized that there is somuch opportunity working with heavy industrial players, because the complexity of the operation, the amount of people involved, and the processes involved can benefit from technology on multiple levels. So, that was an interesting start of the career, and I came back obviously when I joined Cognite through these first experiences.

So, back at Airbus I worked together with the engineers, Airbus had these visions, EADS in general and Airbus in particular were spearheading what was called at that time, the virtual mock-up, or the digital mock-up. What that meant was they were building a massive plane, the A380, they were building it entirely virtually, so using CAD, using only digital systems, and that was extremely early on, there were no other programs that had been doing that, at least in Airbus and even globally, for the major aircraft manufacturers, working together with the engineers in trying to support them there.

So, they were working remotely, they were having a food chain of suppliers, it was at that time something that was eye-opening for me, because I not only realized that it was about the technology, but it was also about the way that people worked. Actually, that led to the topic of my PhD which was collective decision-making. So, it’s how do we put together made decisions. It was heavily applied to industry and our knowledge in that sense.

I also worked on the assembly lines, and there again I had no idea what that was like, so I was on the floor, at that time it was the A380 programme that was assembled in Toulouse where I grew up, and where I worked. They were having massive issues with multiple suppliers, I mean Airbus is traditionally a plane that is built – it’s a European plane and you could say it was a global plane already at that time, where you had parts of the plane that were coming from all over Europe, and they were being shipped, or freighted to Toulouse, and assembled there. Things change, and things have delays, so you were having people coming from Germany, from Spain, contractors being involved in these massive hangers, and they were doing hundreds of tasks in parallel, where they had huge amount of inter-connection between all the task and interdependencies.

So, that’s when we started playing as much as in the engineering side, we were playing with digital mock-up and 3D models, sharing. Here it was really out giving the information to the user, the information they needed in order to know what they had to do, and in order to be aware of the context in which they were doing it. So, we watched what the other team, the German team on the other side of the plane was doing, because that may impact what their current sheet was saying they should be doing today. So, we deployed some of the first tablets on the field; they were pretty heavy things, and they didn’t have a very long battery, but the user appreciated this.

We also started playing with AR, so we had some of the first headsets deployed as well, and that was a little bit less of a success! It was giving massive headaches, it was also very cumbersome, a lot of wires, very poor battery, but it was starting to show the path to a lot of things that I’ve been discussing in the industry and are still not rolled at scale in the industry today. That was really-really fascinating as an experience.

So, that was my first real interaction with both industry, and whole technology, could be applied in industry.

On the personal side, I fell in love and I moved to Norway, I was very lucky to join, as I moved to Norway, I joined this software company that was working with media exclusively. The media side on the distribution, there was a massive change ongoing when it comes to the on-demand revolution, and the change of distribution channel. But on the production side it was also a lot of challenges, and that company when I joined there were a little over 100 people operating in a few countries already, and I was part of a journey of growing it to around 600 people in 40 countries, we were working with all the major broadcasters around the world, most of the sports leaders around the world, and helping them on the production side, focusing on live production.

The value of live, as I’m sure you are aware, has been increasing tremendously, so live original content production was something that was really-really important for a lot of our customers. Our focus as a software company was going to an industry and saying, ‘You do not need specialized hardware, in order to produce this content. Computers are more than powerful enough to help you produce food shows and covering full events on the sports side’. That resonates quite a lot with some of the things you see in the industry right now. But it was also coming to this place and saying, well, actually the journalist used to do very defined tasks, and you had a lot of people collaborating in order to produce a piece, and that’s just simply not economical enough. That’s simply boxing people, so you had very disconnected teams working on a single piece, on a single event.

Our approach coming from Norway, which was very high cost but also a very progressive country when it comes to technology adoption, was to say we will empower the user to perform their full task. So, we’ll empower the journalist to be able to create, edit his piece, and the voice-over, and the graphics; or to enable the producer to produce the food show. What that meant was that you had a lot of intelligence in the software, you had great UYs that were brought up, and it was about efficiency, empowerment, and also of course overall allowing people to produce more content.

There again, these are things that I took with me as I moved to Cognite, because it is about changing the way people work, it’s about giving them the right tools, the right information in order to make them more efficient. As I joined Cognite that’s where I reconnected my roots in industry, and then everything that I’d been doing in real-time software, and enterprise software, when I was working with SRT.

That’s some great background, and I think it’s unusual to be able to have that experience, both in media and in industry around data, just because the business problems around industrial data are enormous and sprawling, but also when you’re dealing with media the need to create, tag, and organize this unstructured data presents a whole different set of challenges. So, I think that’s great perspective in being able to attack some of the bigger problems around managing industrial data.

I’d love to turn to Cognite, and if you could tell me a little bit about what the company’s mission is, and the focus, and what were some of the business problems that had attracted the founders to start up the company?

That’s a great question Ed, and you mentioned it’s a bigger challenge, and indeed it is a bigger challenge. The customers we’ve worked with are the customers that are effectively keeping our world running, whether we’re talking about energy customers, which are both providing the fuel and keeping the lights on. People that work in transportation, customers that work in the manufacturing segments, so these are really-really a different scale of customers, and very often they’re hidden behind the scenes, but without them the world doesn’t run. I think that’s a fascinating challenge.

So, the mission of Cognite is not just to liberate the data that they have, but also to effectively enable them to operate in a completely different fashion, and maybe I’ll take a few steps back there.

Cognite originates from the industry itself, so the founder of Cognite Dr. Lervik, worked very closely with Aker, which is one of the largest industrial conglomerates, or industrial owners in Norway that has portfolio companies spending from fishing, to oil and gas, to energy, so they have a very wide portfolio of companies. They were profound believers in the ability of technology to transform the way they’re working, but they were also very frustrated by the fact that they were doing great POCs, they were working with great companies, great partners, but at the end of the day having very little impact on the operations. So, they could see clearly that there was a disconnect, and I’m not talking about the traditional operating technologies, I’m talking about making use of the data, what people normally drop under these big terms of digitalization.

So, they were doing advanced analytics project, they were doing a lot of transformation proof of concept tests, but at the end of the day very little was impacting the day-to-day operation. When they started looking at the problem, they realized very quickly that there were a few factors preventing them of getting the impact that they should have on these initiatives. The first of them is that very often the POC were taking too much effort and time to get the right data, in order to give a result. So, they were spending far too much time looking and going spelunking for the right data, in order to do the right algorithm, the right visualization tool, predictive model, or what have you, and to field the process that should come out of it. They ended up spending far too much time understanding and gathering the data.

The other issue was that they quickly realized that this was never going to be something that would solve on their own. These are massive challenges, they have a lot of the main expertise, they are specialists obviously in what they’re doing, which operation on heavy industries, but some of the competencies that would need to solve their problems would have to come from partners. They would have to come from an ecosystem of partners, and the only way to attract an ecosystem of partners is to start by admitting that. If you take that statement to its conclusion, it means that you are ready to change the way you work with your ecosystem of partners, and you’re going to have to be a lot more open about inviting them to solve the problems. It cannot be just a traditional supplier by your relationship. That has had a profound impact in the structures.

The last realization was that again they were not software, they didn’t have software competencies to build something like a data platform, so they were very clear from the start that they wanted to product, they believed that the ability to extract this data, make sense of it so they could scale, was something that would benefit so many other actors in the industry that they were certain they would find a product to help them do that. So, they set out to do this, and they went at that time, so that was a little over three years ago, and they obviously talk to G Predix, they talk to a lot of the cloud vendors, and a lot of players in this industry. After doing this tour they had the conclusion that it was either very smart people doing great applications, but not really trying to have a holistic view on how to make the data available at scale, so they used to say available like oxygen, I mean you needed it to breath, and it should be available like oxygen.

So, they found their company doing machine learning, doing AI, and they were piping data but nothing that was at scale. The other thing they found was a lot of people that were ready to sell services, so bespoke, using technology but really bespoke solutions. And again, you go back to their belief that this is not going to be the future, this has to be a product because then they can rise with the tide, as the solution gets better and more customers are using it, then they would benefit from features and insights provided by new customers. So, that was really the cruddle that created Cognite, they decided it was an opportunity, they were working with John Markus Lervik on some of these initiatives, and basically they said, ‘Well, if it doesn’t exist then we have to create it, because it’s certainly going to benefit the industry.

They had a discussion about, ‘How do we go about building such a company?’ and it was pretty clear that they had to be an independent company, they had to be a software company in its DNA, so its really hard to be focused on building the product with software competencies. But it had to be anchored in the industrial world reality, it could not be something that would be built in California, or in a garage, it had to be built in close collaboration with industrial actors to understand, because they take such a giant amount, ‘What are we talking about there, what type of used case, what type of value are we talking about?’ That’s how Cognite was created, with a mandate to build a really top software team, that would be working very closely with the first customer that would be provided by Aker, which was Aker BP, which is a joint venture between Aker and British Petroleum, in the Norwegian continental shelf.

You’re talking about certainly some massive datasets to start with. As you’ve discussed the business problem, I’d be interested to get your perspective on how different, and how difficult data from industrial systems can be, compared to some of the data from your traditional enterprise applications, or media I guess from your perspective. If you could maybe compare and contrast your experience there.

That’s a great question, and that really is the base and the fundaments. You could say data is one and zero, so what’s the big difference, and that would be a fair comment. But if you want to actually do something with that data, it has to have a meaning, so you cannot just stay… I don’t know, on the IoT and stream that you’re receiving, or the IP reference that you’re receiving. You need to understand the data in a context, so that you can share it with these partners, so that they can create value out of it.

What we’re looking at here is then data that, if I say some of our process customers, they would describe a process, so there’s a flow; it would be attached to an equipment or an asset. So, it’s not just ones and zeros, you need to understand that this is actually a precious sensor that is attached to say a pump. You may want to know for that pump what has been happening to it recently? Who touched it last? When was it bought? What does it look like today, so does it have rust, have cracks, do we have all these samples that has been taken? What is connected to it on an electrical standpoint, in the flow and a process standpoint?

So, you start hearing about, here we’re talking about in order to answer these questions, you need to start looking at data very much in a context, and you need to be able to rebuild that context, because that’s the big challenge, the context doesn’t exist in many of our industrial customers, because its either locked into a system, or its just made very differently across systems, or for some of the operation data and nobody on the IT world knows how to make use of it, and connect it back to a representation that would be useful for data scientists, or for a specialist to use.

So, is that answering your question Ed?

Yes, that’s great. I’d love to talk a bit about some of the technology that you’re developing and understand a bit of the approach you’re taking.

Sure, maybe before jumping into the technology, it’s interesting to look at it about, okay so we’re talking data here, and insights, but what’s the value? What time of used cases, what type of value are you creating for your customers? When you start looking at the usage of the data, we found three categories.

The first category is just actually giving to the people who need it, and I go back to what I was doing in Airbus initially, which was there was very little intelligence in it, there was some optimization for planning, but really at the end of the day it was allowing the operator, the technician working to get the right information for him, or her to operate, and this is still true. A lot of value in most of our customers that can be taken by just giving the right information to the right people. So, here you’re looking at displaying data, displaying whether that’s dashboard or that’s mobile data, or its 3D models, but getting them the data that is somewhere here locked into a system, and making it available for them. So, here we have a whole set of technology regarding real-time data display, and also real-time data consumption so that it can get accurate up to date data to visualize.

The second category is starting to use data, and feed it to a machine or an algorithm to increase our understanding of what is happening on the operation, and that can mean starting to give insights that are not necessarily perceivable, for example doing computer vision on equipment to be able to know what is happening right now, where are people located for extra concerns, or whether there’s rust developing right now, or it can be to start making predictions; what is the remaining use for lifetime of my asset, or my piece of equipment. Or, starting to ask questions like, ‘I’m operating a huge power network, and what are the chances for which station to encroach with my grid’, that’s a huge risk of fire, so you give me that data you have and I’m going to build programs that will tell me what are my risks, and where my risks are.

So, augmenting the data, and here we have a range of technologies to do that. Actually, that’s a place where we have learned a lot, because our original intention was to say, ‘Okay, with ML we’re going to reuse this data, use machine learning to it, and build great predictions out of it – not so simple.

The third category is going back to some other philosophical standpoints, it to allow for the data to be easily shared, so that you can create new insights. That’s extremely important, and extremely powerful, because what you do when you start being able to share the data is, you also impact the business model of our customers. So, I’ll give you two examples.

A lot of the heavy asset operators are using huge expensive equipment, and the business model they have with their suppliers is, they would buy the equipment on a CAPEX basis, and then they would have to maintain it, and they would pay for time, material, and spare parts to maintain the equipment. At the end of the day their core competence is not to be an expert on the compressor turbine, or generator; their core competence is to assemble it into an operation.

So, by being able to share data with their suppliers, they can start trading hardware for insights, and maybe all the way to performances. There again, Air Nordic has been very early there, where you have all these models of performance-based contracts, where you will sell a jet engine by the hour, or by the kilometer. The same type of transition is happening with the industry, and the only way you can do that, and the manufacturers can afford to take the risk, is if he/she has insights on how the equipment is behaving, how its operated, so that he can commit to performances of this piece of equipment.

So, that’s one very important change in the business model, and some of these equipment manufacturers are potential client for Cognite, obviously.

The other interesting piece is when you realize and accept the data, as an operator you don’t necessarily have the competence to solve all of the problems. Fortunately, this equipment doesn’t fail very often, so you may not have all the insights you need in order to understand what failure looks like. So, you can start sharing this data with all the suppliers, and Cognite will build a lot of t-shirts stating data migrating fronts, and we actually encourage our customers to share some of their data. There will always be somebody that is business critical but changing the paradigm to everything is critical except to, actually everything can be shared except.

What we see there is that our customers start sharing data with some of their ‘competitors’, because that may allow them to be more efficient. A lot of this is around sustainable goals, attainment for example; how do I operate better my equipment so that I have less environmental impact. These are things that all the operators are willing to share and learn from, other things can be sharing with smaller players in the market, so startups or new companies, and giving easy access to data to these companies, to allow them, to give them a chance to show what smart things they can bring to the customer. Initially we’ve been going through that, it’s an extremely cumbersome process right now to get data from an industrial. Whist if they make it easy to share data then they have the opportunity to engage with a huge ecosystem of talents.

I didn’t answer your question on the technologies, so I’m happy to talk about that as well, but that was a little bit more about why, because I think that really is, and should be our primary driver, what is the value of it for our customer.

It did strike me, you are hitting on this powerful downstream effect of being able to empower companies with easy access to data from heterogeneous sources, in a way that what you just described is really profound; it’s business model changes, and complete rethinking of competitive dynamics too. Historically we’ve seen companies at the financial services industry for instance, pool data to develop credit scores, even though they’re competitors, everybody benefits by having more accurate abilities to predict. But I think what you just articulated is an extension of that concept to industrial, it’s more hard manufacturing, hard goods companies, and that’s quite a significant change. I just thought that was remarkable, I didn’t want to knock you off track there because I do want to hear about the technology.

You’re right, it’s about moving all this concept that have been present in a lot of industry, to multi-operational parts, and the ‘dirty, greasy’ part of it where the production happens. That brings us to the technology, how do you empower this type of concept? And if you’re saying that you’re not just going to connect with IT type of infrastructure, and IT type of data which is very often table of data that is pretty well structured, but you say, ‘Well, I’m going to combine that with the operational data, which can come from sensors and control systems, from cameras, from diagrams that are done from the outset, they’ve been printed somewhere, and no machine can understand them like that.

So, that becomes a very interesting technology problem, where you have to apply extractors, and work very closely on the security side to get access to that data. As you can imagine, security is paramount for these players, some of them are maybe not driving the full infrastructure of a country, of generating most of the income for some of the countries that they operate in. It is really critical that on the data extraction parts there’s a very thorough process of where you embed this technology, and where in the layers of security you embed these technologies. There’s a lot of work being done on the Edge for data extraction, I think there’s still quite a lot to be done and developed there.

The second element of it is, okay now I’m extracting this data, but its still not giving me any meaningful insight, because the data as it was in the solar systems. So, that’s where we look at what we call contextualization, which is effectively fusing the data together, so you can start asking questions here, ‘I’m interested in that compressor, tell me everything you know about it, what is it connected to; what are the sensors; what happened to it; what type of insight you already know from it in the future? And to do that, that’s where you need to use a range of technologies, some of them will be based on machine learning; trying to understand and find patterns that allow you to understand the data that you’re manipulating, and what it is associated to.

But you also want to involve and crowdsource all of these insights to users, so that they can help you understanding some of it, you learn from them and you start creating rules from them. You also combine that with computer vision to for example analyze printed out documents, so a layout document for example, where you’re going to analyze it, understand what it means, and be able to represent it in a way that can later be queried by a machine. So, the machine by itself can have an algorithm saying, ‘Show me two steps downstream’, or. ‘Two steps upstream’, or, ‘Show me all the pieces that are connected between this equipment and that equipment, show me anything in the middle’.

And to describe this type of complex relationships, that’s where people would normally use graph; graph as a technology has become very popular. It has a lot of challenges when it comes to scaling this to millions of datapoints, petabytes of data, tens of millions of documents which is what effectively we’re talking about for a single customer in that case, so there are big engineering challenges. In order to solve these challenges our approach has been very firm from the start, and has been on-premise really not the way to go, you have to do that on the cloud, because the type of elasticity and resources you’re going to need to perform this at scale in a reliable fashion, in a secure fashion, will force you to use cloud infrastructure.

It’s funny, because in the media world I was in exactly the same discussions, you need to move to cloud to get access to on-demand elasticity, and also the cloud vendors really specialize in maintaining this infrastructure and its security, so it would be a tremendous cost for any of our customers to maintain the same amount of staffing, and competences to do the same. So, part of the equation is convincing these heavy industries to embrace the cloud as a way to make sense of the data, the terabytes and not petabytes of data, that they have and that we’re collecting. Then you may be able to consume it, and you need to make it so that it’s extremely fast to consume. There again the cloud is your ally because you can really leverage a lot of the cloud technology.

If you think about some of the technology, take Bigtable from Google, that is powering the AdWords ad the AI behind the AdWords globally in a synchronized fashion. These are the types of technology that you need if you really want to give performances to the users, whilst maintaining a huge amount of data in real-time, in sub-second fashion accessible.

That makes an enormous amount of sense. Who were some of your initial customers, and some of the use cases where you’re finding the most promising adoption, or the most impactful adoption?

That’s a great question, where is the impact, where do we see not just the customer adoption, but where do we see that we can make an impact. As I mentioned, we started with the oil and gas, that’s a very strong vertical for us on the operator side where I mentioned, but we’re also working with a number of operators in Europe, OMV which shall be some of the smaller operators as well, and some very large global oil and gas operators. There you see the applications are in empowering them to perform first of all the visualization of data access that we’ve mentioned, and mobile working applications shaves-off 50 percent of the time to perform the monitoring tasks. It can take increase up to 70 percent of the on-tool timing, so the amount of work they can actually do because now they have the data to do it, rather than radioing home. So, you’re having a pretty big impact in just giving these people the right information.

But then when you start meeting it with some predictive model, and there again there’s a very interesting trend where we’re using not just data science approach but are blending it with the good old physics-based simulation, then we can help them optimize how they run the operations. Here you’re talking about tens of millions of dollars every year that can be saved by just optimizing the way they operate single pieces of equipment, because they will improve the production on this specific equipment, you can dynamically tell them what is happening inside this separator, or this compressor, so that they can optimize how much production they can get, whilst remaining within very safe limits of operations. These are proven results already.

So, in oil and gas there’s a lot to be gained by data visualization, and then predictive models, and of course moving to performance-based contracts has been having a huge impact; they report 100 percent improvements on the uptime of the machinery, so they’ve doubled the availability of the machinery, whilst at the same time lowering their cost and their risk. So, really big impacts.

When you move to some of the customers, we have on the energy segment, then we’re working with Statnett who owns and operates the grid in Norway. Norway has these interesting challenges of very harsh weather, very harsh environments, where keeping the grid operational is a big challenge. But at the same time, the highest penetration of electric vehicles are a huge proportion, the vast majority of the energy production which is based on renewables, hydro being the main source of it, so we also have customers who are operating hydro plants, and they’re the type of upside that we see is enabling resilience of the grid, so predicting failures, optimizing the usage of the grid based on patterns of consumption, but also optimizing on the production side the reliability of the equipment. There again, you’re talking about millions of dollars for any of these optimizations.

If I look at our shipping customers, there are two very interesting areas; they’re interested in optimizing the logistics around the shipping operations, so there you have optimization. But we’re also very often, and most of the time involved in optimizing the handling of the ship itself. Fuel is the biggest cost center, so anything you can do in order to optimize or minimize the fuel consumption is going to be a great benefit for the owner and operator. That means gathering real-time data, combining with weather data, a lot of data sources and making sure that you can provide insight on what road should be taken, what operations should be performed. There you see up to 40 percent improvement, that’s a huge amount, huge amounts of saving that can be done.

It’s really compelling when you see the impact of technology being applied. In terms of alternatives, when you’re working with customers and opportunities, what are your customers wrestling with? Have they historically had to deal with buildings solutions themselves? Do you find yourselves having to compete against others with similar ideas? Or, is this still a greenfield opportunity where you’re involved with evangelizing, and teaching people about the potential of what they can do with data?

It’s a mixed of things, and I think you spot on that when you talk about homegrown initiatives, that’s what we find most of the time, and that’s quite natural. Our customers are well aware of the opportunity that lies in the data, so they’ve been trying to liberate that data, and make it useful for the operations for a number of years. So, there’s always a number of initiatives on growing, and trying to build some level of data calibration and contextualization, and the whole idea there is to embrace this, and explain how you would benefit from the product approach, how you would benefit from new features that benefit your customers, but also how you need to speed this up, and you need to think about the approach in a holistic way, so that the data one you adopt will allow you to scale to new use cases, not just the ones you can think about today, but at the moment you would want to add video, images, point clouds, LiDAR. So, you will want the model to be able to handle this type of new data structures. That’s is always an interesting discussion.

There are number of competitors, the cloud vendors are democratizing of course access to these resources, so computing resources, manage services which rightfully so encourage the build-up of platforms on the clouds. Now, what we find is that most of the people that we meet are focusing on the end applications, and you’ve interviewed a few actually in this serial podcast, and there’s great value in building AI models in building applications that will drive the value. But in our perspective, we embrace this as a partnership, we have a very wide partnership network from incumbents like Siemens or Omni oil, to newcomers in the field, and again you have interviewed the uptake. We could mention a number of the AI-driven companies.

So, from the outset again, it looks like a completing offering, but when the customer understands the need for scale, and the need for recheck of system to drive the value from digitalization, or drive the value from this data, then it becomes a very different positioning between our offering, and what is currently in the market.

What are some of the challenges that you find as you go to market, in terms of implementing and realizing your vision? I’m speaking more broadly than just the cognate technology, and your customers; what do you see as current obstacles, or impediments, say roadblocks, chokepoints, whether they be technological organizational, not so much regulatory at this point; but as you look down the road to envision the full realization of your vision, and your customers vision, what are some of those challenges that are most near-term, and over the long term?

You mentioned organizational, and I think that’s one of the big challenges. Conceptually it’s about scaling transformation within your organization, by giving additional insights, augmenting the understanding of the operation so that you can optimize it, make it safer. This means that you organize yourself to be able to take these insights and act on it. It also means you should use these insights to change the way you have normally operated. So, when you’re driving such transformation, there’s a big need to make sure that together with the technology implementation, you have very-very strong buy-in from the operational arm of the company, and that there’s also a very strong vision from the leadership as to the company they want to become, and what that implies in terms of restructuring and accepting change on the way the processes are currently driven, on the way the operations are currently conducted.

So, when we engage with a customer, we always start by explaining that we love our technology, and we could speak for hours about it! But in order for them to succeed they have to make sure that they evolve their operation, and they have to make sure it becomes clear that this is more than just a technology implementation. So, we work very agile, we work very closely with operations in our projects, and it’s still very difficult, because you’re talking about a real organizational transformation, in order to reap the benefits of the type of insight we can provide.

The second challenge which is in a way a lesser challenge, but still present everywhere, is that you’re changing the perception of how technology has been deployed so far. So, most of our customers are very much on premise with traditional IoT, and the separation with OT (Operational Technology). We say you should actually combine this too if you really want to have a vision of what is happening, andyou should move the data to the cloud. So, you can imagine, that creates a number of regions when you make these statements, first of all you’re saying we’re looking at it as one dataset, describing the operation, not two separate silos, or melting two separate silos as it happens. And the second one is you have to accept that cloud will be your safest, most resilient, and most scalable solution in order to conduct what you are engaging with. This obviously raises a lot of concerns from the IT operation.

Yes, certainly the challenges of scalability, extensibility, and agility are unmatched when you’re looking to support them with cloud technology, you can really achieve everything.

I guess the challenges of course are orchestrating the interplay between edge computing, and sometimes limited bandwidth between the edges or the endpoints obviously, and then the centralized repositories where you can do the real heavy lifting. Again, we’ve got this evolving art and science of combining the historical with the real-time, and the edge with the cloud, but it sounds like you guys are very much in the thick of things.

Absolutely, and the edge is absolutely a requirement, but we work with customers that are operating close to the North and South Pole, but you still get decent enough satellite coverage to extract relevant data, you obviously don’t get as much data, but you do have enough even in these remote locations to operate. There are very few blackspots on the globe right now, and a lot of players including SpaceX are seeming to solve this.

We recently spoke with Fleet Space, which is developing these, I would say nano-satellite enhanced networks, wide area networks, and it’s really amazing the applications of the use cases that are now becoming possible. It sounds like where you guys sit is very close to the business value that’s coming from the application of data, or the extraction of data from the instrumentation of the physical world, which again I think is one of these long-term secular trends that is going to keep a lot of people very busy maybe in a few decades. So, that’s very exciting.

Looking forward, how do you see the market evolving, and talk about where you’re optimistic, and where you might have some concerns, or what keeps you up at night?

Cognite has been enjoying a tremendous growth, in our two-and-a-half-year existence we have over 250 people. That growth has been organic, funded by our revenue, so it’s been an incredible journey up to that point, and we don’t see this slowing down. We believe there’s a massive opportunity for a company that would be providing the type of services that we are, and by all means I’m sure there would be other players, and there are a lot of smart companies out there.

We see the adoption and interest across the industry, and we’re big drivers around energy, around manufacturing, where they have big challenges these days. There’s a question of course, when you’re operating under constrained cashflow, and if you see the macro picture slowing down, there are some predictions there of slowdown in several segments of the industrial world; then of course that would immediately impact our ability to convince our customers. On the other hand, that’s the time when they should invest, that’s the time when they need get an efficiency.

But we know, and I know by experience having worked in the media world, that even though they know this is the future, it’s very hard to change and not go the faster horses way as Harry Ford has been saying, it’s a little bit faster with what you already have. So, that of course could have a tremendous impact, all the changes happening currently, and all the tensions happening with the possibility of a trade war which would negatively impact Cognite, and our customer by reciprocation.

Another question mark is how much our company is going to be willing to share, because what we’re doing is, we’re liberating data, we’re allowing them to share and make use of it with partners, and to create value. Sometimes that message is understood on an intellectual level but translating that into action is very difficult for some of these industrial players. And if you do not accept to share and open to an ecosystem that can help you create value, then effectively the value of the platform is going to be a lot less. So, a lot of what we are proposing is based on the fact that data sharing… it doesn’t mean sharing with everybody obviously but accepting to make the data available for your partners and your suppliers, in order to gain more insight, is critical to the success of a company like Cognite.

Yes, I think what you touch on also impacts the value of an opensource ecosystem, and whilst sharing data isn’t necessarily analogous directly to an opensource project, I think the concepts are very similar, that you do have people who are willing to share what in the past might have been thought as proprietary work, proprietary data; but what you’re able to share in aggregate results in a better experience for everybody, it’s certainly become the prevailing development model in a lot of areas of software right now. I think what you’re arguing for as well may take a little bit longer to materialize, but its compelling, nonetheless.

Francois, it’s been fascinating talking about you and your background, your perspectives, and learning a bit more about Cognite. I would love to get from you any recommendations or resources that we might be able to share with our listeners?

There is one resource I can think of from the get-go. Given the journey that we are experiencing, and the growth that Cognite has been enjoying, and yet our drive to bring impact and to be meaningful as a company, How Google Works by Eric Schmidt, the 2014 book on that inroad has been really very useful reading for us as we have embarked on that journey, and a lot of other readings on scaling, and hyperscale of companies. So, I would warmly recommend it as a tool, as a source of discussion, not necessarily as a truth in itself, but as something that we have used in the company, and very openly discussed together with the rest of my colleagues.

Fantastic, that’s a terrific recommendation. We hadn’t heard of that one before, so looking forward to sharing that, and diving into it as well.

This runs us up to the end of our time here, but again this has been Ed Maguire, Insights Partner at Momenta, and we’ve been speaking with Francois Laborie of Cognite, the President of Cognite in North America. We appreciate your listening, and if you have any questions please feel free to reach out.

Thank you again Francois.

Thank you Ed, it’s great to be here. Thank you very much.

 

[End]