Evolution into the Gas Digital Twin | SSP iLLUMINATE 2020
Does it take you weeks to gather yearly PHMSA data and submissions? Can your GIS provide all this information to you instead? MLGW now as this information at their fingertips at a moment’s notice. SSP Innovations has partnered with MLGW to support the migration of their gas transmission data into UPDM and the implementation of Esri’s ArcGIS Pipeline Referencing toolset. This process included gathering records on pipeline data going back as far as 70 years ago. The team digitized information and migrated data to the standard Esri model for continual maintenance, risk analysis, data gaps, and regulatory functions. Upon establishing the best-known representation of the gas environment (the digital twin), SSP and MLGW created a custom path that pulls all the required data together into the summarized PHMSA dashboard based on any delta/edits. In doing so, the entire organization can consume its regulatory snapshot throughout the year, eliminating weeks of data gathering prior to the yearly submission every March. The presentation will overview the entire project as well as our final outputs.
00:00 thank you all for joining um and welcome to today's illuminate webinar um today's topic is evolution into theTranscript
00:09 gas digital twin before we begin we wanted to let you know this webinar
00:14 will run approximately 30 minutes and we will have time on the back end for questions
00:23 your speakers today are clark wiley and russell webb
00:28 and clark is the director of pipeline here at ssp um he's been working in gis pipeline
00:37 software solutions integrity management and data management for over 15 years
00:42 we're also joined by russell webb and he is the supervisor of gas system integration at mlgw his primary
00:50 responsibilities for the tech past 10 years have been transmission integration management
00:56 and distribution integration management so just a note um we will be conducting
01:02 a formal q a session uh following this webinar so if you do have questions please use the question icon in the go
01:10 to webinar panel and then finally a recorded version of
01:15 this webinar will be made available for everyone at a later date so with that i'm going to go ahead and
01:22 pass it over to clark here thanks carrie and thanks everyone for joining today we
01:29 obviously wish we were in person but it's going to have to do this year so welcome to my home office and
01:35 hopefully you enjoy the presentation i'll be presenting today and russ will be joining us after the presentation for
01:41 question and answers so please hang on to those for us let's start with a bit of an overview on
01:47 memphis like gas and water or mlgw they deliver three commodities electric
01:53 gas and water to the citizens of shelby county more specifically memphis tennessee
01:58 on the gas side they have both transmission high pressure distribution and distribution assets for transmission
02:04 they have over 200 miles of regulated assets fed from seven transmission gait stations as well as over 62
02:12 regulator stations 200 plus farm tabs and ultimately we manage these as 22
02:18 linearly reference routes this is how we can report off of or set different inspections in our intervals
02:24 and how we establish kind of the baseline moving into that future state and that system of record
02:31 on the distribution side they provide over 300 000 customers natural gas within shelby county as well as over
02:38 four thousand miles of maine they deliver nearly 40 billion cubic feet each year
02:44 so a substantial size and a great customer of ssp we've been working with them a long time
02:49 over many upgrades many technology implementations and and the work that we'll walk through today
02:56 so when we think about gis gis really has evolved over time and it started as a a spatial map making
03:04 tool that has evolved into some more simple tracing network management analytics but with
03:10 the advancement and advancement advancement within the esri technology as well as
03:16 computing power we have been able to utilize gis to not only just manage it as as
03:22 editing or managing your network or changes but really understanding gis as your system
03:29 of record and using it as such when we're working with not only mlgw but a lot of
03:34 our large customers and and we're looking at upgrades and utility network implementations and
03:41 major system of record advancements a lot of the time the first thing that we ask is what
03:46 is your system of record is it asset management today is it gis is a combination of both and where do we
03:53 want it in the future it's a big key to understanding and being able to have a successful implementation of your
04:00 gis and technology there's some great esri slides that help visualize that concept but as we
04:08 look at the system of record up here in our top left corner and being able to incorporate that into the system of
04:14 insight being able to use that system of record as not only a data repository but being able to analyze and understand where our
04:21 weak points are where our highest risk of our system is where low pressure where we might have gaps in information
04:27 where we can do some advanced analytics in the same time looking at is a system of of engagement
04:33 where do our users and our organizations and even the public engage with our information internally is that
04:39 through a web app is that through desktop applications through our field crews what mobility tools do we use where are we using paper
04:46 today and where can we utilize technology and digital formats in the future
04:51 so understanding the complete gis as a whole is very important because when we reference a digital twin that is driven
04:57 off the system of record but is also driven off of the system of insight and system of engagement
05:03 a twin could be driven off of a piece of paper that's entered into gis but if someone doesn't have a way to
05:08 interact with that then that true digital twin does not exist so in my mind that digital twin
05:14 encompasses everything through that network of location as well as all the variables within the gis
05:20 as it and the technology on top of it gis obviously starts with location and
05:26 understanding your gas network is driven through location the connectivity
05:32 any real-time information that we want to start looking at into our gis asset information customer and bringing
05:39 all of that together when we are starting to evaluate our needs and not only looking at regulatory drivers but
05:46 needs of implementing a complete system of record in gis we need to kind of bring that back a
05:52 little bit and understand where can we gather all of our information together to support that digital twinning system
05:58 of record because it's your crews out in the field it's the mobility and data gathering it's the customers and the information
06:04 there because upstream of that we can look towards outage management and tracing of our network and managing the continual maintenance
06:11 and as building and maintenance records that come into our gis so it's really the complete picture
06:17 of your entire network through location through the what through the where the how when and why
06:23 and those are the questions that we want to answer through our gis so a complete gis is driven based off of
06:30 again the customers the overall asset management the safety network management and innovation
06:37 so when we sat down with mlgw we wanted to understand what do you consider your digital twin
06:43 and where do we want to get with your particular project because if we just have the customer information but we don't have service
06:49 pipe to connect to the main to be able to trace to where we understand customers uh may be affected
06:56 how is how is that going to be truly a system or record or digital twin if we have a point for a
07:01 regulator station but don't represent all the assets how are we going to send crews out there to inspect individual
07:07 assets within that regular station or even trace through that if we're looking at gaps within our
07:13 transmission or mains to calculate maop how we can import that data and be able to to
07:19 manage that effectively so it's customers it's asset management and ultimately it's safety of your
07:24 system being able to safely deliver natural gas is the utmost utmost importance and let's leverage the
07:31 data let's leverage the technology and let's leverage the work that our fuel crews do to be more efficient
07:37 but also safely delivering the gas being the utmost importance
07:42 so if we look at this mlgw and and ssp we came together to focus on some of
07:47 these aspects and looked at what do we have today and where do we want to go tomorrow and again this is an important point to
07:55 to representing your gis because a true digital twin ultimately might have to be 3d or have a
08:01 lot of information that has to be gathered over time so what is our digital twin today and then
08:06 where do we want to get there tomorrow is it looking and saying i want to replace every single
08:11 paper process and make that digital that's a great step what are my asset management use cases
08:18 from gis or asset management those bi-directional data sharing points
08:23 the round-trip use case of that asset management and that individual stick a pipe or junction or device when and where
08:31 should data be entered when and where is it entered today is that entered back in the office after we get a paper form today but we want to
08:38 enter that in the field how much do we want to put out along the field where do we want to be barcoding
08:44 information into our devices in the field in the ditch how are we bringing that data back in
08:49 who needs access and how does it support their job how can we shorten timelines within your
08:56 organization from posting data to sending out new inspections to to work management
09:01 time savings equals money saving so who needs access and how does it support their job
09:06 the president should access a read-only dashboard that understands where our work crews are the number of miles of our assets and
09:13 nice dashboards but our integrity manager wants to see exactly when our immediates have been repaired or
09:20 180-day anomalies are scheduled for for a review or bell hole inspection
09:26 or understanding when that next direct assessment or ili is in our system so who needs access and
09:33 what do they need access to is very important and then ultimately what level of detail
09:38 do we need to gather to create a true system of record in the office
09:44 versus the field and versus reporting where do those need to exist how do we structure the data model how do we
09:49 structure the gis and how do we optimize that so that gets us into our digital twin
09:56 and what does that really mean to you right our advancement in technology has
10:01 enabled us to expect information on our devices if i want to look up a local restaurant
10:07 i can see their reviews i can see directions to that particular establishment i can pull up pictures of individual
10:16 things on the menu to that to see if i i like that in that dish or i might like that dish or
10:21 not why should we not have a similar experience within our gis and our asset repositories at any point
10:28 in time i should be able to click on an asset i should be able to see the historical information around it when
10:34 was that place what was the original work order order who was the original contract team when was the last
10:40 inspection pictures of that and ultimately directions to that location how does that interact
10:45 interact with my system and being able to understand and have that connectivity and the experience that we expect on a
10:52 day-to-day basis with our gas assets it's a user expectations and we have the it
10:58 modernization to do that technology is far outpacing our system knowledge
11:03 we have the ability to do great things through ai and advanced analytics and the computing
11:08 power there's so many things we can do on the cloud there's so many things we can do with predictive analytics
11:14 but we don't even have the baseline to understand what our gas network is or what assets are out there or even
11:20 exactly where they are sometimes they're on the wrong side of the street in our gis and we need to locate those and
11:25 understand what they are if we have gaps in information from our records we need to understand that we need to go out to do
11:31 non-destructive testing or understand we need to take that offline and redo a pressure test to be able to
11:37 get the actual information and understand our true attribution details of our
11:43 system to be able to support additional maximum allowable operating pressure but also an understanding and
11:49 access of our data that availability so what is your digital twin
11:55 what do you need to do to be able to support that it might just be this half of the pie
12:02 and that's where we looked at it for mlgw let's look at the future innovations and let's look at that future network
12:08 management a lot of that's still in gas control a lot of that's just through uh network pressure modeling but if we get
12:14 this top quadrant of this pi then we're going to be in great shape we're going to be automated regulatory reporting we're
12:21 going to be automating our mpms we're going to understand asset performance as we get into advanced metering being
12:27 able to consume that information shortening the design and planning and as building timelines but also getting a
12:34 complete understanding of every historical record and tying that to our gis the what
12:39 where when how and why so your digital twin it's that system
12:46 knowledge it's the data accuracy it's going to the vaults it's going to those locations it's gathering
12:52 the the documents it's scanning those in it's taking the effort to be able to get all of that
12:57 information aligned to your gis create new features or assets in gis if
13:03 you need to to get a baseline that you can then grow into the gis into the advanced analytics into the
13:10 additional network management and really start to drive key decisions within your gis but again without that
13:16 system of record without your digital twin we can't get there and we'll have defaults in place for different
13:23 analytics or we'll have gaps in our data or gaps in connectivity and our inability to really grow into
13:29 something like the utility network because of our inability to to fill those gaps so it starts today it
13:35 starts filling in that data to create your digital twin where is that data where is that
13:40 document and and making that available to the end user and ultimately having the the
13:47 insight and ability to project out to what are continual maintenance what does our design process look like what do we want
13:53 mobility to be because that's a key key part in understanding how we structure your gis
13:59 how we incorporate data and how we make it available to the end user
14:04 again technology is outpacing our system knowledge we have to be able to catch up so what
14:09 is your digital twin and where do you want to get it for mlgw we started at the beginning
14:17 where is all of our information and what do we have today the legacy gis and data and workflows
14:23 and we walked through and whiteboarded we got all the different groups together but it was a combined gas distribution and gas
14:30 transmission so we need to linear reference our assets uh we were utilizing arc fm to manage
14:35 transmission and distribution networks and there was a lot of paper documents
14:41 there was manual reporting there was laser fish there was vaults there was a lot of locations where
14:46 where mlgw needed to scan and create major initiatives to be able to support that data gathering and understanding of
14:53 their system and they started this a long time ago a lot sooner than most actually and and
14:59 were able to get through this but it was it was a significant effort and and they had some
15:04 ability to project out and say we're going to do this today we're going to go through the pain to get that done to understand
15:09 everything we have to be able to operate our system with the knowledge that we need and then advance with the technology because we
15:16 have that knowledge some of the project goals from the team was a validation and collection of all
15:21 the fins the data input so looking at our films of reporting and saying where where do we have any gaps where we have
15:27 unknowns where do we have the tvc a permanent implementation of the maop record uh
15:33 verification procedure so establishing technology and workflows for the future state
15:38 after gathering all of the historical records to get that baseline establishing the gis is the main
15:44 repository for access previously we'd have vaults or or um cabinets or a work order number
15:52 where users would have to go and find the work order find the document find the information but having gis has
15:57 that repository and true system of record to link that information and make that available
16:03 and ultimately establishing where can gis support their overall data needs and starting to think outside of just the
16:09 data gathering but implementing that and making it available to other users the field etcetera
16:17 we had to first locate all of the maop related documents and mlgw went through a major process to to
16:23 go through that again ensure that they were all scanned index categorized what is it is it a pressure test charge is that
16:29 is a main card is a service record all of that information capturing the malp related information
16:35 linking that to individual assets within the gis tying that not only document and location
16:41 within the document management but potentially creating new features along the way where there was potentially a horizontal directional
16:47 drill under a road and or a casing removed and updates that needed to be done gis
16:52 so it was it was a multi-step process to not only get all the information in but then update the gis with the best
16:59 representation all of that was then used to confirm the maop right if there were any gaps in
17:06 that there were field validations done if they were not then we were able to calculate the malp and the original
17:11 design and the operating pressure and understanding to ultimately populate that fun part q on the femzer report from a
17:19 senior management side they often have some other goals but it was a value added to the company being
17:24 able to minimize time from what the work order is what the asset to go find it to send it out
17:30 to the field to have the documentation handing paper but really access to the data and
17:36 streamlining those workflows that creates time savings which ultimately in all the data we were able
17:42 to validate the maop we can understand what we had today if there were any gaps
17:48 understanding that that what we were running our pressure at through the historical documents and everything that was in our gis
17:54 and ultimately creating that enterprise system repository that go to where whether it's gas engineering or or
18:00 work management or scheduling or folks on the asset management side they knew where to go for that
18:06 information and that was gis so it really started about where's the
18:12 information and there were ultimately 230 locations that we found and those range from different service
18:18 centers to store rooms offices cubicles um you know they it was just everywhere
18:24 and and we had to gather those and mlgw went through that process to gather those scandals
18:30 and then got those entered into gis that included paper maps historical laser fish
18:36 film spool view data some gis some some personal geo databases file geodatabases some enterprise gis
18:44 mainframe right paper records all of that was included and all that was accounted for
18:50 and then 15 different type of types of records too so when we went through and looked at all the different metadata
18:56 we defined those and set the different record and document types so that it could be tied to gis what
19:03 individual attributes was that verifying because on the verifiable traceable and complete we needed to say
19:09 this mtr is verifying my nominal diameter my wall thickness the the grade and
19:16 information around that so we not only had the type but what it was verified by and
19:22 what it was verifying so down to the attribute level so that range asset or individual point
19:29 asset within our gas network was tied to a document and then tied to what did it verify and so
19:35 there's a system and ranking that needed to be in place to be able to support this did an mtr
19:41 win over additional documents or if i did not have an mtr what two documents would you need to be
19:47 able to say this is verifiable traceable complete and reliable and that's important because it's ultimately reported to the
19:54 government and and through our regulatory requirements and and mao meop validation to say that these are
20:01 valid inputs and we know that because it's a historical dock it's not necessarily from a notepad or
20:06 from engineer's mind to know hey this is what we've always run it at but this is tied to an actual po and order
20:14 information that we have from our historical system so that we've located the line and we have all the information
20:19 to it and we're comfortable operating the gas at that pressure establishing
20:24 that maop building up the system of record building up those representations through these documents
20:31 and records drives so many different things it drives that pressure we can run the system at
20:36 we might have to lower it which could affect uh profitability within your organization because we don't necessarily have a a
20:43 record or we need to re-pressure test or we found out the design pressure there's a limiting factor on a particular
20:49 fitting and can't run it at what we're running it at there's also the subsequent analytics
20:55 being able to look at our potential impact radius for our hca and understanding what that pressure is
21:01 and the different diameters so any of those changes from the location of the line shifting to the attributes drives hcas
21:08 drives changes in hd size changes in inspections over time and how we manage those the same thing with dot class and our
21:16 placement of the line and understanding of uh that information obviously that also drives
21:22 when it was placed in our risk our likelihood of failure probabilistic models relative models and how we can
21:28 incorporate the the likelihood of failure and also the consequence of failure in those
21:34 areas when do we expect failures based off of all of the information across our system
21:40 and being able to analyze that and where can we start to make strategic and educated decisions
21:45 about replacement within our system versus just replacing certain areas at a certain time we can
21:51 have all the information to understand what what we have where is it
21:56 advanced analytics and ultimately being able to make educated and and wise decisions with our capital
22:04 dollars to support our our infrastructure and where we want it replaced
22:10 so some of the project achievements we really were able to advance the arcgis enterprise we were on a legacy version
22:17 of arcgis so there were some upgrades there this enabled that web enablement and access to our system as as all of the
22:24 data was gathered in we focused on the utility pipeline data model and started with the transmission
22:30 assets first those were first to go as we were able to linearly reference those introduce arcgis pro some of the change
22:37 management some of the web some of the reporting some of the weather web app builder while mlgw continued to
22:43 support the growth of the gis and additional tools around that to
22:48 manage the entire gas assets so the intent is through a subsequent phase where we're going to
22:54 merge in the distribution into un uh but really focused on uh the the transmission
22:59 side and data gathering we took all of the records uh to support mlgw in
23:06 the network but focused on getting uh the transmission side up into the the updm model
23:12 the automated regulatory reporting was key that was taking uh uh us four to six weeks based off of uh
23:19 pulling all together all that information and now it's reported through an automated dashboard
23:25 right so throughout the year uh engineers and and the integrity management team can work with the gis
23:31 and gis analysts to say circle different areas or annotate or identify areas where they see gaps based
23:38 off the regulatory reporting so you can work top down it's a substantial shift from working from the gis
23:44 up and then being able having to match those different parts and mileages together at the end it's often off a
23:51 little bit you might not be able to submit it because of those delta differences so if we work from the report
23:56 down and we say this looks like a gap or why is this not verified and being able to drill down with those
24:02 individual areas and working with the analyst and working with the document management and asset management teams
24:08 to be able to support that hey where is this and what do we have so it was it was a nice shift to be able to be
24:14 working on both ends of it to work towards the holistic system of record and gis repository
24:21 almost 400 000 records were scanned verified reviewed attributed and linked into the gis so a
24:28 massive undertaking with a lot of streamlined workflows that were addressed or some that are planned
24:33 for future right but a lot were addressed in terms of how do we gather data how do we attach new documents
24:39 how do we design an as built those are big questions and those are big questions as you look at upgrading your system and creating
24:46 this repository this digital twin this this data access point it transforms
24:51 your business and you have to be able to look at those different variables and understand what those are we were able to to validate maop across
24:58 the system we were doing that for years and calculating that through through spreadsheets and
25:03 and understanding what pressures they were running at and doing that well below design but having the actual
25:09 documents tied into gis and running the calculations and using those different variables
25:14 makes it much easier and then we can see the results we can visualize those and we can understand in context
25:20 with those documents how those work together and then ultimately that enterprise system repository and
25:26 building that out and how we utilize that today and and being able to
25:31 to support uh the the the gis and access to it moving forward
25:39 so what were the next steps and what are the next steps we focused on mobility and continual
25:44 collection in the field we looked at the utility network now that we've gathered all that information and gotten to this point we can say
25:50 what are our engineering practices and how does that fit with our base un rules right with that intelligent network coming we
25:58 have the ability now to look at engineering rules and how that matches our data gathering and data editing and
26:03 data workflows there are a lot of rules to validate that network topology and we see a lot of
26:08 times in these projects that that's a big gap in the thing that we need to look at so having gathered all the data we have
26:15 a great baseline to now understand what do we need to have a successful u n implementation
26:21 and where do we potentially have any further gaps maybe maybe there's a reducer that didn't come across uh through our document gathering
26:29 that we need and and not a huge deal in our geometric network but we would need that transition within our utility network
26:35 so we can now evolve and start to look at our next few our next future state and see where do
26:41 we need to prepare where do we need to potentially update current editing workflows to support that
26:47 next move station representations right so these were points within mlgw system
26:53 we gathered all the documents we we pulled in that information and now we're looking at how we best represent
26:59 that through 3d not sure if you saw a previous uh presentation on ai and machine learning well there's
27:06 opportunities there to be able to gather from regulator station pictures and represent those in a nice way within
27:11 that future state so where do we continue to collect that information to support the documents and data that
27:17 we have from this type of effort and then continual improvements and adoption of gis as we were able to
27:24 get the enterprise arches enterprise out we now have many many more users
27:29 opening gis and understanding where they need to open gis is that through the web
27:34 is that through a web app is that through their mobile devices or viewers and understanding what your true digital
27:40 twin is and how you represent it in your organization and within your asset repository
27:47 no one organization's digital twin is going to be the same right so understand where you need to go
27:53 and where you need to take it for your end users to support knowledge of your system
27:59 the asset representation the what where when how and why something was entered
28:06 so if we go back to that complete gis take a step back and as you look to build up your asset
28:12 repository i know there's a lot of work to do but don't just look at it as a document
28:18 gathering document linking gis entry into the system of record
28:23 take the opportunity to be able to evaluate where you want to go what is your digital twin and how can
28:29 you make that successful again thanks everyone for joining today hopefully this was valuable and
28:35 understanding where we like to take these types of projects and understanding that yeah it's not only gathering data
28:42 but it is really evaluating your business process evaluating how you're managing data
28:47 evaluating how users access that information hopefully you gleaned a little information from us
28:52 today and with that we'll open it up for questions and answers
28:59 okay thank you for that um that was good now we have um we do have a few questions already and i
29:06 know russ is joining us now so um if you guys want to make sure your
29:13 phones are off mute and we'll start with the first question so how long did it take
29:20 to scan all your records and i don't know which one of you want
29:25 to answer that one yeah russ you want to take that one
29:35 yeah i'll start i think russ is uh oh there we go i think he's coming back in here hey yeah
29:43 did i miss the question where are we
29:48 um the last question was uh how long did it take to scan the all the records that we had just
29:54 discussed through the presentation yeah i noticed that uh there was some background information
29:59 in the presentation about just the different locations we had to go to things like that so keep in mind that
30:06 i'm including uh all of that work into the effort associated i was looking back
30:14 to some uh other documents just on the project for you know this has been
30:19 some years ago that we actually kicked this off but one of the first things mlgw did was
30:25 went out and rounded up 40 retirees that would be familiar with our documentation types
30:33 you know where is it available because i know one of the slides showed that there were several different
30:38 formats that we'd be going through and so of those 40 people
30:45 i saw one document that said that they assumed that they touched two million documents to sift out
30:52 the four hundred thousand that ultimately needed to be scanned in uh so in touching those you're giving
31:00 them a unique uh name so that you know you have a file name that ultimately can become a directory you
31:06 can sort through and then attach to gis is what we've
31:12 done here and uh so those 40 people worked a solid six months uh
31:20 and then at one point when i fully took over the project i bumped it up with another 30 of
31:28 data entry employees um just specifically pulling out anything that
31:34 wasn't readily accessible or you know object character recognition or that kind of stuff that they needed to type
31:42 back in um so at one point we had upwards of 70 people working on this project
31:48 uh and overall that lasted probably a year but we really once we hit that
31:55 peak and started using some data entry professionals we really started driving down the number
32:01 of retirees and then ultimately would cut those data entry people uh almost weekly
32:09 as we really started getting the plane on the ground or being comfortable with where we were
32:15 with uh the format of all of our documents so that's a very long answer but
32:22 uh you're talking over a million dollars invested by mlgw just
32:28 right out of the gate basically in response to the advisory bulletin that come out originally
32:37 awesome thanks russ all right next question what was the hardest part integrating
32:42 the records information into gis um okay yeah ross i can take
32:49 that one um so as as russ described and going through all the documents and data
32:55 gathering there was a a nice system and user interface that mlgw was using to enter the
33:01 information and that would catalog the metadata of the record
33:06 and the attribution for the maop uh reconfirmation as well as tbc
33:13 the hardest part was establishing that spatial location especially for areas
33:18 that might not have a start and end station value on our transmission assets and essentially conflating that between
33:27 the legacy gis through the upgrade and getting that into the correct spatial
33:32 location on the line based off of the best known information so that was really kind of
33:37 our biggest uh task at hand and then also being in the structuring the the records based
33:45 off of if we wanted to tie it to an individual asset versus tying it to the document range so
33:52 an example might be an mtr a material test record where uh
33:57 that's validating certain contributes that we wanted to tie that pipeline line
34:06 versus those were some of our challenges that we had um but certainly work through those and
34:12 work with the engineers to get our best placement and and get that spatialized and and accurately represented in the gis
34:22 okay thanks clark um okay let's see what is your definition of digital
34:28 twin okay yeah and i i spoke of that uh a bit
34:35 through the presentation and and it's different for each organization and that's where we want to you do
34:40 to evaluate that but um the the way i look at a digital twin it's it's different for different commodities and
34:46 different uh differently represented in that sense but for gas it's important to understand where is
34:53 that asset what is that asset when was it placed in the ground and the historical legacy information so
34:60 if we think about a digital twin let's think about it for today's purposes and regulatory requirements around
35:06 traceable verifiable complete i think a digital twin of the future is a bit different because that's a 3d representation
35:13 potentially for augmented reality and new tools um so in in those scenarios we have to
35:19 evaluate what is your digital twin today right and it's accessing the data
35:24 all of your historical information and the best known information that you have within your system
35:30 utilizing the gis as that system of record thanks clark all right next question
35:37 which version of the updm model was used upm 2016 or upm updm 2018.
35:45 yeah we we started with 2016 uh as the data collection was going and
35:50 have continued to evolve based off of the changes uh released within uh each subsequent
35:57 model uh so even with the latest 2020 release and some future uh consolidation um efforts within mlgw
36:06 uh with legacy editing tools um on the distribution side in the future um we are looking to add
36:14 some of the changes within 20. so we evolved with that as those that model changes but but
36:19 again those updm uh data models are templates uh and they're made to extend to support uh
36:26 the the data needs of your organization and so we did that and certainly evolved with the model where uh additional
36:33 regulatory domain updates or astm standards came into play and and supported that as well
36:42 okay thanks for that let's see if i can get this is a long one are all of the
36:48 scanned documents stored in the geo database or did you employ a concept of integra
36:55 integrated external records management system if esri how did you if it was esri i
37:01 think what they're trying to say how did you gain buy-in from legal compliance
37:07 slash risk as this may appear to relinquish control from their standpoint so i'll answer
37:16 yep yeah i'll answer from the how and then uh russ i'll toss it over to
37:21 you in terms of continual records management but uh we went with the approach that we did
37:27 not add them as attachments in the enterprise geodatabase
37:32 the reason we went that direction was due to the size of a lot of these images and and a
37:39 existing uh data catalog we went through the process
37:44 to link those and so that they're now available through the web applications and linked and and verified down to the
37:52 attribute level so we went through that linking you have to be on the internal mlgw environment to pull up those links
37:60 but that was our process to not reinvent the the document management system but instead
38:06 to link and validate um and then potentially russ if you have any thoughts on kind of the the legal or
38:13 risk or continual storage of those anything bad there
38:18 i'm not a hundred percent that i'm following but like all of this is on our servers so we're
38:25 not too worried about it but uh continual records management i kind of took that as
38:31 you know during this project we identified the critical document that we want to capture and then we came
38:38 up with a naming convention for each so as an engineer is working on a design
38:44 we let know that at the end of the project we're going to want that these documents back and we've got
38:50 someone who will uh you know create file names and that and put it out there on that same
39:01 all right thanks guys next question do you have ili data attached
39:09 uh we don't do a lot of mlgw we've actually done
39:15 one run and that was just in the last couple of years uh on a new line that we were putting in
39:22 pba and we haven't got to the point where we
39:27 want to put that in the web map i see personally a lot of value there
39:32 um that's kind of one of our goals even as this project continues to make our
39:39 system more applicable we did go back and get all of our direct information
39:47 uh which you can imagine is thousands of pages of documents as well and attached them in their location
39:56 made that available to engineers moving forward uh just you know it's an opportunity to
40:02 get eyes on the pipe themselves with all the pictures that are taken and that kind of information
40:08 um for the ecda we we are there as far as incorporating
40:13 temp information uh but we just don't do enough ili right now to prioritize that
40:23 yet okay thanks for that um what
40:28 do you consider done in quotations oh that's a good follow-up to uh
40:36 the answer i just gave because we want to get our system pickable
40:43 this is just really the first stone of understanding you know
40:50 one of the key terms in a lot of the integrity management programs is you you don't always know
40:57 what you don't know and that's one of the toughest things to figure out is where are gaps and going about this in this manner
41:06 has really given us that opportunity to run kind of a gap analysis map of where our locations where we are
41:14 lacking on the data that we've determined to be necessary um so that'll be our
41:21 next step obviously you know over the next seven to ten years we'll be addressing any issues luckily
41:29 we didn't have a lot so it seems very manageable now that we've got this
41:35 project under our belt but this project was an undertaking but couldn't be more
41:43 proud of where we're at and the ability to to see where we need more data
41:48 if that equates to an additional direct assessment or ultimately uh you know abandoning
41:55 some pipe and doing an offset or something like that but that that's where we're headed seven
42:02 years seven to ten years and then hopefully along that same time frame we can
42:08 get some of these lines more applicable we're looking at converting some lines to distribution um
42:17 so all of that is kind of in the mix as to moving towards what we
42:24 hope would be done but i always have more of a continuous improvement mentality uh as
42:30 if there is no finish line you just keep pushing forward getting better uh prioritizing
42:37 what your next good bite of the apple would be so it's i don't really have a date
42:46 out there that i think we will be done but obviously in response to this uh the next seven to
42:53 ten years thank you russ all right next question another long one okay
42:58 how many of the workflows were built into esri technology versus an external work or asset management
43:04 system if external asset management will you be willing to share which system mlgw utilizes so
43:12 i'll answer from the gis and the the project standpoint um the the work that we described today
43:20 was being done at the same time that enterprise asset management
43:25 implementation was occurring and so through that effort it was a
43:32 really a collaborative approach to understand what the gis was to store
43:40 as the system of record from these documents um versus what asset management
43:47 was going to take from work order management
43:53 and and the capital improvement projects and then what we needed back from the eam
43:58 system if any um our our main focus here was getting the system
44:04 to a point where we had all legacy information and all documents tied to it
44:09 to then continue uh to update the um editing and continual maintenance
44:15 workflows with the esri tools so um the the part the first part of that question
44:21 around the esri editing workflow is we're utilizing the arcgis pipeline
44:27 referencing tools um and so now when those work packages come back in
44:32 uh that are that are often driven from the eam work management and asset management
44:39 side those are then entered in through the new workflows
44:44 while linking and indicating which attributes have been validated
44:50 through the documents that are being linked in the updated enterprise system so all
44:56 that's entered into that now um while also linking to the to the document management okay thanks
45:03 for that um we have quite a few questions here i hope we can get to all of them but
45:09 let's see do you have access to this in the field oh that's a good one we we are not
45:17 currently accessing the web map from the field but mlgw is definitely moving quickly
45:25 towards having that capability on every truck uh at which point it's just a matter of
45:32 getting everybody the link and getting them hands-on experience using
45:37 it it is definitely doable but i think our
45:42 primary focus was to gain the knowledge we needed on any
45:48 missing information missing data so that we could get out there to those locations and gather that data
45:55 so that's where we've been so far the design engineers have really taken it on in the last year and
46:02 embraced it uh clark helped us put together kind of a training manual
46:07 uh it's been very nice to be able to kick that over to them and give them the link to the web map
46:14 and let them go it's just mind-blowing to have all of this information at your fingertips
46:22 compared to where we were just a few years ago thanks russ all right next up
46:29 how did you get buy-in from senior management for this type of project
46:36 yeah so originally a mem's advisory bulletin had come out i wasn't on the project initially
46:44 they had a senior engineer over me that was handling it he was the
46:52 design engineering supervisor uh but he was very familiar with the
46:59 retirees that he wanted to call and bring back in and they just kind of used him to uh
47:05 build that team it initially uh for for those that are
47:13 involved in temp you you know this thing has been drug out for several years initially
47:20 when it dropped i think it was something like uh you have 12 months to do part of it and
47:28 you know 16 months 18 months to do uh maybe the part q so he was handling
47:36 it then so i wasn't heavily involved in getting the buy-in but we didn't we didn't see a lot of options
47:42 it looks like you either do something or you do nothing and when we decide to do
47:48 something we certainly take it on and that i don't believe that there was a
47:56 lot of trouble getting by from the top okay um if you have any questions please
48:04 use the question icon it looks like we have one more here unless more come in um
48:11 are you using wait are you are there any other areas you're using the esri product
48:18 uh yeah well we definitely have uh rolled out an app for leak survey
48:25 um that's been very um well received from those inspectors uh we've got
48:32 uh valve inspectors working with an app for valve maintenance um we're well on our way to having
48:40 corrosion control surveyors and technicians um interfacing with their software
48:47 through esri apps and then we'll move on to
48:52 some of the other gas operations groups from there uh corrosion control is just one of the
48:58 groups that was really stuck in the past as far as how they manage their documents and those kind of
49:05 things so we wanted to get them automated i've actually got and i guess this is kind of bleeding over into my
49:11 depth responsibilities as opposed to my temp but i've got a large project going
49:16 on we're calling gas records automation uh and really a lot of those ideas were
49:22 spawned through this effort uh with clark just kind of walking
49:28 walking through the wilderness with him uh i've learned quite a bit about how we can apply some of these
49:34 tools and it it's served very well for the needs of a
49:40 gas operator yeah and and to add to that uh i think one of
49:46 the things that we saw as arcgis enterprise continued to evolve was our ability to
49:53 use some of the web tools like operations dashboard
49:59 uh and so what we've been able to do is uh create a dashboard that represents
50:07 the femsa report uh and and what that allows us to do is based off of any edits within uh the
50:15 updm uh system it's it's updated to support
50:21 those mileage and all the different parts um outside of the inspection ones as we
50:26 indicated but all the parts just think about part q and the complexity of that um and so what what we saw was we
50:34 we got individuals who were weren't necessarily used to getting into heavy
50:41 gis desktop applications but were were using a url link on their desktop
50:47 to see the latest and greatest reporting values and and in those cases they're able to
50:53 easily identify uh things that might stick out and then work with the gis editing team
50:59 to start to dive in a little bit deeper so it just really started to expand once people got their hands on
51:05 the information and started to ask where certain things were or or updates and and we've just really seen
51:12 that organic growth of it all right thank you guys all right this may be our last one i'm not sure
51:18 are those all using survey123 or collector or both or maybe something else which is esri
51:24 based i believe it's both i i don't have off
51:29 the top of my head which group is using what but
51:34 i i feel like one went with collector and one went with survey yep that was it
51:42 and then uh there's for for viewing um mlgw
51:49 from a gis standpoint is is move into a uh enterprise runtime uh application
51:56 that actually ssps mems uh for for field viewing uh additionally so
52:02 there's some there's some tools going out there uh side by side with some of the base esri
52:08 uh implementation and inspections referenced by russell okay thanks for that that looks like it
52:15 that's it for our all of our questions thank you for joining us um and what we want to
52:21 invite you to the next webinar which is our final digital illuminate webinar
52:26 um titled tnd vegetation management with life cycle and mims this will be presented
52:33 by john feldkirk and ian martin on october 1st so i encourage you guys
52:40 to sign up at sspilluminate.com and if you have any additional questions
52:46 let us know and we can get answer have either russ or clark answer those
52:52 through email thank you so much