In her brilliantly insightful post from December 2014 (An All Edits and State Zero Combo Platter), Ms. Dennise Ramirez discussed a significant GIS data improvement project that SSP and its partner RAMTeCH were about to begin at a large utility. The twist: the data improvements made could not adveresly impact the utility’s day-to-day data management operations.
Dennise’s article laid out the “what.” This article — more than a year in the making — will lay out the “how”. It will show the “how” behind how the design was implemented. To keep from overwhelming the reader, we will break this out into a multi-part series. This month, we’ll look at the overall process for a particular delivery. Next month, we’ll talk about some lessons learned.
The purpose of this project was to improve data quality. Specifically, it was undertaken to reduce excavation damage from incomplete records information and achieve improvement in the quality and content of data. The scope of the project was huge: 28 separate delivery areas, 130 edit weeks, and over 866,000 features to be created or edited. And the schedule was aggressive. Starting in June 2015, RAMTeCH’s team of 30+ editors in India would spend on average seven weeks making approximately 30,000 edits per delivery area.
For each delivery area, SSP would provide a copy of the client’s geodatabase to RAMTeCH at the beginning of a particular delivery “cycle”. RAMTeCH’s editors in India would make edits over the course of several weeks, then return the edited database back to SSP. SSP would then use its All Edits Extraction and Replay technology to merge the edits into the client’s geodatabase. The client would then review the applied edits, approve them, then post the versions in the production environment. Let’s take a look at how this played out by examining the most recent delivery area (#5 of 28) in more detail.
Delivery 5 included 30,550 features that were created, edited, or deleted by RAMTeCH. These 30,550 features are broken out over 20 versions for ease of QA. (The client usually had three or four QA teams. One team could review a version without impacting another QA team doing the same thing with a different version.) Below is a partial example of a scoresheet for Version #13 within Delivery 5:
Delivery 5 actually started several weeks before RAMTeCH delivered the edited database to the client:
- Saturday, January 30: The client provides a conflict-free cut of the production database to RAMTeCH. This will allow RAMTeCH to autonomously manage the database without impacting client production or support operations.
- Monday, February 1: Tuesday, April 5: RAMTeCH’s editors go to work. For this particular delivery, there are 11 edit weeks. So there are approximately 2,800 edits made per week — or a little less than 100 edits per day by each of the 30 editors.
- Tuesday, April 5: RAMTeCH places the database and any delivery-specific notes on the client’s FTP site. SSP restores the database and then extracts the data to be replayed.
- Tuesday, April 5 through Friday, April 8: SSP creates and reviews a post-extract report to confirm that what was extracted matches what was delivered. The extract report is shared with RAMTeCH and the client. Anomalies are investigated and resolved.
- Friday, April 8 through Saturday, April 9: SSP runs the replay.
- Sunday, April 10: SSP creates and reviews a post-replay report to see if anything errors out. The replay report is shared with RAMTeCH and the client. Anomalies are investigated and resolved.
- Monday, April 11 through Friday, April 29: The client conducts QA with RAMTeCH and SSP providing support.
How successful was Delivery #5? Very. RAMTeCH has promised 98% accuracy in its edits, and Delivery #5 passed with a score well above the 98% threshold.
So the process is going pretty smoothly now, but it took several months (and several deliveries) to get to this point. Next month, we’ll talk about some of the lessons learned along the way!