Test methods and results

Testing was conducted to validate that the design would perform as expected and support the workflows, users, and intended load. System tests provide the opportunity to discover and correct problems during system deployment in lower environments, ideally before they appear in production. For this test study, the focus of the testing approach was system performance and end-user experience.

Each component was monitored as the workflows were conducted against different load scenarios. Upon test completion, results were assembled and analyzed to identify both bottlenecks and over-resourced components in the system. This information was used to identify system components that needed to be scaled up, down, or out before further testing was repeated.

Manual user experience testing was conducted by capturing screen recordings of the workflow testers to ensure users of the system could complete their workflows productively.

For more information, see how to design an effective test strategy.

Workflow pacing

This test study applied a pacing model to the tested workflows. The pacing model shows how the test intends to simulate the pace of work at a utility, where workflows are performed as some number of operations per hour across a team of staff resources. This approach was based on Esri customer input and aimed to match the small to medium electric utility customer scenario that the data was based on.

The various workflows were spread out through a one-hour test period and staggered so as to not start at the same time, while overlapping with each other as real-world workflows also would. This overall breakdown of workflow pacing is considered the “design load” that the system is subjected to. The load was then increased by multiplying the workflows to a point where the system was no longer able to provide acceptable responses or support successful workflows. Note that the workflow pacing model applied in this test study might not match typical daily use at your organization.

Results of the workflow pacing model for a Network Information Management System: Electric Utility (Oracle)

Performance testing tools

Because ArcGIS is a multi-tier system, performance tests were conducted across client, service, and data storage tiers, as well as the underlying infrastructure itself. In this test study, JMeter was used to simulate the user workflows and measure system performance under different loads. ArcGIS Pro requests were recorded and then replayed to simulate load in addition to manual workflows that were performed to assess end-user experience. Windows Performance Monitor and ArcGIS Monitor were also used to monitor resource utilization across different components.

For more information, see tools for performance testing.

Test results

This architecture was validated with automated load tests and manual users in four scenarios, and you can see the results from each below. At a high level, the test results show that as implemented, the system is adequately resourced to support loads from the design load through 8x the design load. Tests also reinforced the importance of proper application and system configuration for performance. Across each scenario system utilization increases proportionally with load.

Test scenario: Design load

Automated load test results for two machines hosting ArcGIS Web Adaptor at design load Automated load test results for two machines hosting Portal for ArcGIS at design load Automated load test results for two machines hosting an ArcGIS Server (hosting server) at design load Automated load test results for two machines hosting an ArcGIS Server hosting Utility Network services at design load Automated load test results for two machines hosting ArcGIS Data Store (relational) at design load Automated load test results for one machine hosting Oracle and associated workflow run times at design load Automated load test results for concurrent users at design load

  • The system supported the load with low overall resource usage
  • ArcGIS Data Store (relational) was not used - the basemap was accessed as a vector tile service

Test scenario: 4x design load

Automated load test results for two machines hosting ArcGIS Web Adaptor at 4x design load Automated load test results for two machines hosting Portal for ArcGIS at 4x design load Automated load test results for two machines hosting an ArcGIS Server (hosting server) at 4x design load Automated load test results for two machines hosting an ArcGIS Server hosting Utility Network services at 4x design load Automated load test results for two machines hosting ArcGIS Data Store (relational) at 4x design load Automated load test results for one machine hosting Oracle and associated workflow run times at 4x design load Automated load test results for concurrent users at 4x design load

  • The system supported the load
  • The hosting servers generally ran below 10% CPU
  • The GIS Servers generally ran at about 20% CPU
  • The Oracle CPU utilization stayed below 40%

Test scenario: 8x design load

Automated load test results for two machines hosting ArcGIS Web Adaptor at 8x design load Automated load test results for two machines hosting Portal for ArcGIS at 8x design load Automated load test results for two machines hosting an ArcGIS Server (hosting server) at 8x design load Automated load test results for two machines hosting an ArcGIS Server hosting Utility Network services at 8x design load Automated load test results for two machines hosting ArcGIS Data Store (relational) at 8x design load Automated load test results for one machine hosting Oracle and associated workflow run times at 8x design load Automated load test results for concurrent users at 8x design load

  • The system supported the load
  • The hosting servers generally ran below 20% CPU
  • The GIS Servers generally ran at or below 40% CPU
  • The Oracle CPU utilization peaked between 60-70%

Test scenario: 10x design load

Automated load test results for two machines hosting ArcGIS Web Adaptor at 10x design load Automated load test results for two machines hosting Portal for ArcGIS at 10x design load Automated load test results for two machines hosting an ArcGIS Server (hosting server) at 10x design load Automated load test results for two machines hosting an ArcGIS Server hosting Utility Network services at 10x design load Automated load test results for two machines hosting ArcGIS Data Store (relational) at 10x design load Automated load test results for one machine hosting Oracle and associated workflow run times at 10x design load Automated load test results for concurrent users at 10x design load

  • The system did not support the load
  • The hosting servers generally ran below 20% CPU
  • The GIS Servers generally ran at or below 40% CPU
  • The Oracle CPU utilization peaked around 90% CPU

User experience - conducted workflow times

While the system was under load, conducted workflow times were captured as experienced by the users. This represents the time it took to complete all the steps listed in the workflows. Conducted workflow times are consistent until the system becomes overloaded at 10x design load.

Conducted workflow times in ArcGIS Pro across each tested design load scenario

User experience - conducted workflow step times

While the system was under load, conducted workflow times of key steps across all eight workloads were captured. This represents the average time it took to complete a given step.

Conducted workflow step times in ArcGIS Pro across each tested design load scenario

Top