Faster Geo-smart
Enterprise Architecture
& Software Solutions
Bootcamp GIS Instructor
Bootcamp GIS Course
Bootcamp GIS Course
Bootcamp GIS Course

Integration • Automation • Analytics • Conversion • Data Quality • GIS • CAD • BIM • 3D/AR/VR • Imagery • Software/Applications • Enterprise Architecture • Web/Cloud • Sensors/IoT • Real-time • AI/ML

Having come into GIS in 2006–and FME soon thereafter—with an extensive software engineering background, I have been sorely missing several varieties of tools that enable best practices to assure software quality.  A regression test framework is one such tool.  The proper application of a test framework enables greater confidence that software produces correct results, and moreover, as software evolves and functions are added or changed, that existing, supposedly unchanged functions are not broken in the process.

And make no mistake: FME workspaces and custom transformers are software!  They run like software, can be broken down into interacting modules like software, and can have bugs…like software.  They just happen to be implemented in a high-level, graphical, domain-specific (geoprocessing) programming language with a pipe-and-filter data flow architecture.  There is absolutely no reason why we should not be applying all that we have learned over the history of software development to systems written in the FME programming language.

To that end, I have begun building such a test framework, released under a standard 3-clause BSD open source license, the initial components of which are now published here and here on FME Hub.  These first modules are custom transformers designed to work with standard AttributeValidator transformers for both workflow and data quality.  For the former, regression test driver workspaces can call a target workspace or transformer with predefined test data and verify that the results are correct and can be generated without error for both common and edge cases.  For the latter, data can be tested and exception reports generated from inside any workspace or transformer.

The AttributeValidationListPopulator enhances the output from a standard AttributeValidator transformer with contextual information to assist subsequent diagnosis of invalid attribute values.  The AttributeValidationHTMLReportGenerator generates a human-readable HTML report of invalid attributes at different positions within a calling workspace or custom transformer from the output of one or more standard AttributeValidator transformers and paired AttributeValidationListPopulator transformers.

Soon I intend to develop equivalent transformers that work with standard GeometryValidator transformers so that geometric requirements can be directly tested as well as attributes. On a final note, I would be remiss if I did not mention the existence of another open source test framework for FME.  This is the rTest package built by the fine folks at Veremes.  I have tried it, and it works just fine as designed.  However, the test specifications are written as XML documents using a particular DTD (schema), rather than with the more FME-like spirit of transformers.  I greatly respect my colleagues at Veremes, but I personally prefer to stay within the FME workflow paradigm.  If you are so inclined, please check out the rTest open source repository here.  But for those of you who share my aversion to XML, you now have an alternative.