- 28 Aug 2023
- 2 Minutes to read
- Print
- DarkLight
Best Practice Workflows: Data Integration
- Updated on 28 Aug 2023
- 2 Minutes to read
- Print
- DarkLight
A Best Practice Workflow (BPW) is a practical guide to achieving some broad goal in Ursa Studio. The types of activities covered are sufficiently open-ended and heterogenous that a fully prescriptive, cookbook-style guide could never be complete; instead BPWs provide detailed guidance on best practices generally relevant to the activity, as well as the background and rationale supporting that guidance.
This documentation is intended for both new and experienced Ursa Studio users, though as teams become more fluent with Ursa Studio they will likely find superior variations and innovations beyond the practices listed here that are better suited to their local working and data environments.
Each Workflow is broken down into a number of Tasks, listed below. Generally, the ordering of the Tasks within their respective Workflow reflect the recommended chronological ordering of the work.
Accessing Source Data Workflow
- Gather information on the source data
- Determine appropriate storage locations for flat files
- Generate a Source ID value for each source system
- Determine the appropriate Namespace for integration assets
- Create Registered Table objects
- Create Import objects and perform initial loads
- Configure the object visibility settings
- Create data exploration objects
- Validate the source data
- Determine the grain size of each source data object
Semantic Mapping Workflow
- Determine what Semantic Mapping objects are needed
- Identify the Natural objects impacted by each source data object
- Determine the mappings for identifier fields
- Determine the mappings for claims or billing transaction fields
- Checkpoint to review the interpretation of source data
- Create or extend Semantic Mapping Templates
- Create and run Semantic Mapping objects
- Validate the semantic mapping results
Data Mastering Workflow
- Determine which concepts require mastering
- Determine which source tables contain information needed for mastering
- Determine the combinations of fields that should trigger a merge
- Configure and run the patient and provider mastering objects
- Create and run mastering objects for other concepts
- Validate the data mastering results
- Checkpoint to review the implementation of semantic mapping and data mastering
Creating Local Transform Objects Workflow
- Determine the sequence of Local Transform objects needed for each Natural object
- Create and run Local Transform objects
- Validate the Local Transform object results
Connecting to the Natural Object Layer
- Add new fields to existing Natural objects
- Create Natural objects for new source system concepts
- Connect upstream objects to Natural objects
- Review and add validation rules to Natural objects
- Run Natural objects and validate results
- Run downstream assets and review data diagnostic measure results
- Validate aggregate measure results against a gold standard
- Perform case review on a random sample of patients
- Checkpoint to review end-to-end integration results
- Determine which Semantic Mapping and Local Transform objects should be views
- Clean up integration assets