Release Notes v6.4
- 15 Jun 2022
- 5 Minutes to read
- Print
- DarkLight
Release Notes v6.4
- Updated on 15 Jun 2022
- 5 Minutes to read
- Print
- DarkLight
Article summary
Did you find this summary helpful?
Thank you for your feedback
What's new?
- APPLY REPORT-DESIGNATED RESTRICTION pattern. This pattern can only be used in measures. Any report containing measures that use this restriction will have the option to supply per-report string values for each "prompt key" identified by the pattern. This will allow for multiple different reports of varying scopes to use the same measure with different subpopulations. The report can designate one or more acceptable values for each prompt key. If no values are selected, then no restriction will be enforced.
- IS IN REPORT-DESIGNATED CONTROL VALUES pattern. This pattern is the derived field version of APPLY REPORT-DESIGNATED RESTRICTION.
- Added an "Environment Health" tab to the Object Workshop dashboard. This tab will show the status of Ursa Studio's connections to the application database and customer database. For PostgreSQL, Redshift, and Snowflake implementations there is also a button to reset the connection pool to the database, which will help repair some database-locking scenarios.
What improvements have we made to existing features?
- Import "Latest file matching my_import_YYYYMMDD.csv" option
- Every import needs to know the data effective date of the file to be imported, which is typically specified via an input in the import object screen. For cloud-based deployments it's possible to set Ursa Studio up to automatically impute the data effective date based on the filename, using the "Impute Data Effective Date from Filename" checkbox. By default, the date format mask is assumed to be YYYYMMDD and is also assumed to be at the end of the filename. Both of these defaults can be overridden; the user can enter any mask consisting of the characters Y, M, D, period, underscore, and hyphen. If the date string is not at the end of the filename, the user can specify the (1-indexed) start character of the date string. This is especially useful if there are multiple numeric strings in the file name.
- Imputing the data effective date from the filename unlocks the possibility of running the import object via the Object Workshop dashboard screen, as well as automating the import with via a Saved ELT.
- Imputing the data effective date from the filename also unlocks the possibility of choosing a generic, masked filename as the file to be imported. For example, if there is an existing file named "my_import_20200101.csv" in the list of files, users will also see an option to import "Latest file matching my_import_YYYYMMDD.csv". If this option is selected, the actual file to be imported will be dynamically determined at the time of import. When such a file is imported, Ursa Studio will automatically delete all but the latest three files in the import container that match the filename mask upon successful import, so as to keep the container at a manageable size if an automated process is periodically uploading files.
- Ursa Studio supports a special treatment for imports that need to specify not only the data effective date but also the data coverage start and end dates. If the date format mask is exactly as follows: YYYYMMDD_YYYYMMDD_YYYYMMDD, then Ursa Studio will assume that the first of these dates represents the data effective date, and the other two represent the data coverage start and end dates, respectively.
- Import objects can now import multiple files at a time, under specific scenarios. The files must be contiguous within a single storage container, and the effective date of the files must be imputed via the filename. Users can choose the start and end files of the range to be imported, and Ursa Studio will import them one after the other, in alphabetical order, appending each subsequent file and applying the chosen append key to de-duplicate keys. The original file can still be set to either overwrite or append if the table already exists.
- Due to limitations in the AWS and Azure APIs, the Import Object screen is only ever able to list 1000 files within a bucket. For buckets containing over 1000 files, users have had to enter the full name of the file to be imported, which has been problematic in the circumstances where the users do not know the exact name of the file and do not have personal access to the bucket to verify the filename. Now, when Ursa Studio recognizes that the filename list is being truncated to 1000 by the AWS or Azure API, it will provide an optional extra control in the Import Object screen by which users can enter a prefix string to act as a filter, and the list of available files in the filename dropdown will be limited to only those files that start with the prefix string. If the files exist in a subdirectory, the prefix string should be the subdirectory name, and not the beginning of the actual filename.
- The measure archetype picker at the top of the measure wizard has been replaced by a new screen that shows the available archetypes in a grid whose dimensions are measure type and temporal structure. Users will furthermore see measures that are available to clone, in the applicable grid cell.
- The temporal structure Timeline Period is now called Interval, and the temporal structure Timeline Snapshot is now called Entity.
- Removed the designation of "dominant case" in measure workshop. Moving forward, all count measures will be assumed to be numerator-dominant, and all other measures will be assumed to be denominator-dominant.
- Created new visibility setting for boards: "All report users can view, all report architects and admins can edit".
- The data model screen now supports a list view to complement the existing "dependency tree" view. Both views can be toggled for any lens via an icon-button at the top of the screen, but each lens will default to its most suitable view when initially selected. The one-asset-and-its-immediate-relatives "browse mode" will continue to operate as before, in the "dependency tree" view.
- All non-SSO implementations now have two-factor authentication turned on by default, instead of turned off. Deployments can opt out from two-factor authentication by means of the ESCHEW_2FA environment variable. The old REQUIRE_2FA environment variable is now obsolete. SSO-based implementations may still use 2FA, as determined by the SSO identity provider, but such 2FA will not be managed by Ursa Studio.
What's been fixed?
- Stopped Chrome password manager from autocompleting user credentials into the "Zip Password" field in Import Objects.
- Fixed a bug in transitive matching logic during master data management for RDBMSes that don't support recursive CTEs.
Was this article helpful?