By Anthony Lusardi
Further readingRelease of Daml SDK 1.6.0 Release of Daml SDK 1.2.0 Release of Daml Connect 1.8.0 Release of Daml SDK 1.4.0 Release of Daml Connect 1.13.0
Release of Daml SDK 1.5.0
By Anthony Lusardi. Sep 9, 2020
Daml SDK 1.5.0 has been released on September 16th 2020. You can install it using:
daml install latest
There are no changes requiring migration. However we do recommend using Daml Script in favor of Scenarios going forward. Learn more about Script in the notes below.
Interested in what's happening in the Daml community and broader ecosystem? If so we've got a jam packed summary for you in our latest community update.
- Daml Script is now fully integrated into the IDE and will supersede Scenarios.
Impact and Migration
- There are no changes requiring migration.
- We recommend using Daml Script in favour of Scenarios for any new testing and scripting needs. It is more versatile and will supersede Scenarios.
Daml Script in the IDE
Daml Script was designed from the ground up to be “Scenarios 2.0”: A single scripting language for Daml for all use-cases. To make sure Scripts could be used truly universally, the first covered use-case was scripted interaction with live ledgers, introduced with SDK 0.13.55 in March 2020. Daml Repl and Triggers added interactive tooling and reactive automation. This latest release now completes the journey by providing the same real-time IDE integration that Scenarios have always offered, as well as adding a few more Script features that ease migration.
Scenarios offer functionality that cannot be offered via the Ledger API, which made it impossible to extend them into what Daml Script now is without breaking backwards compatibility. They are also in such widespread use that breaking backwards compatibility on Scenarios would have been costly. The two features will therefore live in parallel for a while, but Scenarios are going to be deprecated in an upcoming release, which means they may be removed with a major SDK version from 12 months after the date of deprecation and will not receive new features.
For more details on Daml Script and its role as “Scenarios 2.0”, please refer to our latest blog post.
- Daml Scripts are now run in Daml Studio just like scenarios. The functionality for inspecting the results is identical.
- Add a
scriptfunction to aid type inference. This is the equivalent of
- In Daml Studio, you can now set the ledger time to the past. This is not supported when running against a ledger.
- Add a
queryContractIdfunction for querying for a contract with the given identifier. This offers the Scenario functionality of
submit p do fetch cidin a way that’s consistent with Ledger API use, but also allows for negative results by returning an Optional.
passTimehelper to advance the time by the given interval.
archiveCmd cidas a convenience wrapper around
exerciseCmd cid Archive.
Impact and Migration
Given Scenarios’ widespread use and the fact that Script does not have complete feature parity with Scenarios, Scenarios will go through the proper deprecation cycle:
- They will be deprecated in an upcoming SDK version, and from SDK 2.0, the SDK will emit warnings on Scenario use.
- They may be removed from the next major SDK version 12 months from the time of deprecation.
Given Daml Script’s additional capabilities and the fact that Scenarios will not receive new features after deprecation, we recommend developing any new tests or initialization scripts using Daml Script instead of Scenarios.
If you would like to migrate existing Scenarios, please refer to the migration guide. If you are not sure how to migrate your Scenario to Daml Script, please get in touch with us.
- You can now configure the application id submitted by Daml Script, REPL, and Triggers to the Ledger API using the
--participant-configcommand line flags. This is needed if you are working against a ledger with authentication and need to match the application id in your token.
- You can now convert Daml expressions to JSON in the Daml REPL using the meta-command
:json. For example:
:json [1, 2, 3]. This allows you to test how the JSON API would convert to JSON.
- The maximum size for packages can now be configured independently in the JSON API. The optional
--package-max-inbound-message-sizecommand line option sets the maximum inbound message size in bytes used for uploading and downloading package updates. It defaults to the
- The Daml Standard Library has gained
- a new function
visibleByKey, together with improved documentation on the authorization rules of the various
- a new function
- The @daml/react can now
fetcha contractId using the
- The Daml Engine has had significant performance improvements:
foldrare each four times as fast as before
- Record update expressions of the form
R with f_1 = E_1; ...; f_n = E_nare much more efficient.
- New docs pages giving further detail on the ordering guarantees Daml Ledgers give on events, and how the theory can be extended to span multiple infrastructures.
Early Access Development
- Daml Trigger Service
- Parties are now specified in request bodies as opposed to via HTTP Basic auth. This is done in preparation for running the trigger service against authenticated ledgers.
- The database format has changed to allow migrations in future releases. Databases are always initialized or migrated to the current version on start, so use of
--init-dbis no longer required. See issue #7092.
--addressoption can be used to listen for HTTP connections on interfaces other than localhost, such as
0.0.0.0for all addresses. See issue #7090.
- The fetch by key streaming endpoint (
/v1/stream/fetch) of the JSON API had a bug where streaming multiple keys could silently ignore some of the given keys. This feature was not used by the
@daml/*client libraries so you were only affected if you subscribed to the websocket endpoint directly.
- The JSON API no longer returns offsets before the initial active contract set block. This matches the documented behavior that the first message containing an offset indicates the end of the ACS block.
--application-idcommand-line option on the JSON API is now hidden and deprecated. The JSON API never used it as it uses Application ID specified in the JWT.
- A bug in the Extractor that could cause transient contracts (created and archived within the same transaction) to be shown as active was fixed. See issue #7201 for details.
- Calling the
offmethod of the
@daml/*client libraries’ event streams returned by
streamFetchByKeynow correctly remove the given listener.
- The Scala bindings’
LedgerClientConfigurationwas fixed. It previously only set the maximum size for the metadata.
maxInboundMessageSizenow does the correct thing, and a new option
maxInboundMetadataSizewas added to set the maximum metadata size. These names match the Netty channel builder.
- The Scala bindings’ client was sometimes not closed properly on application shutdown. You may have seen some
RejectedExecutionExceptionerrors in your logs if this has affected you. This is now fixed, but we encourage all users of the
LedgerClientto call the
closemethod explicitly to ensure it is closed at an appropriate time in your application lifecycle.
withTimeProviderwas removed from
CommandClientin the Scala bindings. This method has done nothing since the new ledger time model was introduced in 1.0.0. See issue #6985.
- When running the Ledger API Test Tool, the required DAR files are now uploaded to the ledger automatically before running the tests. You no longer need to upload these DARs before running the test tool.
- kvutils now expects execution contexts to be passed into the various
LedgerStateOperationsmethods. This is a source-breaking change. Instead of providing an execution context implicitly to your ledger implementation, you are encouraged to construct the relevant contexts for different operations explicitly. Please refer to the open-source implementations as a reference.
- We have enriched the contextual information exposed by the Ledger API server. You should note richer logging information, which can be read either via unstructured or structured logging frameworks. A paragraph on how to configure structured logging has been added to the docs. For more on the issue, see issue #6837.
We are continuing to work on performance of the Daml integration components and improving production readiness of Daml Ledgers, but there are exciting features and improvements in the pipeline for the next few releases as well.
- The Trigger Service will reach feature completion and move into Beta
- The authentication framework for Daml client applications (like the Trigger Service) is being revisited to make it more flexible (and secure!)
- Daml’s error and execution semantics are being tidied up with a view towards improving exception handling in Daml
- Daml will get a generic Map type as part of Daml-LF 1.9