Release of Daml SDK 0.13.38

Ledger API

  • Allow non-alphanumeric characters in Ledger API server participant ids (space, colon, hash, slash, dot). Proper fix for change originally attempted in v0.13.36. See issue issue #3327.
  • Add healthcheck endpoints, conforming to the GRPC Health Checking Protocol. It is always SERVING for now.

Ledger API Server

  • Ledger API Server and Indexer now accept an instance of MetricRegistry as parameters. This gives implementors of ledger integrations the most flexibility to set up metrics reporting that works best for them.
  • Add various metrics to track gRPC requests, command submissions, and state update processing. See #3513.

Daml Ledger Integration Kit

  • Add conformance test coverage for the service.
  • Add Ledger API Test Tool [–load-scale-factor]{.title-ref} option that allows dialing up or down the workload applied by scale tests (such as the TransactionScaleIT suite). This allows improving the performance of different ledger over time.
  • The Ledger API Test Tool no longer shows individual test duration colored based on how long they lasted.


  • Add support for JWT tokens that only authorize to read data, but not to act on the ledger.
  • Add CLI options to start the sandbox with JWT based authentication with RSA signed tokens. See issue #3155 .
  • The --auth-jwt-hs256 CLI option is renamed to --auth-jwt-hs256-unsafe: you are advised to _not use this JWT token signing way in a production environment.


  • Fixed a bug where the --access-token-file option did not work correctly.

Daml Compiler

  • Bugfix: The Sdk-Version field in a DAR manifest file now matches the SDK version of the compiler, not the sdk-version field from daml.yaml. These are usually the same, but they could be different if you set the Daml_SDK_VERSION environment variable before running daml init or daml build.
  • Make the experimental feature “generic templates” unavailable. The current implementation is at odds with other, more important language features still under development.

Daml Studio

  • Notify users about new Daml Driven blog posts.

Java Bindings

  • Deprecated existing constructors for DamlLedgerClient, please use the static newBuilder method to instantiate a builder and use it to create the client, starting from either a NettyChannelBuilder or a plain host/port pair.
  • Rename DamlMap to DamlTextMap.
  • DamlCollectors class provides Collectors to build more easily DamlList and DamlTextMap.
  • Change the recommended method to convert DamlValue containers from/to Java Bindings containers. See for more details the new methodology.

Daml-LF Interface Reader

  • Rename PrimTypeMap to PrimTypeTextMap and PrimType.Map to PrimType.TextMap

JSON API – Experimental

  • Accept a path to a file containing a token at startup for package retrieval. See issue #3627.

Daml Triggers – Experimental

  • Daml Triggers now allow you to specify which templates you want to listen for. This can improve performance.

Daml Script – Experimental

  • Daml Script can now run be used in distributed topologies.
  • Expose the Ledger API exerciseByKey command

Release of Daml SDK 0.13.37

Daml Stdlib

  • Added the NumericScale typeclass, which improves the type inference for Numeric literals, and helps catch the creation of out-of-bound Numerics earlier in the compilation process.
  • fromAnyChoice and fromAnyContractKey now take the template type into account.


  • Fixed a bug where Navigator becomes unresponsive if the ledger does not contain any Daml packages.


  • Add field gen_map in Protobuf definition for ledger api values. This field is used to support generic maps, an new feature currently in development. See issue for more details about generic maps. The Ledger API will send no messages where this field is set, when using a stable version of Daml-LF. However the addition of this field may cause pattern-matching exhaustive warnings in the code of ledger API clients. Those warnings can be safely ignored until GenMap is made stable in an upcoming version of Daml-LF.


  • The app can now work against a Ledger API server that requires client authentication. See issue #3157.

Daml Compiler

  • Breaking The default Daml-LF version is now 1.7. You can still produce Daml-LF 1.6 by passing --target=1.6 to daml build. This removes the Decimal type in favor of a Numeric s type with a flexible scale. Decimal is now a synonym for Numeric 10. If you get errors about ambigous literals, you might need to add a type annotation, e.g., replace 1.0 by (1.0 : Decimal).

JSON API – Experimental

  • CLI configuration to enable serving static content as part of the JSON API daemon: --static-content "directory=/full/path,prefix=static" This configuration is NOT recommended for production deployment. See issue #2782.
  • The database schema has changed; if using --query-store-jdbc-config, you must rebuild the database by adding ,createSchema=true. See issue #3461.
  • Terminate process immediately after creating schema. See issue #3386.

Daml Triggers – Experimental

  • emitCommands now accepts an additional argument that allows you to mark contracts as pending. Those contracts will be automatically filtered from the result of getContracts until we receive the corresponding completion/transaction.

Daml Script – Experimental

  • This release contains a first version of an experimental Daml script feature that provides a scenario-like API that is run against an actual ledger.

Smart Contracts Deserve Smart Analytics

Anywhere from 50-80% of a data scientist’s time is estimated to be spent on cleaning and preparing data for analytics. This is needed because data is often inconsistent, has gaps, and has to be enriched from multiple external sources to present a full picture. For example, customer transactions gathered from various sources often have different product SKU names, locations that must be reconciled, as well as duplicate records that must be removed. This also implies that enterprises spend a large chunk of their business innovation budget on the data management problem, even before it reaches the data scientists.

In this blog, we’ll outline why this problem occurs and how Actian’s integration with Daml, the open source smart contract platform created by Digital Asset , can greatly alleviate this problem. Less time spent on preparation means more time spent on actual analytics leading to direct business benefits. We also have much happier data scientists!


A unifying business process

In a digital economy, the industry is moving towards a collaborative model based not only on shared data, but also shared business processes. Traditionally, applications and organizations have interacted with each other through various means such as APIs and messaging protocols, and have maintained their own copies of the data for reasons that naturally come about as a result of an evolving technology landscape. Multiple copies of data require constant and expensive reconciliations, and a parallel infrastructure of checks and balances to ensure business agility and optimal customer experiences. It is no surprise that any aggregation of all these silos for analysis purposes also needs significant clean up and matching.

It is in this context that applications leveraging smart contracts in Daml provide us a new model to think about enabling such collaboration. Each application that is a party on a smart contract based workflow has access to a single golden source of the data while still enjoying complete flexibility with respect to privacy and isolation of data to meet various regulatory, audit, and compliance requirements. The business functions can be executed with the knowledge that the underlying data layer can not get out of sync between applications. This architecture means that the traditional problems of reconciliation and data inconsistencies can be eliminated to a significant extent.


A New Challenge:

While smart contracts and blockchain based applications provide tremendous business and operational benefits when it comes to clean and consistent data, we also need to be able to analyze the operations, derive insights and apply those insights to improve future operations. For example, AML (Anti Money Laundering) and fraud analytics on financial transactions. One approach is to take a periodic snapshot of the smart contract data store and do these analytics on an offline, copy of the database. However, this approach can lead us to the same data island and the same reconciliation problems.

Enabling analytics on smart contract data directly and solving this data synchronization problem is what Actian set out to solve with our integration with Daml. Traditionally, Actian’s Data Connect product has addressed the data aggregation challenge through an exhaustive set of connectors to various enterprise applications (ERP, CRM, various custom databases, marketing systems etc.). Using Data Connect, enterprises have been able to get the data where they want it, when they want it, and the form they want it in.

However, smart contracts and blockchains provide a new set of challenges. First, since the Daml runtime has granular privacy guarantees, it is not just a simple matter of retrieving all existing data from the blockchain into an offline database – that is neither easily allowed nor desired. Second, the data that is stored into the smart contracts needs to be transformed to make sense to analytics applications. And finally, we do not just want to create an offline copy, but instead prefer it to be constantly in sync with the smart contract store to avoid the reconciliation and inconsistency problem.

The Solution Components

We created a couple of real life applications to demonstrate this exciting capability. The first was a supply chain use case that allowed retailers and manufacturers to share demand data. Second was a financial use case on Swaps using the ISDA CDM where a trade lifecycle was executed between brokers, clients and central counterparties.

Modeling & Business Operations: To make the scenarios realistic we modeled real life parties for these use cases, such as multiple brokers and clients for the financial use case. These multi-party Daml applications were then hosted on a scalable cloud backend for Daml applications (projectDABL) that abstracts away the ledger and other plumbing complexities from the participating entities. These distributed applications generated transaction data as part of typical business operations. For example, trade proposals, approved trade instances, cash movements etc.

Data Synchronization: The next step in the process was to make this data available for detecting patterns such as demand trends in case of the supply chain use case, and routine business intelligence on trade information in case of the financial use case. This is where we integrated Actian’s Data Connect product with Daml. We had 2 options. We could either act on behalf of a participating entity, or we could act as a 3rd party regulator or agency that requires access to certain parts of the information. While the connection semantics are the same, we highlight this point because even in the latter case, no special access workarounds are needed. This is because Daml models allow for what is called “Observers” which can also be considered as a party on the network. These observers could be a reporting entity / application, regulator or a financial intermediary. The Data Connect integration can be built in a way to enable continuous synching with the data this Observer receives which meant that any updates to the contracts on the ledger would trigger updates in Data Connect as well. Using this approach also helped us avoid accessing data that reporting systems are not required to store.

Transformation: Smart contract data must be transformed into a format that makes sense for business analytics. We used Data Connect’s built in data exploration and mapping tool to perform this activity. At the end of this step, the data on the Daml ledger is in perfect sync with Data Connect but in a form that is also readily consumable by analytics applications in real time.

Enrichment: Once we had the base ledger data in the form we needed, we then used Data Connect’s comprehensive set of connectors to pull in information from external applications to enrich the data from the ledger. For example, for the trade information, we pulled up client profiles from an external database (dummied up for the purposes of this demonstration). In an ideal world, all of this data might reside on the ledger as reference data. However, in a practical enterprise technology deployment roadmap, functionality is often deployed in phases, or sometimes various constraints do not allow the ideal point of arrival architecture to be realized. We also have multiple systems such as ERPs, CRMs, billing systems, fraud systems etc. to account for. The data enrichment stage was designed within this prototype to account for those realistic scenarios.

Analytics & Insights: Finally, to complete the end to end use case, we used Vector, Actian’s high performance analytics database as an endpoint. Both ledger and enrichment data were then populated into Vector allowing business users to slice and dice the data as desired. Vector also allows for different kinds of visualization and analytic tools to be used such as Tableau, Kibana and others. While we used Actian’s Vector for this integration, we could also have seamlessly used Avalanche, Actian’s cloud data warehouse, or any other enterprise database that is used by organizations. Data Connect has ready connectors with almost every enterprise system.



There are several alternatives to this architecture which are useful in some cases. For example, for simple decisioning at the streaming level – as real time ledger updates are happening – we can use Daml automations to trigger those actions without having to first pull the data into a synchronized data store. However, these decisions that need to be taken by the automations are often those that will be generated and defined through various kinds of advanced analytics already completed. That necessary advanced analytics and enrichment of data is enabled by the integration we have outlined in this post.

A second alternative is provided by the Daml SDK itself, and is very useful while testing applications as they are built. The extractor tool within the SDK allows pulling of the ledger data into an offline DB such as PostgreSQL. Once we get to any significant analytics and business intelligence however, we need advanced transformation and enrichment capabilities that Data Connect provides.



We hope we were able to outline an important use case in data analytics and prediction that is frequently coming up as enterprises move to the next step of their DLT programs. Actian’s integration with Daml is an important step towards enabling the digital transformation journeys of our clients. It also showcases the effectiveness of open technology stack of Daml towards provisioning of an enterprise ready landscape. While Daml applications can run on multiple ledgers and databases, this integration with Actian’s Data Connect provides a wide array of enterprise integrations to enable enrichment of data.

As enterprises move forward in their DLT and smart contracts journey, they must begin to start thinking in terms of their overall enterprise architecture and data analytics roadmap. Both of those will need to be defined as a more flexible, and contextual business process is created with the help of Daml smart contracts based workflows.

Release of Daml SDK 0.13.36

Daml Compiler

  • Support for incremental builds in daml build using the --incremental=yes flag. This is still experimental and disabled by default but will become enabled by default in the future. On large codebases, this can significantly improve compile times and reduce memory usage.
  • Support for data dependencies on packages compiled with an older SDK (experimental). To import data dependencies, list the packages under the data-dependencies stanza in the project’s daml.yaml file.

Daml Stdlib

  • maintainer function that will give you the list of maintainers of a contract key.


  • Add the option to start the sandbox with JWT based authentication. See issue #3363.
  • Fixed a bug in the SQL backend that caused the database to be flooded with requests when streaming out transactions.
  • Fix divulged contract visibility in multi-participant environments. See issue #3351.
  • Enable the ability to configure ledger api servers with a time service (for test purposes only).
  • Allow a ledger api server to share the Daml engine with the Daml-on-X participant node for performance. See issue #2975.
  • Allow non-alphanumeric characters in ledger api server participant ids (space, colon, hash, slash, dot).
  • Include SQL statement type in ledger api server logging of SQL errors.

Daml Triggers

  • Added exerciseByKeyCmd and dedupExerciseByKey to exercise a choice given the contract key instead of the contract id.
  • getTemplates has been renamed to getContracts to describe its behavior more accurately. getTemplates still exists as a compatiblity helper but it is deprecated and will be removed in a future SDK release.
  • Fix a bug where the use of Numeric caused triggers to crash with an assertion error.

JSON API – Experimental

  • Fix to support Archive choice. See issue #3219
  • Implement replay on database consistency violation, See issue #3387.
  • Comparison/range queries supported. See issue #2780.

Extractor – Experimental

  • Fix bug in reading TLS parameters.

Combining Blockchain, Cloud and Nanotechnology to secure the supply chain with Sextant for Daml

Nanotechnology, smart contracts and blockchain all come together to create an elegant and effective solution that will improve a $1.8 trillion problem that has vast economic and human consequences.

Today Digital Asset and Blockchain Technology Partners announced the availability of Sextant for Daml with early adopters Quantum Materials Corp. Download the Daml SDK and deploy with Sextant for Daml today.

Counterfeit goods are a $1.8 trillion criminal industry with vast economic and human consequences. The Boston Consulting Group estimates that as much as $200 billion of drugs sold annually are fakes and 10% of European luxury products are counterfeited. The immutable uniqueness of Quantum Materials Corp’s optical signatures can reinforce trust at all stages of the supply chain. 

How so? After over a decade of nanotechnology research in their labs in Austin, Texas, QMC have released Quantum Dots, nanoscale particles so tiny you could line ten thousand of them across the diameter of a human hair. These quantum dots react to energy and light and can be tuned to emit predetermined wavelengths of light. By varying the wavelength QMC can design billions of unique optical signatures which can be incorporated in the production of almost anything, from aircraft components to luxury handbags. Impossible to copy, these optical signatures can determine absolute product identity.

Quantum dots are the stuff of magic but these unique optical signatures are only useful with the technology that enables them to be read and identified securely. This is where Digital Asset and BTP have stepped in. 

In order to verify an optical signature, an immutable and cryptographically secure digital twin is required. QMC’s use case lends itself perfectly to distributed ledger technology which, essentially, is an innovative secure database architecture where multiple parties can transact using a single source of truth with mathematical certainty. 

Combining their expertise in getting DLT projects into production and generating real business value, Digital Asset and BTP partnered with QMC to provide a blockchain platform that will enable customers to verify the authenticity of their optical light signatures as simply as using a hand-held scanner or smartphone application.

To this end, Digital Asset created and open sourced Daml, a smart contract language that abstracts away the complexity of multi-party business processes on distributed ledgers. Daml is intuitive and focuses on business logic allowing developers to focus on the essential business value of an application and get to market first.

BTP works under the covers by providing its blockchain management platform, Sextant for Daml. Sextant for Daml abstracts away the operational complexity and takes care of the service delivery of the blockchain network. Sextant for Daml currently supports Daml smart contracts running on Hyperledger Sawtooth, an open source distributed ledger technology hosted by the Linux Foundation, as well as Amazon Aurora, a cloud storage engine on Amazon Web Services (AWS), with support for further ledgers coming later this year. 

Get started with Sextant for Daml today!

Release of Daml SDK 0.13.34

Daml-LF – Internal

  • Freeze Daml-LF 1.7. Summary of changes (See Daml-LF specification for more details.):
    • Add support for parametrically scaled Numeric type.
    • Drop support of Decimal in favor or Numerics.
    • Add interning of strings and names. This reduces drastically dar file size.
    • Add support for ‘Any’ type.
    • Add support for type representation values.
  • Add immutable bintray/maven packages for handling Daml-LF archive up to version 1.7:
    • com.digitalasset.daml-lf-1.7-archive-protoThis package contains the archive protobuf definitions as they were introduced when 1.7 was frozen. These definitions can be used to read Daml-LF archives up to version 1.7.

Daml Triggers

  • Triggers must now be compiled with daml build --target 1.7 instead of

Release of Daml SDK 0.13.33


  • Fixed regression in Navigator to properly respect the CLI option --ledger-api-inbound-message-size-max again. See issue #3301.

Daml Compiler

  • Reduce the memory footprint of the IDE and the command line tools (ca. 18% in our experiments).
  • Fix compile error caused by instantiating generic templates at Numeric n.
  • The compiler now accepts single-constructor enum types. For example data A = A or data Foo = Bar.

Daml Triggers

  • Add dedupCreate and dedupExercise helpers that will only send commands if they are not already in flight.
  • Remove the custom AbsoluteContractId type in favor of the regular ContractId type used in Daml templates.


  • Fixed a bug a database migration script for Sandbox on Postgres introduced in SDK 0.13.32. See issue #3284.
  • Timing about database operations are now exposed over JMX as well as via the logs.
  • Added a missing index to the SQL schema for the Postgres Ledger.

Daml Integration Kit

  • Re-add :doc:integration kit documentation </daml-integration-kit/index> that got accidentally deleted.

Ledger API

Daml Stdlib

  • Add DA.TextMap.filter and DA.Next.Map.filter.
  • Add assertEq and assertNotEq to DA.Assert as synonyms for === and =/=.
  • Add DA.Foldable.mapA_DA.Foldable.forA_DA.Foldable.sequence_ and DA.Action.replicateA_. These functions match the behavior of corresponding functions without the underscore suffix but ignore the result which can be more convenient and efficient.

Extractor – Experimental

  • Extractor now stores exercise events in the single table data format. See issue #3274.

JSON API – Experimental

  • workflowId no longer included in any responses.
  • /contracts/search endpoint can optionally store searched contracts in a Postgres-based cache, by passing the new --query-store-jdbc-config option. See issue #2781.

Daml SDK

  • Display release notes in the IDE when the Daml extension is upgraded.