Abstracting Away the Complexities of Blockchain and Smart Contracts

Guest post from Duncan Johnson-Watt, CEO, Blockchain Technology Partners

Implementing new technology across the enterprise is a time consuming, complex process and blockchain is no exception to the rule. According to Gartner’s recent “CIO Guide to Blockchain”, only 3 percent of CIOs have a form of live and operational blockchain for their business; however, business leaders are seeking ways to utilize this transformative technology to automate multi-party transactions across internal and external teams, and produce a single source of truth for that data.

A blockchain infrastructure management platform to accelerate adoption
With blockchain still in its infancy from achieving global adoption across all industries, organizations are finding it difficult to secure large investment dollars for infrastructure deployment and management, and to find developers with the smart contract expertise to build multi-party applications. Additionally, many IT organizations do not have the time or resources to build a production ready blockchain network for distributed applications. What businesses need is a management platform that simplifies the deployment and ongoing management of distributed, multi-party applications and a smart contract language that abstracts away the complexity of the underlying infrastructure for speedy time to market builds of decentralized applications.

Screen Shot 2020-12-15 at 7.46.28 PM

Smart contracts across industry-best blockchain persistence layers

Digital Asset and Blockchain Technology Partners have joined forces to accelerate blockchain adoption and bring innovative applications to market faster with Sextant for DAML. This offering enables businesses to deploy DAML smart contracts across industry-best blockchain persistence layers using their DAML Driver license entitlements for Hyperledger Sawtooth and Besu, as well as Amazon QLDB and Aurora.

Sextant for DAML, a blockchain infrastructure management that frees up resources to focus on DAML applications

DAML is the native smart contract language that enables rapid multi-party application development and platform independence. With DAML, developers only have to code using business logic, while the DAML runtime and DAML integration layer manages everything else. Sextant is the infrastructure management platform that provides one-click deployment of decentralized DAML applications across supported networks on-premises or in the cloud utilizing Kubernetes containers, enabling the broadest range of deployment.

Learn More about DAML Drivers

A recent example of Sextant for DAML in action is the Demex Group’s new paradigm to customize and deliver climate-resilience through financial risk platforms. The Demex Group, a leading technology company that shields customers from financial surprises of volatile weather, underpins their risk provider platform with Sextant for DAML. The Demex Group can now deliver a customized transaction platform for risk providers that leverages smart contracts and blockchain to streamline multi-party workflows while facilitating more transparent transactions. This solution helps end users creatively analyze client exposure and develop bespoke financial risk solutions.

In the November 2020 press release, Ed Byrns, President and CEO of The Demex Group stated, “We selected Sextant for DAML to allow us to fully realize its benefits by providing us with a stable platform to build upon, accelerating our development and reducing our time to market.”

Sextant utilizes Kubernetes to simplify deployment and management of distributed ledgers and smart contract infrastructure

Behind the scenes, Sextant uses parameterized helm charts to achieve the broadest range of deployment through Kubernetes orchestration. Sextant can be integrated into an enterprise’s CI/CD pipeline using its API technology. Alternatively, the Sextant UX renders these parameters in easy to use dialogs. Users can import the details of the Kubernetes clusters they are entitled to use and then select the appropriate DAML Driver to target these clusters for the actual DAML application deployment.

Sextant also maintains and tests DAML applications to ensure optimal performance. Moreover, Sextant for DAML is delivered as enterprise software rather than a centralized managed service, this enables enterprises to deploy flexibly and on their own terms.

With Sextant for DAML, businesses can instantly deploy DAML applications in production without the operational complexity. To learn more about DAML, DAML Drivers, and Sextant for DAML, visit https://blockchaintp.com/sextant/daml/ .

Community Update – December 2020

Our community recognition ceremony is open! Nominate who you think deserves to win! 

What’s New in The Ecosystem

@gyorgybalazsi shared what he, @Gyorgy_Farkas, Janice, and Dani learned from participating in Odyssey and grappling with new problems. From fishing quotas to licenses to matching reports with independent observers it’s thoroughly impressive how much was built in such a short time.

@gyorgybalazsi shared what he, @Gyorgy_Farkas, Janice, and Dani learned from participating in Odyssey and grappling with new problems. From fishing quotas to licenses to matching reports with independent observers it’s thoroughly impressive how much was built in such a short time.

@eric_da is presenting on why open banking and open apis are the tip of the iceberg at Open Core Summit on 12/17 at 4:05 PST/7:05 EST.

IMG_4736
@bartcant got his bobblehead

Odyssey and YHack are a wrap! The winning YHack team wrote a small DAML backend, great to see someone pick up DAML and run with it so quickly. Odyssey rewards were announced here.

Blogs and Posts

@entzik released the 2nd part of “A Doodle in DAML” , with the clearest explanation I’ve ever seen of how a preconsuming choice works. Awesome stuff!

@entzik released the 2nd part of “A Doodle in DAML” , with the clearest explanation I’ve ever seen of how a preconsuming choice works. Awesome stuff!

@anthony talked to Gints at Serokell about why DAML is a different kind of programming language, how it’s rooted in Haskell, and how its unique features make it a great option for writing smart contracts

Videos

@andreolf recently demonstrated how to use DAML to build robust data pipelines in practice. Really great presentation.

@andreolf recently demonstrated how to use DAML to build robust data pipelines in practice. Really great presentation.

Manish, Leve, and Francesco recently gave a presentation on DAML for beginners check out the video recording here.

Corporate News

DAML is now available for Microsoft’s Azure Database, check it out.

Knoldus has added DAML to their Techhub, lots of projects to check out and some even by our very own forum members @Nishchal_Vashisht@upanshu21, and @ksr30!

HKEX, the world’s second-largest exchange group by market capitalization, is now using DAML to standardize and streamline their post-trade workflows.

Demex, a climate risk insurtech is using Sextant for DAML (on Sawtooth) to build financial risk solutions.

VMWare Blockchain 1.0 is released with DAML support right out of the box! Check out the full announcement here.

Other Fun

If you didn’t know Richard has weekly updates on security and privacy news, you can check them out here.

DAML Connect 1.8

Highlights

  • The API coverage of DAML Script, the JavaScript Client Libraries, and the DAML Assistant have been improved.
  • DAML Driver for PostgreSQL Community Edition is now stable.
    • Action required unless you were already using the --implicit-party-allocation=No flag.
    • Running the Sandbox with persistence is now deprecated.

The full release notes and installation instructions for DAML Connect 1.8.0 can be found here.

Impact and Migration

There are no backwards incompatible changes to any stable components.

DAML Driver for PostgreSQL (daml-on-sql) Community Edition ledger has been downloadable as Early Access from GitHub releases since SDK 1.4.0. The option --implicit-party-allocation used to be on by default, but has now been removed. Users that were using the DAML Driver for PostgreSQL with the implicit party allocation option will need to explicitly allocate parties now.

Users running DAML Triggers (Early Access) against authenticated ledgers may now need to handle authentication errors externally.

What’s Next

The eagle eyed reader may have noticed that some features have appeared in the “What’s Next” section for some time, and that there hasn’t been a DAML-LF release since SDK 1.0.0 and DAML-LF 1.8. This will change with one of the next releases because several features that require a new DAML-LF version are currently being finalized:

  • Choice observers (see Early Access section above)
  • Generic Maps
  • Better Exception Handling in DAML

Work also continues to move DAML Triggers and the Trigger Service (see Early Access section above) to general availability.

Lastly, the multi-party read features on the gRPC Ledger and JSON APIs will be supplemented with multi-party writes, allowing the submission of commands involving multiple parties via the APIs as long as they are all hosted on the same node.

Release of Daml Connect 1.8.0

DAML Connect 1.8.0 has been released on December 16th. You can install it using:

daml install 1.8.1

Note: Daml Connect 1.8.0 contained a bug that is now fixed and noted in the 1.8.1 release. Daml Connect 1.8.0 is no longer supported.

Want to know what’s happening in our developer community? Check out the latest massive update for this month.

Highlights

  • The API coverage of DAML Script, the JavaScript Client Libraries, and the DAML Assistant have been improved.
  • DAML Driver for PostgreSQL Community Edition is now stable.
    • Action required unless you were already using the --implicit-party-allocation=No flag.
    • Running the Sandbox with persistence is now deprecated.

Impact and Migration

The DAML compiler now enforces sequential ordering for let expressions with multiple bindings. This fixes a bug where let expressions would be evaluated in the wrong order. In rare cases it may be necessary to re-order your let expressions if they relied on forward references.

In the old behavior the code below would rely on forward references and result in `x` being assigned the value `1` despite `y` being assigned this value after `x`. In the fix this is now a compile-time error.

let x = y
    y = 1
in x

To correct this code you would change the order of the assignments so:

let y = 1
    x = y
in x

There are no other backwards incompatible changes to any stable components.

DAML Driver for PostgreSQL (daml-on-sql) Community Edition ledger has been downloadable as Early Access from GitHub releases since SDK 1.4.0. The option --implicit-party-allocation used to be on by default, but has now been removed. Users that were using the DAML Driver for PostgreSQL with the implicit party allocation option will need to explicitly allocate parties now.

Users running DAML Triggers (Early Access) against authenticated ledgers may now need to handle authentication errors externally.

What’s New

Better API coverage in Script, Libraries, and Assistant

Background

This change addresses several gaps in the API coverage of DAML Connect tooling. In particular, it adds support for the party and package management APIs to the JavaScript Client libraries, easing the development of client applications that perform administrative tasks.

Specific Changes

  • The JavaScript Client Libraries’ Ledger object, returned by useLedger, now has three new methods covering the package management API, and three new methods covering the party management API:
    • listPackages returns a list of all known packageIDs,
    • getPackage returns the binary data of the corresponding DALF, and
    • uploadDarFile takes binary data and uploads it to the ledger. Note that uploadDarFile requires admin access.
    • getParties allows users to, based on a party id (or party ids, as the name suggests) fetch more information about the party or check for its existence,
    • listKnownParties will return a list of all known parties, and 
    • allocateParty will allocate a new party.
  • The JavaScript Client Libraries’ Ledger object now also exposes the API method createAndExercise, which creates a contract and exercises a choice on it in the same transaction.
  • daml ledger can now also be run against the JSON API instead of the gRPC API.
  • The listKnownParties function in DAML Script is now also supported when running over the JSON API.

Impact and Migration

This change is fully backwards compatible.

DAML Driver for PostgreSQL Community Edition now stable

Background

DAML Driver for PostgreSQL (daml-on-sql) Community Edition ledger has been downloadable as Early Access from GitHub releases since SDK 1.4.0. With release 1.8.0 it is now stable and completes the separation of the Sandbox development and test tool from a SQL-based ledger intended for production use. Accordingly, as announced in the release notes of SDK 1.4.0, the persistence mode in Sandbox is now deprecated.

Documentation is available at https://docs.daml.com/daml-driver-for-postgresql/.

Specific Changes

  • The --implicit-party-allocation option, previously enabled by default, is no longer supported by DAML for PostgreSQL.
  • The PostgreSQL JDBC URL can now be passed in through an environment variable via --sql-backend-jdbcurl-env. For example, to instruct the DAML Driver to read the PostgreSQL JDBC URL from JDBC_URL, use --sql-backend-jdbcurl-env "JDBC_URL".
  • The feature to run Sandbox with persistence using the --sql-backend-jdbcurl flag is now deprecated.

Impact and Migration

Anyone wanting to run an open source DAML ledger on SQL is advised to migrate to DAML Driver for PostgreSQL. SQL support in Sandbox may be removed with a major version release following the usual 12-month deprecation cycle.

Implicit party allocation was previously the default mode of operation in DAML Driver for PostgreSQL. Users of DAML Driver for PostgreSQL should now allocate parties explicitly using the admin endpoints of the JSON or gRPC APIs. The combination of persistence and implicit party allocation will not be supported after the deprecation cycle; implicit party allocation will continue to be supported in Sandbox. 

Minor Improvements

  • The JSON API’s /v1/fetch endpoint now uses the Postgres database, if configured, to look up contracts by ID or key, except when querying a contract by ID without its corresponding template ID. The fallback in-memory version of /v1/fetch is also significantly more efficient for large datasets, though still linear.

    You may optionally re-create the JSON API database to take full advantage of this change. See issue #7993.
  • The Navigator’s password option in the config file is now deprecated. Note that the option has not had an effect since SDK 1.0.0.
  • If no parties are in the Navigator config or daml.yaml, Navigator will now pick up parties from the party management service. Those parties are periodically refreshed.
  • The JavaScript Client Libraries will now log warnings received from the JSON API.
  • The SDK Docker image is now signed. Note that this is a dev-only image, not intended for production use.

    To verify the signature, use the docker trust inspect command. You can also set the DOCKER_CONTENT_TRUST environment variable to 1 to instruct Docker commands to only pull and run signed images. Keep in mind, however, that this only checks that there is a signature, not that the signer is who you expect it to be. For optimal security, you should manually check the signature once with docker trust inspect --pretty and then pin the image hash rather than relying on tags.

    The expected output of the docker sign inspect command should mention a signer named automation with a public key ID matching 533a6e09faa512f974f217668580da1ceb6aa5b00aad34ea1240afc7d249703f (note that the --pretty output only shows the first 12 chars) and a repository key matching f5dc2aee6aed2d05d7eda75db7aa2b3fac7fc67afbb880d03535d5a5295a0d3b.
  • The gRPC Ledger API proto definitions and documentation pages now document the error codes endpoints may return.
  • DAML-LF protobuf definitions are now published to Maven Central in JARs. The artifact names follow the format “<name>_proto_jar” under group “com.daml”.
  • The *.proto files containing the gRPC definitions are now provided by a new Maven Central artifact, with the group “com.daml” and the artifact name “ledger-api-proto”.

Early Access Features

  • Support choice observers in DAML-LF 1.dev. DAML-LF 1.dev can be activated by adding the build option --target=1.dev to your daml.yaml, but note that packages compiled to DAML-LF 1.dev cannot be deployed to production ledgers.

    Choice observers, documented on the reference page for choices, add an additional keyword observer to the choice … syntax. Parties designated choice observers using that keyword are guaranteed to see the exercise of the choice. This can be useful for defining custom events, for example.

    nonconsuming choice MyEvent : ()
      with
        message: Text
        sender: Party
        receivers: [Party]
      observer receivers
      controller sender
      do
        return ()


    Here the parties in sender will all see an exercise node of type MyEvent with a payload containing message.
  • Trigger Service
    • Fixed a bug where complex models resulted in a fatal error when restoring the state from the database due to an incorrect protobuf recursion limit.
    • The application id used by a trigger can now be configured by an optional applicationId in the start request.
    • The trigger status endpoint /v1/triggers/:id now includes metadata about the trigger like the party and the trigger id. The logs field has been replaced by a status field.
    • Endpoints have been rearranged to be more consistent:
      New endpoint Old endpoint Functionality
      GET /v1/triggers /v1/list List triggers
      POST /v1/triggers /v1/start Start trigger
      GET /v1/triggers/:id /v1/status/:id Trigger status
      DELETE /v1/triggers/:id /v1/triggers/:id Stop/delete trigger
      POST /v1/packages /v1/upload_dar Upload DAR
      GET /livez /v1/health liveness check
    • The trigger service now has a --port-file option matching the corresponding option in the JSON API.
    • The trigger service now accepts multiple --dar options.
  • Triggers
    • The Daml.Trigger module now re-exports Event which avoids having to import Daml.Trigger.LowLevel for implementing a non-trivial updateState function.
    • UNAUTHENTICATED errors will now terminate the trigger. These errors are no longer available for handling in the trigger DAML code. Instead, they are forwarded to the trigger service for handling, e.g., access token refresh.
  • Command submission requests to the gRPC Ledger API now accept some optional fields for use in the upcoming multi-party submissions feature. Such submissions currently return UNIMPLEMENTED errors, but they will be enabled in the future.

Bugfixes

  • The DAML compiler now enforces sequential ordering for let expressions with multiple bindings. This fixes a bug where let expressions would be evaluated in the wrong order.
  • Fixed a bug where trace statements from a failing transaction were not displayed in DAML Studio.
  • Fixed a regression in the HTTP JSON API introduced in SDK 1.7.0, where using a party multiple times in the same JWT token (e.g., readAs and actAs) broke database queries for that party. Note that there is never a reason to include a party multiple times since actAs implies readAs.

Integration Kit

  • When using a PostgreSQL-based index, leveraging native parallel unnesting allows you to more efficiently index new transactions.
  • The kvutils Protobuf definition is now published to Maven Central in a JAR, under the group “com.daml”, with the artifact name “participant-state-kvutils-proto”.
  • The Scala JARs containing the gRPC definitions no longer contain the *.proto files used to generate the ScalaPB-based classes.
  • New CLI option –cert-revocation-checking for enabling the TLS certificate revocation checking in the LedgerApiServer.
  • kvutils reports LookupByKey node mismatches during validation as Inconsistent instead of Disputed if they can be due to contention on the contract key.
  • Bugfix: daml.index.db.store_transaction metrics were keeping track of empty insertions, skewing down the numbers.
  • Performance: minimizing the number of traversals of a transaction to index it, more efficient indexing.
  • The integrity checker now validates the contents of each write set, regardless of whether it is used subsequently.
  • Pipelining in the indexing process improves throughput by up to 15%.
  • Correct some remaining package name references to com.digitalasset.platform in logback and readme file.
  • Fix a bug resulting in the error message “Dispatcher is closed” on participant shutdown (#7986).
  • The preview of ParticipantPruningService enables ledger participants to prune the “front” of ledger state at the participant including the ledger api server index.

What’s Next

The eagle eyed reader may have noticed that some features have appeared in the “What’s Next” section for some time, and that there hasn’t been a DAML-LF release since SDK 1.0.0 and DAML-LF 1.8. This will change with one of the next releases because several features that require a new DAML-LF version are currently being finalized:

  • Choice observers (see Early Access section above)
  • Generic Maps
  • Better Exception Handling in DAML

Work also continues to move DAML Triggers and the Trigger Service (see Early Access section above) to general availability.

Lastly, the multi-party read features on the gRPC Ledger and JSON APIs will be supplemented with multi-party writes, allowing the submission of commands involving multiple parties via the APIs as long as they are all hosted on the same node.

Unlock the Power of Your Existing Infrastructure

This blog is the second of a three part series focusing on the power of DAML and distributed ledger technology.

In “Unlock the Power of Developer Productivity”, we highlighted how smart contract adoption is hindered by lock-in at the infrastructure level, i.e., developers are tasked with writing decentralized applications against new technology that may change as IT investments evolve. Porting smart contracts to a new ledger usually means a full code rewrite, learning the details of a new system, and a non-trivial resource investment to perform the migration. DAML smart contracts insulate developers from the underlying infrastructure complexities enabling developers to focus only on the code that creates business value. This abstraction has the added benefit of providing seamless application portability. With DAML applications when the underlying ledger changes, the application doesn’t have to. DAML future-proofs your application through our write once, deploy anywhere model.

Businesses need ways to leverage existing infrastructure
While most businesses are building multi-party applications for distributed ledger platforms, many wish they can leverage their existing infrastructure to automate internal processes and unlock the power of multi-party workflows without having to dive head first into distributed ledger technology. Unfortunately, traditional software frameworks are not fit for purpose since they do not support multi-party concepts such as rights and obligations which are found in advanced smart contract frameworks. The manual implementation of multi-party workflows with traditional tools not only adds a layer of risk to the final application but also forces an application rewrite when distributed ledger technology is ultimately adopted by the organization.

To unlock the benefits of multi-party workflows today, businesses need a smart contract language and ecosystem that supports both traditional database and distributed ledger technology so that your applications stay relevant even as the underlying technology continues to evolve.

Learn More about Ledgers Unlocked

Augment your existing system with DAML smart contracts

With DAML for PostgreSQL, users have access to DAML’s smart contract language, runtime, and ecosystem that provides everything necessary to build and run production grade multi-party applications running on top of traditional database platforms. The DAML Driver for PostgreSQL allows applications built with the DAML smart contract framework to run on any PostgreSQL compliant database, including Amazon Aurora and RDS, Google Cloud SQL, Microsoft AzureDB, and more. DAML for PostgreSQL provides an easy and cost effective entry point into multi-party applications for use cases that do not require a fully distributed ledger on day one. You can now automate interdepartamental workflows, augment your existing systems and simplify complex business processes with the power of DAML workflows on top of the database infrastructure you are already running today.

The same DAML application you deployed on DAML for PostgreSQL can also be migrated to any DAML enabled DLT platform, such as Corda or VMware Blockchain, without rewriting the application or redoing the integrations with external systems. When external counterparties want to store a copy of the data they are entitled to, you can redeploy your DAML application onto a distributed ledger and enable your users to run their own DLT node – all without rewriting your application. All DAML-driven applications are ledger agnostic, so you can take advantage of the latest DLT platforms and features automatically without having to make changes to your DAML application. That’s what we mean by write once, deploy everywhere!

As you start to build new solutions for multi-party applications, only DAML future proofs the application from an evolving ledger landscape while also supporting today’s technology. Be sure to check out the Ledgers Unlocked Program, Digital Asset’s solution for businesses seeking to build applications across various stacks without license restrictions, and learn how you can leverage DAML Drivers for Corda, VMware Blockchain, and PostgreSQL without experiencing ledger lock-in.

Secure DAML Infrastructure – Part 2 – JWT, JWKS and Auth0

In Part 1 of this blog, we described how to set up a PKI infrastructure and configure the DAML Ledger Server to use secure TLS connections and mutual authentication. This protects data in transit and only authorised clients can connect. 

An application will need to issue DAML commands over the secure connection and retrieve the subset of contract data that it is authorised to see. To enable this, the Ledger Server uses HTTP security headers (specifically “Authorization” Bearer tokens) to receive an authorization token from the application that describes what it is authorised to do. 

The user or application is expected to authenticate against an Identity Provider and in return receive an authorization token for the Ledger. This is presented on every API call.

What are JWT & JWKS?

Java Web Tokens (JWT) is an industry standard way to transmit data between two parties. Full details on JWT can be found at the JWT: Introduction and the associated JWT Handbook. Here we will provide a summary of the specification and how the Ledger Server uses custom claims to define allowed actions of an application.

JWTs are a JSON formatted structure with a HEADER, a PAYLOAD and a SIGNATURE. The Header defines the algorithm used to process the payload, in particular the algorithm used to sign (or encrypt) the payload. The payload contains the details of the authorization given to the application and the signature is over the structure to ensure it has not been tampered in transit. Each section is then base64 encoded with a dot separator between the sections. This is placed in the Authorization HTTP header to pass as part of each HTTP request.

An end-user or application will obtain the token by first authenticating to an Identity Provider and being issued an access token. oAuth protocol defines several means to achieve this including web users (3-way handshake that also ask for consent from human end-user) and applications (2-step Client Credentials Flow, which uses a client_id and client_secret for machine accounts). The Identity Provider will validate the provided credentials and issue a signed token for the request service or API.

So how does Ledger Server get the public key of the signer so it can validate the signature and trust the token. This is where JSONWeb Key Sets (JWKS) comes in. Each Identity Provider publishes a well known URL and we configure the Ledger Server to query this to retrieve the JKWS structure. This contains the public key of the signer and some additional metadata.

In the previous blog post you may have noticed a parameter to the Ledger Server as follows:

This tells Ledger Server to trust tokens generated from, in this case, a specific Auth0 tenant and to use the URL to get the Auth0 JWKS. It also enforces a key signing using RSA keys – RSA keys with a SHA-256 hash function.

JWT can also use other algorithms including Elliptic Curve (ES256 – EC P-256 cipher with SHA-256) and shared secret (HS256). We do not recommend using HS256 for anything more than development / testing as it is open to bruteforce attack of the shared password.

In-depth JWT Example

To give some more detail, an authenticated application submits an Ledger API command over HTTPS (GRPC or JSON) and provides a security header

The token string is of the format:

HEADER.PAYLOAD.SIGNATURE

If we separate out the sections of the JWT token you would get

This is the header, payload and signature encoded as base64 with each section separated by a dot. This is normally a single string.

After decoding (using the provided script ./decode-jwt.sh <filename>) or via the JWT Debugger (https://jwt.io/), the JWT becomes the Header and Payload portions as follows.

These represent the header and payload sections. What does this tell us? 

The header shows that the JWT was signed using RS256 and used a specific key (kid value). An identity provider may have multiple signing keys described in the JKWS and this selects which one to use to verify the JWT.

The payload contains many standard attributes (the three letter combinations) and one custom claim (“https://daml.com/ledger-api“). The standard attributes include:

TagDescription
algAlgorithm used for signing, here RS256 [checked by API]
audAudience for token
azpAuthorized Party
expExpiry in Epoch seconds [checked by API]
gtyGrant type
iatIssued at in Epoch seconds
issIssuer
subSubject (i.e account name)

The custom claim (“https://daml.com/ledger-api“) details a variety of capabilities for this application:

TagDescription
adminIs the application allowed to call to administrative API functions (true/false)
actAsAn array of Ledger Party IDs that the application is allowed to submit DAML Commands as
readAsAn array of Ledger Party IDs that the application is allowed to read contracts for
ApplicationIdA unique ID for the application. If set then Ledger Server will validate that the submitted commands also have this AppID set
LedgerIdThe Ledger ID of the Ledger that the application is trying to connect.

The authorizing Identity Management provider is expected to set these to appropriate values for the application that is requesting access.

Full details of the API and exposed service endpoints is available in the DAML Documentation. Details of the API and associated permissions is summarised in the Core Concepts section of the sample repo:

https://github.com/digital-asset/ex-secure-daml-infra/blob/master/Documentation/CoreConcepts.md

Public services are available to any application connecting to a ledger (Mutual TLS may restrict this but a valid token, with minimally not admin and no parties, is still required). Administrative services are expected to be used by specific applications or operational tooling. The remaining Contracts, Command and Transaction services are restricted to the set of parties the application is authorised for.

JWKS (JSON Web Key Sets)

The final piece of the puzzle is JSON Web Key Sets (JWKS) which an identity provider exposes to distribute its public key to allow signature verification. 

An example JWKS format is:

The details for each key, the algorithm being used (RS256), the key type (RSA), use (signatures), the key ID (kid value for which Auth0 uses the key fingerprint x5t), the public key (x5c) and some RSA key parameters (n and e fields. Other fields will be seen for EC keys). JWKS supports the distribution of private keys with additional fields for the private key but this is not used here.

A receiving service (in this case the Ledger Server API) will use this to validate the signed JWT to validate that it was issued by the trusted provider and is unaltered.

Using an example Identity Provider – Auth0

So now we have described JWT and JWKS, how do we use these standards? The following builds on the previous post Easy authentication for your distributed app with DAML and Auth0 that focused on end-user authentication and authorisation. You may want to read this first.

In the reference sample, we provide two options:

  • Authenticating using Auth0 for end-users and service accounts
  • Authenticating services via local JWT provider for CI/CD automation

Auth0

The full detailed steps and scripts are described in the reference documentation. To use Auth0 you will need to do the following:

  1. Create an Auth0 tenant – a free trial tenant is usable for this sample
  2. Create an Auth0 API to represent the Ledger Server API. This is the target for user and services to access Ledger information via the API
    1. Create New API
    2. Provide a name (ex-secure-daml-infra)
    3. Provide an Identifier (https://daml.com/ledger-api)
    4. Select Signing Algorithm of RS256
  1. Create an Auth0 “web application” for end-user authentication and access. This uses a single page application (SPA) with React, to create a sample authenticated page that displays the logged in user details and accesses the current contract set via the API.
    1. Create new Application
    2. Select Single Page Application
    3. Select React
    4. In App Settings:
      1. Set Allowed CallBack URLS, Allowed Logout URLS, Allowed Web Origins:
        1. http://localhost:3000, https://web.acme.com
  1. Configure two Auth0 “rule”s – this is a programmatic way in Auth0 to add custom claims to token generated for user authentication requests

Rule: “Onboard user to ledger”

Rule: “Add Ledger API Claims”

  1. Set up and configure end-users and define login credentials and metadata about their Ledger ID. One of the provided Rules allows metadata to be configured on first access of the user. The DAML Sandbox auto-registers new Parties on first use but production ledgers, particularly on DLT platforms, may require more complex provisioning flows.
    1. Create a New User
    2. Enter Email and your preferred (strong) passphrase
    3. If using local Username / Password database, set connection to Username-Password-Authentication.
    4. In the app_metadata section of the User, add the following template. You will need to adjust per user so that partyIdentifier matches that name of the user in the Ledger, i.e. “Alice”, “Bob”, “George”

User Metadata

  1. Define services (machine-to-machine “applications” in Auth0 terminology) and some associated metadata for each service – which parties they can act or read on behalf of. These are linked to the above API. Each m2m application defines client-Id and client_secret credentials for each service
  1. Configure an Auth0 “hook” – this is a equivalent to the end user case above but configures a way to define custom claims for services

Once this is in place, you can then update the following:

  • Env.sh
    • add the Auth0 tenant details and each of the service account credential pairs. In a production setting you would use some form of credential vault like Hashicorp Vault, AWS or GCP KMS services, etc to store and pass these to the respective services.
  • Update the ./ui/src/auth_config.json to point to the correct Auth0 tenant

The Auth0 environment is now ready for use.

Local JWT Provider

Since depending on third party services is complicated for automated testing environments, we implemented a sample JWT provider that uses code signing certificates issued from the local PKI. 

In particular, you can set an option in the env.sh script to require the environment to use a local signer. In this model the following is used:

  • A Signing Certificate is issued from the Intermediate CA 
  • A small Python program then issues JWT tokens for each service with respective custom claims for access. The are placed in <pwd>/certs/jwt directory
  • A simple Python web server is provided that exposes a local JKWS endpoint with the code signing certificate reformatted to JWKS format. It also acts as a simple authentication provider for the Python boy which uses oAuth 2-step Client Credential flow to obtain a token.

The steps are in the following scripts:

  • ./make-jwt.sh
  • ./run-auth-service.sh

The tokens are issued with an expiry of one day.

Summary and Next Steps

In this post we reviewed the JWT and JWKS Standards to allow an application to request an authorization token from an Identity Provider and submit with Commands to a Ledger. We showed how to use a sample identity provider (in this case Auth0) to allow end-user and service account authentication and get appropriate authorization tokens.

Next step is to run the sample environment and execute some tests against the environment. This is the topic for the final part of this series.If you want to see the first part on “PKI and certificates” please check here:

Read the firt part on PKI and certificates