how to start a startup with DAML

How to Start a Startup

Our inspiration for building project : DABL came from wanting to help founders and small teams make a difference with as few technical distractions as possible. To my surprise, we were often asked a lot more about what it meant to start a healthy company than to ship good code. 

I was always proud to try to help, but I couldn’t help but worry that I was presenting a narrow, biased point of view — my own!  

When we started Digital Asset, I didn’t have a wealth of experience or knowledge to help guide my decision-making — especially in the early days. There’s no shortage of great books and posts from others about how to think about starting a startup, but I found that they often fell short. Too often, they were told through the rose-tinted nostalgia of success and distance from the early days. 

Where were all the stories of the chaos from the first 100 days of Facebook? The struggle to organize the first 50 engineers at Google? How did all these behemoths get their very first customers? And where are all the honest retros about strong contenders crashing and burning?

I craved an honest review — a raw and uncut view into the reality of the early days. Were our expectations within DA realistic? Woefully incomplete? Looking back on the past (almost) six years, we have learned a lot — more than I could ever have imagined — and I have really enjoyed returning to some of these thornier issues from the beginning of the company.

This short series is a humble attempt to present a range of views on some of the things you need to consider when starting a new team, product, or initiative. My hope is any type of entrepreneur can read through the jargon and think about how these points of view apply to starting any new business. 

Given I am a singular, opinionated point of view, I wanted to make sure to give the reader a perspective well beyond my own. With this series, we will share the perspectives of a range of founders that are still in the midst of their own journeys. 

In coming weeks, we’ll wade into these high-level topics: 

  • What to consider before you even get started
  • How to pick co-founders
  • Acquiring early funding
  • How to think about the first hires
  • Thinking about life beyond the early startup phase

Each post will comprise a short set of questions that explore a specific topic. Our participants will offer their own views and we encourage you to add your voice to the discussion. 

Do you have your own questions for Digital Asset’s panel of entrepreneurs? Please write to us at

Daml has also a new learn section where you can begin to code online or try DABL here:

Learn Daml online

Try DABL for free

If you are interested in Daml smart contracts too, you should also read:

Programming Smart contracts – A look into Daml, Kotlin & Java

The terms smart contract and blockchain have taken the world by storm as a means to transform how we collaborate and share information. The promise of not having to reconcile with each other constantly to determine which copy of data is true, and allowing all participating entities to share a single version of the business processes are indeed important benefits for any digital transformation strategy.

In this blog, I’ll reflect upon some differences between two popular paradigms used to create smart contract applications that can run on a blockchain based persistence layer.

The first of these paradigms is that of using domain specific languages (domain being smart contracts and multi-party applications). For the purposes of this blog, we consider Daml, an open source smart contracts language created by Digital Asset. Through the Daml runtime – which manages the integration with the blockchain platform and exposes standard application API – Daml smart contracts can be deployed to multiple platforms without any changes. Daml uses a modified version of the Haskell GHC compiler. While the language constructs can be considered similar, Daml adds several important constructs relevant to smart contracts such as templates, rights and obligations, privacy checks etc.

The second paradigm is that of using general-purpose languages. For the purposes of this blog, these will be Kotlin & Java, primarily because these are supported by leading blockchain platforms such as Corda. Kotlin is a full-fledged modern programming language that was developed by JetBrains, the developers behind the IDE IntelliJ. Kotlin allows you to compile down to different target environments like the JVM or JavaScript and more.  It does have some differences to Java, and cannot be thought of as synonymous, but for most cases it behaves in much of a similar way so picking it up for Java developers is not too difficult.

The framework

It is not my intention to pit one paradigm or language against another. Rather I simply provide an analysis of the pros and cons of both approaches and their place in a typical enterprise technology strategy. Other developers and experts may weigh these parameters differently, or even consider new ones. Overall my goal is to enable an educated choice of tools as you embark on this exciting transformation journey for your organization.

Here are the key parameters I will use for this blog:

  1. Where does each language fit in the tech stack?
  2. Developer experience and time to business validation
  3. Portability
  4. Interoperability across networks

This is a vast topic, but I think that these parameters will be useful to developers and business leaders without leading them down a rabbit hole.

Where does each language fit in the technical stack?

This first question also serves to ground the analysis. For simplicity, let’s consider the traditional 3-tiered business application architecture that we are all familiar with. The top layer is the user interface, the middle layer is the business logic + orchestration layer, and the bottom layer is the data layer (in our case the underlying ledger).

Our focus for this blog is on the middle layer.

The traditional 3-tiered business application architecture: interaction, orchestration, smart contract workflow and data layer

Kotlin and Java are general-purpose languages so they can be used to write a generic middle orchestration layer above that can connect multiple services together using APIs or other means (e.g. ERP integration, banking system integration, API gateways). The underlying blockchain platform can also provide libraries so that these general-purpose languages can be used to write the smart contracts layer that then interacts as needed with the underlying ledger.

Daml, on the other hand, is a pure smart contracts layer. Being a domain specific language (the domain being smart contracts), it abstracts common implementation requirements for ledgers, intrinsically ensures multi-party rights and obligations, offers privacy guarantees, and provides connectors to existing implementations like Corda. The Daml runtime automatically exposes APIs that can be used by the application layer. However the Daml smart contracts layer also must be complemented by an application layer that can perform the business process orchestration and interfacing with non-Daml applications. For example, shipping data may need to be sent to an ERP, or IoT signals may need to be aggregated and cleaned before being stored on the ledger. That application layer can be written in Kotlin or Java, for example.

It must be noted that we can pursue a different architecture for the middle layer, viz. an event driven architecture as outlined in this blog by Eric Saraniecki. In that architecture, for every smart contract persisted to the underlying ledger, events can be bubbled to the orchestration layer (as automations) to invoke other services and do other housekeeping tasks. That doesn’t change the thrust of our analysis though. Daml must be used with an accompanying general-purpose language to support the orchestration layer. A general purpose language such as Kotlin and Java need to be used to create these automations.

To understand the need that Daml addresses you have to understand that by being domain-specific, Daml can enable developers and business users to quickly write a comprehensive business process that is self-contained. Then the orchestration layer can be integrated with the APIs that the Daml layer exposes. The code in Daml (not unlike other DSL languages such as solidity) can be several times shorter and much simpler allowing you to focus on the business use case.

Further, given this smart contracts foundation provided by Daml, Kotlin and Java developers can use their existing skills to create an enterprise scale orchestration layer. There is a clean separation of business rules and the complexity of the codebase can be reduced with this approach leading to a simplified enterprise architecture overall.

Developer experience and time to validation of business case

One of the advantages of Daml that I was impressed with is the speed with which I could convert a business hypothesis into a working prototype ready to be tested and validated by business users. Since it is a domain-specific language, I didn’t have to deal with cryptography, checking authorization, and even checking for a valid state of the data before creating a new operation. I could start with and focus only on the business problem in hand. If I wanted a similar outcome using Kotlin or Java I could use an existing technology like Corda by R3 but some of the comforts that Daml provides are not included. These comforts include being able to spin up a GUI and ledger environment with one command. This is valuable when interacting with your business users on a prototype.

In contrast, with a general-purpose language like Kotlin or Java, I see two scenarios: Either you wish to develop your own ledger technology, or you know the ledger technology you wish to extend that also offers Kotlin/Java libraries or an API. If you wish to develop your own ledger, then Daml is not very relevant to develop the ledger itself but can be used to develop the smart contracts layer for the ledger. In the event that you’re leveraging existing ledger technology for your application, you can use Kotlin / Java and integrate with the provided ledger framework to make it easier. You still need to set up the entire scaffolding before you can get to the business problem you are trying to solve. While most Java and Kotlin developers can probably pick up the skills necessary, it can also be quite complex.

To help understand this, let’s take an example. Here is how you would create a commercial paper in Daml and give it to a new owner.

To interact and demonstrate this ledger description it would be just a matter of running `daml start`, which exposes the Navigator and allows you to simulate ledger transactions – create a commercial paper, and then “Give” it to a new owner.

To do this in Kotlin or Java, we will use an existing framework like Corda by R3 which, like Daml, solves a lot of these problems but is written in a general-purpose language (Kotlin) that you can extend and implement with your own behaviors. This avoids introducing needless complexity of having to interact with the ledger yourself, especially when it doesn’t directly address your business use case.

In this paradigm of using a general purpose language that uses provided platform libraries, the Kotlin developer (Java is similar) must also take into account additional constructs required by the underlying ledger, and so the code is significantly more verbose. For example, for the Corda blockchain, a Kotlin developer must create 3 different classes to account for the commercial paper above:

  1. Contract class (immutable business rules).
  2. Flow class (logic and mutation of data).
  3. State class (the data on the ledger).

This also has the unintended initial drawback of making the above Kotlin or Java programs specific to the ledger for which they are written (in this case Corda). If you know you want to target Corda this may not be a problem of course. Equally it is wise, especially during the initial development phase where requirements can change, to architect your solution to reduce coupling to the ledger technology and allow for more flexibility.

In contrast, the Daml runtime abstracts the ledger semantics from the developer thus allowing for simple code which is also ledger independent (more on this in the next section). Being a smart contracts language, Daml also allows built in constructs for privacy, authorization, rights and obligations. E.g. only the “owner” in the code above can “Give” the commercial paper to a new owner. The “issuer” is not permitted to do that, unless they are also the owner. Daml also automatically ensures that a party cannot be put in an obligable position without their consent. All of these can also be done with Kotlin and Java but developers will need to rely on ledger specific libraries to validate their business processes. These implementations are naturally opinionated, like any framework, but that tradeoff for initial productivity is important to consider and evaluate based on your use case.

If the underlying ledger has been determined beyond question, then expert developers should not have a problem with either of the above approaches. Daml does allow you to quickly prototype and validate your business use case if you are already an expert.

The abstraction of ledger level complexity in the Daml architecture leads to a more streamlined developer experience in my view. For example, you can immediately test the above Daml code by simulating multiple parties in a sandbox ledger by using the built-in Navigator tool. Developers and business owners can sit together to review the business process that has been developed via a GUI rather than a command line. This level of collaboration is made possible by Daml’s syntax and focus on the business process itself.


Portability is the ability for your programs to run on multiple platforms without any changes. I consider portability from 2 perspectives:

  1. First, it is an advantage for business decision making. For example, given an identified business opportunity, selection of the underlying ledger has traditionally been required to be done at the beginning of a DLT project so that the application can be developed accordingly. Valuable time is lost even before the software can be developed and the business opportunity validated. Innovation thus far has needed to overcome a high level of inertia.
  2. Second, the notion of synchronizing business processes, minimizing reconciliation, and enabling a single golden source of truth is an enterprise problem that we have battled for years. So bringing this smart contracts paradigm to inside an enterprise, in addition to between enterprises, will yield significant benefits. If the smart contracts language can also run on traditional databases – deploying a blockchain within an enterprise may run up against performance and complexity considerations – then this will allow the business to use existing skills and open up opportunities to further improve their business functions. The underlying database or blockchain deployment option can be kept open.

In both these areas Daml has made great strides by abstracting away ledger level details into the Daml runtime. Your applications can be deployed on any Daml enabled ledger (e.g. Corda, VMware Blockchain, Hyperledger Sawtooth, Besu, and Fabric, etc.) and also traditional databases (e.g. AWS Aurora, PostgreSQL, AWS QLDB etc.) without any changes.

Your applications can be deployed on any Daml enabled ledger (e.g. Corda, VMware Blockchain, Hyperledger Sawtooth, Besu, and Fabric, etc.) and also traditional databases (e.g. AWS Aurora, PostgreSQL, AWS QLDB etc.) without any changes.

In contrast, programs developed in Kotlin or Java must be significantly revised to be able to execute on a new platform if selected, and only if that ledger integration exists. Typically this architecture looks like an API which sits on-top of the ledger, then your other services consume this API. Designing this API to abstract most blockchain behaviors means that changes to other services should be minimal if at all in this scenario. Remember that as your integration is general-purpose you have a large degree of control and responsibility within this API. There is an argument to be made for “native” programs in some cases which has been brought out nicely by Richard Brown, CTO of R3 in the interchange section of his blog titled “Whipping Up Your Market With The Five Ingredients Of Interoperability”.

In addition, because of Daml portability, the ledger selection step at the beginning of projects can be eliminated so the entire focus is dedicated to making the business process come alive. Once the process has been prototyped and discussed, then the appropriate ledger can be determined (e.g. R3 Corda). Since Daml allows for rapid prototyping and validation of business and technology needs, at this stage you make a choice to continue with Daml or choose another technology. 


In Richard’s blog that I mentioned above, he brings out a nice analogy of using Facetime between Apple device owners, or even continuing your transactions from one Apple device to another.

Just like these users are underpinned by the Apple ecosystem, such interoperability has been traditionally achieved at the network or ledger level. For example, the Corda Network enables multiple groups of nodes running different applications to talk to each other.

The reason I bring out network interoperability in a blog that focuses on smart contract languages is that being a domain specific language, Daml bubbles up this interoperability feature to the applications. Regardless of the underlying ledger being used, Daml applications across multiple networks from different software providers will be able to transact seamlessly across each other. For example, a Hyperledger Fabric network can atomically transact tokens and assets with a Corda network. And either of these networks can exchange information with Daml applications running on PostgreSQL inside an enterprise.

It is not to say that these interoperability concerns cannot be done in Java or Kotlin but the complexity in Daml is much easier to manage because it is a specific focus of the Daml community

Interoperability or portability comparisons don’t seem to be fair to Kotlin and Java which are general-purpose languages enabled by ledger specific libraries. However, I make these points to show the benefits of using a domain-specific language such as Daml at the smart contract layer in case these considerations are important to you. It is not to say that these interoperability concerns cannot be done in Java or Kotlin but the complexity in Daml is easier to manage because it is a specific focus of the Daml community.


As you can see, Kotlin / Java and Daml are complementary to each other if we consider the overall enterprise technology stack.

For complex applications that need heavy lifting and integrations with multiple enterprise systems, Kotlin and Java are well suited, and provide a wide range of existing integrations and libraries in addition to a vast developer base. If your underlying ledger is Corda, then libraries are already provided to make the smart contract development efficient in these languages.

When it comes to writing the smart contract layer itself, practitioners will do well to evaluate a DSL such as Daml to pair with a general-purpose language such as Kotlin or Java for the application layer. A DSL, specifically Daml provides important built-in constructs for privacy, rights and obligations, ledger portability and also interoperability. Both developer experience as well as speed of business innovation will be enhanced.

A dual approach offers huge productivity advantages to developers, and reduces risk for businesses. For complex enterprise technology architectures, it does not need extensive reskilling while offering benefits not easily possible today. Furthermore, you can always start with Daml for the productivity gains and as your prototype progresses, and when the use case demands it, migrate to the underlying ledger technology if you need to.


Views expressed in this blog are my own and do not represent the views of my employer. Many thanks to Manish Grover for his expert insights and editorial assistance for writing this blog. If you are also interested in a comparison with Daml and Solidity you can check this out:“The World of Smart Contracts using Daml & Solidity”

Daml has also a new learn section where you can begin to code online:

Learn Daml online

If you want to master Daml smart contracts, you should also read:

Release of Daml SDK 1.0.1

This is a bugfix release. All users of SDK 1.0.0 are encouraged to upgrade at their earliest convenience.

This release fixes 3 issues:

  1. Fix an issue with false negative key lookups by non-stakeholders. ( This issue affected the new sandbox released in SDK 1.0 (but not sandbox-classic) as well as the scenario service. Both sandbox and the scenario service behave properly now.
  2. Fix a crash in the scenario service. SDK 1.0 introduced a bug where the scenario service would crash if a failing transaction references a transient contract. In Daml Studio this error was shown as Scenario service backend error: BErrorClient (ClientIOError (GRPCIOBadStatusCode StatusUnknown (StatusDetails {unStatusDetails = \“\”})))
  3. Fix an issue where the new sandbox introduced in SDK 1.0 incorrectly rejected certain commands relying on getTime during validation ( This was only an issue if you set either min_ledger_time_rel or min_ledger_time_abs.

Calling Any API Through Daml

Editor’s note: This post is the fifth in our series “How to Make the Most of the Daml Application Framework.” Part 1 introduced the fundamentals of Daml/DABL architecture, and Part 2 explained how that architecture simplifies the challenges of user authentication. Part 3 discussed how it speeds front-end development, and Part 4 discussed how DABL’s own APIs make quick work of validating and deploying Daml applications. 

In last week’s edition, we discussed how DABL’s API syncs up complementary endpoints in Daml and DABL to enforce user/Daml Party authentication without additional configuration or layers. 

This week, we’ll consider how DABL talks to other APIs to speed integration of third-party functionality into Daml applications. 

Digital Asset/Daml  is extending that capability across multiple external APIs

Digital Asset is building a layer that lets Daml developers plug APIs into Daml workflows

Because Daml/DABL architecture bakes in support for privacy and business processes, applications built in Daml can initiate secure workflows with a single API call via DABL. 

Now, Digital Asset is extending that capability across multiple external APIs, starting with those specified within the Swagger framework. 

Because each third-party API represents a distinct implementation, you currently have to do all the work to normalize, abstract, and coordinate across those APIs. The task becomes harder and messier with each new implementation you need to code for. 

Daml/DABL architecture untangles the mess. By mirroring the external API in Daml, your app is already normalized into a single layer of functionality — the Daml layer — and coordinating is as straightforward as writing your Daml workflow. DABL is working to make the UX of running and managing your connectivity to those APIs “just work” without any additional effort to maintain their connectivity.

You can add support for third-party APIs simply by creating bots that listen to external activity and make judgments about what data to send or receive.

Consider a Daml application that checks on Bitcoin prices. For it to have true value, this application has to reach the outside world via integration with a third party like CoinDesk. Other examples would be an application that uses Slack to notify users when a contract is created or harnesses Stripe to send payments. 

To connect, DABL will use a package that contains a Daml implementation of the required third-party API, as well as any bots or network connections required for the integration. Working with a deployed ledger, engineers won’t have to implement the API code themselves — just to create a proper Daml contract. 

Extending the concept, Digital Asset is building a layer that lets Daml developers plug APIs into Daml workflows with Daml’s permissioning and security model and the error  handling and scalability required for production use. 

Do you want to extend your applications’ capabilities via external APIs? Reach out to us today to let us know which API you would like to see supported by DABL. 

And if you want to get started building your own application, go to, download the SDK, and use today. Daml is open source and always free to use, and DABL is free to start. 

Unbounded tech stacks with Daml

When you create an application you can be reasonably assured that the program starts and stops at its runtime with fairly flexible interfaces for data persistence such as relational SQL, document stores, key value stores, time series dbs, etc. So it comes as quite a shock to people when they write smart contracts and find that they’re locked into a single language, in a single runtime, with a single database design. 

If composing an Ethereum smart contract you’re now stuck with Solidity and data from within the blockchain itself without any access to data from the outside world unless you choose to permanently store it within the blockchain. If writing for other platforms you may get locked into Java or another language of choice with libraries and fixtures specific to the platform. Still other stacks like Tendermint allow for any language/runtime, but restrict your ledger/database to a single application.

Each of these cases limit the portability of your application and keep you locked in to specific ledgers, databases, architectures, and backends. In early development phases where flexibility is needed you may find yourself rewriting the same application on several different platforms just to have a rough comparison between them.


Image courtesy of Andrey Matveev at

If we want to solve these issues we need to have a language that can be used on any backend that we choose and where the concepts used to enforce permissions and access to data are consistent across platforms. Unfortunately this ideal is a bit unrealistic, every ledger has a different way of dealing with these confounding variables and no two ledgers are alike, if they were we wouldn’t care to choose between them.

However there is a better way, and it is the one Daml employs (surprising for a blog about Daml, I know). But in all seriousness the Daml approach does work and it’s somewhat akin to the Tendermint approach in that the ledger is used for data persistence. However it differs from Tendermint in that the runtime does not need to talk to a ledger in a specific way, nor does it restrict the ledger to a single application.

In Daml the runtime runs entirely outside of the ledger and when it needs to talk to a specific ledger for data persistence it does so through an adapter that allows it to translate its desired state to commands that the underlying ledger can understand. The programmer does not need to worry about how Fabric, Sawtooth, Corda, or any other ledger stores and retrieves its data, it all just happens through a common interface of Daml code and JSON APIs.

Ultimately this gives us a variety of benefits:

  1. Developing an MVP and designing our application’s workflow is easier. Essentially business logic is defined in a single place, the Daml code, rather than being distributed between database logic and the UI. This may not seem like a portability gain, but it’s an indirect feature as without it we could not have portable code
  2. We avoid vendor lock-in, and with it have unparalleled flexibility in the design and architecture of our programs
  3. Portable code is composable code, distributed applications can more easily interact with each other
  4. Interchangeable UIs, the complete decoupling of frontend and backend logic allow for more independent development of interfaces and faster iteration. Improving your UI has zero risk to your business logic

If you want to know more about Daml checkout our documentation, or come chat on our forum. This article is the third of many in an ongoing series of distributed ledger concepts, if you enjoyed it then check out the first post where we breakdown the practical implications of centralized, distributed, and decentralized systems, and the second where we discuss the importance and improvement of smart contracts.

Announcing Daml SDK 1.0

It’s been a year in the making – more if you count the significant development that happened before we open sourced Daml on the 4th of April 2019 – but today we are thrilled to announce the release of Daml SDK 1.0 on the 15th of April 2020.

With Daml we want to bring you the best experience for building distributed applications, at all scales and for any infrastructure. We had already laid the groundwork for this a year ago with the Daml Smart Contract Language. But applications are so much more than just shared state and rules. Applications need to be composable, upgradable, and maintainable. They include automation, integrations, and user interfaces. And of course they need compelling deployment targets.

Consequently, 2019 and Q1 2020 saw a lot of investment into all of these areas with the end result that it is now possible to build and deploy complete end-to-end Daml applications to a range of targets in a matter of days, using sound APIs that we believe will stand the test of time.

Having spent the last month putting some polish on the ledger client tooling, the next generation Sandbox, and tidying up after a year of intense feature development, we now feel the time is ripe to call this the first major version of the Daml SDK and give our users a much stronger commitment on the stability of the developer experience.

Release Candidate for Daml SDK 1.0

The preliminary release notes for Daml SDK 1,0. can be found here. A community open door session will be held Tuesday 14th April 2.30pm-3.00pm CET on Zoom.


  • A new Getting Started Guide shows how several new pieces of JavaScript client tooling fit together to build and extend a distributed social networking application.
  • The Time Model has been improved so that it works seamlessly without user input to the Ledger API.
  • Next generation Sandbox giving an experience closer to distributed infrastructures.

A little teaser of the app you can build with the new Getting Started Guide:

What’s Next

The next month, and possibly months, will be focused on finishing more of the features we started and making the entire Daml ecosystem more robust by improving documentation, performance, compatibility, and production readiness of Daml Ledgers.

  • Daml Triggers currently need to be started one-by-one using the daml trigger command making them difficult to control dynamically at runtime. We will work on a solution to make them easier to use in practice.
  • We will work to complete the Websockets streaming part of the JSON API.
  • We will work to complete Daml REPL.
  • Daml will get a generic Map type as part of Daml-LF 1.9.
  • We will work to make Daml Ledgers more performant.
  • We will continue to work on our release process and to tighten the interfaces between different components of Daml so that we can give clearer compatibility and long term support guarantees.

Release of Daml SDK 1.0

Daml SDK 1.0.0 has been released on the 15th of April 2020. You can install it using

daml install latest

If you’re planning to upgrade your Daml SDK to take advantage of our newest features please note that some action may be required on your part. If you’re not planning to upgrade then no change is necessary.


The summary of the release is available in video format on the Daml Youtube channel:



  • New JavaScript/TypeScript client-side tooling is now stable and the recommended way to build Daml applications. A new Getting Started Guide based on these tools has replaced the Quickstart guide.
  • The Time Model has been improved so that it works seamlessly without user input to the Ledger API. Action needed when you update to the latest version of API bindings or recompile gRPC clients.
  • More TLS configuration options for Daml Ledgers.
  • The next generation Sandbox is now the default, bringing an experience closer to a distributed ledger. Immediate action is needed if your project is relying on scenarios for ledger initialization.
  • Cleanup of names, deprecated features and language versions. Immediate action needed if you use any Java dependencies withcom.digitalasset packages or Maven coordinates.

Known Issues

  • The new Sandbox has a known issue where some false negative contract key lookups
    are only correctly validated on the read path, not on the write path. The net
    effect is that with carefully constructed Daml models, non-conformant transactions can
    be recorded in the underlying storage, which may lead to data continuity issues when this issue is fixed. Full details can be found on GitHub issue #5563.

What’s New

New Client Tooling


Distributed applications are much more than smart contracts running on a distributed ledger, and in 2019 we set out to make it significantly easier to build that part of applications which lives off-ledger: Automations, Integrations, and UIs. The new tooling is focused on giving application developers an easy-to-consume, real-time ledger state, which moves the development experience away from event sourcing and makes it similar to working with a database.

  • The HTTP JSON API: giving a queryable view of the ledger state and endpoints to submit transactions, all using an easy-to-consume JSON format.
  • A JavaScript/TypeScript code generator: turning a Daml package into a (typed) library to interact with the HTTP JSON API.
  • A set of JavaScript/TypeScript client libraries: working hand in hand with the code generator to interact with the HTTP JSON API, and bind ledger data to React components.
  • A new Getting Started Guide shows how all these pieces fit together to build a complete distributed end-to-end application with a custom UI.

The HTTP JSON API is designed to be consumable from any language ecosystem. The choice of JavaScript (and React) for the rest of the tooling was driven by the desire to aid application development all the way up to UIs, using the most widely adopted technologies.

Specific Changes

  • The documentation has a new Getting Started Guide. The previous Quickstart guide has moved under the Java Bindings section.
  • There is a new SDK template with a skeleton for an end-to-end application using the new tooling. It’s documented and used in the new Getting Started Guide. Use daml new my-proj create-daml-app to get started.
  • The /v1endpoints of the HTTP JSON API and the JavaScript Code Generator and Support Libraries are now stable.
    • The JSON API has gained an endpoint to allocate parties:/v1/parties/allocate.
  • Support for maps and lists has been removed from the query language.
  • Note that the WebSockets streaming endpoint of the HTTP JSON API is still under development.

Impact and Migration

The new client tooling is almost purely additive so for most, no action is needed. For new applications, we recommend this tooling as it makes a lot of things quicker and easier. However, direct use of the Ledger API and HTTP JSON API continues to be a good option for anyone needing lower-level control or wanting to use a different language for their applications.

The only non-backwards compatible change compared to previous versions is the removal of queries on lists and maps in the HTTP JSON API. There is no trivial migration for this. If you were relying on these capabilities please get in touch with us via or on Slack. We’d like to hear how you were making use of the feature so that we can replace it with something better, and we will make some suggestions to work around the removal.

Improved Time Model


SDK Release 0.13.55 introduced a new method for command deduplication and deprecated the command field maximum_record_time. SDK Release 1.0 further improves the Ledger Time model so that users no longer need to pass in any time related information to the Ledger API. The new time model is designed to work under almost all circumstances without user intervention, making developing applications against Daml Ledgers easier in practice.

Specific Changes:

  • The Sandbox no longer emits Checkpoints at regular intervals in wall clock mode.
  • The ledger_effective_time and maximum_record_timefields have been removed from the Ledger API, and corresponding fields have been removed from the  HTTP JSON API and Ledger API language bindings.
  • The –default-ttl command line argument of the HTTP JSON API is gone.
  • Ledger Time is no longer strictly monotonically increasing, but only follows causal monotonicity: Ledger Time of transactions is greater than or equal to the Ledger Time of any input contract.
  • The Command Service is no longer idempotent with respect to duplicate submissions. Duplicate submissions now instead return an ALREADY_EXISTS error, consistent with the new deduplication mechanism of the Command Submission Service.

Impact and Migration:

Old applications will continue running against new ledgers, but ledger effective time and maximum record time set on submissions will be ignored. As soon as the client-side language bindings or compiled gRPC services are updated, the fields will need to be removed as they are no longer part of the API specification.

Better TLS Support


Daml Ledgers have always supported exposing the Ledger API via TLS, but support on consuming applications was inconsistent and often required client certificates. From this release onward, more client components support consuming the Ledger API via TLS without client authentication.

Specific Changes

  • When Sandbox is run with TLS enabled, you can now configure the requirement for client authentication via  –client-auth. See the documentation for more information.
  • The daml deploy and daml ledger commands now support connecting to the Ledger API via TLS. See their documentation for more information.
  • Daml Script and Daml Triggers now support TLS by passing the –tls flag. You can set certificates for client authentication via –pem and –-crt and a custom root CA for validating the server certificate via –cacrt.
  • Navigator, Daml Script, Daml REPL, Daml Triggers, and Extractor can now run against a TLS-enabled ledger without client authentication. You can enable TLS without any special certificates by passing –tls.
  • Daml Script and Daml Triggers have the option to configure certificates for client authentication via –pem and –crt and a custom root CA for validating the server certificate via –cacrt.

Impact and Migration
This is a new capability, so no action is needed. These new features are useful in production environments where client to ledger connections may need to be secured.

Next Generation Sandbox


The Daml Sandbox has had a major architectural overhaul to bring it and its user experience even closer in line with other Daml Ledgers. The new Sandbox is now the default, but the “classic” Sandbox is included as a deprecated version in this release. The classic Sandbox will be removed from the SDK in a future release and will not be actively developed further.

Specific Changes

  • daml sandbox and daml start start the new Sandbox. The classic sandbox can be invoked via daml sandbox-classic and daml start –sandbox-classic=yes.
  • Wall Clock Time mode (–wall-clock-time) is now the default.
  • Scenarios are no longer supported for ledger initialization.
  • Contract identifiers are hashes instead of longer sequence numbers.
    • A new static contract identifier seeding scheme has been added to enable reproducible contract identifiers in combination with –static-time. Set flag –contract-id-seeding=static to use it.
  • Ledger API Offsets are no longer guaranteed to be a parsable number. They are an opaque string that can be compared lexicographically.
  • The command line flags –auth-jwt-ec256-crt and –auth-jwt-ec512-crt were renamed to –auth-jwt-es256-crt and –auth-jwt-es512-crt, respectively, to align them with the cryptographic algorithms used.

Impact and Migration

The impact is primarily on demo applications running in static time mode and/or using scenarios for ledger initialization. Since both the classic  and new Sandbox are compliant Daml Ledgers, there is no difference in behavior apart from these fringes.

  • If you rely on static time mode, set it explicitly using –static-time.
    • If you rely on reproducible contract identifiers, also set –contract-id-seeding=static.
  • If you use a scenario for ledger initialization, migrate to Daml Script.
  • If you were parsing ledger offsets, you need to find a way to stop doing so. This is not guaranteed to be possible on Daml Ledgers other than the classic Sandbox. If you were relying on doing so, get in touch with us on We’d like to help with migration and want to understand how you were using this so we can better support your use case.
  • If you were using ES256 or ES512 signing for authentication, adjust your command line flags.
  • If you were running the now classic sandbox with persistence in a SQL database, you need to recreate contracts in the ledger run with the new sandbox. There is no automatic data migration available.

To ease transition, you can revert back to the classic Sandbox using daml sandbox-classic and daml start –sandbox-classic=yes. Note that the classic Sandbox is deprecated and will be removed in a future release.

Cleanup for Daml SDK 1.0


As we are moving into the 1.0 release line, we have done some cleanup work, aligning names of artifacts, removing deprecated language versions, streamlining the release process, and finishing a few language tweaks by turning select warnings into errors. 

Specific Changes

  • All Java and Scala packages starting with com.digitalasset.daml and com.digitalasset are now consolidated under com.daml
    • Impact: Changing the version of some artifacts to 1.0 will cause a resolution error.
    • Migration: Changing Maven coordinates and imports using a find and replace should be enough to migrate your code.
  • Ledger API services are now under the com.daml package. A compatibility layer has been added to also expose the services under the com.digitalasset package.
    • Impact: grpcurl does not work with the compatibility layer.
    • Migration: Scripts using grpcurl need to change the service name from com.digitalasset to com.daml.
  • < Daml SDK 1.0: com.digitalasset.ledger.api.v1.TransactionService
    >= Daml SDK 1.0: com.daml.ledger.api.v1.TransactionService)
  • The default Daml-LF target version is now 1.8.
    • Impact: Projects will not run against old Daml Ledgers that do not support Daml-LF 1.8.
    • Migration: You can target 1.7 by specifying –target=1.7 in the build-options field in your daml.yaml.
  • All Daml-LF versions <1.6 are deprecated and will not be supported on Daml Ledgers.
    • Impact: The new Sandbox will not run Daml code compiled to Daml-LF 1.5 or earlier.
    • Migration: Use classic Sandbox to run older Daml models.
  • We no longer release the SDK to Bintray.
    • Impact: If you were relying on artifacts on Bintray, you will not be able to update to version 1.0 without changing the repository.
    • Migration: The new locations are as follows:
      • SDK Releases and Protobuf files are released to GitHub Releases.
      • Java/Scala artifacts are on Maven Central.
      • JavaScript artifacts are on NPM.
  • File names must now match up with module names. This already produced a warning in previous releases
    • Impact: Projects in which there are mismatches will no longer build.
    • Migration: Change your .daml filenames to match module names.
  • It is now an error to define a record with a single constructor where the constructor does not match the type name. This restriction only applies to single-constructor records. Variants and enums are not affected. This already produced a warning in SDK 0.13.55.
    • Impact: Projects with now illegal type declarations will no longer build.
    • Migration: In declarations of the type data X = Y with .., you have to change the type name (X) to match data constructor name (Y) or vice versa.
  • The compiler name collision check has been extended to also count the case as a collision where you have a type B in module A and a module A.B.C (but no module A.B).
    • Impact: Projects with such module names will produce warnings and stop compiling in a future release. The JavaScript Code Generator is not usable on packages that don’t uphold this restriction.
    • Migration: You have to rename your modules to avoid such name clashes.

Impact and Migration

Impacts and migrations are covered item by item in Specific Changes above.

Progress on Features Under Development


Work is progressing on two features that are currently under active development.

  1. The Daml REPL, introduced with SDK 0.13.55 is becoming richer in its abilities, getting ever closer in capabilities to Daml Script.
  2. Work on a Websockets streaming version of the HTTP JSON API’s querying endpoints is progressing. The aim with this streaming service is to combine the ease of consumption of the HTTP JSON API with the liveness provided by a streaming API.

Specific Changes

  • Daml REPL
    • You can now use import declarations at the REPL prompt to bring additional modules into scope.
    • You can now use more complex patterns in statements, e.g., (x,y) <- pure (1,2).
    • You can now connect to a ledger with authentication by passing it to  daml repl using –access-token-file option.
  • Websockets on the HTTP JSON API
    • The error format has changed to match the synchronous API: {“status”: <400 | 401 | 404 | 500>, “errors”: <JSON array of strings> }.
    • The streaming version of the query and fetch-by-key endpoints now emit the last seen ledger offset. These offsets can be fed back to new requests to start the stream at said offset. Such offset messages are also used for heartbeating instead of the previous explicit heartbeat messages.

Impact and Migration

The only impacts are on consumers of the Websocket streaming APIs. Those consumers will have to make some minor adjustments to include the API changes around error handling and ledger offsets.

Minor Changes and Fixes

  •  Better support for snapshot releases in the Daml Assistant.
  • daml version can now list the available snapshot versions by passing the flag –snapshots=yes.
    • daml install latest can now include the latest snapshot version by passing the flag –snapshots=yes.
    • Daml Script can now be run over the HTTP JSON API, which means it now runs against project:DABL. Take a look at the documentation for instructions and limitations.
  • Party strings are now restricted to 255 characters.
    • Impact: If you used the Sandbox with very long Party strings they’ll be rejected by the new Sandbox and other Daml Ledgers.
    • Migration: Shorten your Party strings. Note that in ledgers other than Sandbox, you may not be able to choose them entirely freely anyway.
  • You can now disable starting Navigator as part of daml start in your daml.yaml file by adding start-navigator: false.
  • Calls to the GetParties API function with an empty list of parties no longer results in an error, but in an empty response.

How Daml & DABL Provide All the APIs Your App Will Ever Need

Editor’s note: This post is the fourth in our series “How to Make the Most of the Daml Application Framework.” Part 1 introduced the fundamentals of Daml/DABL architecture, Part 2 explained how that architecture simplifies the challenges of user authentication, and Part 3 discussed how it speeds front-end development. In this installment, we’ll discuss how the DABL APIs make quick work of validating and deploying Daml applications. 

In last week’s edition, we introduced some ways API support in Daml and DABL speeds and simplifies development. 

To recap: DABL’s built-in support for API logic can kick off a workflow with a single API call, while Daml provides provisions for privacy and business processes, eliminating the need for manual security logic. Daml and DABL auto-generate all the APIs you’ll ever need, so you can focus on the behavior of the application itself.

Now, let’s discuss how DABL’s API syncs up complementary endpoints in Daml and DABL. 

DABL’s API provides a RESTful endpoint that can create and exercise contracts as well as query or fetch active contracts. It also provides Websocket endpoints for streaming an active contract set. 

DABL’s JSON endpoints precisely mirror those of your Daml templates — so once your Daml templates are complete and deployed to DABL, authentication is a simple matter of passing an API key specific to individual ledger parties in the form of a JWT token. 

Because all read/write controls are baked into your Daml file, it’s straightforward for the DABL service to enforce user/Daml Party authentication without additional configuration or layers.

Application developers can use the DABL API as a substitute for sandbox actions to understand whether their workflow can work as a web application. What’s more, they can expose their apps to developers or end users. It’s easy to add integrations, interact programmatically, and easily integrate a UI. 

Because these Daml/DABL APIs are automatically made available, client-server contracts can remain fixed. That allows you to scale the back end transparently and extend a single approach to web services among unlimited templates. 

If you want to get started building your own application, go to, download the SDK, and use today. Daml is open source and always free to use, and DABL is free to start. You can find a generic React UI with the login widget implemented in this open source repo that is deployable in DABL today. 

Smart contracts are good. Let’s make them better.

Nick Szabo’s smart contract in Daml

Smart contracts are an interesting concept that were first introduced by Nick Szabo in his seminal paper on smart contracts. The idea of representing agreements as code, making programs code more legible for non-programmers, and enforcing processes and agreements is quite attractive. But have we figured out how to best implement smart contracts? Is it solely legal contracts or is there wider applicability of these concepts? In short, can we do better?

It may first help to understand what we’ve learned about smart contracts now that they have been live on public (ex. Ethereum) and private networks for many years now. Namely:

  1. Interest outweighs usage. Experiments abound, and a small but sizeable number of intrepid programmers are exploring this vast new landscape
  2. They are not well suited to imperative languages, which often obscure their behavior and can lead to veryexpensivebugs
  3. They generally represent privately enforceable agreements between parties and rarely need their contents, execution, or enforcement to be public

So let’s take a look at one of Szabo’s earliest reified contracts, that of a leased car.

“(1) A lock to selectively let in the owner and exclude third parties;

(2) A back door to let in the creditor;

(3a) Creditor back door switched on only upon nonpayment for a certain period of time; and

(3b) The final electronic payment permanently switches off the back door.”

It may not be a lot of text but there’s a lot of things going on here, namely:

  1. Two parties to the contract, the creditor and the owner
  2. A lock that has configurable permissions
  3. A backdoor to the lock that has payment based permissions
  4. An implied ability for the car to be able to query a payment system

Let’s refine these assumptions a little for implementation and complexity

  1. There are actually three parties to the contract, the creditor, the owner, and the car (enforcing the rules)
  2. The lock governs starting the car and is a series of assertions about the state of the contract. If the assertions fail the lock stays locked
  3. The backdoor is also limited by such assertions
  4. The car has a privileged and uninterruptible knowledge of a payment system, be it a bank account, 3rd party attestation, or a public value transfer network like Bitcoin. While our design glosses over these specifics, the interesting implication here is that the car is a reasonably autonomous party to our contracts

We’ll also assume, as Szabo did, that the mechanism enforcing the rules of these contracts would cost more to remove or replace than the cost of the car. Given the availability of tamper-resistant processors this is a reasonable assumption to make.

So what does our contract look like then? Well, this is the whole thing, ~50 lines excluding comments, read it. You’ll find that not only is it short, but it’s also relatively legible even for non-programmers. This is one of the reasons we use a declarative language like Daml rather than an imperative language like Solidity or Java, it’s easier to reason about.

The other reason we like this is because these contracts don’t need to be public. They’re private and between only those parties that need to be aware of them, and only in the ways they need to be aware.

Only the creditor can create the `LeaseOperation` contract, and only the owner and car can see it, each of them can do some things to the contract, but none can do everything. 

Similarly the `Payment` contract is controlled exclusively by the car. Payments are involved, and they can be on any private (ex. Bank, Credit Card) or public network (ex. Bitcoin), but they are side effects to the contract and there’s no reason your public or private network needs to know, or be involved in, the terms of your agreement. They process your payment, they don’t enforce your contracts.

All variables are explicit and strongly typed, they are well known to the programmer, legal auditors, contract participants, the compiler, and the execution engine. All of these levels of checks and the ease with which this code is interpreted allows for far more participation and understanding of the contract than either a generic or imperative programming language or a legal contract alone. If you’re interested in a comparison between these approaches you can read this great article by Manish Tomer.

Similarly actions are much better when explicit authorizations are given about who can take the actions. We should write smart contracts like this, defaulting to no one being able to interact without authorization, rather than the common pattern of defaulting to everyone being able to interact and then coding to restrict these interactions. This is a nuanced but important distinction, one of the biggest examples of the latter being a hack that permanently lost about $150 million dollars worth of assets on Ethereum.

There’s no doubt as to who can start the car in the above snippet. Similarly there should be no doubt about who can move (or freeze) funds, or any other high value operation.

Ultimately better smart contracts boil down to the following: 

  1. Be explicit rather than implicit. Call them actions, choices, or functions, but no matter their name they should state exactly who has authority when, in all cases. If it’s ever unclear no contract should execute.
  2. Source code should be clear and legible to the point where even non-programmers can make reasonable sense of what the contract does, and who can do it.
  3. There should be many layers of fail safes, as many as possible, between when an application is designed, and when it is in use in the real world.
  4. Smart contracts do not need to encompass the whole world, they can stay private and their interactions with the real world can be side effects.


If you want to know more about Daml checkout our documentation, or come chat on our Slack channel. This article is the second of many in an ongoing series of distributed ledger concepts, if you enjoyed it then check out the first post where we breakdown the practical implications of centralized, distributed, and decentralized systems.

DABL backends make frontends easy


Editor’s note: This post is the third in our series “How to Make the Most of the Daml Application Framework.” Part 1 introduced the fundamentals of Daml/DABL architecture, while Part 2 explained how that architecture simplifies the challenges of user authentication. Now let’s consider how the Daml/DABL architecture speeds front-end development. 

Daml and DABL make application development much easier than traditional application programming by reducing the number of layers you need to develop. Because DABL handles much of the functionality you’d typically have to build from scratch, you only have to specify a UI at the front end and a Daml application at the back end.

So what do front-end developers have to do, and what does it look like in real life? 

A key to UI development: DABL’s out-of-the-box support for API logic that can kick off a workflow with a single API call. 

Daml’s baked-in provisions for privacy and understanding of business processes automatically prevents application users from seeing data they aren’t supposed to see, eliminating the need for brittle and error prone manual security logic.

That means the level of your API logic can be “dumber.” Front-end developers can quickly connect buttons and other interface features to existing API calls, lightening the workload of their homegrown UIs. 

Thanks to some clever features in Daml and DABL, it’s possible to develop different front ends for different application users. That would allow a developer to build custom apps for different user types and needs without having to build complicated supporting layers in the backend.

New enhancements keep making Daml and DABL faster and more powerful for front-end developers. One new feature generates typescript types from your Daml code. Because Daml is a strongly typed language it understands what data you’re manipulating, and can check your work while you’re developing, rather than later during testing or production. If you try to access data from the front end that doesn’t exist on the back end, it will call out the issue as you develop, rather than waiting until the application is running to report the error. 

We’re even developing a feature to stream events from server to client, making it easier for front-end developers to provide end users with far more timely updates of the data they’re working with. 

If you want to get started building your own application, go to, download the SDK, and use today. Daml is open source and always free to use, and DABL is free to start. You can find a generic React UI with the login widget implemented in this open source repo that is deployable in DABL today.