MELD

We build MELD on top of the Cardano blockchain, so improving the platform is one of our priorities. This technical report gives insights into our contribution to the Cardano ecosystem and the progress of our internal projects so far in September.

Cardano

Plutus

Documentation

We have not been able to continue our public Plutus documentation work this month due to increased demand for internal writing. Fortunately, many of these internal works can be ported to our public Plutus documentation in the future.

Current progress:

A good news is that the first entry to Plutus V2 interface is here:

This PR includes essential improvements to TxInfo that we would love to document. For example, having redeemers in ScriptContext would lead to cleaner validator scripts to reason about. Fewer extra checks on datums would yield smaller-sized and faster scripts as well.

We are also looking forward to publishing our internal dApp standards with examples, from iterative protocol specifications & contract designs to coding standards.

Improve Chain Index Config Handling

We continue to work with the Chain Index component for both security research and dApp development. This endeavor includes customizing the component to efficiently store and query any data we want, getting raw data from on-chain hashes, building specialized GET endpoints for MELD-specific applications, and more.

cardano-db-sync is another great candidate for this use case. We prefer Chain Index for now because it is leaner to customize.

During our work, we opened a quick PR that improves config handling for plutus-chain-index:

This PR further distinguishes chain index & logging configs and supports printing default configs to file. Previously there was an unhandled DumpDefaultConfig CLI command, with a few misdescriptions between the two config types.

Fix Pretty-Printing fakeNameDeBruijn

This fix is for a minor inconvenience we found with Hachi’s work.

According to the Plutus Core specifications, names can only start with letters.

Name n ::= [a-zA-Z][a-zA-Z0-9_’]*

But previously fakeNameDeBruijn gave an empty ndbnString, which led to “invalid” pretty-printed programs in this scenario:

  • Call mkTermToEvaluate on a Script.
  • The program is named with fakeNameDeBruijn.
  • The NamedDeBruijns are translated to Name through unDeBruijnProgram.
  • prettyPlcClassicDebuging the program/term leads to names like _0, _1 because of the PrettyBy instance of Name:
    instance HasPrettyConfigName config => PrettyBy config Name where
      prettyBy config (Name txt (Unique uniq))
          | showsUnique = pretty txt <> "_" <> pretty uniq
    ...
    

This bug led to the uplc CLI itself failing to parse the program:

$ cat alwaystrue.uplc | uplc print
uplc: Lexical error: lexical error at line 3, column 12

Our PR fixed it:

cat alwaystrue.uplc | uplc print
(program
  1.0.0
  [
    [
      (lam i_0_0 (lam i_1_1 (lam i_2_2 (lam i_3_3 (lam i_4_4 i_0_0)))))
      (delay (lam i_5_5 i_5_5))
    ]
    (lam i_6_6 i_6_6)
  ]
)

This fix is helpful for security researchers who do not use Haskell in their everyday work but can swiftly write parsers and transpilers for a well-scoped language like Plutus Core. These transpiling efforts are critical in utilizing existing security toolboxes for Plutus programs.

Check Full Outputs in Plutus.Contract.Test

During testing, we often want to check the outputs at a single address. We already have dataAtAddress in Plutus.Contract.Test, but it only checks the datums:

dataAtAddress :: forall d . FromData d => Address -> ([d] -> Bool) -> TracePredicate

We want to add another utility function that zips datums with Values at each output:

forall d. FromData d => Address -> ([(d, Value)] -> Bool) -> TracePredicate

Since this is only for testing, we suspect this can even replace dataAtAddress as a generalization covering more cases. Breaking people’s CI/CD is a bit rich, but a fix should be trivial enough.

Cardano Base

During our periodic dependency bumping, we found a failed orphan Serialise instance on Hash in ouroboros-network that gets propagated to places like Ouroboros.Consensus.Mock.Ledger.UTxO.

Since the new PackedBytes solution does not export the core module intentionally, we helped add a Serialise instance right there:

instance Serialise (PackedBytes n) where
  encode = encodeBytes . unpackPinnedBytes
  decode = packPinnedBytesN <$> decodeBytes

We planned to add tests and try packPinnedBytes in place of packPinnedBytesN to decode tighter. The PR was merged before we could get back to it, so maybe next time.

We have also been looking at the use of FFI in this component. This effort aligns with our dedicated security research on Cardano and the Bounty Program from the Cardano Foundation.

The Concurrency Challenge

We have always been aware of the concurrency challenge on the UTXO ledger where multiple agents compete to spend the same UTXO in the same block. We attempted to design our concurrent state machine months ago but stopped because it was not worth it. Layer-1 latency is always expected for a young ecosystem with small block size. We should either invest in a Layer-2 solution or focus on other areas like security and economics models first.

That said, the recent conversations on concurrency once again triggered our curiosity. We started small with some internal discussions that quickly led to more formalization efforts. We had to publish the first paper draft in just 2-3 days to help address heated questions without merits against Cardano’s concurrency capability. Since then, much more progress has been made: from more ideas, formalizing and verifying models with mathematicians, to building proofs-of-concept in different environments.

We are still against publishing immature research but hope the following drafts are helpful to other projects building on Cardano, especially when other proposed solutions lead people to trust non-consensus and non-deterministic off-chain bots. For example, a single-swap-batch races fastest in such a setting but is still rewarded for rendering a DEX one swap per block. No on-chain consensus is also a straightforward recipe for front-running and MEV. Hachi is ready to stress-test any protocol that trusts these independently racing bots. We are also unsure if it is considered a DoS attack when the protocol itself does not check or punish these malicious minimal batches. The bots are, in fact, still rewarded for submitting such batches. At the same time, it is still the users who pay for the steps.

Auto-Scaling Concurrent & Deterministic Batching on the UTXO Ledger

This architecture is designed explicitly for frequently updated on-chain states. It utilizes a deterministic validation rule that removes any influence from off-chain agents. The most significant trade-off is the congestion on the reserving phase, where users have to compete for a reserve. At least congestion comes from users that have to pay for each reserve. A continuous DoS is very expensive to maintain, and it is easy to guarantee a net profit to the whole dApp after every step. Applying minimal limits on each step in the reserving phase is also easier than on batches, a distinct advantage of local over global states.

After building a Proof-of-Concept on Cardano, we believe there are still limits on Layer-1 that prevent this solution from scaling. Deploying this architecture on a dedicated sidechain should solve those timing and congestion problems.

Long-Term Quotaless Deterministic Batching on the UTXO Ledger

These architectures accept some extra latency to remove the congestion on the reserving phase. It suites longer-term states like governance voting where the final state can wait, but people want no congestion when voting.

Again, deploying these architectures on a dedicated sidechain solves most timing and latency issues.

One-Shot Markets: Redefine Liquidity on the UTXO Ledger

(The rest of the paper is confidential at the moment)

Apart from general solutions that work for most states like the above, we believe dApp developers should rethink contract design per application. Basically to design dApp models of, for, and by the UTXO ledger; instead of just fitting account-based state-centric keys to the UTXO locks.

For example, instead of replicating stateful economics models from the account-based world, MELD has designed a new UTXO-driven economic model called One-Shot Markets. The key is to create new markets, loans, and other economic components after consuming one market UTXO instead of maintaining on-chain shared states. This way, different markets can be processed in parallel, while multiple can be consumed together for more liquidity per transaction. This generalized market also allows a single UTXO to act as lending, liquidity, stake, or insurance pool, to be interacted differently to create different output markets per redeemer.

The removal of an on-chain global state also forces dApps to move more components off-chain. This simplification of on-chain validators usually leads to fewer edge cases, better security, cheaper and faster on-chain transactions due to less resource usage. It also encourages off-chain innovations. For example, when price discovery is made off-chain, a typical liquidity provider can follow a specific oracle, while more involved providers can do their own quantitative analysis and pricing strategies.

In summary, MELD is designing new economics models of, for, and by the UTXO ledger. Simultaneously, we are working on both Layer-1 and Layer-2 scaling solutions. We expect to build our own dedicated sidechain as MELD reaches a massive scale. Dedicated mathematicians are formalizing and verifying all these innovative works. We expect to publish a lot more results towards the end of the year.

Haskell

GHC

Errors as (structured) values

This body of work is instrumental. We want to join hands because there are newcomer-friendly tasks for us to get familiar with GHC work. We can then help integrate this new diagnostic infrastructure into HLS and perhaps more toolings for Hachi.

Our first GHC MR that converts diagnostics in GHC.Tc.Validity to proper TcRnMessage has been merged:

We have been working on part two here:

Summary:

  • Move InstanceWhat to GHC.Tc.Types.Origin.
  • Add TcRnSimplifiableConstraint.

We have slowed down significantly due to the overloaded MELD and Cardano-specific works. That said, we are looking forward to introducing more college students to this project, to hopefully speed everything up to all the way to HLS integration.

HLint

We had another PR merged and continued to complete a long-overdue one:

After a while, we concluded that a pure Plutus linter is not that useful at the moment. We are now directing Haskell training to GHC, HLS, and Cabal instead. Moreover, we have been exploring more powerful approaches like with stan for source-code level analysis.

For the Haskell ecosystem, we continue to work with past teachers and the best universities in Vietnam to find the best crops to train. These training are essential to the growth of the MELD engineering team. Nurturing the purely functional mindset should help us flourish on the functional blockchain Cardano as well.

Hachi

MELD Announces Hachi!

Hachi is our effort to develop a suite of security analysis tools for Plutus smart contracts. This project will be open-sourced to the public eventually. For now, we focus on research and experiments on Cardano to better understand how to have the best approach for each problem.

Here is a summary of what we have done lately:

  • Continue smart contract security research in general.
  • Continue to study Cardano architectures, Plutus interpreters, and more.
  • More static analysis research of functional programming languages & Lambda Calculus.
  • Refine and write more proposals & documentation. Go to greater lengths to plan the ambitious future of Hachi!
  • Write and review more contracts with vulnerabilities on Cardano.
  • Continue to extend internal Haskell libraries and tooling for security research.
  • Update Antlr4 grammar for Plutus Core.
  • Add VIM syntax highlighting for Plutus Core code.
  • Archive Hachi Lint for more powerful static analysis approaches like stan, and more.
  • Continue to build a foundation environment with a passive Cardano node, ChainIndex, script deserialization, and more.
  • Continue work to transform Plutus Core AST to XML for XPath queries.
  • Continue work to transpile Plutus Core to LLVM IR.
  • Symbex:
    • Convert more Haskell types to Racket.
    • Build symbolic ScriptContexts.
    • Support sum types & concrete-length lists including pairs.
    • Basic strategy on finalization to make sure programs terminate.
  • UPLC Parser:
    • Improve pretty printing configuration.
    • Support string constants in addition to bytestrings.
  • Continue to implement our own CEK machines.
  • Survey Cardano low-level FFI for security risks.
  • Survey Cardano web interfaces and endpoints for security risks.
  • Plan an auditing framework for Cardano dApps.

We are writing a White Paper for Hachi to document the vision and projects. We hope to release the first draft next week to display the picture a lot clearer.

ADAmatic

MELD and VENT Present ADAmatic!

We are very excited to announce that Obsidian Systems, one of the best Haskell consultants, a partner of IOG themselves, have joined to build the bridge with us. Many more partners are expected to join soon. The ADAmatic bridge has and will always be a community-driven project.

We have continued to complete the formal specifications of the bridge, to publish its first White Paper by the end of the month. The current focus is to lock ADA on Cardano and mint mADA on Polygon.

More concretely:

  • Continuously survey the ecosystem on both ends to improve planning.
  • Seek more partners to co-build and run the bridge network.
  • A Proof-of-Concept bash script for the Cardano side for the “minting mADA” flow, i.e., have a user send ADA to a bridge-controlled address (with the sending transaction containing the desired target Polygon address), use cardano-wallet to list all the payments to the bridge including the target Polygon address.
  • cardano-node and cardano-wallet running in CI, connected to the public Cardano testnet, with the above bash script running successfully against it.
  • Private Cardano testnet up and running successfully in CI. Write more tests in general.
  • Planning the database structure that will be used to coordinate the two sides of the bridge. E.g., minting mADA in response to locking ADA.
  • Nixify a Polygon node to deploy to Mumbai. This effort is still highly experimental.
  • Deploy the first mADA ERC20 contract on the Mumbai testnet. It simply has a mint function usable by the contract owner and the ability for holders of mADA to burn their tokens that emits an event for the bridge to pick up on.
  • Attempt to pick up Polygon events from Haskell instead of using PolygonScan or other JSON-RPC tools.

Upon completion, we will have a solid foundation to sprint through:

  • Burn mADA on Polygon to redeem ADA on Cardano.
  • Lock MATIC on Polygon to mint mMatic on Cardano.
  • Burn mMatic on Cardano to redeem Matic on Polygon.

After which we can focus on completing the first governance setup for safelisted partners to run the bridge network; support more tokens; and eventually cross-chain contract integration with ChainLink, RenVM, and more.

MELD

MELD.com

Smart Contracts

Our focus has been innovating smart contract designs, optimizing output scripts, and early App integration. Here is the summary of progress:

  • Continue to track Cardano’s progress and smart contract research in general.
  • Continue to refine protocol specifications and smart contract designs.
  • Continue to test the latest dependencies.
  • Continue to innovate new contract designs, like with brand new architectures, or with new features like redeemers in ScriptContext.
  • Continue to refactor code.
  • Continue to improve the current liquidation system.
  • Complete a prototype for a new lending model.
  • Complete the first liquidity pool prototype with the Concurrent & Deterministic Batching architecture.
  • Continue to work with the ChainIndex component for App integration.
  • Start deploying components to the testnet.
  • Develop more techniques to reduce script size, transaction size, and fees.

Economics & Tokenomics

Our focus has been designing dynamic interest rates, MELD incentive functions, and designing brand new economics models! Here is the summary of progress:

  • Continue to study many DeFi protocols on many different blockchains.
  • More economics and tokenomics research, especially on interest rates, incentive functions, AMMs, derivatives, and futures contracts.
  • Collect more data to analyze, especially on DEX and lending protocols.
  • More in-depth market research.
  • Continue work on several forecast models on protocol usage, fees, MELD token’s value, and more.
  • Continue work for dynamic interest rates.
  • Continue work on MELD incentivization for lenders and borrowers.
  • Continue work to design better liquidity pools or better economics models in general with One-Shot markets.
  • Continue work on fiat (stablecoin) yield.
  • Start working on yield aggregation.

It is still very early, but we now have real-life DeFi data to collect and analyze with the smart contract launch on Cardano mainnet. We have also onboarded a seasoned Quant/mathematician to further formalize and prove many of these works. This academic capability brings a lot more confidence to innovate. Do expect the first economics & tokenomics articles and draft papers soon!

MELDApp

We have been facing a few integration problems that have slowed us down. We still hope to beta-test the MELDApp this month and release the first production version next quarter. Other than that, more intuitive demos are planned towards the end of the month to display core on-chain functionalities and interactions.

A summary of progress:

  • Continue to scale the MELDApp team.
  • Continue to partner with more teams to build faster.
  • Continue to finalize the MELDApp, wallet, and API specifications.
  • Continue to complete UI/UX designs.
  • Complete the first AWS setup.
  • Complete the first Firebase setup.
  • Continue security operations.
  • Continue to implement and document MELDApp endpoints.
  • Continue to work on the frontend core and interfaces.
  • Continue to work on intuitive demos to demonstrate smart contracts.