Polygon CDK: All About Data Availability

How do DACs work in a validium built with CDK?

Polygon Labs
December 4, 2023
Polygon Giga Brain
Image source: Dribbble

The first design decision projects building with Polygon Chain Development Kit (CDK) must make is between rollup mode and validium mode. Rollup mode provides the highest degree of security, while validium mode has the lowest transaction fees. Both modes use ZK proofs for validating the state of the Layer 2 (L2), but in validium mode, the transaction data that is costly to post to Ethereum is instead stored cheaply off-chain. 

Projects building with Polygon CDK inevitably ask, Where does that transaction data go, and how can users be sure it will always be retrievable? 

This post will cover Data Availability (DA) and Data Availability Committees (DACs), what they are and how they work within a CDK-built L2.

Data Availability and DACs 

Ethereum’s consensus logic dictates that the transactions included in any given new block must be independently verifiable. This means that the data of the transactions must be available. For L2s, the common practice for guaranteeing the availability of its users’ transaction data is to post it to Ethereum as CALLDATA. This ensures that, in the event of some malice, users can still access and retrieve their assets.

CALLDATA is expensive, however, and, on Polygon zkEVM, it accounts for more than 80% of transaction fees. 

The solution for projects building with Polygon CDK in validium mode is to use a DAC. A DAC is a group of permissioned nodes whose core responsibility is to attest that the transaction data needed to reconstruct the state of the L2 is available. Projects building with Polygon CDK configure the make-up of DAC members in the datacommittee.sol contract on Ethereum.

The Data and DAC Lifecycle in a CDK Validium

On Polygon CDK-built chains, the functions performed by the Sequencer and Aggregator are the same as in Polygon zkEVM. The Sequencer executes, orders, and batches transactions, and the Aggregator executes those batches and generates a ZK proof. 

In validium mode, the DAC interacts with the Sequencer to attest to the secure and efficient handling of user transaction data. Here’s what that looks like in practice: 

  1. Batching: Once a batch of transactions is ordered, the Sequencer forwards the batch data and its corresponding hash to the DAC.
  2. Data validation and storage: DAC nodes independently validate the batch data. Once validated, both the transaction data and the hash is stored in each node's local database for future reference.
  3. Signature generation: DAC nodes generate a signature for the batch hash, endorsing the batch’s integrity and authenticity.
  4. Signature verification: The Sequencer collects the signatures and the original batch hash and submits them to Ethereum for verification.
  5. Verification: A designated smart contract on Ethereum checks the signatures against a list of valid DAC members and confirms that sufficient approval has been provided for the batch hash. Sufficient approval is a threshold configured by the chain in the datacommittee.sol contract.
  6. Signature confirmation: Once the smart contract confirms the validity of the signatures, the Sequencer passes the batch to the Aggregator, which will generate the ZK proof that is posted to Ethereum. The proof confirms the validity of the batch's transactions.

Tune into the blog and our social channels to keep up with updates about the Polygon ecosystem.

Together, we can build an equitable future for all through the mass adoption of Web3!

Website | Twitter | Developer Twitter | Forum | Telegram | Reddit | Discord | Instagram | Facebook | LinkedIn

More from blogs