Apibara: open-source data platform |
Apibara |
This project aims to add Ethereum data (Execution + Consensus Layer) to Apibara, an open-source data platform. Apibara allows developers and researchers to synchronize any onchain data to a target database or API. Currently, we provide support for PostgreSQL, MongoDB, Parquet, and webhooks. It is easy to add support for more integrations. Apibara focuses on “online” usage: first, it backfills data and then synchronises it as the chain moves forward. Developers can access the data using the tools they already know. Our data synchronization protocol is chain-independent. For this reason, we can support indexing Execution Layer and Consensus Layer data. |
DotPics |
Anton Wahrstätter |
DotPics is a collection of dashboards, data and tools for Ethereum. From a dashboad side, I plan to also build one focusing on 4844 blobs, usage of blobs, as well as incoprorating blobs into mevboost.pics. Additionally, there are open-source data sets that I maintain. Lastly, my parser for parsing the CL, EL, MEV-Boost (Bids and Payloads), as well as other stuff will be open-source soon. Currently in the final testing stage. The final parser will have a simple GUI allowing everyone to parse the data wanted as simple as possible. Furthermore the parser directly labels validators with their respective entities (Lido, Coinbase, ect.), marks potentially censorable transactions and ETH2 deposits. The parser can be plugged into a node and is ready to go. |
Healthy Network Baselines |
Metrika |
The problem we aim to solve is establishing clear metrics and thresholds to define a healthy Ethereum network. Given Ethereum’s dynamic, decentralized nature, the responsibility for monitoring and preserving its health falls upon the entire community. To accomplish this, the community needs to agree on network health indicators, including the specific metrics that should be tracked and the corresponding thresholds that signal potential issues when the network is veering toward an unhealthy state. By leveraging Xatu, we will establish robust health baselines for Ethereum’s peer-to-peer (P2P) network layer. Our goal is to document our findings, rationale, and detailed descriptions of the selected metrics, empowering the community with the knowledge to safeguard Ethereum’s stability and well-being. |
MigaLabs Data collection |
MigaLabs |
The Ethereum blockchain is constantly evolving. It has changed dramatically in the past, with the transition from Proof of work to proof of stake, and it will substantially change in the future, with the arrival of EIP 4844 and others. Understanding those changes and anticipating possible bottlenecks is the main job of blockchain researchers. But for that we need a vast array of tools, in order to collect massive amounts of data, extract information from it, analyze the observed patterns and visualize them in an intuitive way. The goal of this ambitious project is to develop and enhance the tools for: monitoring Ethereum nodes, keep track of data propagation, discover the nodes in the network, unveil patterns in MEV, explore the limits of DVT technology, monitor devnets and feature forks, follow validator performance and visualize all this data in a clear and insightful fashion. |
Allowing validators to provide client information privately |
Nethermind |
Understanding the distribution of Ethereum’s execution-layer and consensus-layer clients used by validators is vital to ensure a resilient and diverse network. Although there are currently methods to estimate the Beacon Chain’s client distribution among validators, the same cannot be said about execution client distribution. Also, there is no standard means of anonymously showcasing which ELs and CLs are being utilized. This proposal aims to research and design a way to submit and extract this crucial data while potentially avoiding compromising user anonymity and network performance. |
Anonymous Validator Data Collection using ZK |
Abhishek Kumar |
There are close to 900k validators on ethereum mainnet. This translates to a treasure trove of data about validators that is waiting to be captured. This data would allow us to better design the ethereum protocol by understanding the pain points. But the hard fact is that we don’t have enough data on these validators. Sure, we have data dashboards like rated.network but they are incomplete. For example, we don’t have information about what clients the ethereum node is using (reth, nimbus, teku), on what machine (arm64/linux) etc. Validator operators don’t wish to expose too much information on their staking setups. This is the problem that we’re trying to solve. We plan to use ZK for data collection that allows validator operators to provide information while staying anonymous. |
Core Platform Expansion |
Growthepie |
growthepie has a solid foundation providing reliable Layer 2 data and blockspace analysis as well as content for end-users, developers, and investors. Our aim is to provide our users with the most neutral and complete curated set of metrics, tools and knowledge to understand the ever-growing L2 space and make the ecosystem more transparent. For this, we aim to expand the feature set of the platform, to list more Ethereum Layer 2s, to include more metrics, blockspace analysis and knowledge content. All this while staying public goods funded, keeping a reliable infrastructure for high demand as well as a responsive and quick user experience. |
Standardised and Crowdsourced Smart Contract Labels & ABIs |
Growthepie |
This proposal addresses the issue of isolated and non-standardised contract labelling datasets within the blockchain data community. By introducing a standardised data model for smart contract labels, inclusive of ABI, we advocate for the consolidation into a single, universally accessible database utilised by various data providers. Our solution extends beyond standardisation, adding the community as a key entity into the labelling effort. We have identified that the long-term success of a comprehensive label database relies on crowdsourcing from the community, achieved by lowering the entry barriers with more user-friendly front ends and open API endpoints for seamless integration. This approach marks a pivotal shift for smart contract labels towards a community-driven, standardised and eventual decentralised public good. |
Economic analysis of L2s |
Nethermind |
The rapid adoption of Layer 2 solutions (L2s) necessitates a clear understanding of the profitability and data requirements of the new chains. We aim to develop tooling to provide data for the calldata costs of L2s and the fee that the L2 networks pay for L1 security. The size of the calldata costs will also help study the dynamics of 4844. We hope to provide insight into the data requirements of the expected largest consumer of blob space. The analysis of the profitability and current costs of the rollups will give all of the rollups vital information to design competitive gas markets and increase the information available to consumers of the rollups to allow them to make informed choices about the architectures they rely on. This, coupled with our other proposal on rollup security, will provide consumers a strong basis to select rollup services at a known cost and risk. The data will also be effective for modelling and predicting the behaviour of the data blob market in Ethereum. According to the paper by Offchain Labs and the Ethereum Foundation we assume that the top five rollups by TVL will be classified as a ‘large roll-up’ in the near future and that their data posting strategy will be to use EIP-4844. We can compute what the historical cost of 4844 would have been assuming the rollups used 4844 from their geneses and try to predict the market dynamics of 4844 in the near future, according to the rollups’ current and expected usages. Finally we will propose a standard for comparing and benchmarking the computational capacity between EVM and non-EVM chains. |
Analysis of L2 Finality and Economics of L2 Security |
Nethermind |
The rapid adoption of Layer 2 solutions (L2s) necessitates a clear understanding of the associated risks for developers and users alike. We aim to develop tooling to provide real-time data and assess these risks across various L2s. The tooling will address the risk of L2 networks forking from their L1 canonical chain, and L2 blocks not finalising on L1. A real-time asset risk tracking feature will also quantify and display the assets at risk, providing a clear view of financial exposure. Through this tooling and associated dashboard, we strive to improve transparency and understanding in the L2 ecosystem, fostering a safer and more informed community while encouraging L2s to push for the economic security they require. |
WalletLabels – Standardizing & Enriching Ethereum Account Labels for Transparency and Utility |
Function03 Labs |
WalletLabels is a platform that simplifies the identification of on-chain wallets through custom labels. The need for clear, accessible, and actionable insights into wallet behaviors becomes more and more important as the space grows and matures. Our intuitive interface lets users easily search and categorize wallet addresses by name, label, or entity type, transforming anonymous hashes into meaningful insights. We envision offering a labeling infrastructure that extends its value across a broad spectrum of platforms—be it block explorers, wallet services, or consumer-oriented applications. |
Source link