IOSG Ventures_EN

Posted on Dec 31, 2021Read on Mirror.xyz

Scaling Summit 2021 Recap | New Layer, Same Old Problem: MEV Prevention on Layer-2's

We are proud to present the insightful panel discussed by Peter Kris (CEO of Mangata); Stephane Gosselin(Founder of Flashbots); Deli Gong (Co-founder of Automata); Amir Bandeali (Co-CEO of 0x Labs / Matcha); and Felix Leupoldn (Tech Lead of CowSwap)!

Overview

As for MEV, they went in-depth into some questions like, what is the problem with arbitrage these lately? Is it necessarily exploitative? What are the arguments on the fundamental difference between MEV on L1 and MEV on L2? Is there any difference in how PoW and PoS mechanisms are compared in MEV? Do you think that the spamming game would change on L2 with sequencers in place or does it even make sense given that L2s are much more performant? How would privacy change these dynamics between different domains? We have transcripted the full discussion as below!

Peter*: Hello, everyone so welcome to this panel. Let me just start right away. I think the best would be if everyone would introduce themselves only shortly, one line and then we can jump straight into the questions. Maybe Amir, can you take it?*

Amir: Yes. my name is Amir. I’m co-founder and co-CEO of 0x Labs. We build all sorts of different decentralized exchange infrastructure including a liquidity aggregator and a request for quote system.

Stephane: Hey everyone, I’m Stephane. I am a founder and a current steward at Flashbots. We build all kinds of technologies that help blockchains deal with MEV.

Felix: I am Felix. I’m with Gnosis at the moment. At Gnosis we’ve built a decentralized exchange protocol which we called Cowswap. It is based on batch auctions and multi-dimensional order books. We’re trying to minimize the amount of MEV that is inherent to existing decentralized exchange protocols and are also planning to take Cowswap and spinning the project out of Gnosis. I’ve been leading the technical development on this project.

Deli: This is Deli, I’m the co founder of Automata network. We are building decentralized middleware that focus on privacy for all kinds of DApps across many L1 and L2 platforms. One of the solution is called conveyor which is trying to minimize MEV in many L1 L2 as well. It’s application specific application level solution in compared to so many blockchain level solutions.

Peter*: Thank you Deli. I have the honor to host this panel. I’m Peter Kris co founder of Mangata Finance. Mangata is an application specific blockchain focused on the decentralized exchange where mempool is encrypted and reordering powers are prevented. It’s a parachain in the Polkadot ecosystem. Let me just jump right in into the hot topic of these days. What is the problem with arbitrage these lately? As far as I remember, they were good and necessary for the blockchain to be efficient but now arbitrations are seen as exploitative. What are the arguments in this area? Maybe, Stephane?*

Stephane: Sure. I think it’s a nuanced topic. I don’t think it’s possible to go and say all arbitrage is exploitative though, I do want to push back against the concept that arbitrage as we see it today in the defi ecosystem has a purely positive impact to the end users. There is a tremendous amount of value that’s being captured by a small number of actors through this process. This value is being taken directly away from users in one way or another by giving them usually worst price execution on the execution of transactions. I think it’s important for the ecosystem to have a conversation around like what is the role of arbitrage bots. Are they actually providing benefit to the users or are there other designs that are able to provide better value for the users that are executing and see these exchanges.

Peter*: Any follow up? Amir? I think you may be wanting to say something.*

Amir: Yeah, I was just gonna say, I totally agree with Stephane. I don’t think arbitrage is inherently bad but there are certain ways in which it can be used that are just purely extractive abusers and like a sandwich attack for example, you’re basically manipulating a market to almost steal money from a user in a risk free way. I think that’s obviously not really good for anyone then another issue that I think is not very well known or talked about is that the amount of arbitrage on chain causes this problem where if a user is trading against an automated market maker and they see the quoted price on the front end generally by the time that that trade is actually executed and hits a blockchain. They’re not actually going to realize that quoted price on average, they will be realizing a price probably like 10 basis points worse than what they’re shown. That doesn’t happen intentionally, necessarily but in some ways, it is also deceiving to users those are those are kind of the two main issues that I see.

Deli: I see. I can also chip in a bit. I also think that this arbitrage is actually sometimes healthy for the system because it helps with the liquidity and also the price discovery for DEXes, particularly but I do, it’s just that the underlying execution engine which is a blockchain itself is supposed to provide actually a fair manner for all the participants but nowadays, it just provides some benefits or advantages to some of the participants, which are the bots and also the miners which they look unfair, actually, to the users because they normally don’t have the knowledge to run this kind of sophisticated strategies to in order to arbitrage other people. So in that sense, I guess all blockchain layers should try to solve these issues once and for all.

Peter*: I would jump into the cross chain domain right now, do you see any fundamental difference between MEV on L1 and MEV on L2?*

Felix: Fundamentally, I would say I don’t see a difference in the type of MEV that exists between these layers but happy to discuss this as well. One thing that might make at least some of the L2s we see today with also different consensus mechanism more targeted for MEV is that, at least in the past miners have been very good at understanding economics of mining and getting cheap electronics and cheap energy to perform the mining job but they have not necessarily been super engaged with the community and ecosystem as a whole. I think MEV-GETH and Flashbots have kind of seen this this gap between the DeFi native searchers who actually understand MEV and find strategies to extract the MEV and the miners were actually in the power of proposing these blocks and build a bridge for this. I would see that in a lot of L2s and specifically with proof of stake this gap between specific blockchain natives who understand everything quite specifically and people that are just in the business of providing a service that requires hardware and cheap energy makes MEV attraction more plausible, or more likely.

Peter*: I know that validators are maximizing their MEV gain for instance, on Solana so are you saying that proof of stake validators have more resources to invest in MEV research compared to proof of work miners. Is there any difference in how those two consensus mechanisms are compared in MEV?*

Felix: Fundamentally, the MEV exists in both systems I would say just from the background of people that tend to become validators in proof of stake systems tend to be maybe more aligned with the background of people that actually understand MEV and can extract it whereas in proof of work I feel the type of companies and people that run nodes might be a bit more disjunct from the people that actually understand MEV quite specifically. Another thing and proof of stake that can make things different is that you have a little bit more of foresight into who gets to propose a block so it’s not necessarily completely random for a given block so that may have some impact on the types of MEV and the communication foresight you have with potential searchers or if you’re a searcher yourself to come up with a strategy.

Peter*: Any follow up to that? Fundamentally, miners have power to reorder, reject or insert transactions if we’re talking about L2s, where there are sequencers. Do sequencers have fundamentally the same powers or will the game somehow change when we are comparing now L1 and L2?*

Stephane: If we look strictly at reordering transaction, inclusion, and censorship of transactions, sequencer have all these powers one way to think about, like the powers that L2s have with regards to this type of MEV, is to think of the equivalent if Ethereum only had a single miner and that single miner was producing every single block they were able to include wherever transactions they want exclude wherever transactions they want, and reorder transactions. That’s sort of how L2s are designed today. There’s different types of MEV that are specific to the consensus mechanisms and the ordering rules of the protocols so in theory, you could have different systems that have different properties but I don’t know that there are any earlier tools right now that don’t operate with a single sequencer model.

Peter*: So, for instance, on most of the chains that are low fees or chains that are just young enough to not be congested there is usually an obvious problem with spamming tactics. Those spamming tactics can get really sophisticated. Do you think that the spamming game would change on L2 with sequencers in place or does it even make sense given that L2s are much more performant?*

Stephane: Again, I can try to touch on this right now as well. I think the way that these L2s currently work again caveat like there’s a huge design space and the way ordering rules are implemented but the way they currently work is a single sequencer model with a sort of first come first serve or receive time ordering system where whenever the sequencer receives a transaction from user then it attempts to include it as is it seems like these are sort of the rules that the L2 implementation teams are running with to try to provide the simplest most user friendly implementation but it has some externalities which is that it does not deal with MEV, it sort of ignores it, and makes possible strategies like spam and like colocation possible. So in this world, in order to be able to benefit from MEV extraction you can definitely perform spam strategies so long as the fees on the network are low enough you are clearly incentivized to colocate the strategies that you’re running as a block operator with the sequencers to be able to get minimum latency and execution.

Amir: Can you describe one of the spam strategies I’m actually not too familiar with?

Stephane: So it would work something like this, you have the execution for your arbitrage between multiple pools like the code for the routing or the execution of it baked into the smart contract, then you just like spam the same call data to it all the time and you just do like an early revert if the arbitrage doesn’t exist and then it only goes through if the arbitrage exists so you have some on chain computation costs to check if the arbitrage exists but if you are able to execute this on a low fee chain then you’re able to get some significant advantage we’ve seen a lot of this on, for example, Avalanche, and Polygon as being like very effective strategies when the fees are still relatively low and then gradually over time on these networks, the competitive advantage has shifted to the colocation and latency advantages.

Amir: Gotcha. thank you. that’s pretty interesting.

Deli:Just want to know, is there any anti spam strategy that these sequencers can use just to identify those spammers? Basically, from their behaviors or their patterns of the transactions.

Stephane: I don’t know. you could always try to do like behavior analysis and try to like de-anonymize addresses, but that’s a lot of work and I don’t know that everyone wants to be sending all their transactions through a neural net to try to classify them before they execute them so I think probably the best way networks deal with this is their chain starts to get resource constrained, right, gas prices, or just inclusion prices go up that makes like spam strategies less likely but yeah, it doesn’t increase the number of different actors who are able to participate in this. It just makes it such that the actors instead of investing in spam or execution costs, they invest in hardware costs and colocation costs so it shifts over, I guess, the equilibrium of what the most successful strategy is.

Peter*: This environment leads me to a question whether it makes sense for sequencers to be the same as searchers on Flashbots when there is a settlement of L2 batches on L1.Would it make sense to that go through the Flashbots auction?*

Stephane: I think it’s sort of a separate question. When I think about L2 MEV there’s those two things to think about. There’s the MEV, from the execution of transactions within this L2 chain right within that state and then there’s the separate task of the L2 to settle those state transitions as the rollup back to the main chain we’ve looked at this a bit at Flashbots and Alex Obadiah who’s on the Flashbots Research team knows a lot about this but the current impression that we have is that there really is not much MEV from the settling of rollups on main chain they are large transactions that like require a lot of resources on ETH L1, but the only really thing that a miner could do is delay the settlement by like a few blocks. And that doesn’t really meaningfully impact the experience of the rollup and is quite costly and then reorder as well don’t seem to have much of an impact they can just resubmit the transactions or whatever and get it included at a later point in time, so any form of censorship is quite costly doesn’t have a clear economic benefit so it’s unlikely to be a strategy which means that for these rollup operators there isn’t much of a benefit to reroute through something like Flashbots but what they really want to be able to get is inclusion at the lowest possible cost, which I guess is what everyone else in Ethereum wants right now.

Amir: Is it possible that it could actually be advantageous for sequencers to take advantage of MEV opportunities themselves and act as searchers in order to prevent spam and colocation and stuff like that because there will essentially be no incentive for anyone else to even attempt to take advantage of MEV if they knew that the sequencer was taking all opportunities themselves. Isn’t that a bit like if stock exchanges were in the business of high frequency trading themselves I think it might be advantageous for the chains as well to separate these concerns and sell the colocation as an extra revenue stream to people that are actually really experts on the topic. That’s true. I guess what I had in mind is maybe sequencer captures value redistribute the token holders somehow or something like that.

Stephane: I guess the Flashbots positions being quite big fans of this idea what Felix brought up about separation of concern is a good one, do we want all the L2s to be investing and trading teams to be able to extract all that MEV I think that’s not the probably the most efficient way to go about it. We’re big fans of inserting a market mechanism here or an auction to be able to auction off the rights to do the ordering which has the same economic impact where the MEV does get extracted removes the latency advantage, it removes the spam, it makes sure that anyone can participate in a permissionless competition and then yes, the funds that are raised can be used for whatever that L2 wants to design, whether it’s funding public good whether it’s submitting it back to the token holders through some buyback and burn, that there’s probably multiple different designs for what to do at that point.

Peter*: Felix, you’ve been mentioning on Twitter how can an interchain batch auction look like can you maybe describe like, how that how that could work.*

Felix: Yeah so fundamentally the way that batch auctions work is that instead of settling each trade individually against the best liquidity on chain or against whatever market maker quotes individually for this order we actually collect multiple user orders over time, into an order book that potentially can be overlapping because if somebody wants to sell ETH at 4500 and somebody wants to buy ETH at a price of up to $5,000 then there is an overlap in this order book so just from a perspective, how batch auctions are different kind of these continuous sell order books. Now, just on a conceptual level to do this across multiple chains which just mean that the orders that we collect and the places in which these orders eventually end up getting settled would not just be entirely happening on a single on a single layer or in a single chain of course, the challenge with doing so is the lack of atomicity between different chains so if we’re trying to, for example, settle somebody selling ETH on L1 Ethereum and somebody buying ETH on Polygon, for example it might happen that the settlement now needs two transactions and the transaction on Ethereum might go through and the transaction on Polygon might revert. It’s having this to achieving a settlement guarantee or atomicity across different layers is the main challenge here. Not so much actually just creating a batch and settling it theoretically we are still very early in the in the design phase there but one idea that we’re having specifically on how to for example connect L1 liquidity and right now we’ve deployed on xDai as a L2 or sidechain not really L2 is to basically have the Cowswap’s settlement contract act a little bit as a bridge so what people can do, there would be like the main source of truth chain which of course from our perspective would be Ethereum mainnet contract where people could deposit funds into the Cowswap settlement contract and those funds could then be used to mint tokens on other L2s so you could envision the settlement contract acting as a bridge into Polygon acting as a bridge to xDAI or whatever other sidechain you’re looking at and then we can use the balances that are stored in the bridge to actually make sure that we have a certain kind of atomicity or at least certain guarantees that funds are not moved around from either side of the bridge while the user is trying to make a trade so if you wanted to trade you on Mainnet, you would probably not lock it because that’s working quite well at the moment you can have one chain that is basically your source of truth and where you don’t really need to have some locking guarantees but then on the cheap L2 or side chains, you would whenever you wanted to make a trade basically make that capital lock that capital in a place where is safe and can be executed or used by the solvers or the infrastructure that’s actually performing the trades and then you’re no longer bound on having different execution speeds you can actually make sure that trades either go through or don’t go through in lockstep it still becomes a bit more challenging if you’re trying to access liquidity on those side chains that you don’t control so for example, there’s an AMM on polygon that you want to use for a trade on Main net there again, you might have a bit of more probabilistic approaches or not this lockstep guarantee. We’re researching more into how we can make this kind of the external not intrinsic to our batch auction liquidity available as well but the general idea would be to have the bridge contract that also acts as a settlement engine and decentralized exchange at the center.

Peter*: Thank you. Deli, do you see a privacy? How would privacy change these dynamics between different domains?*

Deli: In terms of privacy, if you look from different angles of MEV is actually is about the transaction information got leaked before it got executed on the engine on the DEX or other DApps. So we could actually potentially patch this loophole by just setting up setting up a channel where the information is not revealed until the transaction ordering is decided and becomes immutable so our approach is actually ended up quite simple so user could just send a message transaction by relayer that’s running in a Trusted Execution Environment. So the relayer will just decide the ordering without any bias. A simple way is to just maybe use first come first serve ordering but there might be other ways that depend on the which approach DAPP wants to use due to the use of this Trusted Execution Environment, or TEE, the relayer will not be able to reorder the transactions, or deny any transactions when the transactions leave those layers, and being forward to blockchain, miners and validators are not able to manipulate them anymore because the meta transaction ordering is already locked in by the relayers so no one can change it anymore in a sense that it creates a local ordering that particular to that DAPP or even a particular trading pool, so to be possible and it became quite lightweight, and also works across many L1, L2 as well if we compare this to the searchers or the solvers, the huge difference is actually this only safeguards the per app local ordering and other than the global ordering for the entire chain or the entire L2.

Peter*: I think one of the biggest, ultimate rejections or value extractions by rejection is when there is an arbitrage opportunity and the relayer despite the transactions being encrypted the relayer can basically reject all the transactions and replace it with one transaction that is exploiting or like filling up the arbitrage so despite the enhanced privacy, do you see any approach to this kind of problem?*

Deli: I guess the first thing is because users will have end to end encryption channel towards those relayers so only a strategy that a relayer can take is actually just blindly just deny all the transactions or randomly deny transactions which sometimes might not be very efficient to execute your strategy sometimes and also, that will damage your reputation as well because again this kind of Trusted Execution Environment setup those identities of the relayers are known if a particular relayer is just behaving abnormally people could know it and just avoid it in the future. For example, if I can choose to connect to different endpoints that’s set it up by different relayers, there will always be I will say the good or benign relayers there also this TEE technology could help with that because it definitely separate the relayer engine itself from the actual operating system or hardware that’s running it.

Felix: If I understood the question, correct, Peter, you were asking if so basically if let’s say a background opportunity would be created, then with this technology it’s impossible for the relayer to extract or for arbitragers to extract that back running opportunity within the block itself but wouldn’t it just postpone that opportunity to the top slot of the next block and the relayer, even though they don’t know what other transactions are coming in they could still make sure that them or party that they’re working with get that first slot in the block, and basically just execute whatever arbitrage opportunity was created before immediately in the first of all of the next block?

Deli: Yeah, that could be possible. yes. sounds quite interesting.

Peter*: One of the more controversial things that happened I believe, two months ago was that a mine pool rejected transactions that were clearly front running it was possible because of some legal compliance reasons so maybe that’s a question for the future if MEV will be legally troublesome, do you think that the MEV will be mitigated just because of the compliance reasons?*

Deli: Yeah, I feel it’s possible because in future if some just big players are running this I mean, the L2 nodes maybe they have to just care about their reputations in the entire space so they tend to not doing this kind of MEV attacks because all the records will be public on chain everyone can just check and track if this particular party is doing MEV against users or not. Yeah, I feel that that’s possible in future.

Amir: Yeah, I agree. It seems like a pretty likely outcome to me. That being said, I don’t think anyone’s gonna regulate away 100% of MEV. I think it would probably be just the MEV that’s very targeted towards users but like, general arbitrage opportunities are the last example, Felix mentioned, where the blockchain isn’t some state that can be arbitraged and miner wants to get the very first transaction in that block that’s not inherently bad for like a specific user. I think something like that was difficult to regulate away but I would not be surprised if 10 years from now there is actual regulation that like prevents miners from specifically front running or sandwiching users

Felix: one problem, however, is that if we don’t reduce the amount of MEV that is inherent to the network then this MEV might not be taken by the miners but we will go back to the state that we had before Flashbots came along where some people will just spam the network probabilistically and cause even more negative externalities by wasting block space with failing transactions that drive up the cost for everyone that is not even related to this trade or sandwiching attack. So I wonder if in a completely trustless network in the presence of MEV even if miners agreed to not extracted themselves will it be possible to prevent the old school one and a half years ago way of instructing MEVs.

Stephane: This is a super, super nuanced topic. I think the main position that I have in this is to be very careful not to solve a technical problem with a regulatory solution because regulatory solutions are very big hammers that strike a lot of things at once and if you look at what regulating away front running, or sandwiching looks like it’s solving MEV through a compelled censorship. It is about putting some restrictions on actors on the network and the way that they are able to select and include transactions and is, by default taking away the credible neutrality that a lot of these systems depend on so I think this is a massive threat to the way that the systems work today and the way that they’re able to continue working and in a trust minimized manner to have the possibility of such limitations being introduced so when I say that this is likely to be solved through technical problems I do think that natural market forces are going to eliminate all of the user costs from front running and sandwiching just through the development of systems that create more benefit for the users and legally giving users the choice to use systems that create more value for them is going to naturally eliminate these characteristics.

Peter*: When we’re talking about costs for the users do you think that MEV effect gas prices positively or negatively what’s the future outlook on that? Anyone?*

Stephane: I can riff on this. I think MEV enables a lot of new experiments with gas prices so just looking at Ethereum, specifically, for pre London hard fork all of a sudden, MEV made it possible to provide a user experience of zero gas price and this is something that the 0x team has been able to replicate also in the post EIP 1559 world but it’s enables to move around the way that users experience paying for the inclusion of their transactions on the system from being an upfront fee for inclusion to being a baked in fee somewhat similar to how Robinhood works so that’s one way to like think about the impact of MEV on gas fees you could imagine like a L2 system or some other chain that says like all the transactions on chain is free and the way that you pay for inclusion is just through MEV. It could be some new experiments to see. I think separately one can think of the impact of MEV as the gas price experienced by the average user and what we’ve seen here is in periods of high MEV let’s say that there’s a particularly restricted opportunity on chain so like token drop and NFT drop where all the bots all the sudden for a period of several blocks are competing on spamming the chain to try to be the first one to receive that opportunity like we’ve seen with many of the first come first serve sales the impact is that you clutter the chain with a bunch of reverting transactions which puts a sort of upward price pressure on the on the gas prices which are experienced by everyone else in the network who’s trying to do regular activity so mishandling and misdesigning the way that the protocol handles MEV certainly has an impact on the average gas price experienced by all the users of the platform.

Felix: I totally agree with what Stephane said. Also the idea of baking in gas prices into prices from the user perspective I mean, that’s also how Cowswap for works. You’re not paying any gas for versus submitting your order you’re only paying part of your sell tokens when the order gets submitted however, this is from a user perspective, it’s very, you know, it looks like gas prices are effective but just from the global level gas prices are the cost of inclusion into a scarce computational resource blockchain and the presence of MEV makes the demand for this inclusion overall larger and also totally with Stephane that if MEV is entered incorrectly this spike can be even higher so designing for MEV in your protocol can make this extra demand lower but just generally speaking, the presence of MEV and the presence of arbitrage opportunity just increases demand for block space and therefore, I would say overall increases the gas price no matter how that is that abstracted away and presented to the user.

Peter*: Thank you. we are coming slowly to the to the end of the session. The last question that I would ask. It’s very broad one. By which attribute the fairness should be assessed like the fair blockchain, what does it what does it mean? Is it equal access to opportunity to arbitrage? Is that equal latency? How MEV democratization looks like in this area? Anyone?*

Amir: I think maybe I could go first. I personally would like to take like fairness. I think there was also this great thread by Robert Miller about how fairness is a super subjective question or criterion. A lot of systems in the real or also traditional finance world are designed not necessarily or when other question you could ask what makes the market most efficient and you could hope that the most efficient market also coincides with the most fair market.

Felix: For me, personally, I’m a strong believer that these low level latency wars and colocation and first come first serve at the mercy of one nanosecond, before somebody else makes gives you the right to be executed for that other person doesn’t make the market more efficient so that’s why I personally think that at least for maybe 15 seconds here and block time is too long, for saying everything is in the same everything happened at once. But breaking this continuum of time into discrete ethos maybe a second five seconds for me is a reasonable way of preventing these kinds of high frequency and arms race for hardware and colocation wars at the cost of making the market more efficient and at the cost of basically negative externalities for the users so in my perspective, if people express the same intent to trade within the same second they should not receive a different price for that intent.

Stephane: I can offer my perspective on fairness. I think fairness is maximizing user choice and minimizing barriers to entry. I think that if those two properties are present in a system, the most fair outcome will emerge just through market activity

Deli: Maybe a different angle is what fairness, not, that not means so I guess the manipulation is something that we don’t want so I guess this is something we don’t really need in fairness work. Then maybe some just very clear attacks for example, if someone can be defined as victim in in a certain MEV incident then I guess this kind of MEV is not something cool to have. Yeah.

Peter*: I see, thank you very much. It was an awesome session. We are already on time. I would hope we would have two more hours to talk about this but hopefully we were able to go through all the diverse topics in MEV.*

About Co-hosts

❄️ IOSG Ventures

IOSG Ventures, founded in 2017, is a community-friendly and research-driven early-stage venture firm. We focus on open finance, Web 3.0 and infrastructure for a decentralized economy. As a developer-friendly fund with long-term values, we launch the Kickstarter Program, which offers innovative and courageous developers capital and resources. Since we consistently cooperate with our partners and connect with communities, we work closely with our portfolio projects throughout their journey of entrepreneurship.

❄️ StarkWare

StarkWare invented, and continually develops, STARK-based Layer-2 Validity Proof scaling solutions over Ethereum. StarkWare’s solutions, which rely on Ethereum’s security, have settled over $250B, and over 60M transactions, serving hundreds of thousands of users. StarkNet, StarkWare’s permissionless general-purpose scaling solution, is live (Alpha) on Ethereum Mainnet. StarkEx, a custom standalone scaling service, has been powering applications since June 2020, including dYdX, Immutable X, Sorare, and DeversiFi.

❄️ imToken

imToken is a decentralized digital wallet used to manage and safeguard a wide range of blockchain- and token-based assets, identities, and data. Since its founding in 2016, it has helped its users transact and exchange billions of dollars in value across more than 150 countries around the world. imToken allows its users to manage assets on 12 mainstream blockchains and all EVM chains, it also supports decentralized token exchange and open DApp browser.

❄️ Arbitrum

Arbitrum is a leading Ethereum Layer-2 scaling solution developed by OffchainLabs. Based on the Optimistic Rollup scheme, Arbitrum enables ultrafast, low-cost transactions without sacrificing the security of the Ethereum ecosystem. Launched on August 31st, 2021, Arbitrum has attracted 100+ ecosystem projects. Arbitrum is currently EVM-compatible to the bytecode level. In the next upgrade, Arbitrum Nitro, Arbitrum will further increase developer experience by incorporating WASM support.