doug

Posted on Feb 21, 2022Read on Mirror.xyz

Turf NFT, Metadata, and You

I recently wrote about a few techniques for storing NFT metadata and assets on-chain. As I mentioned there, a responsible NFT team should make sure that their project’s content remains online and accessible for as long as technically possible.

Having said all that, I wanted to describe exactly what we’re doing with Turf NFT, and why.

Here’s what we’ve done and what we intend to do:

  1. We initially hosted our pre-reveal JSON on S3. All the JSON files were stripped of their traits, except for their names. Each file’s image_url pointed to the same reveal.gif, which was visible immediately after minting on holders’ OpenSea profiles. None of the actual metadata or final images were available online anywhere, either on S3 or Arweave.

  2. Following the mint we replaced these JSON files with our “final” (aside from some necessary edits, which we’ll discuss below) files, and uploaded the actual token specific imagery to S3. This allowed any holder to view their actual items online, following a metadata refresh. The reveal process was really fun, and seeing the community’s reactions on Discord and Twitter as the actual artwork rolled in was great.

  3. The next step, soon to be complete, will be the transfer of all our JSON and images to Arweave for permanent hosting. We’ll accomplish this by changing our contract’s baseURI variable to the appropriate Arweave directory, after the assets have been uploaded, via our setBaseURI() method.

    We’ll likely opt to not use the ar:// prefix, given its uneven support among the NFT marketplaces. We’ll stick with using https://arweave.net as the gateway for the time being.

  4. Finally, we can lock the baseURI permanently by calling our lockBaseTokenURI() method. Once set, this will prevent us from ever changing the baseURI again. We’re not quite ready to do that, given the fast changing landscape of decentralizing hosting platforms. Plus we’re in a happy position where our community trusts us not to scam them and delete all of our NFTs, so there’s no immediate pressure to migrate. At the very least we’ll only do this after the ar:// protocol is widely supported.

    (As an aside, a more drastic take on the concept of locking the baseURI is actually renouncing ownership of the contract. This is done by transferring ownership to some blackhole address, making it impossible for you to call any onlyOwner methods at all.)

  5. All of this logic - and a little baseURI surprise for those paying attention - can be found in our contract on Etherscan.

So what?

Why are we doing it like this? Why did we not just upload all of our content to Arweave before the mint and call it a day? For Reasons! There are trade-offs and risks. Even assuming you work out the technical details, there’s a lot to consider from an operational point of view:

1. Metadata Leak Mitigation

People generally accept that the initial mint of a randomized collection is a gamble. You might end up with a sweet one-of-a-kind token, or it could be a dud. That’s part of the thrill, and waiting for the big reveal has become a common cultural dynamic, driving activity around an NFT collection.

But what if some buyers could identify the rarest pieces before the mint occurred, and could attempt to focus on buying those pieces? That leaves the most valuable tokens going to the most sophisticated players, and the left overs going to the general public. (Of course your collection has no filler, i’m talking about other people’s collections). That’s no fun, but it happens with some frequency.

One way this happens is by uploading all your metadata to any kind of hosting platform - in a discoverable way - ahead of the mint. And all blockchain operations are public, so uploading your full collection to IFPS or Arweave up front would allow some sly foxes to peek at the full dataset early.

Likewise, uploading everything to S3 could conceivably be scraped if a snoop could guess a domain name or URL scheme.

An easy way to avoid this leak is to only upload the minimum data necessary for your pre-reveal. It’s slightly easier to do that on a mutable file system, since post-mint you can simply replace the JSON files, rather than juggling baseURIs.

2. Edits and Corrections

If your project has any amount of reasonably complex trait data, spread across thousands of programmatically generated tokens, you’re going to miss a few things before you mint. We sure did, and we worked with our community of holders to determine an acceptable way to correct that data over a period of weeks.

The immutable nature of blockchain storage would have prevented us from making these adjustments, as it’s designed to do.

If we had locked our data on Arweave from the start we’d be stuck with some buggy metadata forever, so we started off on AWS S3 to give ourselves room for tweaks.

For what it’s worth, Turf has 5,041 tokens and 5.4MB of JSON representing the whole collection, and I don’t think we’re anywhere near as complex (or large) as some collections get.

If you’re adjusting metadata post-mint you have to pay attention to the cultural implications, too. It’s not just about the JSON. People immediately assign value and perceptions of rarity to your tokens based on whatever traits are present. Altering them changes the calculus, for better or for worse, and you really have to tread lightly when it comes to peoples’ investments.

We engaged our owners and really went to huge lengths to ensure that any edits we made were consensual, even going as far as buying tokens back from people who otherwise were resistant to the changes.

Bottom line, we’re trying to do this right.

NFT