In this article, I will discuss how I designed and developed an entire SaaS named
rarityrank.net (this has been brought down for now).
Introduction of NFT
As pandemic struck the world, so did the NFT craze too. From small influencers to big brands all joining the NFT train.
A non-fungible token (NFT) is financial security consisting of digital data stored in a blockchain, a form of a distributed ledger. The ownership of an NFT is recorded in the blockchain and can be transferred by the owner, allowing NFTs to be sold and traded.
So as said above NFTs are traded via Marketplace, but final transactions are done transparently on the blockchain.
There are a lot of Marketplaces for trading NFT, notably following are major players:
NFT value is driven by NFT Rarity, and it's defined as
NFT Rarity originated from the traits found in the Crypto-Punks collection. Rarity drives a large part of the economy around collectible NFTs and produces excitement. It is the result of a calculation that focuses on the various traits found in the NFTs of a collection.
Few Screenshots of RarityRank.net
Individual Collection Page
Filters or Search Functionality
Individual NFT Metadata Page (Shared by User on Discord Channel :) )
Why I Built RarityRank.net
NOTE: I used Opensea as a golden source for our data, as it was the marketplace with the most trade volume
Even before NFT Collection is revealed on the marketplace, once the collection is listed on the Marketplace it can be traded which is mostly near to floor price.
However, once the collection is revealed, it takes a few hours before metadata is synchronized on marketplaces. Once it syncs up, people can use tools like rarity.tools to find rare ones and trade them using filters, etc.
What if we give users a way to know Rare NFTs in the collection before data is synced up on Marketplace so that they can snipe up ahead of time and buy rare ones closer to floor price before their price increases 10x to 100x or even more.
This data related to NFT metadata is available either on on-chain or off-chain, and it varies based on the owner of the collection.
So I saw similar tools like Rarity Sniffer and Traits Sniper already doing, so I thought to build something similar, that was as fast as them or in some cases faster than them (which we achieved in most cases based on a comparison done based on the naked eye).
The high-level workflow would be like the below:
- Onboard NFT Collection and check every scheduled time to see if NFT collection is revealed.
- If revealed, pull all NFT metadata from the given collection based on how data is stored.
- Pass all metadata pulled to Rarity Algorithm engine, which I cracked based on mathematics formula and is closer to rarity.tools algorithm.
- Allow users to snipe up rares one, by directing them to Opensea Marketplace if they are open to buying. (Whether NFT is open to Buy or not, is determined by logic based on Opensea data pulled via Assets or Events API of them.)
So my goal for the technical design and development was Scale, Speed, and Elasticity, because as soon the NFT collection is revealed our users who should know rare ones in the collection and snipe them up.
On average based on the data analysis I did, a single collection would have around 8000-10000 NFTs, so that amounts to the same number of HTTP calls or Blockchain calls to fetch metadata for respective NFTs in the collection. We set ourselves an SLA of 30 seconds for the collection of 10k NFTs, to perform all these calls, run a rarity algorithm engine, and persist data. Kind of Fork Requests (HTTP or Blockchain on average of 10k per collection)and join data to run via Rarity Algorithm engine.
So I had a mix of both long-running processes as well as short-lived processes, however in the long term if I had not decided to bring it down I would have gone with short-lived processes for 85% of my application for a cost-effective solution.
Crux of Building Rarity Rank
The most important crux of my application was fetching Metadata of 10k NFT on average in the collection either via HTTP or Blockchain calls, so I decided to use a mix of golang routines wrapper proxy for HTTP calls and NodeJS for blockchain calls due to better web3 library support.
High-Level Architecture Workflow
The below diagram depicts RarityRank.net Architecture, which I personally designed, and effectively developed too. (Gives me immense pleasure tbh :) )
I build in with a mind to keep services loosely coupled to each other and adhere to specific functionality, a superset of FaaS.
I used serverless in a few cases, where functions were pure stateless and do the job and giving data back to clients, as I wanted them to keep lightweight.
Above design was implemented as above, however, there was still a lot of scope for improvement as I wanted 100% event-driven but due to time constraints and cost I decided to skip the final version which was different from than above.
I won't go individually into what each microservice did or Lambda Functions did as the diagram gives the whole picture of what the design did and how it worked if you want to know let me know, and I will update this article for the same.
Tech Stack Used for RarityRank.net
- ReactJS coupled with Ant Design for UI/UX.
- Middleware consisted of NodeJS, Golang, and Java.
- Postgres data with Redis in Write-Around strategy.
- AWS Services such as BeanStalk, Lambda, API Gateway, etc. (Wanted to use SQS along with Step Functions for more event-driven control, but next time )
- LogRocket for UI Session Monitoring.
- Firebase for User management and Authentication.
- It cost me 10K INR to keep running for 1 month, with almost 50+ contracts onboarded.
- Had over 1500 Users signed up. (data captured from firebase)
Voila, let me know your views and questions on the same. I will be doing technical low-level blogs on things I learned as part of building rarityrank.net.
Thank you for reading, If you have reached it so far, please like the article, It will encourage me to write more such articles. Do share your valuable suggestions, I appreciate your honest feedback and suggestions!