Dune Analytics is a tool to analyze blockchain data. They gather data from various blockchain sources, combine them into user-friendly SQL databases, and then allow users to freely query these databases. They also let you create visualizations from your queries, and then you can combine several of these visualizations into a single cohesive dashboard.
The best part? All of these services are completely free for anyone to use. This is an amazing public service for those looking to learn more about and explore crypto. As I’m interested in several aspects of the crypto space, I wanted to dive into learning Dune Analytics by setting up a simple dashboard. This post will describe my learning process and the steps required to create a dashboard.
One of the Web3 projects I’m interested in is Poolsuite FM, which is difficult to describe. At its core, it’s a music curation service with hand-selected playlists for different summer vibes. But, it’s much more than that: they have an incredibly cool retro website, a line of sunscreen, and are planning several other projects. I described it to a friend as “vibe incarnate.” You kind of just need to explore the website to get it.
They recently launched a line of NFTs, which will eventually allow the NFT holders to access exclusive content and features. There’s two separate NFTs: a Pool Member and Executive Member pass. For my first Dune Analytics dashboard, I wanted to create a simple set of charts to track the price of each of these NFTs over time.
The first step of using Dune is creating a query. They have a list of all the tables available, and the one I’m interested in is
nft.trades, storing NFT transaction data. Here’s what the interface looks like:
The top pane is the query builder, and the bottom half is the results table. Above the results table there’s an option to build visualizations from your query results, which can then later be added to visualizations.
My first step was to get a basic query running, because I’m not great at SQL syntax. I typically use
dbplyr to connect to SQL databases and pull data, and so I very rarely write actual SQL queries by hand. Fortunately, I recently got access to the OpenAI beta, including its extraordinary Codex model, which can
turn natural language into code. This is an excellent tool for learning the syntax of a new language. For example, here is what I typed in to understand how to group and average data:
SELECT was generated by OpenAI, which is honestly almost hard to believe. I’ll have a blog post in the future demoing some more incredible aspects of OpenAI. For this project, I basically used it as a supercharged Stack Exchange, letting me instantly find the correct syntax with just a few lines of input.
With this example, I learned the basics of grouping syntax, and put together the following query after some trial-and-error. The two
nft_contract_address that I filter on are the addresses for each of the NFTs, which I found by searching on
WITH exec AS (SELECT date_trunc('day', block_time) AS time, AVG(usd_amount) AS exec_avg, sum(usd_amount) AS exec_sum, nft_contract_address FROM nft.trades WHERE nft_contract_address = '\xb228d7b6e099618ca71bd5522b3a8c3788a8f172' GROUP BY time, nft_contract_address), member AS (SELECT date_trunc('day', block_time) AS time, AVG(usd_amount) AS member_avg, sum(usd_amount) AS member_sum, nft_contract_address FROM nft.trades WHERE nft_contract_address = '\x123214ef2bB526D1b3FB84A6D448985F537d9763' GROUP BY time, nft_contract_address) SELECT member_avg, member_sum, exec_avg, exec_sum, exec.time, member_avg / exec_avg AS member_exec_avg_ratio, exec_sum / exec_avg AS n_exec, member_sum / member_avg AS n_member FROM member FULL JOIN exec ON exec.time=member.time
A few notes on issues I ran into, and things I learned:
Double quotes do not work in SQL! This took me an embarrassingly long time to figure out. I’m used to using double quotes, so much of my query was breaking completely.
date_truncwas returning either a string or integer representing the date, rather than a date class column, which was really messing with my resulting visualizations (all of the date labels were very ugly).
Reshaping SQL tables to wide format seems pretty difficult, which messed with my plan to create a single table and then reshape it. Instead, as you can see, I created two tables and merged them together. It’s not the most elegant solution, but for this toy project it seems to work just fine.
With that query running, I was left with a table of the average USD price and number of transactions for each type of NFT. The next step was to create some visualizations. Dune has its own built-in visualization setup, allowing you to make various simple charts. It doesn’t let you customize every last tiny detail, but it’s great for quickly throwing some charts together.
After you make some visualizations, the real magic is the dashboards. A dashboard is a collection of visualizations and text to give a variety of data on a single topic. They’re also publicly available by default, so everyone can see your dashboard, and you can see others' for inspiration. For example, here’s a dashboard showing off Opensea’s mind-boggling growth over the past few months:
You can check out the dashboard I made here. Here’s a screenshot:
I’m pretty happy with the result, especially considered the amount of time it took to make. I threw all of this together, including learning the SQL syntax, in only an hour or two. And, the results are useful: you can see the price of the NFTs in USD, rather than ETH. This way, it’s easier to get a sense of whether or not the NFTs' prices are tied to ETH or USD. From my quick reading, it indeed looks like the Executive Member NFT is correlated with ETH, indicating that people are primarily thinking in ETH terms when purchasing these NFTs. For a future project, maybe I will try to analyze this question more: what currency are NFTs really priced in?
I had a great first experience with Dune Analytics, and look forward to using it more in the future.