SCF Lab Fund 1 Retrospective

Tyler van der Hoeven
Stellar Community
Published in
5 min readNov 4, 2020

The SCF Lab Fund №1 has closed and we’ve got the final results!

This lab fund was a first of its kind, and as such was quite experimental. Let’s begin with the final results, and then detail how this lab fund performed and what changes may be needed for future Lab and Seed SCFs.

Anara Farmers Market | 22.5225% | 112,612 XLM

Ayadee TRAK Supply Chain System | 19.4337% | 97,168 XLM

Lumenswap | 15.3797% | 76,898 XLM

Blocknify | 10.2960% | 51,480 XLM

DEB | Ride-Share App Built on Stellar Network | 10.2317% | 51,158 XLM

Crypto Link | 8.1725% | 40,862 XLM

EduNode: Meet, learn and build on the Stellar Network | 7.0785% | 35,392 XLM

Stellarscam.report | 5.4054% | 27,027 XLM

Relax | 1.4801% | 7,400 XLM

This round we required voters to register via SMS. We had 682 SMS registrations, and of those 491 successfully submitted a vote. So while we did get a lot of folks through the system, we also lost a few along the way. Our takeaway: the user experience for flaggable quadratic voting (FQV) could use some tweaking. In future rounds, we plan to improve the UX and the explanatory content to make it easier to participate in — and successfully complete — the process.

The reason we switched to an SMS verification flow was to prevent fraudulent votes, but it turns out that while SMS verification does slow down bad actors, it can’t prevent them entirely. Of those 491 accounts that submitted votes, we disqualified 181, leaving us 310 valid voters. Those disqualifications are the reason these final results don’t match the results that appeared on the Stellar Community Fund voting site Sunday night.

How did we decide to disqualify votes? Rather than doing it unilaterally, we enlisted the help of the `scf_panel` Keybase group, a group that consists of the SDF employees and community members who make up the jury pool. Together we landed on a number of objective criteria for evaluating vote validity, and applied them judiciously to screen out votes. We won’t divulge what the exact criteria are since doing so could create a playbook for bad actors, but we took advantage of all the data we had, including IPs, time zones, and timestamps. It isn’t just me manually combing through thousands of Keybase profiles anymore.

Despite the need to disqualify a few votes, this round was a definite success, and ultimately it’s worth focusing on the positives from this round. 9 amazing projects are receiving funding. The panel made good choices with the finalists. Voter manipulation is down significantly from SCF 5, and the process we used to identify bad actors is more decentralized and objective. The mechanics of FQV — while perhaps not explained sufficiently — did produce the intended result of a more logarithmic sentiment distribution vs a stagnant linear one. It was a great SCF considering the dramatic changes which version 2.0 introduced, and I’m excited to implement what we’ve learned into the upcoming Seed Fund.

Finally, some interesting stats and some comments on the future of the SCF:

Downvotes aren’t used much.

On average only 10–15% of an entry’s votes were downvotes. That’s what we were hoping to see. Negative sentiment is important and being able to express it is a powerful signal. However, if negativity is used more often than positivity, it cripples the system’s ability to appropriately allocate funds. Seeing the percentage allocation both being used at all — but especially being used as intended — was great.

The average time spent selecting and confirming votes is 90 seconds.

People aren’t reading proposals. They’re just quickly clicking on projects they’ve been sent to upvote.

The percentage of votes coming from users who voted all credits on one project is very high

This directly correlates with the fact people aren’t reading proposals. They’re showing up to vote for a specific project. They do that with all their credits — and then they’re done. We may be able to increase voter engagement by making it easier to access information about projects, but ultimately this kind of focus may be inevitable: voters vote for the project that directed them to the community fund page in the first place, and they don’t concern themselves with the rest of the contestants. I had hoped more folks would spread their votes, and the fact that they didn’t will be the biggest feed into how the SCF is improved. The community panel is likely to play a much bigger role, not only in selecting finalists but in distributing some, most, or even all of the funds.

As always as you have feedback, questions or concerns please feel free to reach out in our Keybase group.

Lastly I want to thank all of our participants for shooting your shot with us in this first experimental SCF Lab fund round. It was a wild ride, and I cannot thank you enough for your patience, feedback and effort. I hope you all continue building and innovating on Stellar and find great success as you press on in your efforts.

— Tyler, SDF Ecosystem Evangelist

--

--

Tyler van der Hoeven
Stellar Community

Engineering better financial futures @StellarOrg through funding, education and innovation. I write my own words. — “Work, and stuff will happen.”