Objectives
The overall goal and objectives for this contributor funding experiment
Last updated
The overall goal and objectives for this contributor funding experiment
Last updated
The overall goal of this contributor funding experiment is to learn as much as possible about operating a contributor funding process and the potential impact that contributor funding could have for Web3 ecosystems. This initial funding experiment will focus on open source contributor funding and will trial a number of suggestions that have emerged out of some . The objectives for this experiment are focused on generating a large amount of data to learn about how these different approaches perform in practice. The data and feedback generated by this experiment will be highly insightful for analysing the effectiveness of contributor funding and the differences it has with other funding processes across the industry. The knowledge and experience gained from this experiment will also help with identifying the most promising ideas for improving this funding process for subsequent experiments. The following objectives are data focused and are intended to help with achieving the goal of learning as much as possible about operating a contributor funding process.
Record prioritisation suggestion data
A prioritisation suggestion board is going to help with learning about what priorities the community thinks are important and how those priorities get addressed or change over time. A separate priority process means that suggestions can be submitted at any time and receive upvotes at any time. It will be insightful to understand how these priorities influence the efforts of the contributors and what impact this process could make for supporting a more flexible and dynamic contribution process that can respond to these suggestions. Measuring this objective:
Priority suggestions - The priority suggestions board will be publicly available to allow any community members to make suggestions and upvote or give feedback to existing suggestions. This data can be collected and analysed at the end of the funding experiment to understand how this influenced the contribution outcomes.
Contribution tasks - A contribution task board will be public and enable anyone in the community to see what a contributor is working on. This board will provide data about what actual contribution efforts get focussed on which can then be compared to the priorities that have been suggested. It should also be insightful to see how community members give feedback to contributors based on the tasks they are executing and the priorities which are currently being suggested and endorsed by the community.
Contribution logs - Contribution logs will make it easier to see exactly what contribution outputs have been delivered. The aggregation of these contribution outputs can be analysed and compared with the suggested priorities to see how the community suggestions actually impacted what got executed.
Record contributor proposal data
The submission of contributor proposals and the selection of those proposals can provide insightful data about what information is most important and which factors were the most correlated with proposal success. This data can help with future efforts to minimise the information that is needed for voters to make well informed decisions so that the voting process can become more efficient and scalable. Measuring this objective:
Contributor proposals submitted - All of the contributor proposals submitted should provide useful data about how different people have approached writing a contributor proposal.
Contributor proposal submission feedback - The proposers might have valuable feedback about the submission process and provide ideas about what other information could be added.
Voting results - The results from the voting process can help with identifying any trends in how the voters responded to each proposal based on the contents of the proposal.
Voting process feedback - Voters may provide feedback about the information that is included in the proposals and what information they believe should be included in the future.
Record voting data
is an adaptation of approval voting that should help to increase the expressiveness of this voting process. This suggested voting system will be used in the contributor funding experiment and the voting results data can be analysed to see how people vote in practice. This data will provide more evidence about how effective this approach is and whether it should be trialled in a larger experiment. Measuring this objective:
Voting results - Expressive approval voting will be used in this funding experiment. This voting system can be trialled using existing services and tools such as Google Form and Google Sheets. Anonymised voting data can then be released to the public for further review and analysis.
Voting process feedback - Voters will be asked to provide their feedback about the voting process when they complete the contributor selection decision process. This should help to identify any concerns or issues that people had with the voting approach and any suggestions for improvement.
Record contribution task board data
A public contribution task board should provide useful insights about how contributors coordinate themselves and execute different tasks. Contribution task data should help with providing some initial ideas about what might be needed if a custom solution was going to be developed to handle the suggestion and development of ideas.
Contribution task board - Contributions tasks will be publicly created and managed on a contribution board. This data can be reviewed during and at the end of the funding process.
Contribution task feedback - Any community member can provide feedback to a contributor about a task they are working on. It will be insightful to see how much feedback is given to contributors and what the value and impact is of the feedback given.
Record contribution log data
Contribution logs - Contributors will be required to submit their contribution logs to provide evidence of the contribution outputs they have generated in that month. All contributions can be recorded publicly as all contribution outcomes must be open source. Contribution logs will be open source and available on a code repository for anyone to collect and review.
Contribution attestations - Community members can give attestations about someone's contribution efforts. This will provide more evidence in situations where contribution efforts are harder to digitally record and verify.
Funding process contributor feedback - Contributors will be asked to provide their feedback about the contribution log process once they have completed the funding process. This feedback could be highly valuable in understanding how simple or complex this process was for them and what the overall sentiments are.
Record collaboration data
A contributor funding process could help with creating a highly collaborative environment. Contributors in most cases will have the flexibility to work on any contribution area that they believe will generate impact for the ecosystem. Contributors are not tied to a single idea and can contribute towards many ideas or they can change the idea they are working on if something more impactful presents itself. Much of the data captured in this experiment should be useful for assessing how much collaboration has occurred which can then be compared with other funding processes. Measuring this objective:
Priority suggestions - Priority suggestions will generate valuable information about how people collaborate and discuss different suggestions and how those suggestions eventually lead to executed ideas.
Contribution tasks - Contribution tasks will show how contributors are collaborating together and with other ecosystem projects. Contributors may also need to respond to community feedback.
Contribution logs - Contribution logs will highlight what actually happened and what was delivered. This data will be highly insightful to see how many ideas a contributor worked on, how they collaborated with different teams and which contribution styles were the most effective.
Contributor peer reviews & feedback - Contributors and community members could provide insightful information about the impact and relevance of certain contribution efforts and how those contributions were valuable.
Record voter and contributor participation time data
Data about how long it takes voters and contributors to participate in this experiment will be useful for making comparisons with other funding processes. For contributors it will be useful to know how long it takes them to write their proposal to be considered as a potential contributor. For voters it will be useful to know how long it takes them to read and select the most promising contributors during voting. Measuring this objective:
Contributor proposal submission feedback - Contributors will be asked to keep a record of how long they spend creating their proposal. A proposal submission feedback form will then help with capturing this information.
Voting process feedback - Voters will be asked to keep a record of how long they spent reading profiles and selecting contributors. A contributor selection process feedback form will then help with capturing this information.
Average read and write times - Estimated read and write times can be determined by taking the total number of words there are in a contributor proposal and multiplying that by the average time it takes someone to write or read that amount of text. These values can then be compared against the average voter and contributor values provided.
Record voter and contributor opinions and preference data
After the funding process experiment is completed the opinions and preferences of the contributors and voters who participated can be recorded. For contributors it will be useful to know what they thought of the process, what they intend to do next and whether they prefer this funding process versus other approaches. For voters it will be useful to know what they thought of the process and whether they believe it was effective or not overall. Measuring this objective:
Priority suggestions - Anyone in the community can submit priority suggestions and provide feedback to any other suggestion that has been shared. This will be useful for finding out the breadth and diversity of opinion and preferences there are in the ecosystem.
Contribution task feedback - Any feedback given could be useful for identifying different ways to execute a task or sharing things that the contributor might not have considered or known about.
Contributor peer review & feedback - What other people say about another contributor could be very insightful for seeing some honest opinions and preferences that people might have about the contributors and funding process. This data could reveal a number of opinions and preferences such as what expectations were missed or how one contributor might have impacted them personally.
Funding process contributor feedback - Asking contributors about what they thought of the funding process, what their next steps are in the ecosystem and whether they preferred this funding process versus other approaches could be highly valuable feedback. This information should be useful for spotting any trends in what contributors think about the funding process and what they end up doing next after having this experience. Contributors may end up working with collaborators they met during the funding process or they might prefer to continue being a funded contributor in a future funding round.
Funding process voter feedback - Voters will be asked about what they thought of the funding process and how effective it was at generating impact for the ecosystem. Each voter could have a different perspective about what outcomes they expected to happen with the funding process. This feedback should help to identify shortcomings of the funding process or where it might have met their expectations.
Record funding process outcomes data
Each contributor could contribute towards a range of open source projects. These contributions will all be open source. At the end of the experiment the final contribution outcomes can be aggregated to better understand what outcomes have been achieved. Comparisons can then be made about what has been achieved using this experiment against what might have happened with another funding process. Understanding and comparing these outcomes should be highly valuable for determining whether a contributor funding process is a promising long term solution. Measuring this objective:
Contribution logs - The contribution logs will point towards all of the contributions that each person has made and the different projects a contributor has collaborated with. These contributions can be aggregated together to understand what actual outcomes have been generated overall. It will also be useful for identifying any trends in how people contribute and which of those contribution behaviours have been the most effective.
Funding process voter feedback - Voter feedback at the end of the funding process will provide valuable information about what the voters thought about the funding process as a whole based on the outcomes that were actually generated.
Funding process contributor feedback - Contributors will give their feedback about the funding process at the end of their funded contribution period. Contributors may want to continue being funded by the same funding process or they might be looking to join an existing project team in the ecosystem. Learning about any of these outcomes would be highly valuable for understanding how successful the experiment has been for the contributors. Contributor feedback should also help with understanding the overall sentiment about what has been achieved and whether this format was preferable to them over any other funding approaches they might have experienced.
represent a simple and potentially highly effective way for recording and verifying contribution efforts. Contribution log data will help with making comparisons with other contribution verification approaches used in other funding processes to see which ones are most effective. Measuring this objective: