All six activities below are potentially useful for a full distributed decision-making process, involving as many participants as you like – 10s, 100s or 1000s of people, depending on your application.
Realistically, though, you’re likely to use fewer activities – e.g. perhaps just a Preferences Survey on its own, or a Ranking Survey followed by a Preferences Survey, or other combinations.
Simply enter people’s email addresses into 1000minds, and they’ll be invited to take part. Participants can also self-enrol from a sign-up webpage, which is great for ‘convenience’ or ‘snowball’ sampling.
As well as 1000minds surveys, you can embed another online survey (e.g. Wufoo, SurveyMonkey, Google Forms) into your survey to collect other information of interest such as participants’ socio-demographic data.
This survey – also known as a Conjoint Survey when 1000minds is used for conjoint analysis – entails participants answering questions involving trade-offs between pre-specified criteria.
Participants’ answers determine, via the PAPRIKA method, their individual preference values (or ‘part-worth utilities’) – representing the relative importance (weights) of the criteria – and on average for the group.
The preference values can be used to rank alternatives, and it’s easy to compare participants’ rankings, and their preference values and answers to the survey questions.
This is like the Preferences Survey above, except that participants as a group reveal their preference values – representing the relative importance (weights) of the criteria – by voting on their decisions online.
Alternatively, participants can use a decision-support centre or their laptops from a shared location.
This activity reveals the preference values for the group of participants as a whole, and can be used to rank alternatives.
This survey involves participants ranking descriptions of real or hypothetical alternatives intuitively. Significant disagreements about rankings attracts people’s attention, and may indicate the need for a new prioritization approach (e.g. using 1000minds!).
Participants can also work together to rank the alternatives by consensus.
By having participants discuss their rankings, criteria for differentiating amongst alternatives can be teased out and then used to create a 1000minds decision model.
Participants can also work together to rank alternatives by consensus, for use later as a pseudo-gold standard to be compared against rankings from other group decision-making activities.
Participants categorise descriptions of real or hypothetical alternatives on pre-specified criteria. Any disagreements about alternatives’ categorizations might highlight issues with how criteria are worded, so they can be refined.
Participants can also work together to rank the the alternatives by consensus.
Alternatives can also be ranked based on how they were categorised by applying the preference values (or ‘part-worth utilities’s) – e.g. from a Preferences (or Conjoint) Survey (as above).
This activity enables participants’ rankings of alternatives from the various other activities (as above) to be easily compared. Ranking similarities and differences are useful for considering the rankings’ face validity.
For example, some users apply the consensus ranking from a Ranking Survey as a pseudo-‘gold standard’ to be compared against rankings from other group decision-making activities.
For groups to enter alternatives – usually, when there are large numbers – into 1000minds, including rating them on pre-specified criteria.
The entered alternatives can then be aggregated (by 1000minds) for decision-makers to review and apply in a 1000minds decision model.
This activity enables an administrator or manager to distribute the task of entering alternatives across a group of people – perhaps with specialised roles or expertise – including rating the alternatives on pre-specified criteria.
Equivalently, this activity can be implemented as a survey whereby participants suggest or nominate alternatives for consideration.
(Of course, the usual approach for entering alternatives into 1000minds – when you don’t want to distribute the task across a group of people – is directly via the decision model itself. Data in Excel can be imported directly.)