This guide explains in detail how to run a 1000Minds Preferences or Conjoint Analysis Survey.

Preferences Surveys – also known as Conjoint Analysis Surveys – are used for discovering the preference values or part-worth utilities (or ‘weights’) for potentially 10s or 100s (even 1000s!) of participants, representing the relative importance of decision criteria or attributes to them.

This guide is deliberately detailed. Many users won’t need it (nor all the features explained here); but for those who do, everything is carefully explained here. Bolded text below usually denotes buttons or data fields in 1000Minds for you to look out for.

You might like to print this page so you can refer to it while using 1000Minds. There are also links to this page inside 1000Minds.

Getting started: Log in and create a decision model with decision criteria or attributes

Jump to the main section below if you’re already familiar with these preliminary steps.

  • Go to www.1000minds.com and click log in. If you don’t have a 1000Minds user account, click free trial.
  • Enter your username (usually email address) and password.
  • You can choose the particular 1000Minds service you want to use – ‘All-purpose Decision-Making’, ‘Conjoint Analysis’, etc (i.e. depending on what you want to rank) – by clicking area of application.
  • To create a 1000Minds decision model – consisting of decision criteria or attributes, etc – click new model. This is the model you will use in the preferences survey or conjoint survey (if you’re logged into the Conjoint Analysis service). Your survey will be based on this model (i.e. it will be the selected model for survey at step 4 below).
  • For this new model, follow these two steps (top of screen) in particular: setup and criteria (or attributes if you’re logged into the Conjoint Analysis service).

At the setup step, give your model a name and customise the question at the decisions step as appropriate for your application. This question will be used in the preferences survey – so it’s important it says what you want it to. (Note, you can give shared access to the model to other people who have 1000Minds accounts too – e.g. to colleagues, so you can work together on the model.)

At the criteria step, enter the criteria to be included in your preferences survey (as in the next section). Be careful that within each criterion its levels are entered down the page from lowest ranked to highest ranked. This is very important as 1000Minds does not read your criteria (their words), but relies on the rankings of the levels for each criterion, as specified by you.

Also at the criteria step, you might like to enter any impossible combinations of criteria that you can think of. Such combinations of criteria – i.e. that are impossible or unrealistic in practice – will be automatically excluded from the questions asked in your survey, thereby reducing the burden on participants. Alternatively, participants can identify such impossible combinations when they’re asked about them instead.

  • In addition, if you want to, at the decisions step (or choices if using the Conjoint Analysis service), you can set-up the integrated consistency checker, whereby participants in your preferences survey will be asked to re-answer a few questions each – as a check of their consistency.
  • To set up a preferences survey, click the distributed processes tab. The next section explains how to create and administer a preferences survey.
  • If you’d like to continue with the decision-making exercise you initiated when you created your model (as above), you can. For example, you can enter alternatives (including via import from excel), as described on the criteria you entered, that will be ranked after you’ve made decisions and you can enter other factors and make selections at selections, etc (e.g. see How does 1000Minds work?). This is also a good way of testing the model that your Preferences Survey will be based on to make sure you have things the way you want them.

Creating and administering a Preferences Survey

Follow these 10 easy steps.

  1. Click the distributed processes tab (to the right of the decision models tab). And then click new process.

    Alternatively, if you’ve already engaged in other 1000Minds surveys or distributed activities (also via distributed processes) initiated earlier, you can open and add to them, as you like. You can use the same participants (i.e. ignore the next step) or add additional participants if you want to.

  2. In your new process (for which you should enter a process name), you’ll immediately see an activity that has been automatically included in your process, entitled participants – available for all activities in current process. This ‘activity’ is where the names and email addresses, or anonymised ID numbers, of the participants in your survey will appear.

    As explained in detail at steps 7 and 8 below, there are two main approaches for distributing the survey: (1) Participants can self-enrol for the survey from a sign-up webpage that you can easily create using 1000Minds, and/or (2) if you know participants’ names and email addresses, you can enter them into 1000Minds yourself and send them a personalised email using 1000Minds.

    Depending on the circumstances of your survey, either approach can be used on its own or both approaches can be used together. For most applications, we recommend approach (1) over (2), because (1) gives you more options for distributing your survey, as well as avoiding potential problems – e.g. overly-sensitive spam filters – that may arise from emailing people from a server (1000Minds) not directly associated with you.

  3. With respect to approach (1) above, step 7 below explains how at the webpage and email page you can create sign-up webpage for new participants to self-enrol for the survey. They will also appear at participants (just like any ones you entered yourself, as explained below).

    With respect to approach (2) above – which though, as mentioned at the previous step, is not generally recommended relative to approach (1) may still be useful for some applications – you can enter participants’ names and email addresses by clicking the above-mentioned participants ‘activity’. You can enter each participant individually by clicking new participant (e.g. you might like to enter your own name and email address to test this). Also, near the top of the participants page you’ll see paste in participants, which allows you to paste in batches of participant names and email addresses (e.g. copied from WORD or Excel). You can enter additional participants at any time later on too.

  4. Next, click add activity at the process activities page, and then preferences survey (or conjoint survey if using the Conjoint Analysis service). (The other 1000Minds activities also on offer pertain to other possible components of an overall distributed process; for more information, see How to run a distributed process?)

  5. After your Preferences Survey opens at the preferences survey page (or conjoint survey), it is very important that the selected model for survey there (chosen by you) is the one you want to use for surveying your participants. You need to be sure it’s the one you worked on with respect to specifying the criteria that are to be evaluated in your survey (the 5th bullet point above), as the selected model for survey can’t be changed once participants have started the survey. If you’re unsure, click open model and check it. (When you’re ready for the survey to begin – to be sent out to participants – activity status should be “open for survey”.) If you’ve not already created a model, return to the 4th and 5th bullet points above and create a model to be used in your survey.

    As mentioned at the 6th bullet point above, it’s still possible (if you’ve not done so already) to set-up the consistency checker, whereby participants in your survey will be asked to re-answer a few questions each – as a check of their consistency.

    You can also edit the brief instructions for participants on survey webpage – e.g. set a date/time that you’d like them to finish the survey by, etc. The messages you want to include there, and in the other text boxes, are up to you; they are specific to the application you’re working on, and perhaps to the survey participants too. Try to keep these instructions as brief and to the point as possible. As mentioned later below, you’ll get a chance to experience yourself how this appears.

    It’s also easy to suppress some of the screen features that participants see when doing their Preferences Survey by unticking the boxes at display options on survey webpage. Depending on your application, you may want to do this to simplify things. For example, you may want to remove the help tips (yellow box) that by default appear on the survey webpage, or hide the “this combination is impossible” buttons (if they’re not relevant or you’ve pre-specified impossible combinations in the selected model for survey), or suppress the “allow comments to be entered” tickbox, etc (it’s up to you).

  6. Also at the preferences survey page, you will see message when participants finish survey. The message by default in the space there is to thank participants and also, if appropriate, to inform each participant when they finish the survey of their results: their individual preference values or part-worth-utilities (weights), representing the relative importance of the decision criteria or attributes to them. The code {PreferenceValues} in message when participants finish survey is used by 1000Minds to display the participant’s preference values. Do not change this code (i.e. {PreferenceValues}) unless you want to delete it (i.e. if you don’t want these results to be reported to participants when they finish the survey).

    There are several other things you can do to the results you report to participants when they finish their surveys (in addition to keeping or deleting {PreferenceValues}). They are all discussed in the help tips at the preferences survey page.

    For example, if, for simplicity, you want just the overall weights on the criteria (summing to 1) – i.e. no levels – to be reported to each participant, replace the code {PreferenceValues} with {Attributes} at message when participants finish survey (and edit the accompanying text appropriately too). Or if you want just the attributes, ranked in descending order – i.e. no levels and utility values – to be reported to each participant, insert the code {Attributes scores=false bars=false} (and edit text). Or if you want the top-ranked n attributes – e.g. top 3 – to be reported to each participant, insert the code {Attributes scores=false bars=false topn=3} (e.g. n=3) – with or without {PreferenceValues} as well (and edit text).

    You can also enable ranked alternatives to be reported to participants (provided the selected model for survey includes some alternatives) – by inserting the code {RankedAlternatives categorization=false description=true}, or {RankedAlternatives} to also report their categorizations, in message when participants finish survey (and edit text).

    You can also display a ‘value for money’ chart, like at the selection step in the selected model for survey – by inserting the code {VfmChart} in message when participants finish survey (and edit text).

    In addition, if you want to link from the end of your 1000Minds survey to another survey such as Google Forms, Wufoo or SurveyMonkey use URL for another page to finish on. This extra survey – which, as far as participants are concerned, will be seamlessly joined to their 1000Minds survey – is for participants to answer after the 1000Minds questions. For example, you might use such an extra survey for collecting participants’ socio-demographic data or other information that you are interested in. This extra survey is also available if you are emailing participants to invite them to do a 1000Minds survey (explained in the next step).

    Detailed instructions about how to link to a Google Forms survey appear in the help tips near the top of the preferences survey page. Please note that, albeit not free, Wufoo and SurveyMonkey have several advantages over Google Forms (free!); and so you should think about which of these survey tools is best for you.

    In particular, these paid-for survey tools, in conjunction with 1000Minds, allow you to do more complex yet seamless integration, such as automatically passing and storing a hidden unique ID for each participant and chaining from 1000Minds to the other survey and back again. This integration maximizes your ability to match participants’ data as well as allowing you to present the 1000Minds activity, the other survey and, finally, a closing message to participants. For more information, please see How to pass survey parameters?

    Similarly, if you want to direct participants to another webpage than is defined by message when participants finish survey, you should enter a URL for another page to finish on. Otherwise, just leave this space blank.

    Especially if you make any changes as discussed above, it’s good practice to check to see what participants will experience when they do the survey, and to make sure that everything is working as you want it to. As explained later in the next step, this is easy to do (follow the instructions that appear later below about how to impersonate a participant).

  7. At the webpage and email page, you are offered two means of distributing the survey: (1) create sign-up webpage for participants to self-enrol for the survey (explained at the next step), and/or (2) email participants using 1000Minds. As mentioned at step 2 above, depending on the circumstances of your survey, either approach can be used on its own, or both approaches can be used together – though, for the reasons discussed earlier, for most applications we recommend approach (1) over (2). It’s important that you do not mix-and-match the two approaches (i.e. do a bit of one and a bit of the other and create some kind of Frankenstein-hybridised approach – that will not work!).

    The create sign-up webpage approach to distributing the survey includes a range of participant registration options in terms of anonymous versus not anonymous and whether an email address is required, or encouraged, or not, etc (further discussed below). This approach has the obvious advantage that you don’t need to know participants’ names and email addresses, which is great for ‘snowball’ or ‘convenience’ sampling, for example. All you need to do is email the sign-up webpage link – as created by 1000Minds (see get sign-up web address) – to prospective participants. You can do this via your own means of distributing emails. Or, for example, you might put the link on a webpage of your own (or on Facebook, etc) and direct traffic to there.

    When you create sign-up webpage, the participant registration options include requiring – optionally or compulsorily (or not at all!) – participants to enter their email addresses, so that you can email participants using 1000Minds, if appropriate. For example, a link to the survey can be automatically emailed to them (e.g. in case they want to take a break from the survey and return later and/or return to see their survey results). Later on you can also email participants using 1000Minds – with an up-dated email message – e.g. to remind them to complete their surveys or to thank them for their participation, etc.

    If you choose a participant registration option that includes an email being sent out to participants (i.e. after they enrol for their survey via the sign-up webpage link you sent them, as discussed above), then you should probably edit the subject line and email message (via email participants using 1000Minds) so that they communicate what you want to say. In the email message it’s important that you don’t change these three codes: {fullname}, {url} and {reply-to}. These codes are used by 1000Minds for the data used to personalise the message to each participant; thus, each email is personally addressed to {fullname}, and each participant will get his or her own {url} in their personal email, etc.

    You should test the survey you created by clicking the sign-up webpage link yourself (inside get sign-up web address) or entering it into your browser – to experience what survey participants will experience (especially brief instructions for participants on survey webpage and message when participants finish survey, as explained at step 6). Such testing allows you to check that you have everything the way you want them (and, if not, you should fix things!). Also, after someone has self-enrolled (e.g. you as a test-pilot), you can go to participant progress and click their open-in-new-window icon (i.e. for anyone participating in the survey). This allows you to see what a participant sees when they do their survey.

    Notice that at the participant progress page you can easily delete responses for any participant (including yourself!), and also easily change their state of progress. You can also delete participants – e.g. yourself! – at the participants page. These options are especially useful during testing when you want to iteratively make changes to a model (which would be locked against changes if it had any participant responses) and test the impact on the survey.

    In addition – though most users won’t be interested in this feature – it’s possible to embed a unique User ID in the sign-up webpage link referred to above. This can be useful, for example, if you have a sample of survey participants who are known to you but for whom you don’t want to have to enter their names (e.g. for anonymity reasons, and also for convenience perhaps). For detailed instructions about how to do this, especially if you have a ICT-technical bent, see How to pass survey parameters? and/or email enquiries@1000minds.com and we’ll help you implement this (note though, as alluded to above, this feature is likely to be useful only in exceptional circumstances).

  8. As mentioned earlier, if you know everyone’s name and email, it’s possible to distribute the survey via the email participants using 1000Minds approach – using the names and email addresses you entered at participants (step 2 above). You can add additional participants at any time by clicking the participants button near the top of the webpage and email page. Then, all you have to do at the webpage and email page is to make sure that the participants to whom you want to send emails to are ticked.

    Before sending out the email, as you would for any email you were sending, you can edit the subject line and the email message inviting participants to participate in the survey. Later on, as discussed at step 7 above, you can also send reminder emails to participants who have not completed their surveys via this facility – but with an up-dated email message.

    In the email message it’s important that you don’t change these three codes: {fullname}, {url} and {reply-to}. These codes are used by 1000Minds for the data used to personalise the message to each participant; thus, each email is personally addressed to {fullname}, and each participant will get his or her own {url} in their personal email, etc.

    Before sending out the emails, it’s good practive to experience what participants will experience (e.g. as a test that you have everything the way you want it, especially brief instructions for participants on survey webpage and message when participants finish survey, as also mentioned at the previous step). Go to participant progress and click their open-in-new-window icon (i.e. for anyone). This allows you, in effect, to ‘be’ that particular participant (i.e. see what he or she will see when the survey starts). Another simple test is to include your own email address (as usual, entered at participants), and simply send an email (as explained next) just to yourself so that you can see how the email appears to you and make any changes if required (before sending out the finalised email to all participants).

    As mentioned at the previous step, at the participant progress page you can easily delete responses for any participant (including yourself!), and also easily change their state of progress (e.g. reset it to “email not sent yet”). This is especially useful during testing when you want to iteratively make changes to a model (which would be locked against changes if it had any participant responses) and test the impact on the survey.

    When you are ready – i.e. you’re confident that you have the correct set of participants (to send emails to), the correct model (selected model for survey), and you’re happy with the subject line and your email message (as well as brief instructions for participants on survey webpage and message when participants finish survey on the preferences survey page) – click send emails (bottom of the webpage and email page; and make sure activity status is “open for survey” at the preferences survey step). Personalised emails will be sent to the ticked participants individually and addresses aren’t revealed.

  9. After having started the survey, as time passes (e.g. later in the week) it’s a good idea to check on the progress of your survey participants. Do this by returning to the distributed processes tab, clicking on your process name, and then preferences survey (or this will be called conjoint survey if using the Conjoint Analysis service). Click participant progress (and export to excel for more details) and/or results to see how participants are getting on with their surveys.

    You can also send reminder emails to participants (provided you have their addresses) who have not started or completed their surveys via the webpage and email page (as mentioned earlier) – but this time with an up-dated email message reminding participants to start or finish their surveys. As mentioned earlier, you can also send emails to participants who self-enrolled via the create sign-up webpage approach to distributing the survey and in the process recorded their email addresses (if you asked them to). You can also see how participants are getting on with their surveys (and also email them easily – e.g. perhaps to thank them for doing the survey) via the distribution lists button near the top of the webpage and email and participant progress pages.

  10. When the survey is finished (e.g. the closing date has passed) – or, indeed, at any time – you can see participants’ results at the results page by clicking preference values and criteria rankings or ranking of all possible alternatives (and the other two ‘results’ buttons there too: ranking of entered alternatives and questions, decisions and comments). These results can be opened or saved in a spreadsheet (click export to excel near the top of the specific result pages) for further analysis, etc.

    For potentially greater confidence in your survey results, if your survey was set-up for this, you can check consistency results – from participants having re-answered a few questions each (as a check of their consistency).

    By clicking new model with means (near the top of the preference values and criteria rankings page) the mean preference values from the survey (i.e. across all participants) will be copied into a new decision model (with the same criteria and levels). This can be useful for you to work with, perhaps on a decision-making problem involving the participants’ preferences overall. (To learn about 1000Minds’ stand-along decision tools overall, see How does 1000Minds work?.)

    Finally, and in addition to the steps above, it’s easy for you to see the decision model for each individual participant if you want to. Click the decision models tab (to the left of the distributed processes tab), and at the bottom of the list of models tick show models of process participants. In the table listing all the models in your 1000Minds user account, you will now be able to see a model for each participant (usually marked with an asterisk). You can open these models, just like for any other models. (The way that 1000Minds works is that when a participant begins her survey 1000Minds creates an individual decision model for her – by copying the selected model for survey referred to at step 4 above – and stores it just like for any other model that is created.) This enables you to see what each participant did in the survey, usually in more detail than at the results page (as above). You can also make changes to an individual participant’s model – but you should think carefully before doing so. At the decision models tab you can also compare results for two or more participants’ models (useful for investigating similarities and differences between participants).

Good luck with your survey!

If you have any problems, email enquiries@1000minds.com and we’ll help you.