Company Profile

01.  What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

Founded in October 2020, by Lewis Reeves and Patrick Fraser, Walr has experienced incredible growth due to its unique positioning as a company focused on service and powered by advanced, proprietary research technology. We now deliver over 10 million survey completes annually to over 250 clients.

Our executive team has broad industry experience from both a supplier and agency perspective and as a collective, we bring over 400 years’ experience from businesses such as Dynata, VIGA (now Savanta), Verve, GfK and System1.

100% of our work is for market research purposes only.

02.  Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff? 

Our product team are all tenured market research professionals, with backgrounds ranging from research to technical operations. Therefore, our platform, and all processes resulting in the provision of sample, are designed, built and monitored by an experienced team.

100% of our project management team have prior experience in the field and were put through a comprehensive onboarding and training program, designed to ensure they deliver the consultative service levels expected from Walr.

03.  What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services? 

We provide full service data collection to our clients covering scripting (including translation overlays), data collection, quota and fieldwork management, data processing (including open-end coding), and visualization (editable PPT charts or online dashboards).

All data deliverables are provided with free access to the Walr Platform. This allows our clients to work alongside our service, to interrogate the data themselves and create further bespoke deliverables independently, if preferred. The Walr Platform operates as a searchable library for all historical research projects, with the ability to share access to content with relevant internal and external stakeholders.

Sample Sources & Recruitment

04. Using the broad classifications above, from what sources of online sample do you derive participants?

The Walr network has been built to offer best-in-class reach to sample across the globe, for both broad and niche audiences. Working with carefully vetted partners, as well as our own proprietary panels, our extensive network covers both the largest panel suppliers in the world and smaller specialists, offering our clients access to niche markets or specific audiences globally.

All of our supply consists of highly engaged audiences who have opted to participate in online research through a combination of more traditional online panels, app-based panels, publisher networks and engaged online communities.

05.  Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? 

Our network includes proprietary assets which are fully managed by us, and others which we have direct third-party access. All our relationships are long-term and well-established, giving us full control over the percentage share from any one source.

We currently have two proprietary panels. These are Nova Opinions, our consumer panel, and Specialist Opinions, our niche B2B panel.

06.  What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography? 

Our Nova Opinions consumer panel uses an “open to all” registration approach. We utilize multiple channels to recruit to Nova Opinions, which include social media, direct publisher connections, affiliate networks, and partnerships. All traffic sources are constantly monitored and optimised for quality. We do not run referral programs.

Specialist Opinions recruitment is by invitation only. Invitations are sent to individuals who have been selected by our recruitment team, or who are pre-vetted members of groups we have partnered with for specific audiences.

07.  What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? 

The Walr Platform contains a technology called ‘Gatekeeper’ which uses a blend of third-party and proprietary fraud mitigation solutions. It offers robust validation to ensure participants are real, unique and who they say they are.

This blend of technology includes: digital fingerprinting (device and browser level, alongside IP address and geo-location), Hidden Captcha, longitudinal behavioural analysis/consistency checks and ‘bot catcher’ hidden questions.

We also provide consultancy to clients on survey best practices – all of which are standard in our proprietary scripting platform, Survey Builder – to ensure validation post-survey entry; e.g. industry standard speeder checks, fair flatliner flagging, automated open-end validation against known profanities and poor responses, in-survey trap questions to check participants are paying attention and true subject matter experts (where appropriate).

Our proprietary panels operate on a double opt-in basis, requiring new joiners to confirm their email prior to participating.

08. What brand (domain) and/or app are you using with proprietary sources? 

Nova Opinions (www.nova-opinions.com) is our consumer panel, with panellists in the United States, United Kingdom, and Australia.

Specialist Opinions (www.specialist-opinions.com) is our niche B2B panel and operates on an invitation only basis.

09.  Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration? 

Our model is focused on human expertise powered by our proprietary technology, with most of our clients preferring to leverage our managed services to deliver sample.

We do support API integration on a case-by-case basis and currently run several, fully automated solutions for clients, with no human involvement on either side, in the delivery of ongoing tracking and ad-hoc research.

10.  If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered? 

We do not offer intercepts but do utilize more than one source. We can offer full transparency on the sample sources and providers included at a project level and block/remove providers if a buyer has a specific preference.

We do have integration methods for third-party sources for clients with long term engagements.

11.  Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop-only questionnaires? Is it suitable to recruit for communities? For online focus groups? 

Given we have no restrictions to use a single panel we can offer flexible solutions, matching the suitability of the source to the specific project’s requirement(s).

Most of our work is in quantitative online research (ad-hoc and tracking) and varies in duration and device usage/targeting with suitability for all options. We also have fully defined recruitment processes for more bespoke activities such as qualitative exercises, IHUTs, community recruitment and app downloads.

We will provide a consultancy service to discuss the best approach to achieve the desired outcome in relation to the research objective.

Sampling & Project Management

12. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend? 

Every project is treated individually with a custom sampling plan to ensure the achieved results are in line with the target population.

Quota controls are set in our platform to mirror those required within each project. Alternatively, they can replace quota management in a survey, if needed.

We can control the population within a survey based on both completion and entry to offer greater flexibility to deliver the required outcome (e.g. market sizing with quotas on entry and natural fallout within a survey).

13. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party? 

Our proprietary panels, along with all partners in our network, perform extensive profiling on members. Profiles are updated regularly to ensure the highest level of accuracy possible, given the dynamic nature of most profiling fields.

As a result of the close nature of our relationships with network partners, our project management team have a deep knowledge of profiling fields at a centralised and individual partner level, enabling us to select the most relevant sample blend and targeting for any audience, on any project.

Profiling can be appended to any dataset in real time by passing through a survey link. However, outside of core, static demographics (e.g. age) it is always recommended that any other questions are asked in real time to ensure complete accuracy of data.

14.  What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates? 

The key to accuracy in feasibility lies in understanding our clients’ aims and as such, our team are always on hand to discuss a brief and provide input at the scoping stage before we even get to feasibility.

For a simple, ad-hoc online survey we require the following at a minimum to provide accurate feasibility:

  • A definition of the target audience including all screening criteria being used.
  • An estimate of the incidence amongst either general population or within the target group will assist, but is not required.
  • Countries and languages required.
  • The duration of the survey (LOI).
  • Total completes required.
  • Details of any quotas.
  • Client’s desired field window, if applicable.

Additional considerations that may impact feasibility are listed below and should be shared at the quoting stage:

  • Description of the project/methodology, if not a standard online survey (e.g IHUT, pre/post, eye tracking).
  • Tracker details – number of waves, exclusion period.
  • Device limitations (e.g. desktop or mobile only).
  • Request to capture PII/sensitive information.
  • Any requirement for recontacts.

Our quoting team are experts in understanding what can be achieved but will always include a buffer in calculations as a precautionary measure.

15.  What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third-party sources/sub-contractors? 

Project progress against the timeline is monitored very closely throughout fieldwork, with constant communication to avoid and mitigate any potential challenges. Our team are dedicated to providing solutions first and foremost, and assuming there has been no change in scope, will recommend how we could amend the approach to reach the required audience target, before looking at additional sources.

Given the extent of our network, and the fact we do not rely on a single proprietary panel, it is incredibly rare that a project will prove impossible to complete. In the instances where this does occur, we have additional partners who do not sit within the formal network, who we work with on an ad-hoc basis. All of these are well-established panel companies who have been validated by our team through a similar onboarding process to our core network.

We offer a CPI guarantee for projects where a feasibility concern is not due to a change in scope, to ensure no impact is felt by clients on project pricing.

16.  Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer. 

The Walr Platform does not use a survey router and leverages API connections with our network partners to automate the fielding of studies.

17.  Do you set limits on the amount of time a participant can be in the router before they qualify for a survey? 

We do not employ a survey router.

18.  What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer? 

Each of the panels in our network is responsible for the information provided before a participant chooses to take a survey. At a minimum, the respondent is provided with the following: generic subject topic (non-biased), survey duration and incentive.

Where applicable, additional information may be provided in advance of participation e.g. requirement for webcam, PII capture etc.

19.  Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice? 

Our proprietary panellists can select which surveys they complete from the list of available opportunities on their member page, as well as by direct email or SMS invitations for certain studies.

The other panels in our network are responsible for how they present survey opportunities to their panellists. These will vary from individual invites sent via email/SMS, to apps/online panel sites showing all opportunities available to allow for a selection.

As per the above response, the information shown varies given the opportunity but at a minimum will include generic subject topic (non-biased), survey duration and incentive.

20.  What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during a survey? If so, can this be flagged at the participant level in the dataset? 

Each of the panels in our network is responsible for the incentives offered to respondents. Some partners are able to adjust incentives during the course of a survey. This is not common practice and has shown to have little impact on quality or quantity of responses. In some cases it can train participants to wait for incentive increases, leading to extended field times.

Specific situations do require an additional incentive for participation, but these are limited to survey opportunities requiring an extensive follow-up activity (e.g. telephone follow-up). These incentives can be paid in cash outside of the standard panel incentive model.

21.  Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)? 

Each of the panels in our network is responsible for measuring the satisfaction of their members and the cadence of this measurement.

22.  Do you provide a debrief report about a project after it has completed? If yes, can you provide an example? 

Debrief reports can be provided for any project on completion, where preferred by a client. We tailor our reports to an individual client’s requirements but will usually cover the following as standard:

  • Actual project stats vs. expected (LOI, incidence, dropout rate) and any associated impact on project delivery vs. expected.
  • Review on delivery against timelines vs. expected.
  • Report on any in-field challenges experienced, outcomes agreed and steps to avoid in future projects.
  • Qualitative PM view on overall project delivery against objective.

During a debrief call following a project, we will also discuss elements of the bidding cycle (response speed, feasibility, pricing) to ensure we are providing a high-quality, tailored service for all clients.

Data Quality & Validation

23.  How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this? 

We ensure individuals cannot participate in a survey more than once, unless specifically required to do so.

As standard, each project is configured to disable multiple entries by an individual from any specific panel. Where multiple sources are used, our Gatekeeper solution provides de-duplication as well as quality solutions. It utilizes best-in-class digital fingerprinting to identify individuals down to a device level and blocks re-entry to any survey which has already been seen.

Our partners will each have a proprietary algorithm to ensure members do not suffer from survey fatigue and display an appropriate volume of surveys within any set period.

24.  What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records? 

Our Gatekeeper solution assigns a unique identifier to each survey participant as they enter one of our surveys. This ID can be matched back to a unique panel source and individual panel member to provide a more holistic picture of recent survey participation. This information can be shared at an aggregate level if required.

25.  Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router. 

Participant identity is validated at multiple points throughout the respondent’s journey before and during the survey. Prior to survey entry, we utilize industry-leading technology to validate individuals and prevent fraud:

Digital fingerprinting, including:

  • device fingerprint
  • browser fingerprint
  • IP address
  • geo-location

Hidden Captcha

Longitudinal behavioural analysis/consistency checks

26.  How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

By utilizing a proprietary network of panels, we can fully manage the blend at an individual project level and therefore, ensure it is delivered consistently over waves of any tracker. These can be provided to clients as required on a project-by-project basis.

27.  Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses? 

Our Gatekeeper solution enables us to track quality at both an individual and source level. As detailed above, this technology allows us to track, monitor and block any individual failing one or more of our quality checks before they enter a survey.

We can pass profiled and known data into surveys, which can be validated against responses if required. However, this should only be used to assess quality based on profiles that are not subject to frequent change, where the in-survey response can supersede profiling.

28.  For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) illogical or inconsistent responding, (c) overuse of item non-response (e.g. “Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?  

We have a complete toolkit of standard and customizable in-survey quality measures which can be applied to any project. We will consult with a client on the more suitable approach for the audience within their study.

  • ‘Bot catcher’ hidden question – a question hidden within a 1×1 pixel, not visible to the human eye. Any response to this question is terminated.
  • Industry-standard speeder/flatline checks.
  • Real time verbatim validation against known profanities and poor open-end responses.
  • Logic-based trap questions with conflicting statements at either end of a Likert scale.
  • Fake brand lists.

Policies & Compliance

29.  Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. 

The privacy policy describes the data we collect, how this information is used, information about our use of cookies, who we will share data with, data retention and transfers, security and an individual’s rights with respect to our use of personal data. You can access it here.

30.  How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing of personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer? 

Whilst most of the data we process is anonymized, we consistently monitor data protection laws in the regions in which we operate to ensure we adhere with the strictest requirements for processing personal data.

As an ISO27001 certified company, we have appointed individuals who are fully responsible for security within the organisation and have documented and audited processes for dealing with data, including data breach response, cross-border transfer, and data retention.

31.  How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?

All panels in our network are assessed during onboarding to ensure the relevant consent for processing of personal data is in place and that there is a clear support channel for participants.

Any request from a participant received by Walr will be forwarded on to the relevant party to be managed. Members of our proprietary panels have a direct channel to our Panel Support team to assist with any issues or concerns.

32.  How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

We work with third-party advisors to comply with all applicable laws and regulations in the regions in which we operate.

33.  What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations? 

We are members of ESOMAR and the MRS and adhere to the global and local standards provided by industry bodies in the markets we operate in. We also comply with GDPR and UKGDPR.

In addition, we adhere to all local laws with regards to conducting online market research with people under the age of 18, and where required gain consent from parents prior to any child participating in a study we run.

34.  Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how. 

As an ISO27001 certified company we implement privacy by design from the ground up, through the development of our platform and all associated processes used for delivering professional services to clients. All data is hosted in Microsoft Azure Cloud Infrastructure with industry-leading security, following least privilege principles for access to the platform.

All staff are regularly trained and tested on data security best practices.

35.  What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process? 

We are ISO27001 certified meaning we follow a comprehensive, globally recognised standard for information security. As a part of our ongoing accreditation, our ISMS (Information Security Management System) is regularly subjected to internal and external audits.

36.  Do you certify to or comply with a quality framework such as ISO 20252? 

We are not ISO 20252 certified but are fully aware of its guiding principles and have developed our processes to ensure we comply with this framework.

Metrics

37.  Which of the following are you able to provide to buyers, in aggregate and by country and source? 

To ensure up-to-date accuracy of data, the following reports can be requested with 2-4 weeks of advanced notice:

  • Average qualifying or completion rate, trended by month.
  • Percent of paid completes rejected per month/project, trended by month.
  • Percent of members/accounts removed/quarantined, trended by month.
  • Percent of paid completes from 0-3 months tenure, trended by month.
  • Percent of paid completes from smartphones, trended by month.
  • Percent of paid completes from owned/branded member relationships vs. intercept participants, trended by month.
  • Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort).
  • Average number of paid completes per member, trended by month (potentially by cohort).
  • Active unique participants in the last 30 days.
  • Active unique 18-24 male participants in the last 30 days.
  • Maximum feasibility in a specific country with nat rep quotas, 7 days in field, 100% incidence, 10-minute interview.
  • Percent of quotas that reached full quota at time of delivery, trended by month.

Book a demo today.

Get in touch to see how the Walr Platform can support your business.