fbpx

Customer feedback is important How a company obtains feedback and suggestions on its products sounds like a simple problem. However, the amount of dissatisfaction with what appears to be simple, easy to fix problems, would appear to challenge that assumption. Last week I had the pleasure of going out for coffee and catching up with a friend I hadn’t seen in some time. We met at Tim Horton’s and caught up on the usual topics, family, hobbies, and work. My friend works for a large mining company in Northern Canada. He was asking me about work, and I mentioned to him about my researching and writing on market research online community software. Click here to read full blog.  ...
 

Insight communities and modes of thought In a recent article on Marketresearch.com’s blog Anne Beall examines the two different modes of thinking, one which is conscious and deliberate and the other more unconscious and automatic. These distinct types of thinking are not new, but in terms of market research insight, the treatment of the two as being separate is new and could be of incredible importance. Before we get into further details – let’s stop and understand the difference in the decision making process of a consumer, based on the two types of behavior: The two type of thinking modes are called systematic and heuristic. Click here to read full blog....
 

Do you plan to engage millennials to gather their opinions and product innovation? In a recent article by Fast Company entitled, These Millennials Have Become The Top Decision Makers At IBM, the author Cale Guthrie Weissman looks at how IBM has created a panel of millennial employees whom they can survey concerning key ideas and gather insights on how to stay relevant.  Why is this so revolutionary you might ask? Read more >>  ...
 

With the large amounts of data that market researchers deal with, finding ways to present data in a creative, interesting way can be a challenge. Here's a list of the six best ways to present your research data. #1: Interactive Dashboards Interactive dashboards let you communicate important information to your audience. A dashboard is a visual display of the most significant information from a project. The information appears on a single screen, offering a quick and simple way to monitor and evaluate a study’s progress. Dashboards are a highly effective way to present data to executives who don’t have a lot of time and need to be able to check data at any point in a project. #2. Infographics Infographics illustrate data and combine text, images, and design to tell the story of a study. They are becoming increasingly popular and since infographics present data in an engaging and easy-to-understand manner, they are frequently shared on social media, boosting the viral capabilities of your information. Infographics can drive increased traffic to your website and highlight key elements of your data. #3. Prezi Prezi is a new way to present information that engages audiences, visually demonstrates how ideas relate to one another, and allows collaboration in virtual space. Prezi is cloud-based, so you can present from your browser, desktop, or iPad and you will always have the most recent version available. Prezi offers visually engaging features such as zooming in and out of images and barrel rolls. Prezi is engaging and memorable, helping you make great presentations. #4. Videos/Vox Pops Videos let you put a face to the research, making study results more relatable and memorable. Vox pops are another effective way to bring research to life: vox pops (or streeters) are interviews with members of the public where people speak on camera and tell the viewer what they think and how they feel about a particular subject. Videos and vox pops can supplement both qualitative and quantitative research and is compelling way to involve the viewer in the research. #5. Motion Graphics Motion graphics are graphics that use video footage or animation technology to create the appearance of movement. They are often combined with audio and used in multimedia projects. Motion graphics are a captivating way to present your data and they help create a story for your data. The graphics help people understand concepts more clearly and make your project more appealing. #6. Web & Mobile Apps The increase in the number of smartphone users has led to the development of new ways of presenting data. In the increasingly fast-moving world, people need to be able to check reports and research data at any time, and apps are the perfect solution. Web apps let users check research data on their mobile devices, and the interactive nature of the apps lets the user control the research data they want to access and present. Apps are intuitive, easy to use, and an engaging way to view data and results. Related post: 4 Chart Tips to Turnaround Your Report Quickly ...
 

With the ubiquity of mobile devices and smartphones, many researchers are asking how mobiles and smartphones can be successfully incorporated in a research setting. Whether your aim is to improve existing research projects or you’re looking to try completely new research methodologies, these devices can augment other research methods and are also important research tools in their own right. #1. Add Impact with Video and Picture Libraries Most smartphones have camera and video capabilities, which can both greatly enhance a research program. Respondents have the ability to film themselves completing a research task, such as describing the products they purchased at a store or how they interact with a new product. It’s a great way to communicate the results too: a montage of the pictures or videos can help make the research findings more impactful. #2. Quick Data Collection via Pulse SMS Surveys A one- or two-question survey via text message is a viable way to collect data very quickly (usually within minutes). Younger age groups use email less frequently, making an SMS survey a more effective way to reach this demographic. #3. Feedback via User-initiated SMS Surveys A short code can be used to allow potential respondents to initiate a survey using a key word (e.g., Text JAVA to 78789 to start the survey). Standard rates may apply. User-initiated SMS surveys are a useful way to gain feedback on a transactional basis. Using a variety of start words allows you to track where or when a respondent learned about the survey. #4. Make Reminders More Effective Text messages are an easy way to remind respondents to complete an online survey or to attend a focus group. Since most people carry their cell phones, the reminders are often more effective than by telephone or email. Many will have internet capabilities on their phones and may opt to complete the survey right then and there. #5. Support Your Mystery Shoppers Mystery shopping often requires the shopper to notice many different things during the task, such as the time spent in line or the number of people in the store when they enter. Smartphones provide an easy way for mystery shoppers to record the key points in a discrete manner so that they don’t have to rely on memory. Using technology in this way provides a more accurate result for the client and means the mystery shopping task is less onerous for respondents. #6. Aid Auto-ethnography Ethnographic research can provide holistic, qualitative insight into consumers’ lives, but having a researcher in-home is expensive and has the potential to introduce bias into the results. Fortunately, technology can partially automate this involved research process and allow participants to compile much of the information themselves. Rather than have a researcher observe the subject’s behaviour, the participant can fill in a diary about his or her daily activities at specific times. Auto-ethnography relies on the participants to remember to record their activities at specific times, and since many of us have our phone with us at all times, sending a timed prompt to record the necessary information works well. A smartphone can even be used to record the necessary information. #7. Run Co-research Programs & Spotter Diaries Empowering research participants as co-researchers can provide a viable way to understand complex cultural factors that researchers may not be able to identify on their own, and these methods are nicely augmented by the use of smartphones. Co-researchers can take pictures or record their thoughts surrounding a common topic as they go about their day. These reflections can be used to uncover market gaps and to design new products. Additionally, since marketing campaigns usually encompass executions across various media, it is difficult for marketers to understand the overlap of the various channels. Having respondents record each time they come across an advertisement for a certain brand (creating a spotter diary) can provide a better picture of the whole campaign’s reach. #8. Collect Location-Based Data Using GPS functionality, researchers can better understand location based information as it relates to consumers: how far they travel to a store or other location or their travel patterns within a venue such as a mall or leisure facility. This data can demonstrate issues with congestion, help to optimize within-venue placement, and provide a reference point for advertising metrics. #9. Use Gamification Methods Although gamification in research is relatively unexplored, an ideal venue for research games may be on a smartphone. Canadians are already playing games on their smartphones: sixty percent of Canadians do so, according to the 2012 Rogers Innovation Report. If a game is well-designed, analyzing how users play the game could allow researchers to gain insights into consumer behaviour that could not be measured by a survey. #10. Get Beneath the Surface with Passive Data Using the functionality of participants’ smartphones, passively tracking data can be used to gain insight that might be impossible using a survey methodology. A user can opt-in to provide information about websites visited, health statistics captured via GPS, communications, or any number of other types of data from their smartphone. This data could even be linked to survey responses in order to compare or to augment the dataset....
 

By Briana Brownell Does a Yellow Checkbox Give You Better Brand Equity? The meaning of colours in branding and marketing is a popular topic: blue means you’re trustworthy and yellow means fun, but do yellow checkboxes mean that respondents will give you better scores in brand aspects like approachability or likeability? Maybe…as researchers love to say, more research may be required. Even though research on the impact of colour in surveys is pretty slim, many survey design guides warn that colour can potentially influence survey responses. Most of the research surrounding the effect of colour in surveys has been in regards to response rates in mail surveys, and unfortunately, there are not many conclusive results about whether colour has a significant effect online. One study found that added elements like pictures don’t seem to cause detrimental results in online surveys. It’s not all good news though. Other experiments showed that question types might affect responses to perceptual questions, and in some of these instances, colour has played a role. How Can Colours Affect Your Research? Colour can make a survey less clear. A difficult-to-read or difficult-to-complete survey will have lower response rates and potentially misleading results if respondents misunderstand the survey questions or if the answer options are difficult to read. A coloured mail survey could be conspicuous – or look like junk. Both positive and negative effects have been found for coloured mail surveys. If colour makes a survey more noticeable, it can serve as a reminder to complete and increase response rates. However, if a survey is confused for junk mail, response rates can decrease. A coloured scale can affect rating questions. Colour can influence the perception of a scale’s spread and influence results on perception-based rating questions such as agree-disagree scales or numerical rating scales. When the gradient of the colours from one end to the other is amplified, respondents perceive the scale as more severe and give more moderate ratings. Inventory Questions Are Pretty Safe Inventory questions such as “Who is your current telecommunications provider?” or “In what year were you born?” do not appear to be affected by the question design or colours used because they have an objectively true answer. As long as the question design and layout are clear and the design doesn’t cause confusion, there is no evidence that the survey’s design affects the responses. Perceptual Questions May be Affected Perceptual questions, on the other hand, may be affected by various factors concerning the question style. I know what you’re thinking: we already know that. Very true. Perceptual questions should always be taken with a grain of salt and considered a comparator rather than an absolute measure, whether they’re rainbow coloured or black on white. Sliders seem to have some interesting effects on survey responses. Both the initial placement of the slider and the size of the slider matter: a wide slider discourages respondents from answering at the extremes, and a slider with an initial placement in the middle discourages a neutral response (respondents prefer to move it rather than leave it where it is). Colouring may also matter in the interpretation of the scale, if the colours used affect the respondent’s perception of the measurement. Overall, using colour and changing design seem to be okay as long as they are consistent. Think of using the different question styles as using different anchor points and treat them this way in the analysis. Entering the Era of Grayscale Research Surveys? Colour and visual elements might be a fun addition to your survey as long as you don’t go overboard: clarity is key to collecting quality data. Remember that researchers see far more surveys than respondents will: make sure it’s not you who is bored with the formatting. It’s a safer bet to keep the wilder stuff for the inventory questions. Consistency should be a key priority in tracking work (I’m sure I’m the first one to ever recommend that!). Questions that are going to be compared should be in the same format. There, I said it: researchers, here’s your excuse to feed your addiction and give respondents a few pages of item-bank radio button grids.  ...
 

Different research challenges require various research solutions and knowing when to use a specific approach can certainly be a daunting task. This overview highlights some instances in which online communities may be preferred in place of custom ad hoc research. Combination of Quant and Qual – Online communities offer researchers a solid opportunity to gather both quantitative and qualitative data at the same time and at a lower cost. Because most online community platforms have both quant and qual tools built in, research can be conducted much more quickly and efficiently than a combined qual-quant ad hoc study. Demographic Segments – If you are looking to segment individuals based on demographics, online communities work well. Short surveys are used to profile individuals and then targeted research questions are presented to the entire group to pinpoint where profile differences emerge. Groups can also be formed based on demographics and targeted research can be conducted with specific sub groups. This approach can be achieved much more easily with an online community than with a long ad hoc questionnaire with skip logic that segments groups during the survey. Regional, National, and International Research – If the research question requires insights from individuals who are geographically dispersed, an online community is an excellent research platform. If a wide scope is required, individuals can be recruited from different regions, provinces/states, and countries. Online communities are borderless and research can easily be conducted in several languages. Engaging Research – In place of long and often boring surveys, try utilizing an online community to spice up your research questions and increase engagement. If your research topic is dull in survey form, consider an online community to allow for a more open forum for discussion. The community also allows for innovative approaches such as co-moderation, where a community member or members take an active role in conducting the research. Rather than gathering a lot of yes/no and scale answers, you can collect rich, organic data from engaged members whom you can return to for future research questions....
 

This white paper provides an introduction to statistical and significance testing in market research and answers the following questions: What does statistical testing mean, how is it shown, and how should it be interpreted? Why are there multiple statistical tests, and how are they different? What do terms like “margin of error” and “nineteen times out of twenty” mean, and how are they relevant? Why is a margin of error not reported in online research?   Most of the time when doing marketing research, there is interest in differences between groups. Demographic groups, groups based on psychographics or attitudes, or any number of other slices and dices may be relevant to the researcher. However, for a non-researcher or a new researcher, entering the world of stat testing and interpretation can be daunting. Sometimes researchers forget that most people don’t look at research results all day, and we often forget that not everyone can eyeball a significant difference!...
 

A new independent online poll conducted by Insightrix Research suggests that residents are divided on whether or not the new Regina Sewage Treatment plant should follow a traditional Design, Bid and Build (DBB) approach or a Public-Private-Partnership (P3) approach. Awareness of and Following the Debate Awareness of the debate regarding the development of a new sewage treatment plant is widespread. Nearly all Regina residents surveyed (96%) report they are aware of the debate taking place regarding whether or not the City of Regina should use a DBB or P3 approach to building the new sewage treatment plant. Further, 94% are aware that a referendum is being held on September 25th where Regina residents can vote on the issue. Additionally, eight in ten (81%) residents aware of the issue say they are actively following the discussion (22% very closely, 59% somewhat closely) while the remainder (19%) are either not following the issue at all (8%) or are only listening to what their friends or family tell them as the debate unfolds (11%). Support for P3 vs. DBB Respondents were presented with the following brief description of the two approaches: The City of Regina Council unanimously approved using a public-private-partnership (P3) for the sewage treatment plant because it believes this to be the best option for the city. They report that a P3 costs less than other options, is less risky and is much more likely to be built on time and on budget. However, there are some who do not support the idea of a P3 approach because they feel it does not provide accountability to citizens, it will cost more than the traditional Design, Bid and Build (DBB) approach, privatization is risky, and Regina’s entire water system should be kept public. After hearing this description, respondents were asked to state which approach they personally support. Four in ten (40%) Regina residents say they support a P3 approach while three in ten (30%) support a traditional DBB approach. More than one quarter (27%) are unsure and another 3% are indifferent on the issue. A P3 approach is more strongly supported by males (46% vs. 35% among females) and support for this approach tends to rise with household income. Among those who plan to vote in the upcoming referendum (66% of respondents), 45% are in favour of a P3 approach while 37% prefer a DBB method. Nearly two in ten of those who plan to vote (18%) are unsure as to which approach they support. Research Details A total of 400 randomly selected SaskWatch Research™ panel members who live in Regina participated in the online research study from September 11th to 15th, 2013. Quotas were set by age, gender and region to match the general population of the city. As the research is conducted online, it is considered to be a non-probability proportion sample and therefore, margins of error are not applicable. About SaskWatch Research™ Insightrix started developing the SaskWatch Research™ online market research panel in October 2007, using high-quality techniques including telephone recruitment and referrals from existing panel members. Presently, there are over 14,000 active panel members representing all regions of the province, and distributions of the general population. The panel membership closely matches the 2011 Census based on age, gender, household composition, household income and education. For more information, please visit: http://saskwatch.ca. About Insightrix Founded in 2001, Insightrix Research Inc. is a full-service market research firm that helps clients develop, administer and manage data collection and information strategies. From its office in Saskatoon, Insightrix offers a comprehensive range of research services. For further information contact Lang McGilp, Senior Research Executive Insightrix Research Inc. Tel: 306.657.5640 Ext. 229 Cell: 306.290.9599 Fax: 306.384.5655 Email: lang.mcgilp@insightrix.dev1.commandbase.ca Web: insightrix.dev1.commandbase.ca...