fbpx

Insight communities and modes of thought In a recent article on Marketresearch.com’s blog Anne Beall examines the two different modes of thinking, one which is conscious and deliberate and the other more unconscious and automatic. These distinct types of thinking are not new, but in terms of market research insight, the treatment of the two as being separate is new and could be of incredible importance. Before we get into further details – let’s stop and understand the difference in the decision making process of a consumer, based on the two types of behavior: The two type of thinking modes are called systematic and heuristic. Click here to read full blog....
 

Do you plan to engage millennials to gather their opinions and product innovation? In a recent article by Fast Company entitled, These Millennials Have Become The Top Decision Makers At IBM, the author Cale Guthrie Weissman looks at how IBM has created a panel of millennial employees whom they can survey concerning key ideas and gather insights on how to stay relevant.  Why is this so revolutionary you might ask? Read more >>  ...
 

Ten years ago, the world was introduced to YouTube. Ever since, the media has been ripe with stories of viral videos. The videos are often seen as an overnight success, where uploaders will post something that they found funny, interesting, or absurd to show their friends.   At Insightrix Research, we set out to make a fun video that showed a little about who we are. Before we knew it, our “SaskatcheWHAT?!" slang-based video had a quarter of a million views and was featured in news outlets across the country including Huffington Post, CBC News and Global TV. The following white paper will highlight what makes a viral video, as well as the data, and feedback that we received from our video, which was entitled, SaskatcheWHAT?!    ...
 

By Corrin Harper, CEO and President, Insightrix Research Inc. Recently I had the opportunity to attend the W100 Idea Exchange designed specifically for women entrepreneurs, by the editors of PROFIT and Chatelaine. The Idea Exchange, held at the Ritz Carleton in Toronto on November 26th, provided a unique opportunity to share best practices with other leading women entrepreneurs, to learn from management experts about today's key business opportunities and challenges, and to create personal connections with other top entrepreneurs. Both being educational and inspirational, what a wonderful experience it was! One item from the exchange really resonated with me – every entrepreneur has to find their own path and use lessons they learn along the way to their advantage. Entrepreneurs – no matter what their educational background – achieve success by focusing on their passion and backing it up with an unparalleled work ethic. Here are a few tips that were discussed at the Exchange that can help a person become the most successful entrepreneur possible. Work Hard, No Matter What If you aren’t giving it 100 percent, you won’t be successful, and this resonates beyond business. Success is directly related to the amount of effort you’re willing put in. Always. Giving 100 percent does not mean having to put in 100 hours a week, but it does mean taking care of your investment, taking the breaks when needed to clear your mind to see things from a fresh perspective, and above all, it means being dedicated and driven to meeting the projective goals. People are Your Greatest Assets All businesses need a range of skills to be sustainable and be able to grow. As the owner of a business, you are called upon to perform several roles out of necessity. However, there are some roles you are better at than others. If you want your business to progress, it will reach a stage when necessary skills, need to be improved and extended. Getting the right mix of people to complement and reinforce your skills is essential. When hiring for a small [firm/business], you need employees who demonstrate entrepreneurial characteristics and work habits. Employees come in all shapes and sizes,with all sorts of diverse skills and quirks. Look for those who can handle risk, are results oriented, a team player, high energy, and growth-oriented. No one is perfect, so create a environment where these dynamics are supported and work with employees to maximize their potential in those areas. Learn from Your Hits and Misses Every entrepreneur will have missteps and false starts along the way, but they will end up helping in the long run. You have to be prepared to take those high reward risks and accept the fact that you will make mistakes along the way. The key is not to fall into the trap of believing you always have to get everything right. Risk Taking Risk-taking is almost synonymous with entrepreneurship. When just starting your business, you’ll have to put your career, personal finances, and sometimes even your mental health at stake. Almost all entrerpreuneurs at the Idea Exchange shared stories about abandonig their steady paycheck, sacrificing personal capital, and donating personal time and health. Let’s face it -- this sounds awful. However entrepreneurs frequently talk about this as being the best time of their life! Because we are dedicated to our business, enjoy what we are doing, and worked hard to overcome those stressors, we often reflect on these times with a positive feeling, knowing how far we’ve really come. The Exchange was a wonderful experience, sharing stories with other entrepreneurs, and one that I will remember always....
 

#1. What does a research company need to know from me? If your business has questions that need answering you may have made the same decision that a lot of businesses do: turning to market research to get those answers. However, for companies that have never done market research before, it may be difficult to know what they need to do in order to get the best possible results. Start with an internal discussion of what problems your business may be facing. You don’t need to know exactly how you will address those questions through research – that’s where the provider will come in. Also think about what goals you are hoping to achieve with your research results and who in your organization will need to use these results. Talk with stakeholders within your organization to see what’s on their wish list to know. These points will allow the researchers to choose a research methodology that will fit your needs. Furthermore, letting your research provider know what kind of budget you are working with will help them operate within your means. #2. Who is my target population? Determining your target population or respondent group (those who will be providing you with answers – such as current or potential customers, employees, or other stakeholder groups) is crucial to gathering actionable results that answer the research objectives. The target respondent is often determined by thinking critically about what hurdle your business is trying to overcome. For example, if you wanted to determine customer satisfaction for a line of winter tires, you might want to survey only those with a vehicle. Your target audience may be quite general (all Canadians, for example) or quite specific. If you’re unsure of who your target audience is, our team of research executives can help you pin-point exactly who you should be talking to. For example, our SaskWatch Research™ Panel has over 100 profile questions to ensure your survey is taken by exactly who you want to reach. #3. What are quotas and do I need them? Quotas are partitions of the population that are created to make sure that your research is representative of the population you’re trying to survey. For example, if you want to get an idea about what Canadians think about a certain topic, you would want to have about half of the people answering to be male and half female in order to match demographics. The most common quotas are based on age, gender, and region. Setting quotas helps you to make sure that the research results that you get are applicable to the population at large. This extra step allows you to make accurate forecasts about things like market share or uptake of a new product. #4. What approach should I take? What kind of research methods you utilize depends on what kind of answers you’re looking for. If the questions you want to ask start with ‘how many, how often, what, and where,’ then chances are you will want to use a method of quantitative research. This type of research is intended to be statistically reflective of the market, and will give you quantitative statistics that you can extrapolate to a larger population. If your question is about ‘why’ or ‘how,’ you might want to utilize a qualitative research method. Qualitative research employs methods, such as focus groups and in-depth interviews, allow you to dig deeper, but with fewer respondents. These exploratory research methods more fully uncover respondents’ perceptions, experiences and feelings, and add additional context to quantitative results. The division between qualitative and quantitative research is frequently blurred. Often, a comprehensive research project will involve more than one type of research to answer various objectives. #5. How much data should be collected? This question depends heavily on what kind of budget you are working with, however, spending more is not necessarily better. Depending on what your objective is, impactful research can be done on a relatively minimal budget. Think about how much detail you need in order to make whatever business decisions you’re trying to make. Are you looking to understand if your customers are satisfied with a certain product? Measuring a potential customer’s perception and barriers? Determining the awareness of your brand on a local scale? Whatever the need, it is crucial that enough respondents are obtained to ensure your results are statistically valid. However, paying more to get extremely accurate results when the organization is not able to take action on them means wasted budget. An experienced research provider can guide you to making these decisions and recommend an effective methodology. #6. How long will it take? How long your research takes depends on what type of research you are wanting to conduct, how many people you want to reach, and what method you would like to use (telephone takes longer than online, for example). It also depends on how clear you are with your research objectives. Your research provider will provide you timeline at the beginning of your project so that you are able to plan accordingly. However, there are options if you are in a crunch for time. For example, Insightrix offers a monthly OnTopic™ omnibus survey. OnTopic™ surveys are great if you only have a few questions to ask and have a tight deadline on the data needed. Your questions will be combined with other omnibus clients and is given as one single survey to our panelists. OnTopic™ has a one week turnaround, giving you faster access to the answers you’ve been looking for. #7. How will I be kept aware of the progress of my research? Your project will be assigned a project manager that will keep you informed on the status of your research. This is your primary point of contact and this individual will be the one to inform you if any complications should arise. Especially if your research project has a long field window, you may want to monitor the results as interviews are taking place. Topline reporting allows clients to monitor the results of each survey the moment the responses are collected. The results ideally include user-friendly features that display counts, percentages, and graphs for each question, offering the ability to share these topline results within your organization. #8. What kind of results will I receive? Research results don’t mean much if they are indecipherable. Depending on the needs of your business, there are many different types of deliverables that can be provided at the end of your research project, such as written reports in Word, PowerPoint reports, detailed tables, in-person presentations, infographics, to name a few. With individuals at all levels in an organization becoming shorter and shorter on time, a concise reporting style is essential. Sharp analysis, visually-engaging presentations, structured narrative, and succinct summaries as well as infographics that “pop” and engaging videos will engage stakeholders with the story your data is trying to tell....
 

Because of recent news events, the 2014 CIRPA conference in Hamilton, Ontario had a more introspective atmosphere compared to years past. All the same, the conference was full of great information and friendly people, all wanting to share their knowledge and insights on various facets of institutional research. Many of the themes at this year’s conference had been recognized in sessions in prior years ; however, these trends continued to evolve with advancements in technology and broader, worldwide changes in research. There were many new insights at the sessions. In my opinion, two interesting themes were changes in data collection and changes in data dissemination: the types of data and methods to collect primary data, and the ways in which the research results are being disseminated to key stakeholder groups. Survey methodologies are quickly evolving, in educational institutions and elsewhere in market research. As mobile phones become ubiquitous, the more imperative it is that researchers adapt their survey design and their expectations to the changing technology. In this regard, the use of mobile phones to take surveys has been steadily increasing – to such an extent that research must take the device into account – most specifically in the design of the survey. This means more than making sure the survey technically works on a smartphone. It might mean shortening surveys or adapting the question types to be easier to complete on a mobile device. There was a roar of discussion in boosting survey results with other data, including nationally or provincially collected statistics from government agencies and data containing online conversations through social media sites. Additionally, more sophisticated data tools and outcomes-driven predictive modelling, on things like retention and enrolment, are becoming more common as a part of an educational institution’s research toolkit. I find this development promising and it has the capacity to be a relevant theme for many years, as the amount and nature of the data that is available increases. Disseminating information within a large organization such as an educational institution can be difficult. For this reason, many research groups are actively creating tailored materials to disseminate information for a variety of audiences. At CIRPA, there were several sessions with tips on data visualization, dashboarding, and combining multiple datasets into more holistic research results. With the amount and variety of information ever increasing, it is often difficult to ensure that stakeholders aren’t drowning in data. I anticipate it will become common for institutional research groups to provide stakeholders with more frequent and shorter sets of research results, rather than a long, drawn out report that they may or may not have time to read at once. This follows a greater theme in research where clients are increasingly asking for a suite of deliverables rather than a single report. The conference left me with a feeling of confidence in the way that institutional researchers are tackling some of the major changes in the industry. Adaptability is key in making sure that participation in research remains high, and I saw evidence of many institutions innovating their current practices to address these needs. Overall, it was a great conference where I had a great time connecting with old friends and making new ones. I’m looking forward to seeing everyone in Halifax for CIRPA 2015!...
 

Dashboards are a great way to present information, especially when the data needs to be shown at a high level. Digital dashboards are collections of key reports, metrics, KPIs, and other data that provide relevant context and highlight the essential elements of a research study. They are a great tool for presenting information to executives who may only have a few minutes to review and make decisions about a project. Here are five key points to consider when developing dashboards for executives. #1. Dashboards are not scorecards. Scorecards are report cards for your projects. Scorecards measure performance against goals, show the success/failure of specific metrics, and are utilized once a project is complete. Dashboards, on the other hand, are used throughout a project and offer a snapshot of a study’s progress. Dashboards are a collection of reports, KPIs, and comments from consumers, all of which provide context for the status of a project. #2. Looks matter. A dashboard needs to convey information quickly and clearly, so appearance is very important. All elements of a dashboard, including gauges, colour, highlights, and fonts, are critical to ensuring that messages are communicated efficiently. #3. Dashboards should be actionable. Every dashboard should be created with the goal of making the data actionable. Since organizations collect large amounts of data, dashboards need to provide an overview of the most relevant information in a concise, clear manner. Remember that dashboards are not reports: their function is to assist with the decision-making process. #4. A one-size-fits-all approach does not work. While dashboards keep track of the relevant information for a project, the same information and style of presentation will not meet the needs of all hierarchical levels. According to dashboardinsight.com, performance dashboards can be loosely categorized into four levels, and each should include a different number of metrics: CEO/board level – about six high-level metrics Corporate vice president/director level – between 12 and 20 metrics IT strategic level – range of 12 to 50 metrics IT operational level – around 20 metrics Always begin dashboard design with a clear understanding of the end user and his or her executive level. While different levels of users will require various dashboard views, remember that you can create filters to extract the information required for each type of user. #5. Focus on simplicity. Poorly designed dashboards gather huge amounts of data on one screen, preventing clear understanding and slowing down decision making. With more and more web applications using a minimalist design (a change for the better), dashboards need to be clear and simple. Use clear fonts, appropriate whitespace ratios, and iconography to guide the user through the dashboard....
 

By Briana Brownell Does a Yellow Checkbox Give You Better Brand Equity? The meaning of colours in branding and marketing is a popular topic: blue means you’re trustworthy and yellow means fun, but do yellow checkboxes mean that respondents will give you better scores in brand aspects like approachability or likeability? Maybe…as researchers love to say, more research may be required. Even though research on the impact of colour in surveys is pretty slim, many survey design guides warn that colour can potentially influence survey responses. Most of the research surrounding the effect of colour in surveys has been in regards to response rates in mail surveys, and unfortunately, there are not many conclusive results about whether colour has a significant effect online. One study found that added elements like pictures don’t seem to cause detrimental results in online surveys. It’s not all good news though. Other experiments showed that question types might affect responses to perceptual questions, and in some of these instances, colour has played a role. How Can Colours Affect Your Research? Colour can make a survey less clear. A difficult-to-read or difficult-to-complete survey will have lower response rates and potentially misleading results if respondents misunderstand the survey questions or if the answer options are difficult to read. A coloured mail survey could be conspicuous – or look like junk. Both positive and negative effects have been found for coloured mail surveys. If colour makes a survey more noticeable, it can serve as a reminder to complete and increase response rates. However, if a survey is confused for junk mail, response rates can decrease. A coloured scale can affect rating questions. Colour can influence the perception of a scale’s spread and influence results on perception-based rating questions such as agree-disagree scales or numerical rating scales. When the gradient of the colours from one end to the other is amplified, respondents perceive the scale as more severe and give more moderate ratings. Inventory Questions Are Pretty Safe Inventory questions such as “Who is your current telecommunications provider?” or “In what year were you born?” do not appear to be affected by the question design or colours used because they have an objectively true answer. As long as the question design and layout are clear and the design doesn’t cause confusion, there is no evidence that the survey’s design affects the responses. Perceptual Questions May be Affected Perceptual questions, on the other hand, may be affected by various factors concerning the question style. I know what you’re thinking: we already know that. Very true. Perceptual questions should always be taken with a grain of salt and considered a comparator rather than an absolute measure, whether they’re rainbow coloured or black on white. Sliders seem to have some interesting effects on survey responses. Both the initial placement of the slider and the size of the slider matter: a wide slider discourages respondents from answering at the extremes, and a slider with an initial placement in the middle discourages a neutral response (respondents prefer to move it rather than leave it where it is). Colouring may also matter in the interpretation of the scale, if the colours used affect the respondent’s perception of the measurement. Overall, using colour and changing design seem to be okay as long as they are consistent. Think of using the different question styles as using different anchor points and treat them this way in the analysis. Entering the Era of Grayscale Research Surveys? Colour and visual elements might be a fun addition to your survey as long as you don’t go overboard: clarity is key to collecting quality data. Remember that researchers see far more surveys than respondents will: make sure it’s not you who is bored with the formatting. It’s a safer bet to keep the wilder stuff for the inventory questions. Consistency should be a key priority in tracking work (I’m sure I’m the first one to ever recommend that!). Questions that are going to be compared should be in the same format. There, I said it: researchers, here’s your excuse to feed your addiction and give respondents a few pages of item-bank radio button grids.  ...
 

by Briana Brownell I always find conferences inspiring, and the recent MRIA conference in Saskatoon was especially motivating. It was packed with great concurrent sessions, provocative panel discussions, motivating keynote presentations, beautiful views of Saskatoon, and great times talking with the many industry leaders in attendance. Now that I’m back in the office, I’ve got my desk cleared off, and I’ve written up my lengthy conference follow-up to-do list. These are the top three changes I’m going to make: #1. Be okay with frayed edges I’ll always remember one designer’s interview comment about the Canadian fashion market: “You can’t sell frayed edges to Canadians. They just don’t get it. They need everything to be done.” This is true as well for marketing research in this country - we’ve got big firms with a fairly wide array of off-the-rack products that see minimal modification. And I see why: structure is cozy. It’s so nice when a research project has clear boundaries. When you’ve collected the data, done the analysis as per the plan, and written that last word in the PowerPoint “conclusions” slide, you can be satisfied that it’s done. But, unfortunately for a quant person like me, not everything can be so nicely captured in an SPSS file. Instead, the edges of the research often become important, as we have seen in the many new and often surprising findings in customer satisfaction research. To this end, I was happy to see several exploratory presentations that examined a customer’s holistic interaction with the brand. Lesley Haibach and Anne Kossatz’s presentation on their successful implementation of a change in RBC’s inbound call centre explored times where a company has the greatest chance to impact customer satisfaction. They found that an important insight came from understanding the customer’s state of mind when he or she contacts customer support and allowed RBC to make a small change in the organization that had big results. What was so impressive about this research program was the alignment of the organization with the research results. They achieved considerable buy-in from all levels within RBC, so much so that human resources even altered their hiring practices! Amy Charles and Joel Weinberger explored the edge of conscious and unconscious responses using a very interesting method based on psychology experiments to derive implicit associations. I’ve seen this technique used before (actually, I personally feel that this is an example of such successful gamification that it becomes a methodology in its own right. But that’s a discussion for another time...
 

Different research challenges require various research solutions and knowing when to use a specific approach can certainly be a daunting task. This overview highlights some instances in which online communities may be preferred in place of custom ad hoc research. Combination of Quant and Qual – Online communities offer researchers a solid opportunity to gather both quantitative and qualitative data at the same time and at a lower cost. Because most online community platforms have both quant and qual tools built in, research can be conducted much more quickly and efficiently than a combined qual-quant ad hoc study. Demographic Segments – If you are looking to segment individuals based on demographics, online communities work well. Short surveys are used to profile individuals and then targeted research questions are presented to the entire group to pinpoint where profile differences emerge. Groups can also be formed based on demographics and targeted research can be conducted with specific sub groups. This approach can be achieved much more easily with an online community than with a long ad hoc questionnaire with skip logic that segments groups during the survey. Regional, National, and International Research – If the research question requires insights from individuals who are geographically dispersed, an online community is an excellent research platform. If a wide scope is required, individuals can be recruited from different regions, provinces/states, and countries. Online communities are borderless and research can easily be conducted in several languages. Engaging Research – In place of long and often boring surveys, try utilizing an online community to spice up your research questions and increase engagement. If your research topic is dull in survey form, consider an online community to allow for a more open forum for discussion. The community also allows for innovative approaches such as co-moderation, where a community member or members take an active role in conducting the research. Rather than gathering a lot of yes/no and scale answers, you can collect rich, organic data from engaged members whom you can return to for future research questions....