Home

Rick Frank is the President and founder of Dufferin Research Inc. He has a Master’s Degree from the University of Toronto and has been employed in the Market Research industry for over 25 years. A early advocate for internet data collection on the Linux platform, he started Dufferin Research in 2000 from a small operation in Dufferin County, Ontario moving to Ottawa in 2002.

His current research interests include climate and environmental topics, sustainable living and business practices, sustainability reporting standards (he supports GRI), travel and tourism, as well a technical issues related to doing business in a totally connected environment. Rick has been an environmentalist since the mid 1980's when he designed and built an R2000 home, researched materials, energy sources, recycling and sustainable land use. He has planted many thousands of trees. As well he sat on a regional waste management board planning for a new landfill site (never approved) for a year during this time. He currently lives on 26 acres of mostly forested land in rural west Ottawa in a log house. He has chainsaws and axes and knows how to use them. Rick and his family live as sustainably as possible, they manage a large garden, surrounded by chickens, ducks, turkeys, dogs and cats. He has more eggs in the fridge than would seem reasonable, but you know, chickens. Rick has been very active on social media platforms since 2007. Rick has traveled extensively (25+ countries) and recently lived for a couple of years in Europe exploring the cultural differences in life styles, food, and work place norms. He would be happy to discuss this over beer or coffee anytime. And finally he tries to keep up-to-date on internet security issues, and hacks away on his own network to improve processes, user experiences and secure data.

Follow Rick on Twitter

Our roots

Dufferin Research had been specializing in online survey programming and the hosting of both stand alone custom research projects and continuous tracking studies for more than a decade. The firm was established was in 2000 to provide these specialized services to smaller Market Research firms and consultants. Dufferin Research now deals with a wide range of clientele from international research organizations, to direct research buyers and users from global, national and local businesses and organizations (including NFP), in addition to our original target market.

When we started online research it the industry was in it's infancy. If you were online, dial-up access was the norm and rich media referred to the television networks. Fast forward a decade and internet based research is now the dominant data collection methodology globally. Rich media, in the form of video, audio and high resolution graphics are the norm, broadband access is the norm. The new frontiers are in mobile computing and semantic technologies.

The data collection services at Dufferin Research follow best practice principles just not available if you wish to D-I-Y program in an ad hoc fashion. The data is collected and stored within Canada. We can certify where the data physically is stored, where the backup copies are, who has had access, moreover we can destroy any and all copies on request. Data stored in a cloud computing environment with head offices in foreign countries can offer no such guarantees.

Data collection

Dufferin Research provides cost-effective, reliable services that include superior technical skills, excellent project management and have the sort of robust & secure infrastructure needed to succeed in this business. For further technical details see our capabilities page. But while we love the beauty of automation, and are masters of many programming languages, we still view ourselves first & foremost as a service company.

We are 100% client focused. Your needs are what drive us. It's what we do. This philosophy has a price. Service does not scale like technology, so we have come to terms with, and embrace the fact that we will remain a fairly small company, simply because what we do cannot be replicated on a large scale cost effectively. Hands on, means hands on.

Full service

Over the past few years, Dufferin Research has been moving towards providing an additional range of services. In addition to the traditional quantitative online data collection, we provide a full range of services typical found in a full service market research firm. If the area of expertise exists in-house, we will provide end to end research from the consultative process up front to define your needs and the project scope, project & questionnaire design, to the analysis and delivery of the results.

Should we feel your project requires it, we will collaborate with our network of partners to provide the solution that best suits your needs and your budget. We are experts in quantitative methods but work with qualitative researchers when the numbers alone may not tell the whole story.

Your needs are very likely unique, so an out of the box solution is unlikely to be the best one for you. We will turn the data into intelligence and will engage in whatever research methodology is required to do this. Unless you can take the results of a research project and make an informed business decision based on the results, the exercise has been futile.

Assisted service

Being typically Canadian, we can compromise to reach a solution that meets your needs.

There is a middle ground between D-I-Y projects and full service, that is more than data collection only. Starting with our Research Rabbit product (for employee satifaction and engagement surveys) we developed a series of products where the client makes some of the design decisions alone within the software framework, then we evaluate, and execute the data collection and report the findings using a templated reporting structure that minimizes our labour thus lowers the cost for the user.

You do a bit more, you get a lot more, and you pay less.

Secondary research (1.0 to 3.0)

Traditional secondary research is compiling data from known archival sources, it's akin to the work done for essay when you went to school.

Version 1.0 was hard copy or electronic records usually in libraries.
Version 2.0 is secondary research online (can I Google that for you?)
Version 3.0 is the semantic web.

Now we have the growing use of semantic software to scrap or data mine the web & glean insights through "smart" interpretation. Semantic software is used to conduct secondary research on steroids. The quantity of data online is so vast that no one can process it. But is is no magic bullet, like archival research of the past, this is merely collecting, classifying, and interpreting that which already exists.

Your needs

Surveys, however executed, are primary research, the creation of NEW data that for the moment at least is yours alone (see data collection, full & assisted service).

Dufferin Research can assist in helping you choose when primary quantitative research is best, when you need secondary research and if you do, which type.

We can also advise you when qualitative research may be better suited to your needs, and if a hybrid methodology would be best, and then of course we can conduct the research needed.

The Advantages of Online Primary Research

  • Speed
    Surveys can be completed in days, sometimes within hours of emailing invitations.

  • Larger sample sizes for very little extra cost
    Unlike a telephone survey which is very labour intensive, a web survey has a fixed labour cost. The cost of programming the survey and delivering the agreed upon final product is largely unaffected by sample size. Costs only rise with the sample size for three components.

    • Purchased access panel sample
    • Length of time in field
    • Coding open ended questions

 

Sampling Issues in Internet Research

  • Representativeness of the sample

    All sampling methodologies have some limitations preventing them from being 100% representative of any given population. Just as telephone studies have limitations due to the problem of unlisted numbers and cellular phones, and mailout surveys have problems such as restricted access buildings and bad mailing addresses, internet surveys also have limitations.

    You can only claim to represent those who have access to the Internet either at home, at work or from some other location. However the last Statscan survey shows around 80% of people 16+ in Canada are online.

    Online studies are representative of populations of online users or subpopulations of Internet Users. Using internet based studies is often viewed as appropriate for populations perceived to be hi-tech users or in many business-to-business studies, as these populations will likely be Internet users.

    Online stuides cannot and likely will never be able to claim to represent the entire country's population, or all adults 35-50 years of age, or any other population that may have a significant proportion of non-Internet users. However, similar things can be said of all methodologies.

Solutions to the Sampling Issues

  • Combined methodologies

    Dufferinn Research can combine CATI, Self-Completed Mail and Web methodologies thereby passing on the speed and cost savings of Internet Research for those respondents with Internet access and still offer the option to do a telephone survey or self-completed mail survey. This type of survey, if designed properly, can be considered representative of the entire population.

  • Use Internet surveys for appropriate populations

    Internet research is not a replacement for all other methods of surveys. It is appropriate for many studies but not all. We can advise you on the appropriateness of this methodology on a study by study basis.

Currently we run DASH Web software for all online surveys. The software is written in Java and the source programs for the survey are coded in XML. The web applications themselves are designed to accommodate heavy in-bound usage. There are very few design & logic limitations in the application. Complex logic based on multiple conditions, dynamically generated lists or tables based on previous responses, recalled data, on-the-fly calculations, max-diff and discreet choice conjoints are common. 

Bandwidth/Capacity/Uptime

All regular production web servers are located at one of Rogers secure data centres in Ottawa. The facility is multi-homed (carrier feeds from multiple providers other than Rogers) and the servers are directly on the data centre's fibre network. These facilities are state of the art and have full generator backups and UPS protection. We have never had a serious outage in over a decade.

The network is robust and can maintain heavy traffic. The most continuous traffic we have sustained and measured is a core of 200-300 (peaks up to 400) simultaneously connected people completing surveys in the same study for 10-16 hours a day for about a week completing close to 8,000 20-25 minute surveys within that time. About a dozen smaller concurrent projects were running at the same time. This never approached the capacity of the internet connection at any time.

Since inception in 2000 we have completed over 1600 projects resulting in well over two million completed surveys.  

Addendum 2019.

In March of 2019 we will be bringing on new DELL servers with solid state drives and more RAM to increase throughput to replace our trusty but now aging Linux production servers (which will be moved to non-production roles). We are running RHEL 7.8 and multiple virtual hosts running several versions of Linux. Thoughout 2019 we will be finishing the upgrades and hardware replacement started last year.

There will be no down time in data collection. The new servers will be brought on line prior to retiring the old servers, and all data will be synced prior to switching traffic to the new machines.

We are still committed to locally stored data within Canada. We believe this is a better way of securing proprietary client data than somewhere in the cloud, where we cannot maintain we truly know where the data resides, or how many copies have been made. That being said we are not by any means technological Luddites. The cloud is great for public data and raw CPU processing of real-time data, however we need to have strict knowledge of where all data we manage (original and backups) resides. This certainty we cannot ensure using cloud storage.

Security/Data Integrity

The primary firewall was replaced in July 2020 with a Cisco Firepower 1010 replacing the 2015 Barracuda Networks Firewall Appliance which we will maintain with an up-to-date configuration as a redundant back-up. Access to the datacentre can only be gained after a retinal scan and use of an electronic access card. Behind the perimeter firewall each server and each desktop machine runs a secondary local firewall. 

For reporting our client logins for monitoring online results are done via user login and password and through a secure https browser connection.

Daily incremental data backups of all business data is automated and stored on one of our NAS servers co-located at Rogers. As well, all current projects are fully synchronized at 15 minute intervals and stored on a different internal server for immediate use on an alternate server in the event of a webserver failure. Finally, every 2 hours from 9 am to 9 pm, 7 days a week, encrypted backups of all active projects are made and links to download this are sent via email to key personel so up-to-date backups can be done offsite when no one is at the office. Periodic archives are stored on a CD (encrypted) and stored offsite. Normally after 5 years all data is destroyed unless otherwise requested. 

All data is stored in Canada on our secured hardware and is not cloud based. When we are requested to destroy data, we know where the data is, how many copies there are and can ensure it's complete destruction.

Our official privacy policy is posted at: http://www.dufferinresearch.com/privacy.html We would be happy to supply any other specific non-confidential information not covered by this document upon request.

Our roots

Dufferin Research had been specializing in online survey programming and the hosting of both stand alone custom research projects and continuous tracking studies for close to two decades. The firm was established was in 2000 to provide these specialized services to smaller Market Research firms and consultants. Dufferin Research now deals with a wide range of clientele from international research organizations, to direct research buyers and users from global, national and local businesses and organizations (including NFP), in addition to our original target market.

When we started online research it the industry was in its infancy. If you were online, dial-up access was the norm and rich media referred to the television networks. Fast forward a decade and internet based research is now the dominant data collection methodology globally. Rich media, in the form of video, audio and high resolution graphics are the norm, broadband and wireless access is the norm. The new frontiers are in mobile computing, semantic technologies and AI.

The data collection services at Dufferin Research follow best practice principles just not available if you wish to D-I-Y program in an ad hoc fashion. The data is collected and stored within Canada. We can certify where the data physically is stored, where the backup copies are, who has had access, moreover we can destroy any and all copies on request. Data stored in a cloud computing environment using software as a service (SAS) with head offices in foreign countries can offer no such guarantees.

Dufferin Research provides cost-effective, reliable services that include superior technical skills, excellent project management and have the sort of robust & secure infrastructure needed to succeed in this business. For further technical details see our capabilities page. But while we love the beauty of automation, and are masters of many programming languages, we still view ourselves first & foremost as a service company.

 We are 100% client focused. Your needs are what drive us. It's what we do. This philosophy has a price; service does not scale like technology. So we have come to terms with, and embraced the fact that we will remain a fairly small company. This is simply because what we do cannot be replicated on a large scale cost effectively. For Dufferin Research, hands on, means hands on.

Being typically Canadian, we can compromise to reach a solution that meets your needs.

 There is a middle ground between D-I-Y (Do-it-yourself) projects and full service research, that is more than data collection only. Starting with our Research Rabbit product (for employee satisfaction and engagement surveys) we developed a series of products where the client makes some of the design decisions alone within the software framework, then we evaluate, and execute the data collection and report the findings using a templated reporting structure that minimizes our labour thus lowers the cost for the user. 

You do a bit more, you get a lot more, and you pay less.

(1.0 to 3.0)

Traditional secondary research is compiling data from known archival sources, it is akin to the steps you would take when writing an essay in school. 

Version 1.0 was hard copy or electronic records usually in libraries.
Version 2.0 is secondary research online (can I Google that for you?)
Version 3.0 is the semantic web.

 Now we have the growing use of semantic software to scrap or data mine the web and glean insights through "smart" interpretation. Semantic software is used to conduct secondary research on steroids. The quantity of data online is so vast that no one can process it. But it is no magic bullet, like archival research of the past, this is merely collecting, classifying, and intrepeting that which already exists.

Surveys, however executed, are primary research, the creation of NEW data that for the moment at least is yours alone (see data collection, full & assisted service).

 Dufferin Research can assist in helping you choose when primary quantitative research is best, when you need secondary research, and if you do, which type. 

We can also advise you when qualitative research may be better suited to your needs, and if a hybrid methodology would be best, and then of course we can conduct the research needed.