I have a quantitative analysis bias—I think everyone needs statistical literacy to function in society. To be an intelligent consumer of news you must know about fractions and percentages; yet, unfortunately, many citizens do not have these basic skills. That’s where educational technology comes into play. Educational technology learning tools have evolved from the abacus, to whiteboards, to computers, with tons of technology in between and in front of us. These technologies have allowed us to be more confident in what we’re doing, to learn with ease, to learn faster, and to learn less expensively.
Data science technology has evolved too. Statistical software over the past 40 years, particularly IBM SPSS and SAS, have dominated the education and business markets with smart, easy to use graphical display and quantitative analytics. Other companies like Tableau have engaged us in data visualization, and Survey Monkey and Qualtrics have allowed us to create questionnaires, describe data, and create basic tables.
But data science is much more than graphical display and quantitative analytics. I fear that too much time was spent learning the programs, interpreting the output, and presenting the findings, and less attention to thinking about the research methodology, examining the strengths and limitations of a study’s design, thinking about who is electing to be a respondent and who is electing to not participate, how attrition plays a role in the findings, and how to manage the data. On the back-end, more time and energy should have been given to what the results mean for theory and practice, and how those results fit into the historical context of the subject, and what generalizations, if any, can be made.
If, as a mentor said, that educational technology is about extending human capabilities to be more effective and efficient, then there is a lot more that can be done on the cleaning, conducting, interpreting, and presenting of analyses, leaving more time for the front and back-end of the research enterprise. Intellectus Statistics is playing a role in that effective and efficient technology.
Intellectus Statistics has a patent-pending process of cleaning the data, comprehensively assessing a tests’ assumptions, interpreting the data, then writing the findings in plain English all in about .125 milliseconds. Effectiveness—doing the right things—is improved by having all of a test’s assumptions thoroughly assessed and analyses accurately conducted and interpreted, conducting post-hoc tests, interpretations with Bonferroni corrections, all parsimoniously written in English narrative, every time. Efficiency—doing things with the least waste of time and effort—is improved by automating statistician-like decisions, automatically generating additional tests, tables, and figures, when necessary, and not having to copy, paste, edit, and format tables and figures.
I’m interested in your thoughts about educational technology, data science, Intellectus Statistics, and where the data science field might go from here.